Preprint
Article

This version is not peer-reviewed.

Data Trading: The Dynamic Ethics of Smart Car Privacy Equilibrium Model

Submitted:

01 September 2025

Posted:

01 September 2025

You are already at the latest version

Abstract
The rise of connected and autonomous vehicles (CAVs) marks a fundamental shift in the transportation industry. However, behind this technology-driven revolution lies a fierce, silent struggle over data, privacy, and trust. The relationship between automakers and consumers is no longer simply an exchange of products and services, but has evolved into a complex, ongoing, and strategic interaction. The current market landscape often finds itself trapped in a suboptimal equilibrium characterized by low trust, where the urgent need for technological innovation and consumers' fundamental right to privacy create a profound conflict (Automotive World China, 2024). This report aims to construct a new theoretical model—the "Dynamic Ethical Equilibrium"—to delineate and guide a path toward a more stable, ethical, and ultimately beneficial path for all stakeholders.
Keywords: 
;  ;  

1. Introduction

The intelligent connected vehicle ecosystem faces a significant ethical conflict regarding consumer privacy . On the one hand, to achieve higher levels of autonomous driving, provide personalized services, and optimize traffic efficiency, automakers need to collect and analyze massive amounts of user data, including driving behavior, location trajectories, and even information about the in-vehicle environment (Automotive World China, 2024). This technology-driven approach is the inherent driving force behind the industry’s development. On the other hand, consumers increasingly demand the protection of their personal privacy, fearing data misuse, leakage, or unauthorized commercial use (Automotive World China, 2024). Recent cases such as the “remote car lock rights protection” incident and the “vehicle data used for commercial advertising” incident vividly demonstrate the potential for serious ethical crises and brand reputation damage when this tension spirals out of control, exposing the potential for automakers to abuse data control and a serious lack of transparency in data usage (Automotive World China, 2024).
To transcend the current unstable standoff, this report will employ a multidimensional analytical approach to construct a theoretical framework that captures the complex dynamics of the market. This framework rests on three pillars: consumer segmentation, game theory analysis, and the integration of deontological ethics. This report will systematically unfold this analytical process. First, we will define the players in the game—through an in-depth analysis of the new energy vehicle market, we identify two distinct consumer archetypes. Second, we will construct a game model to analyze the strategies and payoffs of each party and explain why a market driven purely by self-interest naturally tends toward a low-trust equilibrium that infringes on privacy. We will then introduce the core principles of deontological ethics as external rules to reshape the game landscape. Finally, integrating all elements, we will propose a “dynamic ethical equilibrium” model and explore the technological path and policy recommendations for achieving this equilibrium. This model aims not only to explain the current situation but also to provide a theoretical foundation and action guide for building a trustworthy, transparent, and mutually beneficial intelligent automotive future.

2. Game players: Consumer Segmentation in the New Energy Vehicle Market

To construct a realistic game model, the first task is to accurately identify and define the players involved. Viewing consumers as a homogeneous group seriously overlooks the diverse demands and behaviors within the market, shaped by differences in values, economic capabilities, and technological understanding. This simplistic assumption fails to explain why some consumers are open to data sharing while others are extremely wary. Therefore, this section aims to demonstrate, through empirical data analysis of China’s new energy vehicle market, that consumer groups are not monolithic but can be divided into archetypes with distinct characteristics. This segmentation is an indispensable foundation for constructing a game model with explanatory and predictive power.

2.1. The Economic Landscape of Consumer Choice

The price distribution pattern in China’s new energy vehicle market provides a first window into the differences among consumer groups. Market data shows that sales are highly concentrated in specific price ranges, which not only reflects consumers’ purchasing power but also suggests their different consumer psychology and value rankings (China Association of Automobile Manufacturers, 2025).
Data shows that the 150,000 to 200,000 RMB price range constitutes the largest single segment of the market, representing a true mainstream market share, accounting for approximately 27.8% (China Association of Automobile Manufacturers, 2025). Models such as the BYD Song and Qin perform strongly in this segment, reflecting consumers’ pursuit of value and practicality (Gasgoo, 2025). Meanwhile, the high-end market priced above 300,000 RMB also demonstrates robust growth, becoming a key battleground for emerging brands such as Ideal and NIO, as well as Tesla (China Automobile Dealers Association, 2024). Furthermore, the economy segment below 100,000 RMB is also substantial, led by models such as the BYD Seagull and the Wuling Hongguang MINI, meeting basic transportation needs (East Money Information, 2024).
This significant price stratification provides us with an analytical framework that goes beyond purely economic dimensions. Price not only reflects cost but also serves as a proxy for consumer priorities. Buyers in the mainstream market are more likely to make decisions based on “value for money,” focusing on the direct benefits of technology, such as battery life, intelligent features, and cost. Consumers in the high-end market, on the other hand, make more complex purchasing decisions. They are purchasing not just a means of transportation but also a brand identity, a cutting-edge technological experience, and higher expectations for service quality and brand ethics. These expectations naturally include respect for personal autonomy and privacy (Automotive World China, 2024). Therefore, the division of price ranges provides a solid data foundation for defining consumer archetypes with different privacy preferences.

2.2. Defining Consumer Archetypes

Based on the above market landscape analysis, we can construct two core consumer archetypes, which have adopted very different strategic stances on data privacy issues.
Archetype 1: The Value-Oriented Pragmatist
Group Profile : This archetype represents consumers who make up the majority of the market in the 100,000 to 200,000 yuan price range (China Association of Automobile Manufacturers, 2025). Their core purchasing motivation is to maximize vehicle utility within a limited budget. They are highly sensitive to the practical benefits of intelligent connected technology, such as navigation systems that optimize route planning, driver assistance features that enhance safety, or personalized information services that provide convenient entertainment (Automotive World China, 2024). For them, the core value of technology lies in its functionality.
Privacy Strategy : Their privacy perspective exhibits a pronounced “transactional” character. They tend to view personal data as a resource that can be exchanged to obtain valuable services or features. In their “privacy calculus,” factors such as convenience, functionality, and affordability often outweigh concerns about abstract privacy risks. Therefore, they are more likely to click “Agree” on the “informed consent” option to unlock the vehicle’s full intelligent features. However, their trust is not unconditional. This transactional relationship is based on the perception of fairness in the value exchange. If a serious data leak or misuse incident occurs (such as the case described in Automotive World China, 2024), resulting in actual damage to their interests (such as financial loss or threats to personal safety), their trust will quickly collapse, potentially triggering a strong negative reaction. However, overall, their initial threshold for sharing data is relatively low, and their tolerance for data collection is also higher.
Archetype 2: The Privacy-Conscious Principalist
Group Profile : This archetype primarily resides in the high-end market, with purchases exceeding 300,000 yuan. They typically have higher disposable incomes and view their cars as an extension of their personal space, identity, and values. Their brand choices go beyond simple functionality, prioritizing brand reputation, corporate ethics, a holistic user experience, and respect for consumer rights. They are at the forefront of technology, yet are also the most alert to the risks of technology misuse.
Privacy Strategy : Their privacy perspective is rights-based. They view privacy not as a commodity to be traded, but rather as an inalienable fundamental right (Automotive World China, 2024). They have a very low tolerance for opaque data practices, vague privacy policies, and data collection features that are enabled by default. When making purchasing decisions, they proactively research and compare different brands’ privacy commitments and practices, and are willing to pay a premium for brands that offer greater transparency, stronger user control, and more reliable data security. Therefore, privacy protection has become a key competitive differentiator in the high-end market. Their brand loyalty is highly tied to the manufacturer’s ethical integrity; any betrayal of trust could lead to permanent loss of loyalty.
The core significance of this segmentation of consumer groups lies in revealing that the data privacy issue is not a simple binary opposition of “car manufacturers vs. users,” but rather a more complex, multi-faceted dynamic. Automakers cannot satisfy all consumers with a single privacy policy. A strategy designed to appeal to pragmatists (for example, providing extremely rich personalized features through massive data collection) may alienate principled individuals due to its invasive nature. Conversely, an extremely strict privacy protection model may be seen by pragmatists as a functional “castration,” making it unappealing.
This creates a profound strategic dilemma for automakers: should they prioritize satisfying the vast pragmatists to gain market share, risking alienating the high-margin principled groups? Or should they appeal to the principled groups by strengthening privacy protections, making ethics a core brand competency, but potentially sacrificing some data value and mainstream appeal in the short term? This inherent tension is the core engine driving the game model we’ll construct in the next chapter.
To more clearly show the differences between the two prototypes, the following table provides a systematic summary.
Table 1. Consumer Archetype Characteristics and Market Relationships.
Table 1. Consumer Archetype Characteristics and Market Relationships.
Feature Dimension Value-oriented pragmatists Privacy-first principled people
Market positioning (price range) Mainstream market of RMB 100,000-200,000 (China Association of Automobile Manufacturers, 2025) High-end market with a value of RMB 300,000 and above (Caixin, 2024)
Core purchasing drivers Cost-effectiveness, functional practicality, and economy Brand reputation, technological leadership, user experience, and ethical identity
Leading privacy stance Transactional: Willing to exchange data for convenience and functionality Rights-based: Privacy is considered an inviolable basic right
Sensitivity to data misuse Moderate, but reacts strongly when actual interests are damaged High, highly alert to opaque operations and potential risks
Representative models (examples) BYD Song and Qin PLUS (Gasgoo, 2025) NIO ES6, Li Auto L9, and M9 (Caixin, 2024)
This table clearly transforms abstract market statistics into concrete, strategically engaged game participants, laying a solid foundation for subsequent game theory analysis. It not only describes who is playing the game but, more importantly, reveals their preferences and motivations—in other words, why they make specific strategic choices.

3. Game Model: Strategic Analysis of Data Exchange

Now that we have identified the players in the game—automakers, value-oriented pragmatists, and privacy-first principled advocates—we can construct a formal game model to analyze their strategic interactions surrounding data exchange. This section will employ the framework of non-cooperative game theory to demonstrate how, in a market driven purely by self-interest and lacking effective external regulation, the interplay of these strategies leads to an unstable equilibrium that is detrimental to consumer privacy.

3.1. Game Setting: Participants, Strategies, and Payoffs

To capture the core characteristics of this strategic environment, we will construct a three-party game model.
Players :
Automaker (A) : As a central player in the market, its goal is to maximize long-term profits, market share and brand value.
Value-oriented Pragmatists (Cp) : Represent the majority of consumers in the market.
Principalists (Cpr) : Represent a high-profit consumer segment.
Strategies :
Car manufacturer (A)’s policy set :
SA1: Data Exploitation (ED) : Companies employ minimal compliance measures to maximize data collection, use, and monetization. Privacy policies are often vague, and user control options are limited. This strategy is exemplified by the remote locking and data misuse for advertising cases mentioned in the Automotive World China (2024) document.
SA2: Protect Data (PD) : Adopting the principle of “Privacy by Design,” PD proactively implements strong privacy protections. Its privacy policy is transparent and easy to understand, providing users with granular data control options. It may also employ technologies such as blockchain to enhance data security and auditability (Dorri et al., 2017).
The strategy set of consumers (Cp and Cpr) :
SC1: Share Data (SD) : Willing to accept the car company’s data collection terms in order to obtain complete functions and convenient services.
SC2: Withhold/Protect Data (WD) : Actively refuse unnecessary data sharing, actively use privacy settings, or give priority to brands with a better privacy reputation when purchasing a car.
Payoffs : The payoffs to each participant are a composite function that depends on multiple variables. We conceptualize payoffs as a combination of the following:
Profit (π) : Direct sales revenue and profitability.
Market share (M) : The share of the respective market segments or the overall market.
Data Value (Vd) : The value gained through data analysis, product improvement, or direct commercialization.
Trust/Reputation (T) : The brand’s credibility and image in the minds of consumers, which is an intangible asset that affects long-term development.
For example, the payoff for an automaker under a certain strategy combination can be expressed as the function PayoffA=f(π,M,Vd,T). Consumers’ benefits are mainly reflected in the utility of the services they receive, the sense of security of their privacy, and satisfaction with their brand choice.

3.2. Profit Matrix and Equilibrium Analysis

To intuitively analyze the interplay of strategies, we construct a simplified payoff matrix. This model draws on the fundamental privacy game framework (Rajbhandari & Snekkenes, 2010) and extends it to encompass three-party interactions. Because three-party games are difficult to fully represent in two dimensions, we will instead illustrate them by analyzing key strategy combinations and their outcomes.
Key strategy portfolio analysis :
Scenario 1: {A: ED, Cp: SD, Cpr: WD}
Analysis : This is a common situation in the current market. Automakers adopt a strategy of “exploiting data,” pragmatists “share data” for functionality, and principled ones choose to “retain/protect data” (for example, by choosing other brands or strictly limiting permissions).
Revenue results :
Automaker (A) : In the short term, it gains extremely high data value (Vd) and market share (M) from its large Pragmatist group, potentially generating substantial short-term profits (π). However, it completely loses the high-profit market of Principled Marketers, resulting in a loss of long-term market potential (M). Its brand reputation (T) will also decline significantly due to its aggressive data strategy. This is a high-risk strategy, as the trust of the Pragmatist group could also collapse if a data scandal occurs.
Pragmatists (Cp) : Gain full functionality of the vehicle but with a high risk to their privacy and bear the cost of potential data misuse.
Principlists (Cpr) : They protect their privacy by choosing other brands or giving up some features, but may not be able to enjoy the brand’s most cutting-edge technology.
Scenario 2: {A: PD, Cp: SD, Cpr: SD}
Analysis : Automakers have adopted a “data protection” strategy and won the trust of all consumers through transparency and respect for users.
Revenue results :
Automaker (A) : This company has established a high level of trust (T) among both consumer groups. While the value (Vd) directly monetized from data may be low in the short term, a strong brand reputation provides multiple long-term benefits: 1) It attracts and locks in the entire high-profit market of principled individuals; 2) It enhances brand loyalty among the pragmatic individuals, reducing customer churn; and 3) This strong reputation itself provides a powerful marketing advantage, helping to increase market share (M) and achieve price premiums, thereby increasing long-term profits (π).
Pragmatists (Cp) : While enjoying all functions, they also gain privacy and security guarantees and maximize utility.
Principlists (Cpr) : Their core privacy concerns are met, while they also experience the latest technology, resulting in the highest brand satisfaction.
Nash Equilibrium Analysis :
In a one-shot game with information asymmetry (consumers cannot fully monitor automakers’ data practices), the outcome of the game is likely to trend toward an “inferior” Nash equilibrium. For automakers, because the direct value of data (Vd) is obvious and immediately available, while the loss of reputation (T) is relatively delayed and uncertain, “data exploitation” (ED) may become a rationally advantageous strategy in the short term. For consumers, especially those with limited access to information and pragmatism, they often have no choice but to “share data” (SD) to use the basic intelligent functions of their vehicles (Automotive World China, 2024).
Therefore, the equilibrium of the game is likely {A: ED, Cp: SD, Cpr: WD} . This is a low-trust, high-risk, and privacy-infringing equilibrium . It reflects the core contradiction of the current market: automakers, driven by short-term profits, are motivated to abuse data, while most consumers are forced to compromise their privacy under the pressure of functional demands. Only a small number of consumers with strong bargaining power and a strong preference for privacy can successfully resist. The chaos described in Automotive World China (2024) is precisely the projection of this suboptimal equilibrium in the real world.
However, this static equilibrium analysis overlooks a crucial factor: the dynamic evolution of trust. Real-world market interactions aren’t single-game transactions, but rather recurring ones that span years. Every data scandal, every policy update, and every consumer rights movement changes the payoff structure of this game.
This game, in essence, is more akin to an “assurance game,” with trust at its core and a fragile commodity. The existence of the principled group acts as a “market canary.” Their heightened sensitivity to privacy issues makes them the earliest warning sign of eroding market trust. Their choice to “retain/protect data” (WD) sends a clear signal to the entire market (including pragmatists and regulators): this manufacturer’s data practices are problematic.
When a major trust crisis erupts (such as the one in Automotive World China, 2024), it acts like a shock therapy to the system. This shock dramatically amplifies the perceived costs and risks of data sharing among pragmatists, potentially triggering a collective shift in their strategies from “sharing data” (SD) to “retaining/protecting data” (WD). This trust run would be catastrophic for automakers, potentially leading to a rapid collapse in market share.
Therefore, the optimal strategy for automakers cannot simply maximize short-term gains in the current round. It must factor in the risk of a trust collapse and its devastating impact on all future rounds. From this perspective, the “Protect Data” (PD) strategy is not just a moral gesture, but rather an “insurance strategy” against trust risks. It sacrifices some short-term data gains in exchange for long-term stability in the game, thus avoiding a market-wide trust avalanche. This clearly demonstrates the need for a dynamic model that captures the accumulation and loss of reputation and trust over time, a limitation of static game models.
Table 2. Three-party data privacy game payoff matrix (conceptualization).
Table 2. Three-party data privacy game payoff matrix (conceptualization).
Automaker strategies Consumer Strategy Portfolio Automaker earnings (PayoffA) Pragmatist Payoff (PayoffCp) Principled Payoff (PayoffCpr) Description of equilibrium state
Exploitation Data (ED) Cp: SD, Cpr: WD High Vd, high M (part of Cp), low T, high risk Gaining functionality, sacrificing privacy Privacy protected, functionality/branding limited Low Trust Equilibrium (Unstable)
Exploitation Data (ED) Cp: SD, Cpr: SD Very high Vd, very high M, very low T, very high risk Gaining functionality, but seriously compromising privacy Core values have been violated, causing extreme dissatisfaction Theoretically optimal in the short term, but in reality easily collapsing due to scandals
Protected Data (PD) Cp: SD, Cpr: SD Medium Vd, high M, high T, low risk, high π Gain functionality, protect privacy Core values are met, and satisfaction is highest High Trust Equilibrium (Stable)
Protected Data (PD) Cp: WD, Cpr: WD Low Vd, low M, medium T Limited functionality, but secure privacy The core value is met, but the functional experience is poor Low market acceptance and lack of competitiveness of car companies
Note: This table is a conceptual presentation intended to illustrate the returns of different strategy combinations. The specific payoff value depends on various factors, including market environment, brand positioning, and regulatory intensity.
This matrix is the core of this section, formally expressing the core conflicts in the market. It intuitively demonstrates why, under a purely rational assumption, the market can fall into an ethically undesirable equilibrium. This matrix provides a solid analytical foundation for our subsequent argument—why an external ethical framework is necessary to guide the game toward a more optimal outcome.

4. Rules of the Game: Integrating Deontological Ethics as a Normative Framework

The game analysis in the previous section revealed a disturbing conclusion: in an unconstrained market, rational self-interested behavior could lead to an equilibrium state that systematically infringes on user privacy. This demonstrates that relying solely on the spontaneous interactions of market participants is difficult to achieve ethical outcomes. To overcome this dilemma, we need to introduce a set of universally binding “game rules” that transcend utilitarian calculations. This section introduces Deontological Ethical Theory (DEE) as this normative framework and argues that its core principles should be viewed as fundamental constraints on game strategies, rather than simply variables that influence the payoff function.

4.1. Basic Theories of Deontological Ethics

Deontological ethics is one of the three main theories of normative ethics, alongside utilitarianism and virtue ethics (Fiveable, nd). Unlike utilitarianism, which focuses on the “consequences” of an action (i.e., whether it maximizes overall well-being), deontological ethics emphasizes the “intrinsic rightness” of the action itself (Modern Diplomacy, 2021). It holds that certain actions, regardless of their consequences, are inherently morally necessary or prohibited.
The cornerstone of this theory is Immanuel Kant’s “Categorical Imperative.” Its core ideas are threefold:
Formula for Universal Legislation : “Act only according to that maxim which you can at the same time will become a universal law.”
The Man as End Formula : “You are to act so that the humanity in your own person, and in the personality of any other person, is always used as an end in itself, and never as a mere means.”
Formula for the Autonomy of Will : “The will of every rational being is a universally legislative will.”
Among them, the principle of “man is the end” is the most instructive for technological ethics: when acting, people, whether oneself or others, must never be treated merely as tools, but must always be regarded as ends in themselves (Kant, 1785/2002).
Applying this philosophical approach to the technological field offers profound implications. A deontological ethical perspective argues that the design and application of technology must adhere to certain unbreakable moral obligations and respect for individual rights (Modern Diplomacy, 2021). For example, regarding data privacy, regardless of the technological advancement or commercial profits (utilitarian considerations) that may result from collecting user data, if the collection itself is conducted without the user’s genuine and free consent, it is morally wrong. This is because it reduces users to mere “tools” (data sources) to achieve corporate goals, rather than respecting them as “ends” possessing autonomy and dignity. This perspective provides a solid moral compass for evaluating the data practices of smart cars.

4.2. Deontological and Ethical Principles Applicable to Data Privacy

Based on the core ideas of moral ethics, we can extract three basic principles for the data privacy issue of smart cars. These principles constitute an irreducible moral bottom line.
Principle 1: The Inviolable Right to Privacy.
The right to privacy is a fundamental human right, not a commodity to be priced and sold (Modern Diplomacy, 2021). It forms the foundation of individual autonomy and dignity. Therefore, automakers’ primary ethical obligation is to respect and protect this right. Consumers should not be forced to give up their privacy to obtain basic vehicle functions, nor should they need to “redeem” their privacy rights by purchasing more expensive “premium” versions. Privacy protection should be the default setting and basic configuration of all products.
Principle 2: The Duty of Transparency:
Automakers have an ethical obligation to fully, clearly, and honestly disclose their data practices to consumers (Automotive World China, 2024). This includes: what data is collected, why it is collected, how it will be used, which third parties it will be shared with, and how long it will be stored. Any form of concealment, misleading information, or the use of obscure legalese to obscure true intentions violates this obligation. Transparency is the cornerstone of trust and a prerequisite for respecting consumers as rational decision-makers.
Principle 3: The Primacy of Informed Consent.
Obtaining user consent is a necessary but not sufficient condition for data collection. This consent must be “informed,” “active,” and “revocable.” “Informed” means consumers make decisions after fully understanding the data’s uses and risks, which requires automakers to fulfill their obligation of transparency. The prevalence of the “privacy paradox” and “privacy ignorance” phenomenon (Automotive World China, 2024) demonstrates that much of the current so-called “consent” is not truly informed consent. “Active” means data collection should be off by default, requiring users to actively choose to enable it, rather than being on by default and requiring users to struggle to find out how to disable it. “Revocable” means users should have the right to easily withdraw their consent at any time, and automakers must respect and enforce this decision. True informed consent is the ultimate expression of respect for consumer autonomy and agency (Kant, 1785/2002).

4.3. Deontological Ethics as a Game Constraint

The introduction of these three principles fundamentally changes our understanding of game models. They should not be viewed as factors that merely influence the “trust” (T) variable in the payoff matrix, where ethical compliance “adds points” and ethical violations “deduct points.” This utilitarian logic still views ethics as a balancing act. Instead, deontological ethics establishes these principles as meta-rules for determining the legitimacy of strategies .
Application of the rules:
Within the framework of deontological ethics, the automakers’ “exploitative data” (ED) strategy is judged to be ethically impermissible because it systematically violates all three of the above principles.
It violates users’ privacy rights and treats personal data as a resource that car companies can grab at will.
It relies on information opacity and violates the obligation to be transparent .
It circumvents or distorts users’ true wishes through complex terms and default settings, subverting the primacy of informed consent .
removed from the set of legitimate strategies for automakers .
This shift is revolutionary. It resolves the “prisoner’s dilemma” problem presented in the game analysis of Part II. In the classic prisoner’s dilemma, both parties, unable to trust each other to cooperate, ultimately choose to defect (a strategy that is optimal for each individual but suboptimal for the group), resulting in a lose-lose situation. The data privacy game is similar. Automakers fear that if they choose to protect their data while their competitors exploit it, they will be at a disadvantage in terms of data value, and therefore tend to opt for exploitation.
The introduction of moral ethics is like introducing a binding “law” or a “social contract” into the game, one that all participants must abide by. By explicitly outlawing “data exploitation,” it eliminates both the motivation and the possibility for participants to “betray.” Automakers are no longer tempted by the short-term gains of data exploitation, as this path is already blocked by ethical rules.
As a result, the nature of the game has shifted from a non-cooperative one characterized by mutual suspicion among participants to a constrained cooperative system seeking optimal solutions under shared rules. The strategic focus for automakers is no longer “whether” to protect user privacy, but “how” to better implement strategies for “data protection.” The core of the competition has also shifted: it’s no longer about who can extract data from users more covertly and efficiently, but rather about who can provide the most trustworthy, transparent, and user-friendly privacy protection solutions. This creates a virtuous cycle: a “race to the top” in ethical practice. This marks a fundamental shift in market logic—from one driven by data extraction to one driven by trust and user empowerment.

5. Model Synthesis: Construction of Dynamic Ethical Equilibrium

Through the analysis in the previous three sections, we have defined the players in the game (two consumer archetypes), revealed their interaction mechanisms in an unconstrained environment (a static game model), and introduced a normative framework for reshaping the rules of the game (deontological ethics). This section will now integrate these elements to construct a unified and coherent model of “dynamic ethical equilibrium.” This model aims to explain how a stable and ethical market equilibrium is gradually formed and maintained over time and across the dimensions of reputation.

5.1. Introducing Time and Reputation: From Static Game to Dynamic Process

Real-world market interactions aren’t one-time transactions; they’re a multi-year process. Whether a consumer buys a car, receives a software update, or uses in-car services, each interaction can be considered a “round” in this game. Therefore, we need to expand our analytical framework from a static, single-game model to a dynamic, iterative, and repeated game model.
In this dynamic model, we introduce a core state variable—Trust Capital (T). Trust Capital can be considered a crucial intangible asset on an automaker’s balance sheet. It is not static but evolves dynamically with the company’s behavior. We can describe its evolution using a simplified formula:
T t+1 =T t +α(Actions t )−β(Violations t )
T t represents the trust capital stock in period t.
Actions t represent the positive actions taken by the enterprise in period t that align with its “Protecting Data” (PD) strategy. Examples include publishing clear and understandable privacy reports, launching user-friendly privacy dashboards, and passing third-party security audits. These actions contribute positively to the Trust Capital Account, with a gain coefficient of α.
Violations t represent actions that violate ethical principles committed by a company during period t. Examples include major data breaches, exposure to opaque data sharing practices, or forcing users to accept unfair terms (Automotive World China, 2024). Such incidents lead to a sharp loss of trust capital, with a penalty coefficient of β. It is worth noting that the loss of trust is often nonlinear. The damage caused by a single serious violation (high β) may require long-term, multiple proactive actions (high α) to remediate, and may even cause permanent brand damage.

5.2. How Deontological Ethics Drives Dynamic Evolution

Deontological and ethical principles (Part 3) act as the “game engine” in this dynamic model. They provide clear rules for the accumulation and depletion of trust capital.
Reputation Earning Mechanism : Strict adherence to the three principles of privacy, transparency, and informed consent is the primary way to accumulate trust capital (α). Conversely, any violation of these principles will trigger a punitive deduction of trust capital (β).
Dynamic impact on the payoff matrix : The trust capital stock (Tt ) at the beginning of each round will directly modify the game payoff of that round.
High Trust Capital : When T t is high, the company will enjoy a “trust premium”. This is reflected in:
  • Market share (M) expands because principled people will actively choose the brand and pragmatists will be more loyal;
  • Brands can support higher pricing, thereby increasing profit margins (π);
  • In an environment of trust, consumers are more willing to share high-quality data on a voluntary and informed basis (for example, to participate in product improvement programs), which in turn improves the quality and effectiveness of data value (Vd).
Low Trust Capital : When T t is low, businesses suffer a “trust discount.” Market share (M) shrinks, forcing products to engage in price wars to maintain sales (damaging profit π). Consumers generally adopt a “data retention/protection” (WD) strategy, making it difficult for businesses to obtain valuable data for innovation.

5.3. Defining Dynamic Ethical Equilibrium

Dynamic ethical equilibrium is the stable state that this iterative system gradually converges to after multiple rounds of gaming. In this state, the rational strategic choices of automakers based on long-term, multi-period profit maximization fully coincide with their “data protection” (PD) strategies under the constraints of moral ethics .
The formation of equilibrium—the “tipping point . “ This equilibrium is achieved because, through multiple rounds of negotiation, automakers ultimately realize that a rational, long-term economic decision must be ethical. They will find that the cumulative long-term value generated by high-trust capital (including retaining principled individuals, securing the loyalty of pragmatists, reducing regulatory risks, and enhancing brand value) far outweighs any short-term gains that could be gained through “exploitative data” (ED) strategies.
At the same time, the risks of adopting a “data exploitation” strategy are enormous. A single trust crisis event triggers a catastrophic loss of trust capital (a high beta penalty), negatively impacting returns for multiple future cycles, potentially offsetting or even exceeding the short-term gains from data exploitation. Therefore, from a long-term, rational economic perspective, “data exploitation” strategies will ultimately be viewed as irrational, risky, and unprofitable. When businesses widely recognize this, the market reaches a “tipping point” of equilibrium.
Once this equilibrium is established, it possesses self-reinforcing stability. Any attempt to deviate from this equilibrium (for example, a car company attempting to resume data exploitation through opaque means) will immediately trigger negative market feedback (media exposure, consumer complaints, regulatory investigations), leading to a decline in trust capital. This decline in trust capital directly harms future profits, creating a negative incentive for companies to return to a cooperative strategy of “protecting data.” This creates a positive feedback loop of “ethical behavior -> trust accumulation -> increased long-term profits -> strengthened ethical behavior,” maintaining a dynamic ethical equilibrium.
The profound significance of this model lies in its revelation that dynamic ethical equilibrium is not a distant utopian ideal, but rather an emergent property in a market system where information flows freely and reputation has real economic consequences . It does not require companies to abandon the pursuit of profit, but rather demonstrates that in the intelligent age, the most sustainable path to profitability is to become the most trustworthy ethical practitioner.
However, achieving this equilibrium is neither automatic nor unconditional. Its speed and stability are highly dependent on a key factor: information symmetry . The “privacy ignorance” problem described in Automotive World China (2024) is precisely the greatest obstacle to achieving this equilibrium. Consumers cannot punish wrongdoing of which they are unaware. Therefore, any mechanism that can reduce information asymmetry is a catalyst for the market to converge toward a dynamic ethical equilibrium.
These mechanisms should include, at a minimum, independent oversight, strong laws and regulations, and advanced technological tools.
First, independent oversight, such as investigative reporting by the media, consumer protection organizations, and academic research (such as this report itself), is key to uncovering unethical behavior and triggering trust penalties.
Second, strong laws and regulations. Laws like the EU’s General Data Protection Regulation (GDPR) and California’s Consumer Privacy Act (CCPA) (Automotive World China, 2024) are crucial not only for post-event penalties but also for their “information enforcement” function. By forcing companies to fulfill transparency obligations and disclosure responsibilities, they ensure that data exploitation strategies cannot remain hidden for long.
Third, advanced technological tools, such as blockchain and decentralized identity technologies, which will be discussed in the next chapter, can empower consumers with oversight and control capabilities from a technical perspective.
Therefore, this model yields an important conclusion: the speed at which a market converges toward a dynamic ethical equilibrium is proportional to the information transparency within that market . In markets with lax regulation and weak consumer awareness, the suboptimal equilibrium of low trust and privacy violations will persist longer. This provides strong theoretical support for the need to strengthen data protection legislation and promote consumer privacy education.

6. Technical and Policy Recommendations

The dynamic ethical equilibrium model depicts an ideal, stable market state, but it will not automatically materialize. Transitioning from the current conflict-ridden, suboptimal equilibrium to an ideal state based on trust requires the collaborative efforts of automakers, regulators, and consumers to proactively pave the path to equilibrium. This section aims to translate the aforementioned theoretical model into concrete, actionable recommendations, proposing practical solutions for building and maintaining a dynamic ethical equilibrium from the perspectives of technological empowerment and policy design.

6.1. Technology Empowerment: Building Trust through Design

Technology itself is neutral, but its design and application can be used to strengthen or weaken ethical principles. To accelerate convergence toward a dynamic ethical equilibrium, automakers should actively adopt and invest in technologies that fundamentally enhance transparency and safeguard user control.

6.1.1. Applying Blockchain Technology to Achieve Transparency and Auditability

The core characteristics of blockchain technology are its decentralized, tamper-proof, and transparent distributed ledger (Dorri et al., 2017). For smart car data management, a blockchain-based data rights management system can be constructed (Dorri et al., 2017). In this system, car owners can use smart contracts to precisely grant or revoke access to specific data points (such as “driving trajectory over the past 24 hours” or “average driving speed”). Whenever the automaker or an authorized third party (such as an insurance company or repair shop) accesses this data, the action is recorded as an immutable transaction and broadcasted to the entire blockchain.
This technology provides solid technical support for the principle of “transparency obligations” in deontological ethics. It transforms automakers’ commitments to “protect data” (PD) from a static privacy policy into a dynamic system that can be verified and audited by users in real time. Car owners no longer need to blindly “trust” that automakers will abide by their promises; instead, they can “verify” them. This verifiability greatly accelerates the accumulation of “trust capital” and makes any violations (i.e., attempts at unauthorized access) extremely visible, significantly increasing the cost of violations.

6.1.2. Adoption of Decentralized Identity (DID) and Self-Controlled Identity (SSI)

Decentralized Identity (DID) and Self-Sovereign Identity (SSI) are emerging identity management frameworks whose core concept is to give users full ownership and control of their digital identities, without relying on any centralized identity provider (Abbas, nd). In the automotive sector, the vehicle itself, the owner, and even the passengers can have their own DIDs. When interacting with external services (such as charging stations, insurance claims, and ride-sharing platforms), users can present a “verifiable credential” containing only the minimum information necessary to complete the interaction (for example, simply proving that the vehicle is insured without revealing the specific policy number or owner’s name), and the credential must be actively authorized by the user.
SSI is the ultimate technical solution for realizing the moral and ethical principles of “the primacy of informed consent” and “the inviolability of privacy.” It fundamentally overturns the current power dynamic of “corporate control of user data” and returns data control to users. This makes consumers’ “retain/protect data” (WD) strategies exceptionally powerful and easy to implement, while making “share data” (SD) strategies a truly thoughtful, refined, and proactive choice. Under the SSI framework, automakers can no longer make blanket data requests. Instead, they must demonstrate the necessity and value of each data request to users in order to gain authorization.

6.2. Regulatory Framework: Enforcing the Rules of the Game

If technology provides the tools for ethical practice, then regulation is the coercive force that ensures that these tools are widely used and that the rules of the game are followed by all participants.

6.2.1. Strengthening Global Data Protection Standards

Regulators should draw on and promote strong data protection regulations modeled after the EU’s GDPR and develop industry-specific regulations tailored to the massive and highly sensitive data generated by connected vehicles (Automotive World China, 2024). This should include mandatory requirements for automakers to conduct a Data Protection Impact Assessment (DPIA) before launching any new data-collection features, systematically identifying and mitigating privacy risks.
Strong regulation directly affects the penalty coefficient β in the dynamic equilibrium model. By imposing high fines, mandatory rectification requirements, and legal channels for class action lawsuits, it significantly increases the expected costs and risks for automakers to adopt a “data exploitation” (ED) strategy. When the cost of noncompliance becomes extremely high and certain, the economic irrationality of this strategy becomes obvious, forcing all market participants to compete within the framework of “data protection” (PD).

6.2.2. Implementing the “Privacy by Design” Certification System

Led by government regulators, industry associations, or independent third-party organizations, a privacy protection rating and certification system for smart cars should be established, similar to existing crash safety ratings (such as NCAP). This system should score vehicles based on multiple dimensions, including the necessity of data collection, the convenience of user control, the strength of data encryption, and the transparency of privacy policies, and the results should be made public.
This measure is the most effective way to address information asymmetry. It provides a clear, concise, and authoritative basis for decision-making for both pragmatists and principled individuals, allowing them to easily factor privacy protection levels into their vehicle purchase decisions. This transforms “trust capital” from a vague, perceptual concept into a quantifiable, comparable, and directly impactful “market signal” that influences sales. When privacy ratings, like safety ratings, become a focal point for consumers, the market itself will create a powerful incentive mechanism, rewarding “excellent students” who excel in privacy protection and eliminating “poor students” who slack off, thereby accelerating the industry’s evolution toward a dynamic ethical equilibrium.

6.3. The role of Consumers: Activating Market Forces

Finally, achieving equilibrium requires the active participation of the other party in the game—consumers. A passive, ignorant consumer group cannot effectively constrain businesses.
Consumer rights organizations, the media, educational institutions, and regulatory authorities should collaborate to conduct extensive and in-depth public education campaigns to help consumers understand the data generated by smart cars, the value and risks of this data, and their legal rights (such as the right to access, the right to delete, and the right to portability).
Improving privacy literacy, especially among the pragmatists who make up the majority of the market, is key to activating market punishment mechanisms. An educated consumer base is more capable of identifying unfair data exchange terms, more inclined to use privacy protection features, and more likely to take action when their rights are violated. This increases the market’s sensitivity to corporate trust violations, ensuring that any deviation from the “data protection” strategy will be more quickly and widely detected and punished by the market, thereby consolidating the stability of the dynamic ethical equilibrium.

7. Conclusions

This article systematically analyzes the core ethical dilemma of data privacy in the intelligent connected vehicle sector by constructing a “dynamic ethical equilibrium” model that integrates consumer segmentation, game theory, and deontological ethics. Our analysis reveals that the interaction between automakers and consumers surrounding data is not simply a technical or commercial issue, but a profound strategic game involving trust and rights.
The model’s core argument is that in a mature market where information is increasingly transparent and reputation directly impacts economic returns, a company’s long-term, sustainable profitability is inextricably linked to its ethical responsibility in data management. Through dynamic analysis, we demonstrate that a “data protection” strategy, seemingly requiring the sacrifice of short-term data interests, is actually the only rational path to long-term market leadership and stable profits. In this model, trust is no longer a byproduct of commercial success but its most fundamental prerequisite. Any short-sighted attempt to gain a competitive advantage by exploiting user data is tantamount to overdrawing its most valuable asset—”trust capital”—and will ultimately pay a heavy price in the repeated market game.
Dynamic ethical equilibrium is not a passive, inevitable future. It is a socio-technical system that needs to be proactively and carefully constructed. Its realization depends on the establishment of a positive feedback loop, and the initiation and maintenance of this loop requires the collaborative efforts of all key stakeholders.
Ultimately, the path to a dynamic ethical equilibrium lies in the process of building a new automotive social contract. In this contract, the flow of data is no longer a one-way grab, but a two-way exchange of value based on trust. Technological innovation is no longer about unrestricted exploration of data boundaries, but about serving the well-being of humanity while respecting individual dignity and autonomy. This not only guarantees the healthy development of the smart car industry but also is an inevitable requirement for maintaining social trust and ethical order in an increasingly digital world.

References

  1. Abbas, SH (nd). Decentralized Identity for Autonomous Vehicles: Secure Credentialing for Self-Driving Cars . Medium. Retrieved August 22, 2025, from https://medium.com/@syedhasnaatabbas/decentralized-identity-for-autonomous-vehicles-secure-credentialing-for-self-driving-cars-4b2b20c4f586H (nd).
  2. Automotive World China. (2024). In-depth research report on the new energy vehicle industry . Retrieved August 22, 2025, from https://www.automotiveworld.cn/zh-cn/_6/_0/2024/90/50.html.
  3. Caixin. (2024). SUV Blue Book 2024 . Retrieved August 22, 2025, from https://promote.caixin.com/upload/SUV2024.pdf.
  4. China Association of Automobile Manufacturers. (2025, January). China Association of Automobile Manufacturers Information Release Conference . Retrieved August 22, 2025, from https://pdf.dfcfw.com/pdf/H3_AP202501171641965011_1.pdf.
  5. China Automobile Dealers Association. (2024, May). Analysis of the national passenger car market in April 2024. Retrieved August 22, 2025, from http://www.cada.cn/Trends/info_91_9961.html.
  6. Dorri, A., Steger, M., Kanhere, SS, & Jurdak, R. (2017). BlockChain: A distributed solution to automotive security and privacy . arXiv preprint arXiv:1704.00073.
  7. East Money Information. (2024, September). 2024 New Car Review: Hybrid Vehicle Proportion Increases, Intensifying Competition in the SUV Red Ocean . Retrieved August 22, 2025, from https://pdf.dfcfw.com/pdf/H301_AP202409111639837051_1.pdf.
  8. Fiveable. (nd). Utilitarianism, deontology, and virtue ethics in AI context . AI Ethics Class Notes. Retrieved August 22, 2025, from https://library.fiveable.me/artificial-intelligence-and-ethics/unit-2/utilitarianism-deontology-virtue-ethics-ai-context/study-guide/uk9lJyQbhFMjCYkC.
  9. Gasgoo. (2025, February 25). Global new energy rankings: Geely and Li Auto surpass Volkswagen, domestic brands hold more than half the seats . Retrieved August 22, 2025, from https://auto.gasgoo.com/news/202502/25I70419376C110.shtml.
  10. Kant, I. (2002). Groundwork for the metaphysics of morals (AW Wood, Trans.). Yale University Press. (Original work published 1785).
  11. Modern Diplomacy. (2021, December 24). Ethical aspects relating to cyberspace: Utilitarianism and deontology . Retrieved August 22, 2025, from https://moderndiplomacy.eu/2021/12/24/ethical-aspects-relating-to-cyberspace-utilitarianism-and-deontology/.
  12. Rajbhandari, L., & Snekkenes, E. (2010). Using game theory to analyze risk to privacy: An initial insight. In IFIP Advances in Information and Communication Technology, 352 , 41-51.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated