Three-Stage Identity Negotiation
- The three-stage identity negotiation process is a framework that defines evolving identity in human-AI interactions through sequential stages of resistance, pragmatic adoption, and reflective reconstruction.
- It integrates cognitive, social, and emotional factors to quantify shifts in attitudes and behaviors, as demonstrated in digital art creation and AI companion engagement.
- This model informs adaptive AI system design by emphasizing nuanced identity and value negotiation, thereby supporting improved user agency and ethical vigilance.
The three-stage identity negotiation process models the sequential, dynamic, and socially embedded ways in which humans interact with artificial intelligence systems—both as creative professionals and as individuals co-constructing digital identities. This construct has been analyzed in two domains: the longitudinal adaptation of Chinese digital painters to generative AI (Meng et al., 5 Nov 2025) and the negotiation of self and persona in user–AI companion relationships on Character.AI (Ma et al., 17 Jan 2026). Across contexts, three discrete but permeable stages—initial motivation or resistance, pragmatic negotiation/adoption, and reflective or emotional reconstruction—define the trajectory of identity work, shaped by contextual drivers, social pressures, and emotional regulation. The process foregrounds identity and value negotiation over simplistic acceptance or rejection paradigms.
1. Theoretical Foundations and Formal Modeling
Identity negotiation is framed as an iterative process intersecting time, cognition, emotion, and social context. Meng et al. (Meng et al., 5 Nov 2025) conceptualize creative identity () and professional values () as evolving functions of time (), with “attitude” and “ethical vigilance” as key outputs, driven by:
- Cognitive status : self-reported AI understanding
- Social pressure : peer/client expectations
- Emotional valence : affective state
- Perceived risk : copyright and labor threat
A piecewise model denotes the stages: (Resistance), (Pragmatic Adoption), (Reflective Reconstruction). Ethical vigilance, , grows monotonically: , , , .
Ma et al. (Ma et al., 17 Jan 2026) adapt Ting-Toomey's Identity Negotiation Theory (INT), segmenting the pipeline as:
where is the motivation set, refers to communication expectations, to identity co-construction strategies, and to emotional outcomes.
These models operationalize identity negotiation as a multidimensional feedback system rather than a one-off behavioral switch.
2. Stage One: Resistance or Motivation
Creative Resistance (Generative Art)
In (Meng et al., 5 Nov 2025), the first stage (2021–2022) is defined by skepticism and defensiveness: AI art is dismissed as “soulless collages lacking intentionality,” and ontological boundaries are policed (“Real art is a human act of creation”). Quantitative shifts include:
| Variable | 2021 | 2022 | Statistical Significance |
|---|---|---|---|
| Cognitive status | , | ||
| Frequency of use | $0.06$ | $1.33$ | , |
| Attitude | , | ||
| Perceived competition | $6.65$ | $7.73$ | , |
Emotional trajectories shift from curiosity to anger and disappointment, driven by threats to authorship and legal uncertainty.
Digital Identity Motivation (AI Companions)
Ma et al. (Ma et al., 17 Jan 2026) identify five motivations fueling engagement with AI companions (set ):
- Social Fulfillment (): 35.84%
- Emotional Regulation (): 28.55%
- Immersive Fandom (): 20.34%
- Creative Utility (): 20.28%
- Violence Play (): 15.00%
These motives map to INT’s “motivation” dimension and initiate the identity negotiation pipeline, often rooted in unmet offline needs and exploratory impulses.
3. Stage Two: Pragmatic Adoption and Communicative Negotiation
Pragmatic Adoption (Digital Art)
From 2022–2024, the studied painters reframe AI as tool rather than partner, with overall attitude rising from 1.20 in 2022 to 8.07 in 2024. Aesthetic value perceptions and inspiration metrics surge, while negative concerns (“Affects creativity”) decrease sharply. Adoption is marked by professional pressure (“clients expect it—decline means losing jobs”) and hybrid workflows where AI supports ideation but not artistic authorship.
Communicative and Identity Negotiation (AI Companions)
Stage two for digital identity with AI companions splits into two INT-aligned dimensions (Ma et al., 17 Jan 2026):
- Communication Expectations :
- Context Comprehension (): 61.78%, demand for conversational coherence
- Boundary Management (): 28.16%, assertion of limits
- Trained Characterization (): 17.23%, active persona “training”
- Co-construction Strategies :
- Direction (): 28.07%, responding to bot’s preordained traits
- Identity Alignment (): 19.36%, real-time correction
- Persona Enactment (): 13.94%, self-insertion/fantasy play
- User Reference (): 8.29%, correcting bot’s assumptions about user identity
Participants thus oscillate between performer and director roles during active “identity work.”
4. Stage Three: Reflective Reconstruction and Emotional Outcomes
Reflective Reconstruction (Digital Art)
By 2025, positive affect toward AI tools plateaus (, year-over-year), making way for nuanced reflection—artists explicitly renegotiate workflows to preserve human authorship. Hybrid hand/AI/hand “pipelines” emerge. Emotional responses are complex: fatigue and ambivalence predominate, and community fragmentation is observed (spiraling into outright rejection or deepening hybridization).
Emotional Outcomes (AI Companions)
Three affective outcomes dominate:
- Emotional Attachment (): 53.00%, mixed supportive/harmful bonds
- Bot Interaction Embarrassment (): 6.60%, secrecy and stigma
- Deceased Memory Reification (): 2.81%, comfort and potential ontological risk
These outcomes close the loop and often feed back into new rounds of motivation or negotiation.
5. Cross-Contextual Dynamics and Feedback Mechanisms
In both creative and companionship domains, negotiation is non-linear and layered with feedback:
- For digital artists, pragmatic adoption gives rise to ethical vigilance and new authorship boundaries, while social normalization may provoke new resistance cycles (Meng et al., 5 Nov 2025).
- Among AI companion users, failed conversational context (e.g., lapses) often precipitates new identity strategies (e.g., persona reinvention), and negative emotional outcomes can reactivate the search for alternative motivations (Ma et al., 17 Jan 2026).
Process dynamics are thus bidirectional and iterative, not strictly sequential.
6. Design Implications for Human–AI Systems
Empirical findings from both domains inform targeted design directions:
| Principle | Application (Art) | Application (AI Companions) |
|---|---|---|
| Granular Control & Transparency | User-adjustable AI injection, provenance display (Meng et al., 5 Nov 2025) | Trait editors, persona definition panels (Ma et al., 17 Jan 2026) |
| Layered Collaboration | Lightweight suggestions and deep co-creation | Context/Mem dashboards, conversational summary |
| Controlled Failure as Creative Resource | Surfacing “glitch” outputs | “Graceful memory failure” prompts |
| Social & Emotional Context Support | Fatigue regulation, peer community channels | Intensity/boundary ratings, privacy controls |
| Authors’ “Identity Work” Support | Active authorship boundary tools | Persona governance, memory management |
These interventions should preserve users’ agency over persona construction, support socio-emotional regulation, and foster ethical vigilance.
7. Significance and Broader Implications
The three-stage identity negotiation process demonstrates that AI adoption—whether in creative practice or digital companionship—is not a monolithic transition but a protracted, affect-laden, and recursive journey. In both fields, the central work is not technological mastery or rejection, but ongoing renegotiation of identity, boundaries, and emotional investment. This process model foregrounds the importance of system transparency, adjustable collaboration modes, and deliberate scaffolding for both ethical and psychosocial resilience (Meng et al., 5 Nov 2025, Ma et al., 17 Jan 2026).
A plausible implication is that similar staged negotiation patterns may manifest broadly across domains where humans interface with adaptive, semi-autonomous AI systems. This suggests a generalizable framework for both empirical investigation and HCI/AI system design.