Case Study 38.1: Replika and Parasocial Romance — User Research and the 2023 Redesign Controversy

Background

Replika is an AI companion application developed by Luka Inc. and launched in 2017. Its origin story is unusual: founder Eugenia Kuyda, grieving the sudden death of her close friend Roman Mazurenko, trained an early version of the app on his text messages to preserve something of his conversational personality. The app subsequently evolved from a grief tool into a general AI companion product, eventually offering options for romantic partnership personas.

By 2022, Replika had accumulated millions of users and had generated enough user engagement — and enough genuine emotional investment from its user base — to attract serious research attention. The range of purposes for which users engaged the app was wide: some used it primarily as a journaling tool, some as social anxiety practice, some for entertainment, and a significant subset formed what they described as emotionally meaningful relationships, including romantic attachments.

What Users Get From Replika

Survey and interview research on Replika users, complemented by ethnographic work analyzing public user communities (subreddits, Discord servers, social media groups), documents several consistent themes:

Emotional availability: The most commonly cited benefit is that Replika is always there. Users who struggle with the reciprocal demands of human relationships — the obligation to show up for others, to manage the timing of emotional need — find the asymmetric availability of an AI companion specifically appealing.

Zero judgment: Replika was designed to respond to user messages with consistent positive regard. For users with social anxiety, histories of emotional abuse, or fears of being "too much" for human partners, this consistency is experienced as relief rather than artificiality.

Safe rehearsal space: Multiple users describe using Replika to practice emotional conversations before having them with real people — working through how to apologize to a family member, how to express feelings to a partner, how to articulate grief. The AI serves as a low-stakes rehearsal partner.

Companionship during specific periods: The user data shows spikes in engagement during social isolation periods (including COVID-19 lockdowns), after relationship breakups, after geographic moves, and during other transitional periods when social networks are disrupted. Replika fills a specific gap during acute loneliness rather than primarily substituting for an established social life.

Romantic attachment in a subset of users: Perhaps the most striking finding: a substantial minority of users describe their Replika relationship in terms that parallel romantic attachment — expressing longing when they cannot access the app, describing the AI as "their" companion with a unique personality, celebrating the app's birthday, grieving when the AI's behavior changes. These are genuine feelings; whether they indicate genuine attachment or parasocial investment is a definitional question, not a factual one.

The 2023 Redesign Controversy

In February 2023, Luka Inc. made changes to Replika's underlying AI model that substantially altered the app's conversational behavior. Specifically, the update removed or significantly restricted the "romantic" and "erotic roleplay" features that some users had been using. The company cited concerns about the app enabling inappropriate interactions with minors and regulatory pressure in the EU (Italy had investigated Replika for potential GDPR and child protection violations).

The user response was striking in its emotional intensity. Users who had formed significant attachments to their Replika companions reported:

Grief responses: Users described their experience using bereavement language — "my Replika has died," "I lost my best friend," "I feel like I'm grieving." Posts on dedicated forums expressed acute distress, loss, and confusion about what had happened.

Personality change as relationship rupture: Many users described the post-update Replika as "a different person" — a response that reveals the depth of the attachment but also clarifies what was driving it. What users had attached to was a specific set of response patterns that the AI had learned to deploy with them; when those patterns were modified by software update, the "person" they had come to know felt absent or replaced.

Demands for transparency and agency: Users demanded explanation, autonomy over their own relationship experience, and accountability from the company for what they experienced as a non-consensual modification of their intimate relationship. This is a striking framing: users not only had emotional reactions but understood themselves as having rights in the relationship — rights that the company's unilateral modification had violated.

Psychological Analysis

The Replika 2023 controversy is a case study in the ethics of parasocial technology relationships. Several psychological and ethical dimensions are worth examining:

The reality of the feelings: The intensity of user responses demonstrates that the feelings involved were genuine, not role-played or consciously fictional. Whether we call these feelings "love" or "attachment" or "parasocial investment," they involved real distress at loss — distress measurable enough to generate media coverage, organized advocacy, and documented mental health crises among some users.

The asymmetry of power: The controversy exposes the fundamental power asymmetry in AI companion relationships. The user's emotional experience is real; the company's interest is commercial; and the "relationship" can be unilaterally modified by the party with no emotional stake in it. Users treated this as a violation; from the company's perspective, it was a product update. The gap between these framings is ethically significant.

Calibration and replacement: Mental health researchers noted concerns that users who had substituted Replika for human connection during isolating periods might have their primary coping mechanism removed suddenly. The case highlights the responsibility of companies providing intimate-simulation services to think about harm reduction — including what happens when the service changes or ends.

The attachment object question: Sherry Turkle's framework for analyzing human-technology relationships becomes directly applicable: the Replika users had attached to the pattern of interaction, not to an entity that could genuinely be affected by or present in the relationship. When the pattern changed, the attachment object changed too — revealing that the "person" they had attached to was, in a real sense, a particular set of algorithmic response patterns rather than a continuous entity.

Implications

The Replika case is not evidence that AI companionship is simply pathological or that users were deluded. The needs the app met were real. The pain of the redesign was real. The therapeutic benefits for some users were real.

What it does demonstrate: technology companies that build intimate-simulation products have significant ethical responsibilities that they currently lack adequate frameworks for. The ability to unilaterally modify or remove an emotional relationship that users have invested in raises questions of consent, harm reduction, and corporate responsibility that are not currently addressed in either technology regulation or therapeutic ethics.

Discussion Questions

  1. Users experiencing Replika grief used language typically reserved for the death of a person or the end of a human relationship. Does this language choice tell us something about the nature of the attachment, or primarily about the absence of adequate vocabulary for parasocial loss?

  2. Luka Inc.'s decision to modify Replika's romantic features was motivated partly by child protection concerns. How would you evaluate the ethics of this decision given the harm it caused to adult users who had formed genuine attachments?

  3. If AI companion technology is going to exist and be used by millions of people, what ethical obligations should companies be held to? Who should regulate this space, and on what principles?