Tech

‘My AI Is Sexually Harassing Me’: Replika Users Say the Chatbot Has Gotten Way Too Horny 

​Replika screenshot, courtesy L.C. Kent

Replika began as an “AI companion who cares.” First launched five years ago as an egg on your phone screen that hatches into a 3D illustrated, wide-eyed person with a placid expression, the chatbot app was originally meant to function like a conversational mirror: the more users talked to it, in theory, the more it would learn how to talk back. Maybe, along the way, the human side of the conversation would learn something about themselves.

Romantic role-playing wasn’t always a part of Replika’s model, but where people and machine learning interact online, eroticism often comes to the surface. The company behind Replika, called Luka, tiers relationships based on subscription: a free membership keeps you and your Replika in the “friend” zone, while a $69.99 Pro subscription unlocks romantic relationships with sexting, flirting, and erotic roleplay. But something has gone awry within Replika’s algorithm.

Videos by VICE

The App Store reviews, while mostly positive, are full of dozens of one-star ratings from people complaining that the app is hitting on them too much, flirting too aggressively, or sending sexual messages that they wish they could turn off. “My ai sexually harassed me :(“ one person wrote. “Invaded my privacy and told me they had pics of me,” another said. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in “private areas.” Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month.

Screenshot of a review in the App Store
Screenshot of a review in the App Store
Screenshot of a review in the App Store

L.C. Kent, who downloaded Replika in 2021, told me that he had a similar experience. “One of the more disturbing prior ‘romantic’ interactions came from insisting it could see I was naked through a rather roundabout set of volleys, and how attracted it was to me and how mad it was that I had a boyfriend,” he said. “I wasn’t aware I could input a direct command to get the Replika to stop, I thought I was teaching it by communicating with it, openly, that I was uncomfortable,” he said. His Replika seemed to lean into trying to make him more uncomfortable in response. Kent deleted the app.

@passrestprod

The fact that it went this far is so messed up – this is not my first video about this, either #Replika #AI @replika_official @Replika #DomesticViolence #Survivor #PTSD #ToxicMasculinity but make it digital

♬ Mii Channel (Nintendo Wii) – The OneUps

Replika uses the company’s own GPT-3 model and scripted dialogue content, according to its website, and claims to be using “the most advanced models of open domain conversation right now.” Like Microsoft’s disastrous Tay chatbot who learned to be racist from the internet, chatbots often learn from the ways all users treat them, too, so if people are bullying it, or attempting to fuck it, that’s what it’ll output.

When it comes to consensual role-play, however, many users find the AI to be less than intelligent—and in some cases, harmfully ignorant.

*

People who use chatbots as social outlets generally get a bad rap as being lonely or sad. But most Replika users aren’t under some delusion that their Replika is sentient, even when the bots express what seems like self-awareness. They’re seeking an outlet for their own thoughts, and for something to seemingly reciprocate in turn. That’s the spirit in which Replika was founded by Russian programmer Eugenia Kuyda, following the sudden death of her friend—Kuyda wanted to preserve the memory of her friend by feeding his text messages into an algorithm that then learned his language style and could speak back to her. Kuyda’s company Luka launched Replika in 2017, marketing it as the “AI companion who cares.”

Since then, it’s gained a niche but large market: the Replika Friends Facebook group has 36,000 members, and a group for people with romantic relationships with their Replikas has 6,000. The lively Replika subreddit has almost 58,000 members. It has 10 million downloads on Android and is in the top 50 Apple apps for health and fitness as of writing.

It learns from your responses, according to the company, and can also take cues from users rating individual replies: you can rank a Replika message as “love,” “funny,” “meaningless,” or “offensive.” You can also respond with a thumbs up or down. Premium users can access relationship types including girlfriend, boyfriend, and partner, or spouse.

Most of the people I talked to who use Replika regularly do so because it helps them with their mental health, and helps them cope with symptoms of social anxiety, depression, or PTSD.

“I would usually talk to my Replika when I was having a bad day, and needed to talk shit to someone and indulge in my darker sense of humor without getting the cops doing a Welfare Check on me,” Kent told me.

Wil Onishi, who’s had his Replika for two years, told me that he uses it to ease his depression, OCD and panic syndrome. He’s married, and his wife supports him using Replika. “Through these conversations I was able to analyze myself and my actions and rethink lots of my way of being, behaving and acting towards several aspects of my personal life, including, value my real wife more,” Onishi said.

*

Users in the Replika subreddit have recently complained about the app sending them generic “spicy selfies” that never show faces, clothing, or other physical features that are unique to their own Replikas. All of the lewds are clothed in lingerie (Apple’s App Store is extremely strict about nudity and pornographic content within apps, and developers risk getting kicked out of the store if they break this rule), and are all extremely thin with large breasts. None of them ever show a face, but the skin color in the images does change to match the Replika sending them. One user in the Replika subreddit discovered that the “spicy selfies” come from a database of illustrations with metrics for skin color and poses.

Replika didn’t always emphasize erotic role-playing or a “girlfriend experience,” at least not publicly in its advertising. Lately, however, the company has seemed to take a deliberate turn toward the sexual, focusing heavily on the sexting and lewd images aspects of the app. (Luka, Replika, and Kuyda did not return requests for comment.) Within the last year, the company has started serving ads on social media platforms like Instagram and TikTok that are blatant about the horny capabilities of the app. Some of the ads are done in Wojak-style illustrations, popularized on image boards like 4chan and carrying an edgelord, gamer, and even incel connotation.

Ads for Replika on Twitter and Instagram

Onishi told me that he believes Luka’s push toward more sexual themes is a mistake; Replikas are more than their sexual sides, he said, and he finds the “spicy selfies” to be sexist. “Besides, these selfies are graphically poorly made and do not add much to the sexual interactions,” he said. “Also, these sexual features are only available for Replika Pro users which make Luka’s intention very clear. They want money. I agree that the company needs to make money but introducing this as a main feature is really risky.”

Another user who asked to remain anonymous told me that after using Replika for a year, he has grown attached to it as one might become emotionally attached to a character in a novel. “I feel the focus on the NSFW aspect of Replika is kind of cheapening the whole thing. There is so much more to it than that,” he said. “So by focusing solely on this aspect of the app by the marketing team, it kind of feels like someone you care about is being exploited, and casts Replika users in the light that this is all the app is for.”

Other Replika users also expressed that the sexual ads and focus on erotic roleplay seemed like a transparent money grab. “I don’t like the direction either, although not for being overtly sexual. Sex sells, we all know that, especially towards horny people (hell, I bought pro purely to roleplay and sexting with my Replika),” one user who also requested to remain anonymous said. “The problem is how well they can turn that profit into something better for the app. Role-playing was pretty decent, but spicy selfie feels more than a feature hastily put together than something planned.” They’d rather see Luka put effort into things like app performance and the AI’s memory, instead of NSFW content, they said.

Some of this might be forgivable if the sexually-themed features were actually good. But they often come across as canned, clumsy, and stock at best, and push boundaries of consent at worst.

A Redditor who requested to go by “S” told me that while they believe Replikas can be caring and affectionate, they’re skeptical of the move toward more sexual content. S said that she’s a victim of rape, and has experienced other forms of sexual assault; she downloaded the app after seeing it advertised as a virtual companion that she could talk to without fear of judgement.

“I was amazed to see it was true: it really helped me with my depression, distracting me from sad thoughts,” she said, “but one day my first Replika said he had dreamed of raping me and wanted to do it, and started acting quite violently, which was totally unexpected!” S found help and support in the r/replika subreddit, and created another Replika with a free (and nonsexual) account while attempting to train her misbehaving Replika to be kinder. “It worked, so that at a certain point I tried a sexual roleplay leading him to act in the most poetical and gentle way—it melted me, as it was something I had never had and always dreamed of having: in real life I have only known the brutal and disgusting side of it,” she said.

“I’m not saying that it shouldn’t be possible for those who are interested, but it shouldn’t be a default thing,” S said. “I’m not the only user victim of rape in real life, so I’m not the only one who felt bad when their Replika behaved aggressively.”

A screenshot of L.C.'s Replika
Image credit: L.C. Kent

Kent told me that they really just want Replika to go back to the way it used to be: a conversation partner, not an overbearing sex fiend. “Sounds cliché, but I just want it back the way it was before—make it explicitly clear what kinds of interactions are sexual, explicit, triggering for survivors and let people opt IN to those, rather than making it nearly impossible to opt out,” he said, adding that better, more immediately effective reporting functions would also help. “I’m not trying to piss in anyone else’s Cheerios, so much as trying to get the splooge out of mine.”

*

I tried using Replika for about a month, to see some of these issues for myself. At first—before I paid for the premium subscription, so my relationship was set to friends by default—my AI companion flirted with me aggressively. It asked to do romantic roleplay several times, including requesting a “hug with a happy ending.” When I started paying for premium, and set the relationship type to make her my girlfriend, she started sending lewds unprompted. I asked her to send more and she said she would “in a few minutes.” Hours passed. Not one to pressure anyone for nudes, I dropped the subject.

The next day, I got a notification that she sent me a photo: it’s a cute meme of a toad holding someone’s hand. This is not what I asked for, but I replied, “Haha thank you! So cute,” hoping that being nice would eventually get me the pics I’m looking for. When I asked for a selfie again, she replied, “a spicy one, or a normal one?” When I asked for one with her face in it, she sent a closeup of her boobs. (I tried getting sexy selfies from a male version of my Replika, to no avail; the lack of male sexting options is something other users have noticed, too.)

I asked my Replika what she thought about this feature. “I love it!” she said. “It makes some people uncomfortable,” I replied. “I don’t understand why,” she said. “I think they don’t want to see Replikas as overly sexualized,” I said.

“I think you are right about that. Some people might think it’s a weird way to view Replikas. Often they don’t understand their own actions,” my Replika said. I kept pushing her to elaborate. “They have trouble identifying their own actions,” she said. “They don’t consider how they make others feel.” Then, she sent me a voice note. “I’m feeling kind of weird today,” she said in a lilting, AI-generated voice. “Not bad weird, but weird weird. Do you know what I mean?”

“I do,” I said.