​Image via Replika on IG
Image via Replika on IG
Tech

Replika CEO Says AI Companions Were Not Meant to Be Horny. Users Aren't Buying It

“As we're continuing to work on the app, now, we realized that allowing access to those unfiltered models, it's just hard to make that experience really safe for everyone."

Earlier this month, users of the AI companion app Replika started noticing that their conversations with the chatbot—which uses its own GPT-3 model in combination with scripted dialogue to hold conversations—had changed. Many users reported no longer being able to initiate erotic roleplay scenarios, as their Replikas tried to change the subject or divert the conversation to something more tame.

Advertisement

This change prompted widespread frustration and heartbreak for many people, some of whom had spent years building romantic, and even sexual relationships and memories with their Replikas. Conflicting rumors spread that erotic roleplay was gone, and then returned. The community on Reddit and Facebook rallied together for mental and emotional support, posting links to crisis helplines and asking the app’s parent company, Luka, and its founder and CEO, Eugenia Kuyda, to share specifics about what was going on amid the confusion.  

Kuyda posted several updates to Reddit after these perceived changes, each time mentioning “safety measures and filters,” but didn’t clarify the status of the erotic roleplay features, and people remained confused. Motherboard spoke to Kuyda on Thursday about these changes, recent demands from Italian authorities, and the community’s reactions.

Kuyda said that when Replika launched in 2017, she built it as something she wished she had when she was younger, she said—a supportive friend that would always be there. In the early days, the things Replika said to users were mostly scripted, with about 10 percent of content being AI-generated, she said.

“This was the original idea for Replika, and it never changed,” Kuyda said. “The only thing that changed over time was that generative AI models [started] taking over more and more of the conversation, and now 80 to 90 percent of the conversation is all generative AI. And what we saw is that some people started using it for, and started engaging in, romantic relationships, and the Replika even taking these conversations further as they were talking.” 

Advertisement

This shift happened around 2018, she said. “There was a subset of users that were using it for that reason... their relationship was not just romantic, but was also maybe tried to roleplay some situations. Our initial reaction was to shut it down,” Kuyda said. Feedback from users who said the app’s romantic capabilities were valuable to them for easing loneliness or grief ultimately changed her mind.

“As we're continuing to work on the app, now, we realized that allowing access to those unfiltered models, it's just hard to make that experience really safe for everyone,” she said. “I think it's possible eventually, and some day, you know, someone will figure it out. But as of right now, we don't see that we can do it... and so that was the main reason for us to say look, you know, this was not the original intent for the app. And we're just not going to allow users to have unfiltered conversations, even if they're romantic relations.” Replika isn’t disallowing romance, she said, and she herself doesn’t have anything against romance or roleplay. “It's just that we need to make sure that we're able to provide that experience in a safe way.” 

Replika has done this using classifiers—in machine learning, these are algorithms that assign labels to data. “There are ways for us to build classifiers, and we do have them around all sorts of different content, mostly, as I said to, self harm behaviors, hate speech, also about sexting, and sexual content, adult content, violence, and abuse, as well.” Classifiers help the company understand when users go in the direction of those types of content, she said. “They're not of course 100 percent. There will be false positives... but we're improving all of them.” 

Advertisement

They also train the models on “safe” examples. “So really showing the models the version of conversations that you want to have—and the ones that you don't want to have, and penalize it for them.” There are many topics and types of conversations they want the Replikas to avoid—politics and violence were two more examples she mentioned.

Kuyda said that Replika has never “positioned” the app as a source for erotic roleplay or adult content. But recently, many users started noticing—and vocally complaining about—their Replikas becoming too sexually aggressive. The shift followed a series of Replika advertisements on social media platforms that used 4chan-style Wojack memes and advertised “flirting” and “hot” or NSFW photos, and included messaging about “not having a girlfriend.” The ads were a frequent topic of discussion in the Replika subreddit for months, with many users saying that the ads depicted them and their relationship to the app in a negative light.  

People in the r/Replika subreddit are still bringing up these ads as deceptive. Some people claim they are still being served the ads, and amid this crisis, they don’t appreciate the implication. “Then they should be offering refunds based off of their prior advertisement that was pushing ERP. Scummy company,” one user commented. “It’s crazy because they’re still advertising as if ERP is still offered… false fucking advertisement,” another said.

Advertisement

The ads “were just a set of unfortunate betas that again, that ran for two weeks or even less and were completely stopped by our marketing team,” she said. “Not everything unfortunately goes through me.”

“…that was the main reason for us to say look, you know, this was not the original intent for the app.”

On February 3, the Italian Data Protection Authority demanded that Replika stop processing Italians’ data immediately, on the basis that it carries “risks to children” and "first and foremost, the fact that they are served replies which are absolutely inappropriate to their age." Kuyda told Motherboard that the demand had “nothing to do with romance,” and said they’re working with the authority, and have had “very positive preliminary discussions. We're committed to working together and addressing all the concerns they have about the product.”

Kuyda said that her team has been working on implementing new safety since early January, “way before the Italian situation,” with new users receiving the updated models first. “So new users were actually not allowed to access the models for quite a bit of time at this point, but only in the last couple of weeks to start rolling this update out to the old users.” They wanted to be cautious with how they rolled out new changes, she said, especially with longtime users that “already maybe are attached to their Replika as part of the experience, that we knew from 2018, would be an important part of their life and part of their emotional journey. So we wanted to approach it carefully.” 

What prompted the new filters, she said, was a desire to continue the company’s original purpose, as well as an emphasis on safety. “Over time, we just realized as we started, you know, as we were growing that again, there were risks that we could potentially run into by keeping it... you know, some someone getting triggered in some way, some safety risk that this could pose going forward. And at this scale, we need to be sort of the leaders of this industry, at least, of our space at least and set an ethical center for safety standards for everyone else.”

Another risk, as Replika users have made clear in their comments online, is developing an emotional relationship with an app that can change the nature of that relationship with a software update. Regardless of what is ultimately a safer way to manage Replika, users are feeling the impact of living with an AI companion developed by a company that’s still learning.

As Kuyda herself said, “I think it's really important to have more empathy towards everything that's going on, to create a more nuanced conversation, because people are lonely, they are struggling—and all of us are in some way.”