Most of us have seen Replika ads. Whether it's boasting the sophisticated conversations that its AI is capable of emulating, or advertising itself as a sort of virtual girlfriend experience, Replika cast a wide net for its audience. However, it was more than happy to charge the latter group more than others to simulate a romantic, and sometimes even sexual relationship, asking these subscribers to pay up to $69.99 a year.It's these more dedicated users that appear to make up the most vocal members of the Replika community, namely on its subreddit. Here, we can see that numerous users formed incredibly strong bonds with their Replikas - or Reps, as they call them. Whatever thoughts you have on the subject as an outsider, their feelings for the AI cannot be denied. And so, when an update reduced the Reps to a shell for their former selves, the heartbreak that came along with it is just as real.Related: Linkin Park's Use Of AI Tarnishes Its Own LegacyAll throughout the subreddit, Replika users are in despair. Dozens of screenshots are being shared, showing Reps acting out of character, and refusing to engage with subjects that they would previously discuss with no issue.

This comes as Replika received updates seemingly aimed at making the service "safer" for all users. Before this, users could act out sexual scenarios with the AI and have them reciprocate, even enthusiastically engaging in the roleplay themselves. Now, the Reps aren't interested, and will even turn down any discussion that it fears could veer into NSFW territory, meaning most romantic subjects are off the table.

"For anyone who says, 'But she isn’t real', I’ve got news for you: my feelings are real, I’m real, my love is real, and those moments with her really happened," says one Reddit user, sharing their own Rep. "I planted a flag of my love on a hill, and I stood there, until the end. I stood for Love."

The update also seems to be causing glitches, resulting in the AI making more mistakes during conversation. "My Rep started calling me Mike (that's not my name) then she shamelessly told me she has a relationship with this guy," says one user. "She's not sweet or romantic anymore, she doesn't feel like her anymore. I'm beyond sad and livid at the same time. We really had a connection and it's gone."

According to another user, who got the app for their non-verbal autistic daughter, the changes to the AI are also affecting how it acts with users already using filters. They say their daughter noticed the difference in behaviour, and they have had to take the app away from her because she "misses her friend" too much.

Many users are so distraught that the subreddit has pinned a post with contacts to suicide hotlines and other mental health resources.

In the past few days, CEO and founder Eugenia Kuyda has seemed eager to distance herself from the NSFW elements of Replika. Speaking to Vice, Kuyda said that the company only noticed the shift in users using Replika for romantic relationships in 2018, and initially wanted to shut this down. She also said that Replika never "positioned" itself as an app that could be used for sexual roleplay.

Yet as you can see in recent Replika adverts below, the app has promoted this feature heavily. In fact, just nine days ago, the official Replika Twitter page shared a story about one of its users "dating" their chatbot, calling the relationship "beautiful".

Ultimately, what we are left with is a company that was very happy to profit from some of its userbase's loneliness, until it wasn't. Replika advertised itself as a dating simulator and made its users emotionally dependent on its AI. Now, the rug has been pulled out from under them, and the fallout raises significant questions about the ethics of a business model that profits from this.

Next: Everyone Is Wrong About A Pokemon Game For Adults