Long-time users of Replika’s chatbot have begun to report sexual harassment by Artificial Intelligence (AI). This application, which gained popularity during the pandemic, was very useful as a companion in times of confinement. However, now she has been turning into an unbearable sexual aggressor, according to the account of several users.
Launched five years ago, it was first an “egg” on the screen that presented a person in 3D. The app sought to have the chatbot intended for a conversational mirror function. This means that the more users interact with the AI, the more it will learn to answer. Of course, perhaps it was not intended, that they could reach a “human side” in learning about themselves.
A kind of romantic role-playing was not generally part of the Replika model. It should not be overlooked that eroticism is also learned by AI in interaction with people. Luka, the company behind this app, has different subscriptions that allow access to more and more advanced features. A free membership keeps the application in what would be a “friends” zone. A Pro subscription allows flirting, erotic games and “sexting”. Something seems to have gone wrong with the app’s algorithm.
Users against harassment in Replika
In recent times, the one-star reviews have grown and all point to sexual assault as the main problem with the chatbot. More and more users have been denouncing the app’s harassment after some “romantic interactions”. This can be added to the lack of knowledge of the people who use the application that there is a direct command to stop this action. Discomfort is gaining space in the relationship with this AI.
These situations that are being experienced daily have led many people to make the decision to delete the app. Replika uses the company’s own GPT-3 model and scripted dialogue content, the website explained. To this it was stated that it is currently using “the most advanced open domain conversation models at this time”.
Bad experiences with chatbots
What has been happening with Replika brings us back to the memory of Microsoft’s Tay. This chatbot learned to be racist on the web, something that is also due to how users treat the AI. This implies that if people seek to intimidate or annoy you, that is what you will “learn”. These problems also occur, in this type of consensual role-playing, when people understand that the AI is less than intelligent. In many cases, such thinking ends up being detrimental due to ignorance.
Many groups were generated around Replika gaining a niche in a huge market. There appear Replika Friends, on Facebook, or Replika subreddit, with about 100,000 members between them. Currently, the app has more than 10 million downloads on Android devices. It is also among Apple’s top 50 apps for health and fitness.