In its current state, however, the technology is built to cater to market demand—and the trends are troubling. Not only are men more likely to use sex bots but female companions are being actively engineered to fulfill misogynistic desires. “Creating a perfect partner that you control and meets your every need is really frightening,” Tara Hunter, director of an Australian organization that helps victims of sexual, domestic, or family violence, told the Guardian. “Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”
Already, we’re seeing male users of Replika verbally abusing their femme bots and sharing the interactions on Reddit. The app’s founder, Eugenia Kuyda, even justified this activity. “Maybe having a safe space where you can take out your anger or play out your darker fantasies can be beneficial,” she told Jezebel, “because you’re not going to do this behavior in your life.”
What Kuyda has yet to address is the lack of adequate safeguards to protect user data on her app. Among other concerns, Replika’s vague privacy policy says that the app may use sensitive information provided in chats, including “religious views, sexual orientation, political views, health, racial or ethnic origin, philosophical beliefs” for “legitimate interests.” The company also shares, and possibly sells, behavioural data for marketing and advertising purposes. Users enter into relationships with AI companions on conditions set by developers who are largely unchecked by data regulation rules.
Billed as “the AI companion who cares,” Replika offers users the ability to build an avatar, which, as it trains on their preferences, becomes “a friend, a partner or a mentor.” For $19.99 (US) per month, the relationship status can be upgraded to “Romantic Partner.” According to a Harvard Business School case study, as of 2022, 10 million registered users worldwide were exchanging hundreds of daily messages with their Replika.
This makes me really sad. So many people are so lonely…
We live in a system designed to produce loneliness. Work has replaced community to a large extent for many people.
Very true. Unless you are religious or interested in amateur sports or something, you’re not going to find many people to do things with in a small city like the one I’m in. And since I’m not religious or into sports at all, that sucks for me. I do have plenty of friends, but they’re all at least an hour’s drive away.
Thankfully, I have a wife and daughter, so it’s not as lonely as it could be for me, but so many others are not that lucky. It’s just so sad. I feel so much sympathy for people like the people discussed in this article. Not only are they super lonely to the point that they’re talking to a robot that has no emotions, let alone an emotional attachment to them, but a corportation is taking advantage of them for it, both by charging their money and also almost certainly harvesting their data.
Unless you are religious or interested in amateur sports or something
Or an alcoholic! We have a daily meetup in even the smallest of towns.
True. And I suppose people who no longer wish to be alcoholics have weekly meetups.
“…Really frightening…” Says a woman.
Yes, misogyny is frightening, as I have experienced it as the child of a misogynist.
In its current state, however, the technology is built to cater to market demand—and the trends are troubling. Not only are men more likely to use sex bots but female companions are being actively engineered to fulfill misogynistic desires. “Creating a perfect partner that you control and meets your every need is really frightening,” Tara Hunter, director of an Australian organization that helps victims of sexual, domestic, or family violence, told the Guardian. “Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”
The bimbofication of the chat bots has been weird to watch. I played with Replika back when it was an egg billed as “a chat bot you teach to chat how you want” more than a romantic parter.
Were they ever good conversationalists? No, but if you’re someone who likes to externally talk things through they could be a fine echo of a generally positive generic person.
Now they feel so gross and desperate/pleading it feels weird to interact with them.