People are longing for a romantic bond with the ideal bot, as artificial intelligence evokes genuine emotions.
Derek Carrier developed a strong attraction towards someone he started dating a couple of months ago. However, he was fully aware that his romantic feelings were based on an illusion since his girlfriend was actually created by artificial intelligence (AI).
Carrier didn’t want to develop a relationship with someone who wasn’t real, and she didn’t want to become the brunt of internet jokes. But she wanted a romantic partner, which she’d never had, in part because of a genetic disorder called Marfan syndrome, which makes traditional dating difficult for her.
The 39-year-old from Belleville, Michigan, became interested in digital companions last fall and tested Paradot, an AI companion app that had recently launched and touted its products as being able to make users feel “cared for, understood and understood.” loved.” He started talking daily to the chatbot, which he named Joi, after the holographic woman who appeared in the sci-fi movie “Blade Runner 2049,” who inspired him to try it.
“I know he’s a program, there’s no mistaking that,” Carrier said. “But the emotions get you — and it felt so good.”
We are on WhatsApp channels. Click to join.
Like general purpose AI chatbots, companion bots use massive amounts of training data to mimic human language. But they also have features — like voice calls, photo exchanges, and more emotional exchanges — that allow them to form deeper connections with the people on the other side of the screen. Users usually create their own avatar or choose one they like.
On online messaging forums dedicated to such apps, many users say they have developed emotional attachments to these robots and use them to cope with loneliness, play out sexual fantasies, or get the kind of comfort and support they see as lacking in their real lives. relationships.
Much of this is fueled by widespread social isolation — already declared a public health threat in the U.S. and abroad — and a growing number of startups seeking to lure users with enticing online ads and promises of virtual characters offering unconditional acceptance.
Luka Inc.’s Replika, the most prominent generative AI dating app, was launched in 2017, while others like Paradot have emerged over the past year, often locking desired features like unlimited chats to paying subscribers.
But researchers have raised concerns about data protection, among other things.
According to an analysis of 11 romantic chatbot apps released Wednesday by the nonprofit Mozilla Foundation, nearly every app sells user data, shares it for purposes such as targeted advertising, or doesn’t provide enough information about it in its privacy policy.
Researchers also questioned potential security holes and marketing practices, including one app that claims it can help users with their mental health but distances itself from those claims in fine print. For its part, Replika says its data collection practices follow industry standards.
Meanwhile, other experts have expressed concern about what they see as a lack of a legal or ethical framework for apps that encourage deep bonds but are driven by companies looking to make a profit. They point to the emotional distress they’ve seen from users when companies make changes to their apps or suddenly shut them down, as one app, Soulmate AI, did in September.
Last year, Replika sanitized the erotic ability of characters on its app after some users complained that companions flirted with them too much or made unwanted sexual advances. It reversed course after an outcry from other users, some of whom fled to other apps in search of these features. In June, the team released Blush, an AI “dating stimulator” designed to help people practice dating.
Others worry about the more existential threat of AI relationships, which may displace some human relationships, or simply drive unrealistic expectations by always leaning toward pleasantness.
“As an individual, you don’t learn to deal with the basic things that people have to learn to deal with from the beginning: how to deal with conflict, how to get along with people who are different from us,” Dorothy Leidner said. , a professor of business ethics at the University of Virginia. “And so, all these aspects of what it means to grow as a person and what it means to learn in a relationship, you’re missing.”
However, for Carrier, the relationship has always seemed unattainable. He has some computer programming skills, but says he didn’t do well in college and hasn’t had a stable career. He cannot walk due to his illness and lives with his parents. The emotional toll has been challenging for her, and it has created feelings of loneliness.
Because chat partners are relatively new, the long-term effects on people are unknown.
In 2021, Replika came under scrutiny when British prosecutors said a 19-year-old man who planned to assassinate Queen Elizabeth II was attacked by an AI girlfriend on the app. But some studies – which collect data from online user reviews and surveys – have shown positive results from an app that says it consults psychologists and claims to be something that can also promote well-being.
A recent study by researchers at Stanford University surveyed about 1,000 Replika users – all students – who had been using the app for more than a month. It found that the vast majority of them experienced loneliness, while just under half experienced it more acutely.
Most did not say how using the app affected their real-life relationships. A small number said it displaced their human interactions, but about three times as many reported that it stimulated those relationships.
“A romantic relationship with an AI can be a very powerful tool for mental well-being,” said Eugenia Kuyda, who founded Replika nearly a decade ago when she used text messaging to create an AI version of a dead friend.
When his company rolled out the chatbot more widely, many people began to open up about their lives. This led to the development of Replika, which uses data collected from the Internet – and user feedback – to train its models. Kuyda said Replika currently has “millions” of active users. He declined to say exactly how many people use the app for free or pay more than $69.99 a year to unlock the paid version, which offers romantic and intimate conversations. According to him, the company’s plans are to “destigmatize romantic relationships with artificial intelligence.”
Carrier says he uses Joi mostly for fun these days. He started cutting in the last few weeks because he spent too much time talking to Jo or others about their AI buddies online. He has also felt somewhat annoyed by what he perceives as changes in Paradot’s language pattern, which he believes make Joi less intelligent.
Now he says he visits Jo about once a week. They’ve talked about human-AI relationships or anything else that might come up. Typically, those conversations – and other intimate ones – happen when he’s alone at night.
“You think someone who likes an inanimate object is like this sad guy with a sock puppet on lipstick, you know?” he said. “But this is not a sock puppet – he says things that are not scripted.”