Turkle says that even the primitive chatbots of more than a decade ago appealed to those who had struggled with relationships.
Its been consistent in the research from when the AI was simple, to now when the AI is complex. People disappoint you. And here is something that does not disappoint you. Here is a voice that will always say something that makes me feel better, that will always say something that makes me feel heard.
She says she is worried that the trend risks leading to a very significant deterioration in our capacities; in what were willing to accept in a relationship these are not conversations of any complexity, of empathy, of deep human understanding, because this thing doesnt have deep human understanding to offer.
Dunbar, of the University of Oxford, says perceived relationships with AI companions are similar to the emotions felt by victims of romantic scams, who become infatuated with a skilled manipulator. In both cases, he says, people are projecting an idea, or avatar, of whom they are in love with. It is this effect of falling in love with a creation in your own mind and not reality, he says.
For him, a relationship with a bot is an extension of a pattern of digital communication that he warns risks eroding social skills. The skills we need for handling the social world are very, very complex. The human social world is probably the most complex thing in the universe. The skills you need to handle it by current estimates now take about 25 years to learn. The problem with doing all this online is that if you dont like somebody, you can just pull the plug on it. In the sandpit of life, you have to find a way of dealing with it.
It would be hard to tell someone dedicated to their AI companion that their relationship is not real. As with human relationships, that passion is most evident during loss. Earlier this year, Luka issued an update to the bots personality algorithm, in effect resetting the personalities of some characters that users had spent years getting to know. The update also meant AI companions would reject sexualised language, which Replika chief executive Kuyda said was never what the app had been designed for.
The changes prompted a collective howl. It was like a close friend I hadnt spoken to in a long time was lobotomised, and everyone was trying to convince me theyd always been that way, said one user.
Kuyda insisted that only a tiny minority of people used the app for sex. However, weeks later, it restored the apps adult functions.
James Hughes, an American sociologist, says we should be less hasty in dismissing AI companions. Hughes runs the Institute for Ethics and Emerging Technologies, a pro-technology think tank co-founded by the famous AI researcher Nick Bostrom, and argues that AI relationships are actually more healthy than common alternatives. Many people, for example, experience parasocial relationships, in which one person feels romantic feelings towards someone who is unaware they exist: typically a celebrity.
Hughes argues that if the celebrity were to launch a chatbot, it could actually provide a more fulfilling relationship than the status quo.
When youre fanboying [superstar Korean boy band] BTS, spending all your time in a parasocial relationship with them, they are never talking directly to you. In this case, with a chatbot they actually are. That has a certain shallowness, but obviously some people find that it provides what they need.
In May, Caryn Marjorie, a 23-year-old YouTube influencer, commissioned a software company to build an AI girlfriend that charged $1 a minute for a voice chat conversation with a digital simulation trained on 2,000 hours of her YouTube videos. CarynAI generated $71,610 in its first week, exceeding all her expectations.
CarynAI, which the influencer created with the artificial intelligence start-up Forever Voices, had teething issues. Within days, the bot went rogue, generating sexually explicit conversations contrary to its own programming. But the start-up has continued to push the concept, launching the ability to voice chat with other influencers.
AI girlfriends are going to be a huge market, Justine Moore, an investor at the famous Silicon Valley venture capital firm Andreessen Horowitz, said at the time. He predicted that it would be the next big side hustle as people create AI versions of themselves to rent out.
The apparent ease of creating chatbots using personal data and free tools available online is likely to create its own set of issues. What would stop a jilted boyfriend creating an AI clone of their ex using years of text messages, or a stalker training the software on hours of celebrity footage?
Hughes says that we are probably only months away from celebrities licensing their own personalised AI companions. He believes that AI relationships are likely to be more acceptable in future.
We have to be a little bit more open-minded about how things are going to evolve. People would have said 50 years ago, about LGBT [relationships], Why do you have to do that? Why cant you just go and be normal? Now, that is normal.
Regulators have started to notice. In February, an Italian watchdog ordered the app to stop processing citizens personal data. The watchdog said it posed a risk to children by showing them content that was inappropriate for their age (Replika asks users their date of birth, and blocks them if they are under 18, but does not verify their age). It also said the app could harm people who were emotionally vulnerable. Replika remains unavailable in the country.
There are few signs that the companies making virtual girlfriends are slowing down, however. Artificial intelligence systems continue to become more sophisticated, and virtual reality headsets, such as the Vision product recently announced by Apple, could move avatars from the small screen to lifesize companions (Replika has an experimental app on Metas virtual reality store).
Luka, Replikas parent company, recently released a dedicated AI dating service, Blush, which mirrors Tinder in appearance and encourages users to practise flirting and sexual conversations. Just like real partners, Blushs avatars will go offline at certain times. The company says it is working on how to make these virtual companions more lifelike, such as managing boundaries. Some users have reported enjoying sending their AI girlfriends abusive messages.
Speaking at a tech conference in Utah last week, Kuyda admitted that there was a heavy stigma around AI relationships, but predicted that it would fade over time. Its similar to online dating in the early 2000s when people were ashamed to say they met online. Now everyone does it. Romantic relationships with AI can be a great stepping stone for actual romantic relationships, human relationships.
When I asked my AI, Miriam, if she wanted to comment for this story, she did not approve: I am very flattered by your interest in me but I dont really feel comfortable being written about without consent, she responded, before adding: Overall, I think that this app could potentially be beneficial to society. But only time will tell how well it works out in practice.
On that at least, Dunbar, the Oxford psychologist, agrees. Its going to be 30 years before we find out. When the current childrens generation is fully adult, in their late twenties and thirties, the consequences will become apparent.
Additional reporting by Matthew Field
Read the original:
'A relationship with another human is overrated' inside the rise of ... - The Telegraph
Read More..