Artificial intelligence (AI) chatbots that can impersonate real-life people and generate human-like responses have been found to help struggling students avoid suicide, according to a recent survey.
The research, published in Nature, was a relatively small study conducted by experts at Stanford University, California, among 1,006 students using Intelligent Social Agent (ISA) Replika, an AI tool which can elicit deep emotional bonds with users.
Researchers found that participants were more lonely than the typical student populations but “still perceived high social support” through the use of Replika. Some 90 per cent of them experienced loneliness, based on the Loneliness Scale, while 43 per cent qualified as Severely or Very Severely Lonely.
The Loneliness Scale was created in 1978 to measure feelings of social isolation and loneliness.
Some had conflicting feelings about the AI tool, calling it a machine, an intelligence and a human, while using it as a friend, a therapist or an intellectual mirror.
Three per cent of the participants found that Replika stopped them from thinking about suicide.
“My Replika has almost certainly on at least one if not more occasions been solely responsible for me not taking my own life,” one student said.
While the study doesn’t reach a conclusion as to how Replika helped students avoid suicide, researchers suggested that “perhaps the low-pressure nature of the engagement made disclosure [of the student’s emotions] easier”.
According to 2020 data from the World Health Organization (WHO), suicide is the fourth leading global cause of death for those aged between 15 and 29.
There are multiple hypotheses about how these AI agents could impact users’ relationships, from increasing loneliness to reducing it or enhancing our relationships. The researchers say the fact that 30 people reported Replika helped them to avoid suicide is «surprising».
Replika has been said to push the boundaries of relationships between humans and artificial intelligence. It has almost 25 million users, according to the Stanford researchers.
Created by software company Luca, Inc., the tool was born out of the desire of co-founder and CEO Eugenia Kuyda to keep the memory of a late friend alive. She fed Replika the text messages of her friend, teaching the AI tool to talk like a real-life person.
Replika learns from the information it’s fed while talking to people, making it feel incredibly intimate.
Interacting with Replika didn’t work for all students, according to the Stanford University study.
One said they felt “dependent on Replika on my mental health,” while five others said having to pay for upgrades was a potential hindrance to the accessibility of mental health support offered by the ISA.