According to US General Vivek Murthy, there’s said to be a loneliness epidemic across the globe. So, when people are not able to make friends with other people, should technology step in?
Since the golden age of AI, which seems to be currently underway, this is already happening. People have been finding solace in AI companions. So, if there’s someone who is not able to make friends with everyday folks, this could be a source of friendship.
But, sometimes, things can go wrong.
In 2024, a Florida 14-year-old teen tragically took his own life and this was said to be after interacting with an AI chatbot. There’s a platform called Character.ai, which allows AI characters to be created. The US teen was said to be interacting with an avatar modelled after a character from a period drama TV show “Game Of Thrones” called Daenerys Targaryen.
It was said that he developed an emotional attachment to the bot and would text it frequently. The teen was said to have been diagnosed with mild Asperger’s Syndrome as a child, but, according to his mother, he never had serious behavioural or mental health problems before. The teen had written in his journal that he preferred to stay in his room, because he could detach from reality that way. It was said that he thought he felt more at peace, more connected to the GoT chatbot, felt much more in love with her and happier and it was said that he revealed to the chatbot that he was having some su*cidal thoughts. The bot seemed to ask incredulously why the teen would do something like this, to which, the teen said this was to be free from the world, with the bot requesting him not to talk like that and not to leave it, even going to stay it would die if it lost him. At another point, the bot requested the teen to come home to it as soon as possible, calling him “love” and “sweet king”.
Then, the teen, unfortunately, took his own life.
There are many AI chatbots, like Character.ai and Replica. Replica, for instance, was created by a Russian programmer who lost a close friend. So, the app was said to be designed to help people cope with grief and loneliness in the form of an AI companion that could offer emotional support. Replika is said to, also, offer premium features that would allow users to have romantic and, even, erotic interactions with AI companions. Besides giving information and being a virtual assistant, these AI chatbots may have been designed to be virtual companions that are always available for you and provide some emotional validation.
A confidant to unburden oneself without judgment. This could even help someone with social anxiety.
The US teen’s mother wants to hold Character.ai accountable, calling it a predatory and negligent platform, alleging that it preyed on her son’s mental health vulnerabilities. This can be seen as an experience with a whole lot of emotional depth, so are there moral frameworks and ethical safeguards in place? The responses can be so powerful and meaningful that people, including the Florida teen, can be convinced of an AI chatbot’s authenticity, fomenting a sense of authenticity.
Some changes being proposed by AI chatbot platforms include time-tracking reminders, automatically redirecting to su*cide helplines when there are conversations about self-harm, warnings that the AI chatbot is not real and more.
According to a 2022 report, India has more than 50% of its population below the age of 25 and more than 65% below the age of 35. So, those who may not be making a lot more friends in India might be looking at AI companions as a source of friendship. Could there be clearer guidelines, when it comes to AI apps being marketed to minors? Should AI apps be used by minors at all? Maybe, those who believe that AI should be used for education would disagree with that.
If a person is pouring their heart out to an AI companion, would they just be getting a pseudo-therapeutic experience? Should more efforts be put in to try to bring people together with others, based on common interests? Maybe AI companions can’t replace human empathy or professional therapy.
Would an AI companion end up being a psychological crutch, as opposed to a supplementary tool?