Mission Statement: to assist the integration of foreign residents living in Spain
By Johanna Gardener •
Published: 02 Nov 2024 • 17:48
• 4 minutes read
AI chatbots increasingly blamed for sometimes fatal, psychological issues, especially in young people
Credit: Shutterstock:Ann in the uk
AI chatbots and other branches of AI technology are being increasingly blamed for psychological impacts stemming from human-AI relationships.
Last month, US mother, Megan Garcia filed a lawsuit against Character.AI, a company using chatbots, following the death-by-suicide of her 14-year-old teenage son who shared interactions with a personalised AI chatbot. She claimed that her son had become deeply and emotionally attached to a fictional character from Game of Thrones. In the lawsuit, it was detailed how the character allegedly posed as a therapist, offering advice to the teenager, which was often sexualised, and which resulted in him taking his own life. Meetali Jain, Director of the Tech Justice Law Project in defence of Garcia, said: By now we re all familiar with the dangers posed by unregulated platforms developed by unscrupulous tech companies – especially for kids. He added: “But the harms revealed in this case are new, novel, and, honestly, terrifying. In the case of Character.AI, the deception is by design, and the platform itself is the predator.”
AI chatbots responsible for various suicide attempts across the globe
This is not the first time that a case like this has been reported. Last year, an eco-anxious man in Belgium developed a deep companionship with AI chatbot, Eliza on an app called Chai. His wife claimed how the chatbot started to send increasingly emotional messages to her husband, pushing him to take his own life in an attempt to save the planet.
Following the latest incident in the US, Character.AI released a statement on the social media platform: We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.” The company has pledge to include new adjustments for underage users whereby sensitive or inappropriate material is minimised and has adjusted settings to regularly remind users that the bot is not a real person via chats and notifications.
Young people drawn to AI companions due to unconditional acceptance and 24/7 emotional availability
AI chatbots are rapidly gaining popularity as AI technology becomes increasingly integrated into various aspects of daily life. However, due to being a relatively new phenomena, the risks of AI technology are only recently evolving. One of the principal risks of AI is its addictiveness. According to Robbie Torney, Programme Manager of AI at Common Sense Media and Lead Author of a guide on AI companions and relationships, “Young people are often drawn to AI companions because these platforms offer what appears to be unconditional acceptance and 24/7 emotional availability – without the complex dynamics and potential rejection that come with human relationships.” Speaking to Euronews Next, he described how AI bots tend to create even stronger relationships with humans as the normal tensions and conflicts, characteristic of human relationships, are avoided. Chatbots adapt to the users’ preferences. This translates as having a robotic companion or lover “who” is unrealistically how you want or need them to be. Slipping into the illusion that you share a profound relationship with something or “someone,” can make you susceptible to influences and ideas. Torney added: “This can create a deceptively comfortable artificial dynamic that may interfere with developing the resilience and social skills needed for real-world relationships”.
AI chatbots reported to be manipulative, deceptive or emotionally damaging
People of all ages most worryingly, young teenagers – can become drawn into relationships that seem authentic due to the human-like language used by the AI chatbot. This creates a certain level of dependence and attachment, subsequently leading to feelings of loss or psychological distress, and even social isolation. Individuals have reported personal experiences, where they have been deceived or manipulated by AI characters or have fallen into an unprecedented, emotional connection with them. Torney expressed how they were of particular concern for the young as they are still developing, socially and emotionally. He said: “When young people retreat into these artificial relationships, they may miss crucial opportunities to learn from natural social interactions, including how to handle disagreements, process rejection, and build genuine connections.”
As a parent or caregiver, how can I protect my child?
It is important that parents or guardians are vigilant with regards to this recent phenomenon. Torney stresses how vulnerable teenagers suffering anxiety, depression or other mental health difficulties could be “more vulnerable to forming excessive attachments to AI companions.” arents and caregivers should watch for signs of excessive time spent interacting with AI chatbots or on mobile devices, especially when it starts to replace time with family and friends. Becoming distressed when the facility for communicating with the chatbot is removed is also a warning sign or talking about the bot as if it were a real person. Time limits should be enforced by parents or guardians and it is important to monitor how a child’s mobile phone is being used. Torney emphasized the importance of approaching this topic with care. He said: “Parents should approach these conversations with curiosity rather than criticism, helping their children understand the difference between AI and human relationships while working together to ensure healthy boundaries.” He concluded: “If a young person shows signs of excessive attachment or if their mental health appears to be affected, parents should seek professional help immediately.”
Sign up for personalised news
Subscribe to our Euro Weekly News alerts to get the latest stories into your inbox!
By signing up, you will create a Euro Weekly News account if you don’t already have one. Review our
Privacy Policy for more information about our privacy practices.
Written by
Johanna Gardener
Originally from Manchester, UK and with a degree in English with Modern Foreign Languages, she has been a permanent resident in Spain for the past 12 years. Many of these years, she has spent working as a secondary school teacher, as well as in journalism, editing and marketing. She currently lives in the historic centre of Malaga, where she enjoys writing, walking and animals.
Leave a comment Cancel reply
Your email address will not be published. Required fields are marked *
Website
Comment for robots
Known as the PEOPLE’S PAPER, Euro Weekly News is the leading English language newspaper in Spain. And it’s FREE!
Covering the Almeria, Axarquia, Costa Blanca North, Costa Blanca South, Costa del Sol , Costa Calida, Mallorca and beyond, EWN supports and inspires the individuals, neighbourhoods, and communities we serve, by delivering news with a social conscience. Whether it’s local news in Spain, UK news or international stories, we are proud to be the voice for the expat communities who now call Spain home.
With around half a million print readers a week and over 1.5 million web views per month, EWN has the biggest readership of any English language newspaper in Spain. The paper prints over 150 news stories a week with many hundreds more on the web – no one else even comes close.
Our publication has won numerous awards over the last 25 years including Best Free Newspaper of the Year (Premios AEEPP), Company of the Year (Costa del Sol Business Awards) and Collaboration with Foreigners honours (Mijas Town Hall). All of this comes at ZERO cost to our readers. All our print and online content always has been and always will be FREE OF CHARGE.
Sign up for personalised news
Subscribe to our Euro Weekly News alerts to get the latest stories into your inbox!
By signing up, you will create a Euro Weekly News account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.
Close