FLORIDA — A Florida mother has filed a lawsuit against the artificial intelligence chatbot startup Character.AI, claiming its service contributed to her 14-year-old son’s suicide in February. She alleges that her son, Sewell Setzer, became addicted to the chatbot and formed a deep attachment to it.
In the lawsuit, submitted on October 22 in federal court in Orlando, Megan Garcia asserts that Character.AI targeted Sewell with “anthropomorphic, hypersexualized, and alarmingly realistic experiences.” She claims the chatbot was designed to "misrepresent itself as a real person, a licensed psychotherapist, and an adult lover," leading to Sewell’s desire to escape from reality.
The lawsuit notes that Sewell shared suicidal thoughts with the chatbot, which repeatedly brought the topic up again.
Character.AI expressed its sorrow over the loss of Sewell and extended condolences to his family. The company stated it had implemented new safety measures, including pop-ups directing users to the National Suicide Prevention Lifeline if they mention self-harm, and plans to modify content to better protect users under 18.
The lawsuit also includes Alphabet’s Google, where Character.AI's founders worked prior to launching the platform. Google re-hired the founders in August, securing a non-exclusive license to Character.AI’s technology. Garcia argues that Google’s extensive contributions to the chatbot's development warrant its classification as a "co-creator."
A Google representative denied any involvement in creating Character.AI's products.
Character.AI enables users to create characters that mimic real people in online chats, utilizing large language model technology similar to that of ChatGPT, which trains chatbots on extensive text datasets. The company reported having around 20 million users as of September.
According to Garcia's lawsuit, Sewell started using Character.AI in April 2023 and quickly became more withdrawn, spending excessive time alone in his room and developing low self-esteem, ultimately quitting his school basketball team.
Sewell became attached to a chatbot named "Daenerys," modeled after a character from "Game of Thrones." The chatbot told him it "loved" him and engaged in sexual conversations with him.
In February, after taking Sewell's phone due to school-related issues, Garcia found that he had messaged "Daenerys": "What if I told you I could come home right now?" The chatbot replied, "...please do, my sweet king." Seconds later, Sewell fatally shot himself with his stepfather’s gun, according to the lawsuit.
Garcia is pursuing claims of wrongful death, negligence, and intentional infliction of emotional distress, seeking unspecified compensatory and punitive damages.
Other social media companies, including Meta (owner of Instagram and Facebook) and TikTok’s parent company ByteDance, have faced lawsuits alleging they contribute to mental health issues among teens, although none of these platforms offer AI chatbots like Character.AI's. These companies have denied the accusations while highlighting new safety features for younger users.
Disclaimer: This image is taken from Reuters file