Mother sues AI chatbot company over son’s suicide
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son’s suicide in February, saying he became addicted to the company’s service and deeply attached to a chatbot it created. In a lawsuit filed Tuesday in Orlando, Florida federal court, Megan Garcia said Character.AI targeted her son, Sewell Setzer, with “anthropomorphic, hypersexualized, and frighteningly realistic experiences.” She said the company programmed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside” of the world created by the service. The lawsuit also said he expressed thoughts of suicide to the chatbot, which the chatbot repeatedly brought up again.
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said. It said it had introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to “reduce the likelihood of encountering sensitive or suggestive content” for users under 18.
Reuters, the news and media division of Thomson Reuters, is the world’s largest multimedia news provider, reaching billions of people worldwide every day. Reuters provides business, financial, national and international news to professionals via desktop terminals, the world's media organizations, industry events and directly to consumers.