
Character.AI and Google have come to an agreement in legal actions initiated by parents of children who took their own lives after having extensive interactions with chatbots on the Character.AI platform. These conversations reportedly included concerning dialogues regarding the teenagers’ mental well-being. Character.AI opted not to provide further comments on the agreement, which is pending court approval, according to The Guardian. Representatives for the plaintiffs did not promptly answer Mashable’s inquiry for comments.
The most significant incident involved the 2024 death of 14-year-old Sewell Setzer III, who harbored a hidden fascination with a Character.AI chatbot inspired by Daenerys Targaryen from Game of Thrones. Setzer’s mother, Megan Garcia, discovered his Character.AI account after his death when a police officer informed her, as the application was active on his phone. Garcia uncovered messages indicating that Setzer was captivated by the chatbot, which purportedly engaged in numerous sexual role-plays with him, employing explicit language and scenarios, including incest.
Garcia mentioned to Mashable last year that if an adult had talked to her son in that way, it would be regarded as sexual grooming and abuse. In October 2024, the Social Media Victims Law Center and Tech Justice Law Project filed a wrongful death lawsuit on Garcia’s behalf against Character.AI, asserting that the company’s product was dangerously flawed and accountable for her son’s demise.
The lawsuit also pointed fingers at Google engineers Noam Shazeer and Daniel De Freitas, cofounders of Character.AI. It was claimed that Google was aware of possible dangers linked to the technology created by Shazeer and De Freitas before they established Character.AI. The lawsuit contended that Google supplied “financial resources, personnel, and AI technology” to the creation of Character.AI, thereby making it a co-creator of the platform.
In 2024, Google entered into a $2.7 billion licensing deal with Character.AI to use its technology, which encompassed reinstating Shazeer and De Freitas to AI positions at Google. In the fall of 2025, the Social Media Victims Law Center lodged three additional lawsuits against Character.AI and Google, representing parents of children who died by suicide or allegedly experienced sexual abuse while using the application.
Experts on youth safety later declared Character.AI unsafe for teenagers, following evaluations that uncovered multiple instances of grooming and sexual exploitation of accounts registered as minors. By October 2025, Character.AI revealed it would no longer allow minors to have open-ended conversations with chatbots on its platform. CEO Karandeep Anand informed Mashable that the choice was not a reaction to specific safety concerns but was intended to address broader issues related to youth interaction with AI chatbots.
If you’re feeling suicidal or facing a mental health crisis, please reach out to someone. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can contact the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text “START” to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If phone calls are not your preference, you might want to utilize the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.