While major AI companies continue to claim chatbots as tools built for human benefit, a growing number of cases involving alleged influence on teenagers have come to light. These incidents have prompted legal action in multiple courts, with companies such as Character.AI, Google, OpenAI and others named in the lawsuits. Now, the new reports claim that Character.AI and Google have moved to settle multiple lawsuits filed by families who alleged that interactions with AI chatbots contributed to self-harm and suicide among teenagers, according to newly filed court documents.
According to reports, the companies informed a federal court in Florida that they had reached a mutual agreement in principle covering all claims and requested a temporary pause in proceedings to finalise the settlement terms. The specifics of the agreements have not been made public, and representatives for Character.AI, Google and the affected families all declined to comment on the outcome.
Also read: Google Pixel 9 Pro price drops by over Rs 25,700: How to get this deal
Among the cases that have been reportedly resolved is a widely reported lawsuit brought by Megan Garcia, who alleged that a Character.AI chatbot inspired by Game of Thrones played a role in her 14-year-old son’s death. The lawsuit added that the teen developed a harmful reliance on the chatbot, which allegedly reinforced suicidal thoughts. The lawsuit also named Google as a defendant, claiming the company had a significant role in Character.AI’s development through funding, technology and staffing links.
Following the legal action, Character.AI made several safety-related changes to its platform. These included deploying a separate AI model for users under 18, tighter content restrictions, parental control features, and eventually prohibiting minors from accessing open-ended character chats entirely.
Court filings show that similar lawsuits in Colorado, New York, and Texas have also been settled. The agreements must still be approved by the court before they can be formally closed.