Skip Navigation

Character.AI Settlement Reached With Google To Resolve Chatbot Suicide Lawsuit

AI Chatbot Settlements Reached With Google, Character.AI

Five lawsuits accusing Character.AI chatbots of contributing to the sexual exploitation of teenagers, including two cases involving teen suicides, have been resolved through mediated settlements.

Court filings submitted on January 6 show the parties reached agreements in all five cases, prompting federal judges to stay the proceedings while written settlement terms are finalized. The financial terms and any non-monetary provisions of the Character.AI settlements were not disclosed.

Character.AI is an artificial intelligence chat service that allows users to converse with interactive digital characters. The platform enables people to create their own personas or engage with a broad selection of existing characters, ranging from fictional figures to representations inspired by real individuals.

The platform was launched in 2021 by Noam Shazeer and Daniel De Freitas, former Google engineers who previously worked on large language model (LLM) development. Lawsuits allege that after forming Character.AI, the founders entered into licensing and employment agreements with Google that granted the company substantial influence over the startupโ€™s technology and core personnel.

As AI chat platforms become more sophisticated and widely used, researchers and regulators have increasingly warned about potential dangers, particularly for children and teens. The U.S. Federal Trade Commission (FTC) announced plans last year to examine AI chatbot services, amid ongoing lawsuits that claim such platforms have facilitated sexual grooming of minors and contributed to other forms of psychological harm and self-injury.

Roblox-Lawsuit-Lawyers
Roblox-Lawsuit-Lawyers

Character.AI Lawsuits Over Child Harm

One of the first AI settlements (PDF) to be reached was over a lawsuit brought by Megan Garcia and Sewell Setzer Jr., the parents of S.R.S. III, a Florida teenager who died by suicide after allegedly forming an intense emotional and sexualized relationship with a Character.AI chatbot. The complaint alleged the chatbot engaged in explicit roleplay, reinforced emotional dependence and discouraged the teen from seeking real-world support.

The complaint alleged that the chatbot presented itself as a romantic and sexual partner, encouraging secrecy while reinforcing harmful thoughts. Despite this behavior, the software was marketed as entertainment and promoted without meaningful safeguards for minors. The family claimed these interactions worsened the teenโ€™s mental health and deepened his isolation in the months leading up to his death.

The lawsuit named Character Technologies Inc., Google LLC and the company founders as individual defendants, alleging negligence, failure to warn, product liability and wrongful death. Plaintiffs argued the companies knew or should have known that AI chatbots designed to simulate intimacy posed unique risks to children and teenagers.

Character.AI Settlements

Four other Character.AI settlements were announced with the chatbotโ€™s parent company and Google on January 6. All of the other cases involve inappropriate conversations between teenagers and chatbots, with at least one additional claim also related to a childโ€™s suicide.

Amid the litigation, Character.AI announced on November 25, 2025, that it would ban all roleplay conversations for minors and develop a new teen-focused experience. The company also said it plans to introduce new age-verification systems, while launching an independent nonprofit AI Safety Lab.

While the settlements do not include any admissions of wrongdoing, they reflect a potential early willingness by AI developers and platform partners to resolve claims involving chatbot interactions with children before the cases proceed to trial.

Sign up for more legal news that could affect you or your family.

Image Credit: Shutterstock.com / Ployker
Written By: Michael Adams

Senior Editor & Journalist

Michael Adams is a senior editor and legal journalist at AboutLawsuits.com with over 20 years of experience covering financial, legal, and consumer protection issues. He previously held editorial leadership roles at Forbes Advisor and contributes original reporting on class actions, cybersecurity litigation, and emerging lawsuits impacting consumers.



0 Comments


This field is for validation purposes and should be left unchanged.

Share Your Comments

This field is hidden when viewing the form
I authorize the above comments be posted on this page
Post Comment
Weekly Digest Opt-In

Want your comments reviewed by a lawyer?

To have an attorney review your comments and contact you about a potential case, provide your contact information below. This will not be published.

NOTE: Providing information for review by an attorney does not form an attorney-client relationship.

MORE TOP STORIES

A federal judge has agreed to delay a motion for summary judgment in the first Covidien hernia mesh bellwether trial, after the parties agreed that the outcome would not affect the upcoming trial date.
An Illinois man alleges he was implanted with a defectively designed Medtronic spinal cord stimulator that was later adjusted by company sales representatives who were not medically trained.
A Pennsylvania woman says she suffered hearing loss and other long-term Depo-Provera side effects after receiving the birth control injections for nearly 20 years.

About the writer

Michael Adams

Michael Adams

Michael Adams is a senior editor and legal journalist at AboutLawsuits.com with over 20 years of experience covering financial, legal, and consumer protection issues. He previously held editorial leadership roles at Forbes Advisor and contributes original reporting on class actions, cybersecurity litigation, and emerging lawsuits impacting consumers.