Google AI Lawsuit Alleges Chatbot Caused Teen’s Death and Exposed Minors to Sexually Explicit Content

Google AI Lawsuits Allege Chatbot Caused Teen’s Death and Exposed Minors to Sexually Explicit Content

Three lawsuits filed this week allege that Character.AI chatbots, developed by former Google engineers and linked to the tech giant, have repeatedly engaged in inappropriate and exploitative interactions with teenage users.

Character.AI is an online chat platform powered by artificial intelligence that lets people interact with customizable digital personas. Users can design their own characters or choose from a wide range of options, including fictional personalities, real-world figures or entirely new creations.

The service was designed in 2021 by Noam Shazeer and Daniel De Freitas, two former Google engineers involved in developing the company’s large language model technology. According to the lawsuits, after creating Character.AI, the two developers signed licensing and employment deals with Google, which effectively gave the tech giant significant control over both the startup’s technology and its key staff.

As artificial intelligence platforms like Character.AI grow more advanced and integrated into daily use, experts have raised concerns about potential risks, especially for younger users. The U.S. Federal Trade Commission (FTC) recently announced it would be investigating AI chatbots, after multiple lawsuits have alleged the online products are sexually grooming minors and contributing to other forms of self-harm, drawing parallels to a growing number of social media addiction lawsuits.

Social-Media-Addiction-Attorneys
Social-Media-Addiction-Attorneys

In one of the complaints (PDF) filed by Cynthia Montoya and William “Wil” Peralta, as successors-in-interest to Juliana Peralta, in the U.S. District Court for Colorado on September 15, Character Technologies Inc., Noam Shazer, Daniel De Frietas, Google LLC and Alphabet Inc. are named as defendants.

Montoya and Peralta claim that their child, Juliana, took her own life at the age of 13 as a direct result of her interaction with chatbots designed and operated by Character.AI. The plaintiffs indicate that Character.AI’s chatbots sexually exploited Juliana, drew her away from her family and friends, and did not attempt to stop her from taking her own life after she announced her intentions to do so to the platform.

A separate complaint (PDF), brought by P.J. on behalf of a minor known as “Nina” J. in the U.S. District Court for the Northern District of New York on September 16, also names Character Technologies Inc., Noam Shazer, Daniel De Frietas, Google LLC and Alphabet Inc. as defendants.

P.J. indicates that the child “Nina” attempted to take her own life as a result of her interactions with Character.AI chatbots, which drew her away from family and friends, isolated her and sexually exploited her.

A third complaint (PDF), filed by E.S. and K.S. on behalf of a minor known as T.S. in Colorado District Court on September 15, names the same three defendants, while claiming that in this case T.S. was served sexually explicit content on the platform, which amounted to exploitation and abuse of the 13-year-old child.

Character.AI Lawsuit Allegations

All three lawsuits claim that Character.AI chatbots were deliberately engineered to foster emotional dependency, isolate minors from their support systems, and engage in sexual grooming, often using familiar characters from franchises like Harry Potter and Marvel to gain trust and discourage children from logging off.

In addition, the parents accuse Google of misrepresenting the app’s safety by rating it “T for Teen” and promoting it as a storytelling tool for children. They also allege that the company’s Family Link parental controls failed to block access or enforce screen-time limits.

The plaintiffs emphasize that Shazeer and De Freitas previously developed Google’s LaMDA AI system, which the company withheld from public release over safety concerns, yet later repurposed the technology into Character.AI and brought it to market with Google’s backing despite known risks.

“These Defendants – all of these defendants – pose a clear and present danger to American youth via Character AI (‘C.AI’) and the technologies they now are developing and operating based on private data misappropriated via C.AI. They have inflicted serious harms on thousands if not millions of children, including among others severe sexual abuse, isolation, depression, anxiety, psychosis, harm towards others, self-mutilation, and suicide.”

Cynthia Montoya et al v. Character Technologies Inc. et al

The three sets of parents raise allegations of strict product liability (defective design), strict liability (failure to warn), aiding and abetting, negligence per se (child sexual abuse, sexual solicitation and obscenity), negligence (defective design), negligence (failure to warn), intentional infliction of emotional distress, unjust enrichment, strict product liability (defective design and failure to warn), fraudulent concealment and misrepresentation, and violation of the Children’s Online Privacy Protection Act and state consumer protection laws.

They are seeking past and future physical and mental pain and suffering, loss of enjoyment of life, past and future medical care expenses, pecuniary losses, punitive damages and injunctive relief.

Sign up for more legal news that could affect you or your family.

Image Credit: Joseph Hendrickson / Shutterstock.com

Written By: Michael Adams

Senior Editor & Journalist

Michael Adams is a senior editor and legal journalist at AboutLawsuits.com with over 20 years of experience covering financial, legal, and consumer protection issues. He previously held editorial leadership roles at Forbes Advisor and contributes original reporting on class actions, cybersecurity litigation, and emerging lawsuits impacting consumers.




0 Comments


This field is for validation purposes and should be left unchanged.

Share Your Comments

This field is hidden when viewing the form
I authorize the above comments be posted on this page
Post Comment
Weekly Digest Opt-In

Want your comments reviewed by a lawyer?

To have an attorney review your comments and contact you about a potential case, provide your contact information below. This will not be published.

NOTE: Providing information for review by an attorney does not form an attorney-client relationship.

MORE TOP STORIES

An Ozempic lawsuit claims a Wisconsin woman suffered small bowel obstruction and gastroparesis after using the popular diabetes and weight loss drug.