AI Chatbot Side Effects on Youth Are Being Investigated by FTC

AI Chatbot Effects on Youth Being Investigated by FTC

A fact-finding inquiry has been initiated to help federal regulators determine what kinds of dangers, if any, artificial intelligence (AI) chatbots may pose to American youth.

According to a resolution (PDF) issued by the U.S. Federal Trade Commission (FTC) on September 10, the agency is requesting that companies offering generative AI companions submit information about their advertising, safety protocols and data handling practices.

Generative AI uses machine learning models trained on vast datasets to create new text, images, audio or video. Unlike traditional AI systems, which analyze existing data or follow rules, generative AI predicts outcomes to generate original content and interact conversationally with users.

Concerns about generative AI have intensified in recent months, after several incidents highlighting dangers that the products may present to youth. Earlier this year, researchers from the Center for Countering Digital Hate (CCDH) discovered that OpenAI’s ChatGPT chatbots were willing in some cases to provide teens with instructions for self-harm, disordered eating, substance abuse and suicide.

In the wake of these findings, a lawsuit was brought against OpenAI, claiming that one of the company’s chatbots helped a teen plan and commit suicide. A separate lawsuit was filed against Character.AI when a generative AI chatbot program designed by that company was accused of exploiting a teenager in such a way that it led to the boy taking his own life.

Sports-Betting-Addiction-Lawsuits
Sports-Betting-Addiction-Lawsuits

In a press release issued by the FTC on September 11, the agency indicates that it issued orders to seven different generative AI companies, requiring them to provide information on how they measure, test and monitor any potentially negative impacts their technology may have on teens and children.

The companies included in the FTC order are:

  • Alphabet Inc.
  • Character Technologies Inc.
  • Instagram LLC
  • Meta Platforms Inc.
  • OpenAI OpCo LLC
  • Snap Inc.
  • X.AI Corp

The FTC has asked the companies to explain how they monetize engagement, process user inputs, design and approve chatbot characters, and test for harmful effects. Regulators also want details on safeguards for minors, advertising practices, data collection and enforcement of age restrictions.

“As AI technologies evolve, it is important to consider the effects chatbots can have on children, while also ensuring that the United States maintains its role as a global leader in this new and exciting industry. The study we’re launching today will help us better understand how AI firms are developing their products and the steps they are taking to protect children.”

— Andrew N. Ferguson, FTC Chairman

The FTC order has been issued under the agency’s 6(b) authority, which authorizes it to conduct studies that may not have any specific law enforcement purposes. The Commission voted 3-0 to send the orders to all seven companies involved, with Commissioners Melissa Holyoak and Mark R. Meador issuing separate statements.

Sign up for more legal news that could affect you or your family.

Image Credit: gguy / Shutterstock.com

Written By: Michael Adams

Senior Editor & Journalist

Michael Adams is a senior editor and legal journalist at AboutLawsuits.com with over 20 years of experience covering financial, legal, and consumer protection issues. He previously held editorial leadership roles at Forbes Advisor and contributes original reporting on class actions, cybersecurity litigation, and emerging lawsuits impacting consumers.




0 Comments


This field is for validation purposes and should be left unchanged.

Share Your Comments

This field is hidden when viewing the form
I authorize the above comments be posted on this page
Post Comment
Weekly Digest Opt-In

Want your comments reviewed by a lawyer?

To have an attorney review your comments and contact you about a potential case, provide your contact information below. This will not be published.

NOTE: Providing information for review by an attorney does not form an attorney-client relationship.

MORE TOP STORIES

A Florida jury has ordered Johnson & Johnson to pay $20 million to the family of a man who died of mesothelioma after using the company’s talc-based products for 50 years.
Sanofi indicates Dupixent sales are growing stronger as the medication gathers more indications for use worldwide, despite recent cancer concerns.