Instagram Teen Safety Concerns Highlighted in New Report

Instagram Teen Safety Concerns Highlighted in New Report

A new whistleblower report accuses Instagram of deliberately designing features that endanger minors, stating that nearly two-thirds of its teen safety tools fail to work, leaving children vulnerable to harmful content and abuse.

The report was released last week by Arturo Béjar, a former Meta executive who worked on child safety issues at the company, in partnership with watchdog groups Cybersecurity for Democracy, Fairplay, the Molly Rose Foundation and ParentsSOS.

Launched in 2010, Instagram has grown into one of the most widely used social media platforms in the world, with more than 2 billion monthly active users. Now a central part of Meta’s business model, the photo- and video-sharing app generates tens of billions of dollars in advertising revenue each year by keeping users engaged and scrolling. 

However, that influence has also placed the platform at the center of growing scrutiny. Instagram and other apps owned by Meta, alongside TikTok, YouTube and Snapchat, now face hundreds of social media addiction lawsuits filed by families who claim the companies knowingly exploit human psychology to keep children online longer, fostering compulsive use at the expense of their mental health. 

Many lawsuits allege that this deliberate design has fueled a youth mental health crisis, contributing to anxiety, depression, eating disorders, and in some cases, attempted or completed suicides.

Social-Media-Addiction-Attorneys
Social-Media-Addiction-Attorneys

In the whistleblower report, Béjar’s analysis found that most of Instagram’s well-publicized safety tools are either broken, discontinued or easy to bypass. 

Of the 47 safety features tested, researchers determined that 64% were ineffective, 19% had major limitations, and only 17% worked as promised.

Despite Meta’s claims, investigators discovered that teen accounts could still access suicide, self-harm and eating disorder content. Instagram’s autocomplete even suggested search terms that directed teens to harmful material. Test accounts also received recommendations for sexual content, violent videos and body-image posts, even when the strictest safety settings were enabled.

The report highlights disturbing evidence that Instagram’s algorithm encouraged children under 13 to post sexualized content in exchange for likes and views. Some posts, uploaded by children as young as six, drew tens of thousands of views and predatory adult comments. Although Meta formally bans users under 13, researchers estimated that millions of young children remain active on the platform.

Instagram Messaging and Safety Tool Failures

The testing also revealed major flaws in messaging restrictions. Adults were still able to contact teens who did not follow them, undermining promised protections. In some cases, minors were rewarded with a “rain of emojis” for enabling disappearing messages, a feature critics say predators exploit because it erases conversations and makes abuse harder to report.

Tools designed to reduce compulsive use were equally ineffective. Meta’s heavily promoted “Take a Break” feature appears to have been discontinued, while “Nighttime Nudges” failed to activate. Teens cannot set their own usage limits, and changing notification settings required navigating 50 toggles across 10 different screens.

Parental controls misled families by showing sanitized activity summaries while concealing that violent, sexual and self-harm content was still being pushed to children.

“Parents should know, the Teen Accounts charade is made from broken promises. Kids, including many under 13, are not safe on Instagram. This is not about bad content on the internet, it’s about careless product design. Meta’s conscious product design and implementation choices are selecting, promoting, and bringing inappropriate content, contact, and compulsive use to children every day.”

— Arturo Béjar

The report urges Meta to overhaul its safety measures and calls on regulators to act against what it describes as misleading claims, specifically recommending passage of the Kids Online Safety Act (KOSA) in the U.S. and stronger enforcement of the Online Safety Act in the U.K., warning that self-regulation has failed.

Social Media Addiction Lawsuits

The concerns raised in Béjar’s report come against the backdrop of a rapidly expanding wave of social media addiction litigation. Families, school districts and public entities across the country are pursuing claims that platforms like Instagram, TikTok, YouTube and Snapchat were deliberately engineered to maximize user engagement, fueling a youth mental health crisis.

At the federal level, these lawsuits have been centralized in a multidistrict litigation (MDL) before U.S. District Judge Yvonne Gonzalez Rogers in the Northern District of California, while a parallel consolidation of state claims is underway in California Superior Court under Judge Carolyn Kuhl.

In the federal litigation, Judge Rogers has identified six school district social media addiction lawsuits that will serve as the first test trials. However, the parties informed the court earlier this month they could not agree on which lawsuit should lead off.

Plaintiffs are pushing for the Tucson Unified School District’s case to be tried first, noting it is ready for trial and raises central allegations that are common across the litigation. Meanwhile, defendants argue that the Irvington School District’s case should go first, saying the smaller district is more representative of most claims filed by school systems nationwide.

While the outcomes of these early bellwether trials will not be binding on other lawsuits in the MDL, they are expected to heavily influence how juries respond to recurring evidence and testimony. The results could shape the direction of any eventual settlement negotiations.

The very first trial involving claims of social media addiction is set to take place in California state court, beginning on November 24, 2025, while the federal bellwether trials are not expected to start until late 2026.

Sign up for more legal news that could affect you or your family.

Image Credit: Primakov / Shutterstock.com

Written By: Michael Adams

Senior Editor & Journalist

Michael Adams is a senior editor and legal journalist at AboutLawsuits.com with over 20 years of experience covering financial, legal, and consumer protection issues. He previously held editorial leadership roles at Forbes Advisor and contributes original reporting on class actions, cybersecurity litigation, and emerging lawsuits impacting consumers.




0 Comments


This field is for validation purposes and should be left unchanged.

Share Your Comments

This field is hidden when viewing the form
I authorize the above comments be posted on this page
Post Comment
Weekly Digest Opt-In

Want your comments reviewed by a lawyer?

To have an attorney review your comments and contact you about a potential case, provide your contact information below. This will not be published.

NOTE: Providing information for review by an attorney does not form an attorney-client relationship.

MORE TOP STORIES

The JPML has received a request to consolidate video game addiction lawsuits against the makers of Minecraft, Roblox and Fortnite before one federal judge for coordinated pretrial proceedings.
California state court will host three talcum powder bellwether trials beginning in November, with each trial involving claims of ovarian cancer injuries.