Meta Faces Lawsuit: Alleged Negligence on Teen Safety and Harmful Content Exposure
November 23, 2025
The brief contends Meta misrepresented safety to Congress, citing a 2020 Senate inquiry response that claimed no clear link between teen use and depression, despite internal studies suggesting otherwise.
The plaintiffs allege Meta knew millions of adults were contacting minors and that its platforms worsened teen mental health, with harmful content like eating disorders, self-harm, and child sexual abuse material detected but often not removed, while the company did not disclose these harms publicly or to Congress.
A multidistrict lawsuit accuses Meta of downplaying harms to children and misleading the public about safety on Instagram and Facebook.
Despite concerns about toxicity, Meta reportedly shelved or reversed measures like hiding likes and warnings about beauty-filter risks because they affected engagement and ad revenue.
Plaintiffs argue Meta’s actions reflect a deliberate strategy to maximize growth and engagement at the expense of child safety, with safety features often shelved or weakened to preserve metrics and profits.
Unsealed court filings, based on depositions and internal documents, portray a culture prioritizing user growth over child safety, with internal acknowledgments of widespread risk and harm.
Internal Meta documents and sworn depositions indicate safety improvements were repeatedly blocked by executives who prioritized growth and engagement over safety features.
Key internal disagreements are highlighted, including resistance to private-by-default teen accounts due to potential engagement loss and debates over features like hiding likes and beauty filters, which were rolled back or deprioritized to protect metrics.
An Instagram safety chief testified to a harsh “17 strikes” policy for sex trafficking-related accounts, meaning accounts were not suspended until at least 16 violations occurred, a threshold far above industry norms.
Former Meta safety chief Vaishnavi Jayakumar described the 17x strike threshold as signaling a lax deletion policy for repeated offenses.
Internal recommendations dating back to 2019 to default teen accounts to private were rejected by growth teams to avoid hurting engagement, delaying privacy protections until 2024 and exposing teens to billions of inappropriate interactions with adults.
Allegations include failure to automatically remove harmful content, including child sexual abuse material and self-harm content, letting detected violations remain live unless high enough thresholds were met.
Summary based on 2 sources
Get a daily email with more World News stories
Sources

Time • Nov 22, 2025
Court Filings Allege Meta Downplayed Risks to Children and Misled the Public
Timeline Daily • Nov 23, 2025
Meta Let Adult Strangers Contact Minors: Explosive Unsealed Allegations Against Meta