Nearly 20 Percent of Young Teens Report Unwanted Sexual Content on Instagram, Court Records Reveal

By Mintesinot Nigussie
Published on 02/24/26

Court documents filed in California show that nearly 20 percent of Instagram users aged 13 to 15 reported encountering sexual or nude content they did not wish to see, Reuters reported.

Meta stated the figure comes from a 2021 survey of users and is not based on direct content review.

The filing includes a March 2025 deposition from Instagram chief Adam Mosseri, in which he said roughly 8 percent of users in the same age range had observed posts depicting self-harm or threats of self-harm. He added that most sexually explicit content is shared through private messages, limiting the company’s ability to review it without affecting user privacy.

A separate memo dated January 20, 2021, outlined Meta’s focus on teen users, describing them as influential within households and key to shaping how siblings and parents engage with the platform. Meta did not provide a comment on the memo.

Meta, which owns Facebook and Instagram, faces thousands of federal and state lawsuits in the U.S. and scrutiny worldwide over claims its products encourage addictive behaviour and negatively affect minors’ mental health. In late 2025, the company said it would remove sexual or nude content for teen users, including AI-generated material, with exceptions for educational or medical purposes.

A company spokesperson said Meta continues to track safety improvements and is committed to enhancing protections for young users.