Meta under fire for years-long delay in teen DM safety feature, court filings reveal

HIGHLIGHTS

Court filings reference 2018 internal emails acknowledging risks of explicit content in Instagram DMs.

The automatic nudity blur feature for teens was only rolled out in April 2024.

Plaintiffs argue tech firms prioritised engagement and growth over child safety, while Meta says it balanced privacy with protection.

Meta under fire for years-long delay in teen DM safety feature, court filings reveal

Prosecutors pursuing claims that social media platforms are designed in ways that harm the teens have questioned why Meta had to take years to introduce safeguards aimed at restricting explicit content in private messages on Instagram. This comes during an ongoing federal lawsuit examining whether platforms such Instagram promote addictive behaviour or expose minors to unsafe experiences.

Digit.in Survey
✅ Thank you for completing the survey!

As per the reports, Instagram boss Adam Mosseri was questioned about internal discussions back to 2018. Emails used in the deposition showed company executives acknowledging that inappropriate content, including sexually explicit images, could be shared via Instagram’s direct messaging feature. Even after this awareness, a tool for automatically blurring explicit images in teen accounts was not released until April 2024.

During the questioning, Mosseri denied that Meta should have informed parents that private messages were not proactively monitored beyond the removal of child sexual abuse content. He stated that the company has long tried to strike a balance between user privacy and security measures, and that problematic content can be shared on almost any messaging platform.

Also read: Samsung Galaxy S26 Ultra vs S25 Ultra vs S24 Ultra: What upgrades to expect from this year’s flagship

It also disclosed survey findings related to teen safety on the app. Nearly one in five users aged 13 to 15 reported encountering unwanted sexual imagery, while a smaller but notable percentage said they had seen content related to self-harm within a recent seven-day period of app use.

The lawyers claim that the delay in implementing the tools in place is a big concern that the tech companies prefer to prioritize growth and engagement over child safety. For the unversed, the case which includes YouTube, TikTok, Meta, and Snap is being heard by the US District Court for the Northern District of California.

According to spokesperson Liza Crenshaw as cited by reports, Meta has spent years working with parents, experts, and law enforcement to build teen safety. She cited parental controls and Teen Accounts as examples of actions taken to improve safety, but added that the company is still refining its strategy.

Ashish Singh

Ashish Singh

Ashish Singh is the Chief Copy Editor at Digit. He's been wrangling tech jargon since 2020 (Times Internet, Jagran English '22). When not policing commas, he's likely fueling his gadget habit with coffee, strategising his next virtual race, or plotting a road trip to test the latest in-car tech. He speaks fluent Geek. View Full Profile

Digit.in
Logo
Digit.in
Logo