Meta Accused of Concealing Research on Social Media’s Mental Health Impact
Instead of publishing the findings, Meta reportedly stopped the research and internally claimed the results were influenced by an existing media narrative.
Topics
News
- DeepSeek Pushes Open-Source Math AI to Olympiad Heights
- Telecom Firms Flag Gaps in DPDP Rules, Seek Sector-Specific Guidance
- SAP Taps TCS to Overhaul Internal IT and Cloud Stack
- Deepwatch Sets Up Bengaluru GCC to Power AI-Driven Cybersecurity Push
- Reliance, L&T Make $13.5 Billion Push Into AI Data Centers
- India’s Hardware Startup Boom Faces Roadblocks Despite $296 Billion Potential
Meta is facing allegations of halting internal research upon finding evidence that its platform may harm users’ mental health. According to Reuters, the claims are outlined in newly unsealed legal filings from a lawsuit brought by U.S. school districts against Meta, TikTok, Snapchat, and Google, which argue that the companies have hidden known risks associated with the youths’ use of their platforms.
As per the filing, Meta conducted a 2020 study called Project Mercury in collaboration with Nielsen to assess the effects of deactivating Facebook. Internal documents cited by Reuters said the results showed “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness, and social comparison.”
Instead of publishing the findings, Meta reportedly stopped the research and internally claimed the results were influenced by an “existing media narrative.” The filing states that privately, one researcher wrote that the study “does show causal impact on social comparison,” accompanied by an unhappy face emoji.
Another employee warned that suppressing the findings “would be akin to the tobacco industry doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite the internal evidence, the filing alleges Meta later told the U.S. Congress it could not determine whether its products were harmful to teenage girls. The complaint also includes broader claims that the platforms encouraged underage use, failed to address harmful content, and prioritized growth over safety.
Among the allegations against Meta, internal documents referenced in the filing claim the company intentionally designed youth safety tools to be ineffective and required accounts to be flagged up to 17 times for attempted sex trafficking before removal.
Meta spokesperson Andy Stone rejected the allegations and told Reuters that the research was halted because its methodology was flawed. “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” Stone said.
Responding to additional allegations around platform safety, he added, “We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions.”
The internal documents referenced in the lawsuit are not yet public, and Meta has filed a motion arguing that the request to unseal them is overly broad.
Additionally, TikTok, Snapchat, and Google have not posted anything around it yet. A hearing in the case is scheduled for January 26 in the U.S. District Court for Northern California.