Thursday, February 12, 2026
HomeTechMeta Faces Damning Allegations It Buried Evidence of Social Media Harm, Landmark...

Meta Faces Damning Allegations It Buried Evidence of Social Media Harm, Landmark Court Filings Say

SAN FRANCISCO — Meta is confronting explosive accusations that it covered up internal evidence that its family of social media products had contributed to the harm of children and teens, newly unredacted court filings in a federal class action brought by U.S. school districts show. The plaintiffs say Meta killed a 2020 study that concluded deactivating Facebook and Instagram lowered depression, anxiety, and loneliness; and then hid those findings from parents, teachers, and Congress on Nov. 23, 2025.

They indicate that Meta researchers hired Nielsen for a 2020 experiment, code-named “Project Mercury,” to test what happened when people deactivated Facebook and Instagram for a week, as previously detailed in a thorough Reuters account of the filings. Those who took a break reported being less depressed, anxious, lonely, and likely to compare themselves with others compared with those who did not take a break — results that Facebook staffers said the staff called “causal evidence” of harm.

Instead of publishing those results or commissioning further research into them, however, Meta reportedly shut down Project Mercury and insisted internally that the results were compromised by negative media coverage of the company. At the same time, staff informed global policy head Nick Clegg that the study found a causal impact on social comparison and privately compared not talking about it to tobacco companies hiding evidence of cigarettes.

The new claims are contained in a late-Friday filing from plaintiffs’ firm Motley Rice in sprawling “social media addiction” litigation that combines hundreds of U.S. school districts suing Meta, Google, TikTok, and Snapchat (in which Nine.com.au’s parent company, News Corp., is an investor). The districts contend the companies deliberately hid risks they themselves identified from children, parents, and teachers amid a strong push to promote their products in classrooms and at home.

In addition to Project Mercury, the filing alleges Meta designed tools for youth safety that would be insignificant and impractical to use, delayed implementing anti-child predator measures for years, and insisted on 17 instances of trying to traffic people into sex before it permanently banned a profile. Internal teams reportedly warned that optimizing feeds for teen engagement would mean surfacing more dangerous content, but the company left them in place to protect growth.

It is also charged with recruiting child-focused nonprofits and parent groups to publicly affirm that its apps are safe, even as internal research showed they posed serious risks. In an example of the internal self-congratulation, the filing cites executives bragging that after its sponsorships at organizations like the National PTA, “We can do whatever we want,” followed by “We’ll keep putting up statements from ‘third-parties’ despite the bad headlines they give — you have to maximize for radicalized folks understanding this.”

Many of the underlying documents are filed under seal, and Meta has asked a judge to strike significant portions of the plaintiffs’ brief, saying it mischaracterizes internal records and introduces material that is damaging to its business interests. A hearing to consider what will be unsealed has been scheduled for Jan. 26 in the U.S. District Court for the Northern District of California.

In a statement, Meta spokesman Andy Stone said that the research in question used flawed methodology and had been cancelled for that reason, not because it produced inconvenient results. He also said that Meta had spent more than 10 years building teen-safety features, including parental controls, time limits, and content filters, and he called the narrative in the lawsuit “cherry-picked” and misleading, contending that the company’s efforts to protect teenagers are largely working.

Meta has also pointed to more recent product changes. It started limiting how Instagram and Facebook send content recommendations to children in January and vowed last month to treat teen users differently from adult users around sensitive topics like self-harm, eating disorders, and sexual content, after intense pressure from lawmakers and advocates. The Wall Street Journal reported on the changes, describing them as having come while lawsuits over youth mental health have become more frequent.

Critics say the recent filings mirror a longer trend in which Meta’s internal research sounded alarms while its public messaging remained upbeat. Internal slide decks from 2021 revealed that Instagram made body-image issues worse for many teenage girls, even as company statements minimized those risks; The Guardian reported on the documents, and separate reporting by The Wall Street Journal unpacked the findings of those slides.

A year later, over 40 U.S. states filed coordinated lawsuits claiming that Instagram and Facebook were designed to addict children, encourage social comparison, and gather data from users under the age of 13 in violation of federal law. A summary of those lawsuits carried by The Associated Press depicted them as a major front in the youth mental-health crisis, and state complaints accused Meta of hiding internal research that suggested harm to young users.

To those priorities was added working with law enforcement to ensure the safety of children in the metaverse, an initiative that nearly every platform has introduced in some fashion, or pledged to do so before it opens for business, was the “U.S. tech.” Even before the youth-safety controversies, Meta’s predecessor, Facebook, was under strict orders from the Federal Trade Commission to overhaul its governance and deploy five additional workers to monitor teen browsers, following a record $5 billion privacy fine in 2019 over deceptive practices around data sharing. The FTC settlement mandated extensive internal controls and board-level oversight of privacy decisions, which now cast fresh questions about how the company managed subsequent research into psychological harm.

Internal investigations continue to sprout . A meta-research project unveiled in October found that teenagers who already felt bad about their bodies were being actively steered toward much more “eating-disorder adjacent” and other provocative content on Instagram than their peers, leading to criticism of recommendation algorithms and demands for tougher safeguards; Reuters had reported on the study earlier.

In September, independent reports identified Meta employees who said they were discouraged from formally learning how children can be harmed by the company’s virtual-reality environments (one researcher said she was told by lawyers not to look into youth safety in VR). The Washington Post investigation and the Senate questioning further contributed to lawmakers’ discomfort with Meta’s research culture and its ability to confront uncomfortable findings.

As a matter of law, the new allegations could be significant because they paint a picture of internal work that Meta’s own employees referred to as “causal” evidence that its products exacerbated mental health for at least some users — not just correlational findings. If a judge admits them into the record, plaintiffs are likely to argue that Meta had deliberately disregarded plain and measurable red flags of foreseeable harm to minors while continuing to develop youth engagement.

The class action could also have implications for negotiations with other social-media defendants and for regulatory battles in Washington and state capitals, where lawmakers are considering rules to address youth design, data collection, and addictive product features. For Meta, which is already defending itself in antitrust trials and facing continued privacy enforcement actions, the possibility of further internal research coming to light risks reopening painful questions about what it knew about mental health issues on its platforms and when.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular