Meta, the company that owns Facebook and Instagram, is now at the center of a major legal storm over its handling of research on social media’s impact on mental health, especially for teenagers. According to newly released U.S. court filings, Meta is accused of burying its own evidence that its platforms could worsen anxiety, depression, and loneliness in young users. The lawsuit, which includes claims from school districts and parents nationwide, says Meta knew about shocking findings but kept them under wraps to protect its business interests.

In 2020, Meta teamed up with Nielsen, a top research company, for a study called Project Mercury. Their goal: track what happened to people, especially teens, when they took a break from Facebook and Instagram.
The outcome was clear, Quitting these apps for just a week led to less anxiety and loneliness, and users (especially teens) felt better about themselves. This was some of the strongest evidence so far that heavy social media use can directly harm young people’s mental health.
But Meta didn’t share these results publicly. The lawsuit says the company shut down the research and later told Congress it simply didn’t have the data needed to measure these harms. Court documents also show that Meta employees sounded alarms internally, with some even comparing the situation to how the tobacco industry once covered up the danger of cigarettes.
The filings say Meta didn’t just hide the research, it also ignored or shelved new safety features that could have protected kids, because these would have likely reduced user engagement and revenue. Evidence points to Meta failing to remove harmful content, overlooking flagged accounts, and putting profits above user safety, even after knowing the risks highlighted by its own internal research. Examples cited in the filings include increases in harmful posts about self-harm and eating disorders that were left online even after detection. Features that could have made teenage accounts more private weren’t launched, with concerns that this would hurt growth and engagement numbers.
Meta says the controversy is being exaggerated. Company spokespeople claim the research wasn’t shared due to “methodology problems,” not because of the findings themselves. They say the company is committed to keeping young users safe, pointing to improved tools and parental controls added over the past ten years. Critics, however, think Meta’s response has been slow and insufficient, especially now that hidden research documents have come to light.
Meta isn’t alone, similar lawsuits are targeting TikTok, Google, and Snapchat over the harm their platforms may cause to kids and teens. This case could have huge consequences for how the largest tech companies run social platforms and protect users in the future. More details are expected early next year, when a major hearing is scheduled in California.
This story is already pushing governments, schools, and families to demand more transparency and stronger protections from tech giants whose platforms shape the daily lives of millions of young people around the world.
Be the first to post comment!