Meta buried study linking Facebook to mental health harm

Meta buried study linking Facebook to mental health harm - Professional coverage

According to TechSpot, Meta launched an internal research project called Project Mercury in 2020 that partnered with Nielsen to study Facebook’s psychological effects. The study found that users who deactivated their Facebook accounts for just one week reported significantly lower levels of depression, anxiety, loneliness, and negative social comparison. Instead of publishing these findings or conducting further research, Meta’s leadership shut down the project entirely. Internal documents reveal company officials questioned the study’s validity while comparing their handling of the data to tobacco industry tactics. The company also failed to share these results with Congress while maintaining it had no definitive evidence of harm to teenage users, particularly girls.

Special Offer Banner

The damning internal conversations

Here’s the thing that really stands out – Meta employees were having conversations that sound like they knew exactly what was happening. One researcher argued internally that the survey showed a causal link between Facebook use and social comparison stress. And they weren’t shy about the comparison either – internal discussions literally compared Meta’s handling of this data to how tobacco companies obscured cigarette health risks. That’s not something you say lightly in internal communications. When your own people are making those kinds of comparisons, you’ve got a serious problem on your hands.

Safety features designed to fail

The lawsuit documents go way beyond just this one buried study. They describe how many youth safety tools were actually engineered to be rarely used. Think about that for a second – safety features designed to look good on paper but not actually function effectively in practice. Internal tests of more restrictive features were blocked because they might hurt user growth metrics. Meta kept a ridiculously high “strike threshold” requiring users to be caught more than a dozen times for serious offenses like sexual exploitation before being removed. And engagement algorithms for teens specifically boosted content tied to body dysmorphia and eating disorders while safety engineers’ attempts to reduce these risks were deprioritized.

This isn’t just a Meta problem

Look, this pattern appears across the social media industry. The lawsuit also targets Google, Snap, and TikTok with similar allegations. Internal TikTok presentations described offering “spinach” – healthier educational content – to children on its Chinese app Douyin, while American kids got “opium” – hyper-engaging but potentially harmful content. Snapchat’s internal emails acknowledged that features like Snapstreaks effectively cultivated compulsive behavior among teens. And several companies apparently sponsored third-party child advocacy groups while those same groups publicly endorsed the platforms’ safety practices. It’s a coordinated industry-wide effort that court filings allege included staff comparing their platforms to drugs.

Where this is all heading

So what happens now? Meta’s spokesperson Andy Stone says they abandoned the Facebook deactivation study due to methodological flaws, not because of its conclusions. He claims their safety tech is “broadly effective.” But the timing here is crucial – with the next hearing set for January 26 in Northern California federal court, we’re about to see how much of Silicon Valley’s internal decision-making gets forced into public view. These cases could set major precedents for how internal research gets disclosed across the entire tech industry. Basically, we’re looking at a potential turning point in how social media companies are held accountable for what they know versus what they share.

Leave a Reply

Your email address will not be published. Required fields are marked *