Meta’s Influence on Teens: Unsettling Content and Ignored Health Uncovered in Company Reports

A complaint has been filed against Meta Platforms, revealing disturbing information about how the company treats its social media platforms, particularly when it comes to teenagers. The complaint refers to internal documents that show Meta’s algorithmic features leading users to distressing content. Additionally, the complaint claims that Meta’s relentless pursuit of usage growth, especially among teenagers, has overshadowed the need for automated detection systems and enough staff to handle reports of underage activity.

Meta, formerly known as Facebook, has long been under scrutiny for its impact on young users. CEO Mark Zuckerberg has consistently ignored warnings from senior officials about the harmful effects of the company’s main social media platforms. The complaint alleges that Meta intentionally designed its products to exploit the vulnerabilities of young users’ brains, taking advantage of their impulsive behavior, susceptibility to peer pressure, and potentially dangerous actions.

One of the most concerning things revealed in the internal documents is Meta’s apparent disregard for user well-being, prioritizing increased platform usage over safety. Despite acknowledging the negative impact of Instagram on the self-esteem of teenage girls, Meta downplayed its responsibility in the tragic death of Molly Russell, a 13-year-old British girl who took her own life after being exposed to recommended content promoting self-harm.

The company’s lax approach to underage users is deeply troubling. While Instagram takes steps to remove underage users once discovered, Meta allegedly allowed preteens to use Facebook and Instagram. The complaint sheds light on a backlog of 2-2.5 million accounts belonging to users under 13 that are awaiting action, raising doubts about the company’s ability to protect young users.

Furthermore, the complaint reveals that Meta’s internal research confirmed the negative impact of Instagram on the self-esteem of many teenage girls. The platform, originally designed for older teens with more developed cognitive and emotional skills, worsens well-being concerns, especially for younger users.

In its pursuit of increased usage growth, Meta heavily relied on notifications, specifically targeting teens. The company even evaluated the lifetime value of a 13-year-old, estimating it to be around $270 when making product decisions. This profit-focused approach raises questions about Meta’s true commitment to the well-being of its young users.

Despite Meta’s claims that its products were not designed to be addictive for teens, the internal documents suggest otherwise. The company intentionally engineered its platforms to exploit the vulnerabilities of youth psychology, encouraging impulsive behavior and peer pressure among teenagers.

Although Meta publicly supports legislation that would give parents control over which apps users under 16 can download, critics argue that this is a reactive measure rather than a proactive attempt to address the harm caused by the company’s platforms.

The unredacted complaint also highlights the stark contrast between Meta’s public statements and its internal communications. Several executives made public claims that allegedly contradicted the company’s internal documents, further eroding trust in Meta’s commitment to user well-being.

In response to the concerns raised in the complaint, Meta has introduced optional features like “quiet mode,” encouraging users to consider closing the app when scrolling late at night. However, critics argue that these measures are not enough to address the underlying issues affecting young users.

The lawsuit against Meta aims to hold the company accountable for its actions and demands improvements in automated detection systems, increased staff to review reports of underage activity, and stricter age verification measures.

As public awareness of the potential harm caused by social media platforms continues to grow, Meta faces increasing pressure to prioritize user well-being, especially among teenagers. It is crucial for society, regulators, and Meta itself to promptly address these concerns. Only by holding the company accountable and demanding comprehensive changes can we ensure the well-being and safety of young users in the digital age.

Scroll to Top