Saturday, April 4, 2026
Logo

Meta Faces Historic Legal Reckoning Over Harm to Teens: Lawsuits, Fines, and Congressional Response

Meta has suffered two landmark court losses in less than 48 hours, marking the first time a jury held the company legally accountable for designing addictive features that harmed teens. With 40 state attorneys general suing Meta and thousands of cases pending, the verdicts could reshape the tech ind

BusinessBy Catherine Chen3d ago6 min read

Last updated: April 4, 2026, 10:53 AM

Share:
Meta Faces Historic Legal Reckoning Over Harm to Teens: Lawsuits, Fines, and Congressional Response

In a historic legal reckoning, Meta suffered two unprecedented courtroom setbacks within 48 hours last week, each exposing the company’s alleged role in designing social media platforms that harmed teenagers. On October 13, 2026, a New Mexico state court jury found Meta liable for violating the state’s Unfair Practices Act, fining the company $375 million—the maximum $5,000 per violation—for deliberately creating addictive features that endangered child safety. The very next day, a Los Angeles jury ruled that Meta’s apps were intentionally designed to be addictive to teens, assigning 70% liability to Meta and 30% to YouTube in a lawsuit brought by a 20-year-old plaintiff identified as K.G.M. These verdicts—alongside a growing wave of litigation—signal a seismic shift in how courts may evaluate tech companies’ responsibility for the psychological harm inflicted on young users.

The twin courtroom defeats arrive amid a broader reckoning over social media’s impact on adolescent mental health, fueled by internal Meta documents, whistleblower testimony, and mounting pressure from state attorneys general. While tech platforms have historically relied on Section 230 of the Communications Decency Act to shield themselves from liability for user-generated content, these cases target not the content teens consume, but the design of the platforms themselves—endless scroll, push notifications, and algorithmic amplification—that experts say are engineered to exploit young minds. The legal strategy mirrors the tobacco industry’s downfall, where companies were held accountable not for what smokers chose to consume, but for the addictive delivery systems they engineered.

  • Meta lost two major lawsuits in 48 hours, facing $381M in combined fines for harming teens through addictive app design.
  • Thousands of pending lawsuits and 40 state attorneys general are now suing Meta over its alleged role in exacerbating teen mental health crises.
  • Internal documents reveal Meta prioritized teen engagement over safety, with executives mocking parental oversight and pushing features to hide teen activity from adults.
  • Congress is debating children’s online safety bills, but critics warn some proposals could lead to mass censorship under the guise of protecting minors.
  • Whistleblower Frances Haugen’s 2021 leaks exposed Meta’s knowledge of Instagram’s harm to teen girls, accelerating regulatory and legal scrutiny.

How Two Courtroom Losses Could Redefine Tech Liability for Harming Minors

The New Mexico verdict marked the first time a court held Meta legally accountable for endangering child safety through platform design, not content. Jurors deliberated for six weeks before finding Meta violated the state’s Unfair Practices Act, a consumer protection law typically used against companies accused of deceptive or harmful business practices. The $375 million penalty—calculated as $5,000 per violation—sends a clear message that courts may no longer treat tech giants as immune from accountability when their products are proven to exploit psychological vulnerabilities in young users.

The K.G.M. Case: A Landmark Ruling on Addictive Design

The Los Angeles lawsuit, filed by a 20-year-old plaintiff identified as K.G.M., centered on Meta’s alleged exploitation of teen psychology through features like infinite scroll and real-time notifications. A jury concluded that Meta’s apps were designed to be addictive and that this design directly harmed the plaintiff’s mental health. The ruling also assigned partial liability to YouTube (30%), highlighting the broader industry’s role in enabling compulsive social media use among adolescents. Snap and TikTok settled the case before trial, avoiding a public verdict that could set a precedent for similar litigation.

“They took the model that was used against the tobacco industry many years ago, and instead of focusing on things like content, they focused on these addictive features—how the platform is designed, and issues with the design, which is different than content, where you have this First Amendment argument. It turned out at least to be, in these two cases, a winning argument.” — Allison Fitzpatrick, digital media lawyer and partner at Davis+Gilbert

Legal experts say the K.G.M. case could become a blueprint for future lawsuits, as plaintiffs increasingly argue that social media companies are liable not for what users post, but for how their algorithms and interfaces are engineered to maximize engagement—even at the cost of young users’ well-being. Fitzpatrick noted that while the fines in these cases are relatively small for a company of Meta’s size, the cumulative effect of thousands of similar lawsuits could force systemic changes in how tech platforms operate.

Internal Documents Reveal Meta’s Decade-Long Pursuit of Teen Engagement Over Safety

Court filings and unsealed documents from the New Mexico and K.G.M. cases paint a damning picture of Meta’s corporate culture, where executives and engineers openly discussed strategies to hook teens on their platforms—often at the expense of their mental health. Among the most revealing was a 2019 internal report summarizing 24 one-on-one interviews with users flagged for problematic usage, which Meta defines as applying to an estimated 12.5% of its user base. The report bluntly stated: “The best external research indicates that Facebook’s impact on people’s well-being is negative.”

Executives Prioritized Teen Time Engagement, Even at School

Internal emails and strategy documents reveal that Meta’s leadership, including CEO Mark Zuckerberg and Instagram head Adam Mosseri, prioritized teen engagement as a key metric of success. In one exchange, Zuckerberg remarked that for Facebook Live to succeed with teens, “my guess is we’ll need to be very good at not notifying parents / teachers.” Another 2021 email from Meta VP of Product Max Eulenstein captured the company’s relentless focus on maximizing usage: “No one wakes up thinking they want to maximize the number of times they open Instagram that day. But that’s exactly what our product teams are trying to do.”

The Rise of ‘Finstas’ and Hiding Teen Activity from Adults

Meta employees also discussed strategies to circumvent parental oversight, including the use of “finstas”—fake Instagram accounts teens create to hide their activity from parents and teachers. One internal email from 2021, sent to Meta Chief Product Officer Chris Cox, included the gleeful observation: “We learned one of the things we need to optimize for is sneaking a look at your phone in the middle of Chemistry :).” These revelations underscore how Meta’s design choices not only encouraged compulsive use but actively worked to conceal that use from adults.

Meta has claimed many of the unsealed documents date back nearly a decade, arguing that the company has since shifted its priorities. In response to the lawsuits, a Meta spokesperson stated that the company now “does not goal on teen time spent” and has introduced safety features like Instagram Teen Accounts, which default to private settings and send time-limit reminders after 60 minutes. For users under 16, parental permission is required to extend these limits. The spokesperson also pointed to Meta’s 2024 launch of “Take a Break” reminders and nudges to mute notifications during school hours.

Whistleblowers and Former Employees Amplify the Criticism

The legal battles have been fueled by whistleblower disclosures, most notably Frances Haugen’s 2021 leak of internal Meta research showing that Instagram worsened body image issues and depression among teen girls. Haugen’s testimony before Congress galvanized public outrage and led to multiple federal and state investigations. Former Meta employees have since come forward with their own accounts of the company’s culture. Kelly Stonelake, a former Director of Product Marketing at Meta who worked there from 2009 to 2024, is now suing the company for alleged gender-based discrimination and harassment. Stonelake, who led go-to-market strategies for Meta’s VR social app Horizon Worlds, told TechCrunch that the unsealed evidence aligns with her own experiences.

“The mountain of unsealed evidence really demonstrates what I experienced firsthand. At Meta, concerns about safety were consistently deprioritized in favor of engagement metrics. When I raised issues about content moderation in Horizon Worlds for teen users, my objections were ignored.” — Kelly Stonelake, former Meta Director of Product Marketing

Stonelake, who previously lobbied for the Kids Online Safety Act (KOSA), has since withdrawn her support, arguing that the bill’s current language could preempt state lawsuits like the one New Mexico filed against Meta. “I am urging a ‘no’ vote on the current version,” she said. “The latest draft includes preemption clauses that would close the courthouse doors to school districts, bereaved families, and states. That’s wild.” Her criticism reflects broader skepticism among advocates that federal legislation may not go far enough—or could even backfire by limiting state-level accountability.

Congress Struggles to Craft Effective Children’s Online Safety Laws

The twin courtroom losses have intensified pressure on Congress to pass legislation addressing children’s online safety, but the debate is sharply divided. Over 20 bills have been proposed, ranging from the Kids Online Safety Act (KOSA), which has gained support from Microsoft, Snap, X, and Apple, to proposals that privacy activists argue could lead to mass censorship. The most contentious proposals involve age verification systems, which critics say would effectively ban adults from accessing certain content under the guise of protecting minors.

KOSA’s Mixed Reception: Support from Tech and Skepticism from Advocates

KOSA, introduced in 2022, has emerged as the leading legislative effort to regulate social media platforms’ impact on minors. The bill would require platforms to provide minors with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations. It has drawn support from major tech companies, which see it as a way to preempt more stringent state laws. Supporters argue it would give families and educators tools to mitigate harm, while opponents warn it could force platforms to over-censor content or surveil users.

Age Verification and Censorship Concerns

Critics of the broader legislative push, including digital rights groups like Fight for the Future, argue that proposals framed as “children’s safety” often serve as trojan horses for broader censorship. Evan Greer, director of Fight for the Future, stated in a response to a potential age verification bill: “There is no universe where passing censorship or ‘age verification’ laws, under the guise of kids’ safety, doesn’t lead to massive online censorship of content and speech that some politicians don’t like.” Greer’s concerns reflect a broader fear that well-intentioned safety laws could be weaponized to suppress political speech or marginalized communities under the pretext of protecting children.

The Broader Reckoning: How These Cases Could Reshape the Tech Industry

The New Mexico and K.G.M. rulings are likely just the beginning of a wave of litigation targeting tech platforms’ design choices. Legal experts say these cases could set a precedent for holding companies accountable not only for harmful content but for the psychological architecture of their products. The lawsuits also highlight the growing role of state attorneys general, who have filed 40 coordinated cases against Meta, mirroring the tobacco industry’s litigation in the 1990s. With thousands more cases pending, the financial and reputational stakes for Meta—and the tech industry at large—could be existential.

What Happens Next: Appeals, New Laws, and Industry-Wide Changes?

Meta has vowed to appeal both verdicts, calling the outcomes an oversimplification of complex issues affecting teen mental health. “Reducing something as complex as teen mental health to a single cause risks leaving the many, broader issues teens face today unaddressed,” the company stated. “It overlooks the fact that many teens rely on digital communities to connect and find belonging.” However, legal experts say appeals are unlikely to succeed in reversing the core findings that Meta’s design features harmed minors.

Meanwhile, Congress remains deadlocked on children’s online safety legislation, with no clear path forward. The Biden administration has signaled support for stricter regulations, but partisan divisions over censorship, surveillance, and federal preemption have stalled progress. State attorneys general, emboldened by the New Mexico verdict, are expected to continue filing lawsuits, while advocacy groups push for more nuanced solutions that balance safety with free expression.

The Human Cost: Families and Advocates Seek Justice

Behind the legal jargon and corporate denials are real families grappling with the consequences of social media addiction. For the plaintiff in the K.G.M. case, the jury’s verdict acknowledged years of documented distress linked to compulsive Instagram use. Similarly, the New Mexico lawsuit was brought by the state on behalf of families whose children experienced anxiety, depression, and self-harm linked to Meta’s platforms. These cases underscore a growing recognition that the harm inflicted by algorithmic addiction is not abstract—it is measurable, preventable, and, according to juries, legally actionable.

What This Means for Parents, Teens, and Tech Platforms

For parents, the verdicts serve as a stark reminder of the challenges of monitoring children’s digital lives, especially in an era of finstas and hidden accounts. For teens, the rulings may accelerate demands for safer, more transparent platforms. For tech companies, the message is clear: Courts are increasingly willing to scrutinize design choices that prioritize engagement over well-being. Meta’s recent safety features, like Instagram Teen Accounts, represent a step in the right direction, but critics argue they are too little, too late. The question now is whether the industry will self-regulate or face more courtroom battles—and higher costs—down the road.

Frequently Asked Questions

What did the New Mexico lawsuit against Meta decide?
A New Mexico state court jury found Meta liable for violating the state’s Unfair Practices Act, fining the company $375 million for designing addictive features that harmed teens. The verdict marks the first time a court held Meta legally accountable for endangering child safety through platform design.
How did the K.G.M. case change the legal landscape for social media companies?
In the K.G.M. case, a Los Angeles jury ruled that Meta’s apps were intentionally designed to be addictive to teens, assigning 70% liability to Meta. The case established that courts may hold companies liable not for user content, but for the psychological architecture of their products.
What internal Meta documents were revealed in these lawsuits?
Court filings revealed internal documents showing Meta prioritized teen engagement over safety, including emails discussing how to hide teen activity from parents and executives joking about maximizing Instagram opens during school hours.
CC
Catherine Chen

Financial Correspondent

Catherine Chen covers finance, Wall Street, and the global economy with a focus on business strategy. A former financial analyst turned journalist, she translates complex economic data into clear, actionable reporting. Her coverage spans Federal Reserve policy, cryptocurrency markets, and international trade.

Related Stories