New Mexico Sues Meta Over Child Safety: What Internal Documents Reveal
New Mexico has sued Meta for knowingly designing features that enable child exploitation on Facebook and Instagram, backed by internal documents showing executives understood the risks. The case repre

New Mexico Sues Meta Over Child Safety: What Internal Documents Reveal
New Mexico filed a lawsuit against Meta Platforms on December 5, 2023, accusing the company of knowingly designing features on Facebook, Instagram, and WhatsApp that make it easier for children to be exploited. The state also claims Meta misled the public about how safe its platforms really are.
This case marks the first time a state has pursued a standalone trial against a social media company focused specifically on harm to children. It grew out of an undercover investigation conducted by New Mexico's attorney general into how Meta moderates content and designs its platforms.
What the Documents Show
The case became much more serious when court documents were unredacted and released on January 17, 2024. They revealed internal communications from Meta employees and executives between 2020 and 2021 discussing known safety problems affecting children on Instagram and Facebook.
These internal documents show that Meta knew adult strangers could contact children through its messaging systems and recommendation algorithms. One particularly problematic feature is "people you may know"—a tool that suggests new connections based on information about users. Internally, Meta flagged this feature as creating dangerous pathways for adults to contact minors.
The documents also show that Meta's employees knew child sexual material was circulating on Instagram, and that the platform was helping to spread it. In one striking example from 2020, Meta executives scrambled to address a situation where an Apple executive's 12-year-old child had been solicited on the platform.
In internal messages, Meta employees described these child safety issues bluntly. One framed the problem as "the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store." This suggests Meta understood both how serious the problem was and what business consequences could follow if the company didn't respond adequately.
How Algorithms Boost Engagement—At What Cost
New Mexico's lawsuit focuses heavily on how Meta designs its algorithms and systems to keep users on the platform longer, especially younger users. The lawsuit claims Meta engineered recommendation systems and notifications specifically to maximize how much time young people spend on Facebook and Instagram—despite knowing this created exploitation risks.
The complaint argues that Meta's algorithms were deliberately designed to make young users come back more often and stay longer, much like habit-forming systems. These systems kept running even as internal teams documented safety concerns and patterns of exploitation.
The broader context here is worth understanding: Meta, like most digital platforms, makes money from advertising. The more time users spend on the platform, the more ads they see. This creates a built-in financial incentive to maximize engagement. The tension at the heart of this lawsuit is whether that incentive can ever align with genuine child safety when the two goals conflict.
Years of Regulatory Pressure Building
New Mexico's action is part of a larger wave of state-level oversight. Attorney General Hector Balderas joined a nationwide investigation into Instagram's effects on young users in November 2021, focusing on Meta's engagement techniques and the documented harms of extended platform use.
Even before that, in May 2021, a bipartisan group of 44 attorneys general sent a public letter to Meta urging the company to cancel plans for an Instagram version designed for children under 13. The letter raised concerns about how social media might affect child development and increase exploitation risks for very young users.
Later, in October 2023, attorneys general from 33 states filed a separate lawsuit against Meta claiming that Instagram and Facebook contain features deliberately designed to hook children on using the platforms. New Mexico's case is different because it zeros in specifically on child sexual exploitation rather than broader mental health harms.
Having watched platform regulation battles unfold over more than three decades—from the Communications Decency Act debates of the 1990s forward—I recognize a pattern in how these cases tend to develop. Companies promise to police themselves, internal documents later reveal they knew about serious problems, and coordinated enforcement actions follow. The Meta case follows that familiar arc, though the focus on child sexual safety gives it particular weight.
The Technical Challenge of Moderation at Scale
The lawsuit raises genuine technical questions about how to moderate content and recommend connections safely across platforms serving billions of users. Manual review of every piece of content and every potential connection between users is simply not feasible at that scale.
That said, the internal documents in this case suggest Meta had specific knowledge of exploitation patterns and weaknesses in its recommendation systems that could have been addressed. The company could have adjusted its algorithms or added automated detection tools to catch these problems—steps that technical teams often have available but that might slow growth or engagement.
A Different Legal Angle
New Mexico's legal strategy is noteworthy. Rather than trying to challenge Meta on the content itself—which federal law broadly protects platforms from liability for—New Mexico is focusing on consumer protection violations. The state argues that Meta made false claims about how safe its platforms are and failed to disclose known risks to users and parents. This amounts to deceptive business practices under state law.
This approach may be legally stronger because it sidesteps complex federal immunity questions and instead asks a simpler question: did Meta's public safety statements match what the company actually knew about its platforms. The lawsuit seeks both money damages and court orders requiring Meta to change how it designs certain features.
What This Means for the Broader Industry
The technical and legal outcomes of this case will likely shape how other social media companies approach platform design. If New Mexico wins, we may see state attorneys general routinely demanding internal documents from tech companies, creating new documentation and compliance requirements for how companies handle safety issues internally.
The specific focus on recommendation algorithms and engagement optimization could eventually force the industry to reconsider how it measures success—moving beyond raw engagement metrics for features that affect minors. However, implementing age-appropriate algorithmic behavior across billions of users remains a substantial engineering challenge that no company has fully solved.
Ultimately, this case tests whether states can regulate platform design in ways that federal agencies and Congress have struggled to achieve through traditional technology law and policy. It is, in effect, an experiment in whether state-level consumer protection law can do what federal oversight has not.


