A New Mexico jury has ordered Meta to pay $375 million in civil penalties after finding the company misled users about safety and enabled harm, including child sexual exploitation, on its platforms. The ruling marks a landmark moment in holding Big Tech accountable for prioritizing profit over the safety and wellbeing of children.
The case, brought by the New Mexico attorney general in December 2023, is the first jury trial to find Meta liable for harms facilitated through its platforms. At its core, the lawsuit exposed how design choices and corporate decisions created environments where children could be targeted, groomed, and exploited.
Profit over protection
The jury imposed the maximum penalty of $5,000 per violation under the state’s Unfair Practices Act. Jurors agreed that Meta knowingly misrepresented the safety of its platforms while failing to act on mounting evidence of harm.
New Mexico attorney general Raúl Torrez framed the verdict as a turning point in the fight to protect children online. As reported by The Guardian, Torrez states:
The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.
Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.
Internal documents and testimony revealed that company leaders understood the risks. Employees and external child safety experts repeatedly warned that Facebook and Instagram were facilitating child sexual exploitation. Despite this, executives failed to implement adequate safeguards.
In depositions shown during the trial, Meta executives, including CEO Mark Zuckerberg and Instagram head Adam Mosseri, acknowledged that harms such as sexual exploitation were inevitable given the scale of their platforms. Critics argue that this acceptance reflects a business model that tolerates harm as a cost of growth.
Platforms that enable abuse
The trial detailed how predators used Meta’s platforms to groom minors and share child sexual abuse material. Evidence included a 2024 undercover operation, “Operation MetaPhile”. The operation resulted in three men were arrested after attempting to meet children they had contacted online.
Investigators and experts from the National Center for Missing and Exploited Children testified that Meta’s reporting of crimes taking place on its platforms, including the exchange of child sexual abuse material, was deficient. According to investigators,
Meta has generated high volumes of “junk” reports by overly relying on AI to moderate its platforms…. These reports were useless to law enforcement and meant crimes could not be investigated.
At the same time, Meta’s 2023 decision to encrypt Facebook Messenger limited access to critical evidence. While encryption can enhance privacy, the court heard that it also shielded predators and obstructed investigations into child exploitation.
These failures highlight a broader issue: platforms designed to maximize engagement and growth can also create fertile ground for exploitation when safety is treated as secondary.
Accountability is not in Meta’s plan
Unsurprisingly, Meta has said it will appeal the ruling. However, the court rejected Meta’s attempt to shield itself under Section 230 of the Communications Decency Act, emphasizing that the case focused on product design and corporate decisions—not just user content.
Legal experts believe the verdict could open the door to further litigation and regulation. “This one certainly opens the floodgates to lots of other litigation and reforms,” said criminal defense lawyer John W Day.
A second phase of proceedings, set to begin in May, will seek additional penalties and court-mandated changes. These may include stronger age verification, removing predators more effectively, and limiting features that allow abusers to operate undetected.
The ruling comes amid growing legal pressure on tech companies. In a separate case in Los Angeles, families accuse platforms including Snap Inc., TikTok, and YouTube of designing addictive systems that harm children.
A broader reckoning
While some companies have settled, others continue to fight the claims.
This case sends a clear message: when companies design platforms that prioritize engagement and profit, they risk enabling systems of abuse. For children targeted by traffickers and predators, the consequences are devastating.
The jury’s decision underscores a critical truth—protecting children must come before profit.
Freedom United is interested in hearing from our community and welcomes relevant, informed comments, advice, and insights that advance the conversation around our campaigns and advocacy. We value inclusivity and respect within our community. To be approved, your comments should be civil.