Meta Hit with $375 Million Penalty in Historic Child Exploitation Case
A New Mexico jury has delivered a landmark verdict, ordering Meta to pay $375 million in civil penalties after finding the company liable for misleading consumers about platform safety and enabling harm, including child sexual exploitation. This marks the first jury trial to hold Meta accountable for acts committed on its platforms, setting a significant legal precedent.
Jury Finds Meta Violated Consumer Protection Laws
The lawsuit, filed by New Mexico Attorney General Raúl Torrez in December 2023, alleged that Meta executives knowingly allowed harmful conditions on Facebook and Instagram, disregarding internal warnings. The jury found Meta liable under the state's Unfair Practices Act, imposing the maximum penalty of $5,000 per violation, totaling $375 million.
"The jury's verdict is a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety," said Torrez. He accused Meta of lying to the public and ignoring employee concerns about risks to children.
Evidence Reveals Systemic Failures
During the nearly seven-week trial, internal documents and testimony revealed that both Meta employees and external child safety experts repeatedly warned about dangers on the platforms. Evidence included details from "Operation MetaPhile," a sting operation leading to the 2024 arrest of three men charged with preying on children via Meta's services.
Law enforcement and the National Center for Missing and Exploited Children (NCMEC) testified about deficiencies in Meta's crime reporting, including the exchange of child sexual abuse material (CSAM). Investigators criticized Meta's overreliance on AI moderation, which generated high volumes of "junk" reports that hindered law enforcement efforts.
Encryption and Platform Design Under Scrutiny
The court heard how Meta's 2023 decision to encrypt Facebook Messenger blocked access to crucial evidence of crimes, as predators used the platform to groom minors and share abusive imagery. In taped depositions, Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri acknowledged that harms to children, such as sexual exploitation and mental health issues, were inevitable due to the vast user bases.
Despite this, executives highlighted investments in safety technology, including Instagram Teen Accounts launched in 2024, which set default protections for users aged 13 to 17.
Legal Battles and Broader Implications
Meta has announced it will appeal the ruling, with a spokesperson stating, "We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms." The company accused Torrez of making sensationalist arguments by cherrypicking documents.
Meta's attempts to invoke Section 230 of the Communications Decency Act, which typically shields platforms from liability for user content, were denied in June 2024. The judge ruled that the lawsuit focused on product design and internal decisions, not just speech issues.
John W Day, a former New Mexico deputy district attorney, commented, "This opens the floodgates to lots of other litigation and reforms and regulation," reflecting broader public concern about social media's invasiveness.
Ongoing Legal Proceedings and Industry Impact
The next phase of the New Mexico case, beginning on May 4, will seek additional financial penalties and court-mandated platform changes, such as effective age verification and removing predators. Meanwhile, Meta faces a separate lawsuit in Los Angeles, where hundreds of families and school districts accuse tech giants, including Meta, Snap, TikTok, and YouTube, of designing addictive platforms that harm children's mental health.
While Snap and TikTok have settled, Meta and YouTube continue to contest these claims, with all companies denying wrongdoing. The jury in that case is currently deliberating a verdict, highlighting the growing legal pressure on social media firms to address safety concerns.



