Meta Found Liable for Harming Children's Mental Health in Landmark $375M Ruling
Meta Found Liable for Harming Children's Mental Health

Meta Found Liable for Harming Children's Mental Health in Landmark $375M Ruling

A New Mexico jury has delivered a groundbreaking verdict, ruling that Meta knowingly harmed children's mental health and concealed child sexual exploitation on its social media platforms. The decision follows a nearly seven-week trial that represents a significant legal challenge to the tech giant's practices.

Jury Finds Thousands of Violations Against Meta

Jurors determined that Meta violated parts of New Mexico's Unfair Practices Act through thousands of separate violations, each counting toward a substantial penalty of $375 million. The jury agreed with state prosecutors who argued that Meta—parent company of Instagram, Facebook, and WhatsApp—prioritized profits over user safety, particularly concerning vulnerable children.

The verdict was based on accusations that the technology giant deliberately concealed its knowledge regarding the dangers of child sexual exploitation on its platforms and the detrimental impacts on children's mental well-being. Jurors found that Meta made false or misleading statements and engaged in what they described as "unconscionable" trade practices that unfairly took advantage of children's vulnerabilities and inexperience.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Meta's Response and Broader Legal Context

"We respectfully disagree with the verdict and will appeal," a Meta spokesperson stated. "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online."

Meta attorneys argued during the trial that the company discloses risks and makes substantial efforts to weed out harmful content and experiences, while acknowledging that some problematic material inevitably gets through their safety systems. The company has not agreed that social media addiction exists, though executives at trial acknowledged "problematic use" and expressed their desire for users to feel positive about time spent on Meta platforms.

This New Mexico case represents one of the first to reach trial in a wave of litigation involving social media platforms and their impacts on children. More than 40 state attorneys general have filed lawsuits against Meta, claiming the company contributes to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are addictive.

Evidence Presented During the Trial

The trial examined extensive internal Meta correspondence and reports related to child safety. Jurors heard testimony from Meta executives, platform engineers, former company whistleblowers, psychiatric experts, and technology safety consultants. Local public school educators also testified about disruptions linked to social media, including sextortion schemes targeting children.

New Mexico's case relied on a state undercover investigation where agents created social media accounts posing as children to document sexual solicitations and Meta's response mechanisms. The lawsuit, originally filed in 2023 by New Mexico Attorney General Raúl Torrez, also alleges that Meta hasn't fully disclosed or addressed the dangers of social media addiction.

Legal Arguments and Future Proceedings

Prosecutors argued that Meta should be responsible for its role in pushing harmful content through complex algorithms that proliferate material dangerous to children. "We know the output is meant to be engagement and time spent for kids," prosecution attorney Linda Singer stated. "That choice that Meta made has profound negative impacts on kids."

Tech companies have traditionally been protected from liability for material posted on their platforms under Section 230 of the U.S. Communications Decency Act, as well as First Amendment protections. However, this verdict suggests potential legal vulnerabilities for social media companies regarding how they design and promote content.

Pickt after-article banner — collaborative shopping lists app with family illustration

A second phase of the trial, possibly scheduled for May before a judge without a jury, would determine whether Meta created a public nuisance and may be ordered to implement changes and pay for remedies. The jury considered whether social media users were misled by specific statements about platform safety from Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri, and Meta global head of safety Antigone Davis.

Broader Implications for Social Media Regulation

This landmark verdict arrives as school districts and legislators increasingly seek more restrictions on smartphone use in classrooms. The trial that began on February 9 represents just one of numerous lawsuits against Meta, with similar cases pending in other jurisdictions.

During deliberations, the jury used a comprehensive checklist of allegations from prosecutors that included Meta's failure to disclose what it knew about problems with enforcing its ban on users under 13, the prevalence of social media content about teen suicide, and the role of Meta algorithms in prioritizing sensational or harmful content.

"What this case is about is one of the biggest tech companies in the world taking advantage of New Mexico teens," state Chief Deputy Attorney General James Grayson told the jury in closing arguments. The jury was assembled from residents of Santa Fe County, including the politically progressive state capital city.