A court in New Mexico has ordered Meta to pay $375 million (£279 million) for misleading users over the safety of its platforms for children.
A jury found that Meta, which owns Facebook, Instagram, and WhatsApp, was liable for the way in which its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators.
New Mexico Attorney General Raul Torrez said the verdict is 'historic' and marks the first time that a state has successfully sued Meta over child safety issues.
A spokeswoman for Meta, led by chairman and chief executive Mark Zuckerberg, said the company disagrees with the verdict and intends to appeal.
She stated: 'We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors and harmful content. We remain confident in our record of protecting teens online.'
The jury found that Meta violated New Mexico's Unfair Practices Act by misleading the public about the safety of its platforms for young users.
During a seven-week trial, jurors reviewed internal Meta documents and heard testimony from former employees regarding the awareness the company had of child predators exploiting its platforms.
Arturo Béjar, a former engineering leader at Meta who became a whistleblower, testified to experiments that demonstrated underage users being shown sexual content on Instagram, sharing that his own daughter was propositioned for sex by a stranger on the platform.
Prosecutors revealed that at one point, 16% of Instagram users reported experiencing unwanted nudity or sexual activity within a week.
Meta contended that it has made efforts to combat problem users on its platforms and enhance safety for minors, touting features such as Teen Accounts and parental alerts for potentially harmful searches.
The civil penalty of $375 million stems from the jury's finding of numerous violations of the law, each with maximum penalties of $5,000.
Meta is currently involved in a separate trial in Los Angeles concerning a claim from a young woman who argues that her childhood addiction to platforms like Instagram and YouTube reflects intentional design flaws.
Thousands of similar lawsuits are progressing through U.S. courts.
The lawsuit originated in 2023 when New Mexico accused Meta of directing young users toward content that was sexually explicit, exposed them to solicitation of such materials, or contributed to sex trafficking through recommendation algorithms.
Torrez emphasized that Meta executives were aware of the risks their products posed to children, ignored warnings from their employees, and misled the public about their knowledge of these issues, declaring, 'Today the jury joined families, educators, and child safety experts in saying enough is enough.'



















