New Mexico proposes $3.7bn fine for Meta and sweeping changes to its social platforms
State’s attorney general petitions for overhaul of how Meta’s child safety protocol in second phase of landmark case
www.silverguide.site –
Meta has returned to court in the US this week for the second phase of a lawsuit brought by Raúl Torrez, New Mexico’s attorney general, following a March verdict that found the company liable for child safety failures and imposed a $375m fine. On Monday, the state petitioned for a legal sanction against the company, a monetary penalty 10 times the original amount, and a sweeping, drastic overhaul of Meta’s child safety protocols.
In the second part of the landmark case, known as the remedies phase, the state is asking for Meta to be declared a public nuisance and for the judge to order the company to pay $3.7bn in an abatement plan. The money would fund programs for law enforcement, mental health services and educators. The state is also requesting that the judge force a series of design changes to Meta’s platforms aimed at improving child safety, including universal age verification, de-encryption of children’s messages, a guardian account linked to every child’s account, and a child safety monitor tasked with holding Meta to account for five years.
The New Mexico department of justice argues that these changes would make Meta’s social networks safer for underage users in the state. Meta, however, says the proposed reforms are unfeasible and could ultimately force it to shutdown its platforms in the state altogether.
The second phase of the trial is expected to last three weeks. Before opening statements on 4 May, Judge Bryan Biedscheid, said he needed to remain cognizant of free speech protections when evaluating the state’s arguments for Meta to impose the design measures, which he said could amount to “overreach”.
Drastic changes for children’s accounts
The key measures state regulators propose for Facebook, Instagram and WhatsApp aim to proactively prevent online child sexual exploitation by restricting how adults can interact with minors on the platform, attorneys for New Mexico said.
Under the measures, Meta would need to enact age verification tools for all users and then permanently delete any personal identifying information collected to verify age. The company would be compelled to block users determined to be under 13 from using their platforms and delete their accounts.
The proposed measures would mandate a court-appointed child safety monitor to oversee Meta’s compliance for at least five years. Last week, Torrez said his office is reviewing potential independent technical monitors from around the country, but a particular one has not yet been identified.
Other proposed actions include blocking adults from finding or messaging users under 18 unless directly connected, limiting contact from users in regions linked to sextortion, and prohibiting both adults and AI chatbots from engaging minors in romantic or sexualized conversations. The rules also call for strict enforcement, including permanent bans for offending users and preventing those users from creating new accounts with the same device, IP address or phone number.
The state is calling for Meta to improve its reporting of online child sexual exploitation to the National Center of Missing & Exploited Children (NCMEC), by including actionable information in a timely manner. It also wants Meta to make it easier for users to report child sexual exploitation and ensure these reports are reviewed by a human within 48 hours.
The state is also requesting that New Mexico law enforcement be permitted by the company to operate undercover accounts to test child safety on Meta products. It wants end-to-end encryption of direct messages on Meta’s platforms eliminated for all users aged under 18. By protecting users’ privacy, encryption prevents the company and law enforcement from reviewing messages, which, the state argues, means they cannot be scanned for child sexual abuse material.
If enacted, another measure would see Meta implement a “warning label” on its products, alerting to the likelihood that teens will encounter violating content, report any product breaches that affect children’s safety, and make Meta’s litigation documents public.
The state is also seeking a requirement that all users of Facebook, Instagram and WhatsApp aged under 18 have a linked guardian account. Child safety experts have previously criticized social media protections that place the burden on parents or guardians to monitor and protect minors. The state wants guardians to be alerted when users under 18 are contacted by suspicious adults.
The New Mexico lawsuit was first filed in December 2023, following a Guardian investigation published in April of that year that found Facebook and Instagram had become marketplaces for child sex trafficking, and was cited multiple times in the complaint. Meta has said it plans to appeal the ruling.
The first trial spanned seven weeks earlier this year, and Meta has said it plans to appeal the ruling.
In the first phase of the trial, the jury heard how Meta’s design features, such as infinite scroll and push notifications, can be addictive. The state has requested design changes to alleviate this, as well as implementing mandatory time limits for users under 18. It also wants to prohibit the use of minors’ data for personalized recommendations.

Comment