State prosecutors say the court should require stronger age checks, safer design features and outside oversight because Meta misled families and failed to protect minors from exploitation. Meta says many of the demands are unworkable, legally flawed and could threaten access to its services in New Mexico.
What the Meta trial could change for child safety
The remedies phase will be heard by a judge, not a jury, and could determine whether Meta must change how its platforms operate for young users in New Mexico. Prosecutors are expected to ask for measures including stronger age verification, changes to recommendation systems, restrictions on autoplay and infinite scroll for minors, warning labels, permanent bans for adults who facilitate exploitation and a court-supervised child safety monitor.
In a court filing unsealed Thursday, Meta said one proposed requirement for 99% accuracy in verifying whether child users are at least 13 years old was effectively impossible. The company warned that such demands could leave it with no practical option but to stop offering Facebook, Instagram and WhatsApp in the state.
The state’s requested remedies also include restrictions tied to end-to-end encryption for minors and a high detection threshold for new child sexual abuse material uploaded to Meta services, according to The Verge. Meta argues those standards are vague and technically infeasible at the scale of platforms with billions of users.
March verdict raised the stakes
The new phase follows a March verdict in which the New Mexico Department of Justice said a jury ordered Meta to pay $375 million for violating the state’s consumer protection laws. The department said New Mexico became the first state to prevail at trial against a major tech company over claims that its platforms harmed young people.
New Mexico Attorney General Raúl Torrez called the verdict “a historic victory for every child and family,” while Meta has disputed the allegations and defended its safety work. The company says it has invested in protections for young users, including Instagram Teen Accounts, which automatically place teens in more restrictive settings and require parental permission for younger teens to loosen some protections.
The legal question now is broader than the March penalty. New Mexico wants the court to treat Meta’s platforms as a public nuisance, a finding that could allow the judge to order changes aimed at reducing alleged harms to minors. If the state succeeds, the case could become a model for other lawsuits seeking court-ordered changes rather than only damages.
Why the Meta trial did not come out of nowhere
The New Mexico case is part of a longer public reckoning over children’s safety online. In 2021, the Wall Street Journal’s Facebook Files coverage brought internal Instagram research about teen well-being into public view and intensified scrutiny of Meta’s public statements about young users.
The pressure widened in 2023, when The Guardian reported that Facebook and Instagram were being used by criminals to facilitate child sex trafficking. New Mexico later filed its lawsuit against Meta in December 2023, alleging the company’s platforms exposed children to sexual abuse, solicitation and trafficking.
That same year, Reuters reported that dozens of states sued Meta over claims that Instagram and Facebook contributed to a youth mental health crisis through addictive design features. Those cases helped turn online child safety from a parental concern into a nationwide legal and regulatory fight.
In January 2024, an unsealed filing reported by The Guardian added further pressure by citing internal allegations about the scale of sexual harassment affecting children on Meta’s platforms. Meta denied that the lawsuit fairly represented its work and said it had built tools to support safe, age-appropriate experiences.
What happens next
The court now has to weigh two competing arguments. New Mexico says Meta has the resources and technical ability to redesign parts of its platforms when business needs require it, and that child safety should receive the same urgency. Meta says the state is demanding perfection, singling out one company and asking for remedies that could weaken privacy, restrict speech and disrupt services for all users in New Mexico.
A narrow ruling could order targeted changes for minors in New Mexico. A broader ruling could push Meta toward state-specific product changes and encourage other states to seek similar remedies. A ruling for Meta, by contrast, could make it harder for state attorneys general to use public nuisance claims to reshape social media platforms.
Either way, the case has already moved beyond a single verdict. The next decision could help define whether courts can require social media companies to redesign core features in the name of child safety, or whether those changes remain primarily in the hands of companies, lawmakers and regulators.

