NPR
New Mexico jury says Meta harms children's mental health and safety, violating state law
+443 words added -531 words removed
By
The Associated Press
A recording of Meta Founder and CEO Mark Zuckerberg's deposition is played for the jurors on March 4 in Santa Fe, N.M. Jim Weber/Santa Fe New Mexican/AP hide caption
SANTA FE, N.M.
The landmark decision comes after a nearly seven-week trial, and as jurors in a federal court in California have been sequestered in deliberations for more than a week about whether Meta and YouTube should be liable in a similar case.
The jury agreed with allegations that Meta made false or misleading statements and also agreed that Meta engaged in "unconscionable" trade practices that unfairly took advantage of the vulnerabilities of and inexperience of children.
Jurors found there were thousands of violations, each counting separately toward a penalty of $375 million. That's less than one-fifth of what prosecutors were seeking.
The social media conglomerate won't be forced to change its practices right away. It will be up to a judge — not a jury — to determine whether Meta's social media platforms created a public nuisance and whether the company should pay for public programs to address the harms. That second phase of the trial will happen in May.
A Meta spokesperson said the company disagrees with the verdict and will appeal.
"We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content," the spokesperson said. "We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online."
Attorneys for Meta said the company discloses risks and makes efforts to weed out harmful content and experiences, while acknowledging that some bad material gets through its safety net.
New Mexico's case was among the first to reach trial in a wave of litigation involving social media platforms and their impacts on children.
"Meta's house of cards is beginning to fall," said Sacha Haworth, executive director of watchdog group The Tech Oversight Project.
Meta hasn't agreed that social media addiction exists, but executives at trial acknowledged "problematic use" and say they want people to feel good about the time they spend on Meta's platforms.
"Evidence shows not only that Meta invests in safety because it's the right thing to do but because it is good for business," Meta attorney Kevin Huff told jurors in closing arguments. Communications Decency Act, as well as a First Amendment shield.
"We know the output is meant to be engagement and time spent for kids," prosecution attorney Linda Singer said. "That choice that Meta made has profound negative impacts on kids."
The New Mexico trial examined a raft of Meta's internal correspondence and reports related to child safety.
The jury also heard testimony from local public school educators who struggled with disruptions linked to social media, including sextortion schemes targeting children.
In reaching a verdict, the jury considered whether social media users were misled by specific statements about platform safety by Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri and Meta global head of safety Antigone Davis.
ParentsSOS, a coalition of families who have lost children to harm caused by social media, called the verdict a "watershed moment."
"We parents who have experienced the unimaginable — the death of a child because of social media harms — applaud this rare and momentous milestone in the years-long fight to hold Big Tech accountable for the dangers their products pose to our kids," the group said in a statement.
Sponsor Message
Become an NPR sponsor
− — A New Mexico jury decided Tuesday that Meta knowingly harmed children's mental health and concealed what it knew about child sexual exploitation on its social media platforms, a verdict that signals a changing tide against tech companies and the government's willingness to crack down.
+ — A New Mexico jury determined Tuesday that Meta knowingly harmed children's mental health and concealed what it knew about child sexual exploitation on its social media platforms, a verdict that signals a changing tide against tech companies and the government's willingness to crack down.
− Technology Zuckerberg grilled about Meta's strategy to target 'teens' and 'tweens' New Mexico jurors sided with state prosecutors who argued that Meta — which owns Instagram, Facebook and WhatsApp — prioritized profits over safety. The jury determined Meta violated parts of the state's Unfair Practices Act on accusations the company hid what it knew about about the dangers of child sexual exploitation on its platforms and impacts on child mental health.
+ Technology Zuckerberg grilled about Meta's strategy to target 'teens' and 'tweens' New Mexico jurors sided with state prosecutors who argued that Meta — which owns Instagram, Facebook and WhatsApp — prioritized profits over safety, and violated parts of the state's Unfair Practices Act.
− Meta is valued at about $1.5 trillion.
+ Meta is valued at about $1.5 trillion and the company's stock was up 5% in early after-hours trading following the verdict, a signal that shareholders were shrugging off the news.
Juror Linda Payton, 38, said the jury reached a compromise on the estimated number of teenagers affected by Meta's platforms, while opting for the maximum penalty per violation.
− The company's stock was up 5% in early after-hours trading following the verdict, a signal that shareholders were shrugging off the news and its potential impact on the company's business.
+ With a maximum $5,000 penalty for each violation, she said she thought each child was worth the maximum amount.
− The trial that started Feb. 9. is one of the first in a torrent of lawsuits against Meta and comes as school districts and legislators want more restrictions on the use of smartphones in classrooms.
+ Technology Tennessee teens sue Elon Musk's xAI over AI-generated child sexual abuse material More than 40 state attorneys general have filed lawsuits against Meta, claiming it's contributing to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are addictive.
− More than 40 state attorneys general have filed lawsuits against Meta, claiming it's contributing to a mental health crisis among young people by deliberately designing Instagram and Facebook features that are addictive.
− "For years, it's been glaringly obvious that Meta has failed to stop sexual predators from turning online interactions into real world harm."
Haworth pointed to whistleblowers like Arturo Bejar, as well as unsealed documents and other evidence, saying it painted a damning picture.
+ "For years, it's been glaringly obvious that Meta has failed to stop sexual predators from turning online interactions into real world harm."
Haworth pointed to whistleblowers like Arturo Béjar, as well as unsealed documents and other evidence, saying it painted a damning picture.
− New Mexico's case relied on a state undercover investigation where agents created social media accounts posing as children to document sexual solicitations and Meta's response.
+ New Mexico's case relied on an undercover investigation where agents created social media accounts posing as children to document sexual solicitations and Meta's response.
− The lawsuit, filed in 2023 by New Mexico Attorney General Raúl Torrez, also says Meta hasn't fully disclosed or addressed the dangers of social media addiction.
+ The lawsuit, filed in 2023 by New Mexico Attorney General Raúl Torrez, also said Meta hasn't fully disclosed or addressed the dangers of social media addiction.
− "Meta designs its apps to help people connect with friends and family, not to try to connect predators."
Technology Tennessee teens sue Elon Musk's xAI over AI-generated child sexual abuse material Tech companies have been protected from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S.
+ "Meta designs its apps to help people connect with friends and family, not to try to connect predators."
Tech companies have been protected from liability for content posted on their social media platforms under Section 230, a 30-year-old provision of the U.S.
− New Mexico prosecutors say Meta still should be responsible for its role in pushing out that content through complex algorithms that proliferate material that can be harmful for children.
+ Shots - Health News Screen addiction and suicidal behaviors are linked for teens, a study shows New Mexico prosecutors say Meta still should be responsible for its role in pushing out that content through complex algorithms that proliferate material that is harmful for children.
− Jurors also heard testimony from Meta executives, platform engineers, whistleblowers who left the company, psychiatric experts and tech-safety consultants.
+ Jurors also heard testimony from Meta executives, platform engineers, whistleblowers who left the company, psychiatric experts and tech safety consultants.
− In deliberations, the jury used a checklist of allegations from prosecutors that Meta failed to disclose what it knew about problems with enforcing its ban on users under 13, the prevalence of social media content about teen suicide, the role of Meta algorithms in prioritizing sensational or harmful content, and more.
+ They also considered Meta's failure to enforce its ban on users under 13, the role of its algorithms in prioritizing sensational or harmful content, and the prevalence of social media content about teen suicide.
− Shots - Health News Screen addiction and suicidal behaviors are linked for teens, a study shows Juror Linda Payton, 38, said the jury reached a compromise on the estimated number of teenagers affected by Meta's platforms, while opting for the maximum penalty per violation. With a maximum $5,000 penalty for each violation, she said she thought each child was worth the maximum amount.