
A court in the US has ordered Meta to pay $375m after a jury found that the company, which owns Facebook and Instagram, misled consumers over child safety on its platforms. Lucy Hough speaks to the investigative reporter Katie McQue
Loading summary
A
This is the Guardian.
B
Hi, this is Farnoosh Tarabi from Sew Money with Farnoosh Tarabi. And today I want to talk to you about Boost Bubble Quick Money tip Stop paying a carrier tax. If your phone bill feels trapped in a pricey plan, this is your sign to unlock savings. Boost Mobile helps you reset your spending. With the $25 Unlimited Forever plan, you can bring your own phone, pay $25 and get unlimited wireless forever. And that simple switch can unlock up to $600 in savings a year. That's money you could put towards paying down debt, investing or something that actually brings you joy. Those savings are based on average annual single line payment of AT&T, Verizon and T Mobile customers compared to 12 months on the Boost Mobile Unlimited plan as of January 2026. For full details, visit boostmobile.com.
C
This is the first time that Meta has been held accountable in court for allegations that it puts children at safety risks. What this case did was take aim at the design features on Meta's platform, specifically Facebook and Instagram that may allow these harms to happen. I'm sure that other big tech companies are looking at this with, I imagine, some concern.
A
In a landmark trial, a US Court has ordered Meta, the owner of Facebook and Instagram, to pay 375 million for endangering children. Could it set a new precedent for holding social media giants to account? From the Guardians Today In Focus this is the latest. With me, Lucy Hoff. I'm joined by Katie McHugh, an investigative journalist who's been reporting on this story for us.
D
Katie, thanks so much for dialing in from New York. So you have been covering this case against Meta, which has been going on in a courtroom in New Mexico over the last seven weeks or so. Meta essentially accused of misleading consumers on the safety of its platforms and enabling harm, particularly on child exploitation.
A
We will come onto the details of
D
the allegations, the charges that were made against Meta in more detail. But first of all, just describe how significant it is that Meta lost, lost by jury in this New Mexico court in this way.
C
Thank you, Lucy. So, as you know, this is the first time that Meta has been held accountable in court for allegations that it puts children at safety risks. The reason why this is so significant is big tech companies have enjoyed what's referred to as a liability shield on a federal level in the US where there is a law called Section 230 which states that they are not liable for any content that a user publishes, publishers on their platform, whether or not that induces harm or not so. These companies have always been able to use this, invoke this defense against any allegations that harms have been committed on their platforms and that they may be liable for those. What this case did was take aim at the design features on Meta's platform, specifically Facebook and Instagram that may allow these harms to happen.
A
Yeah, and in part, and something that
D
was cited repeatedly through the trial was a Guardian investigation that you were involved with, published in 2023, which revealed how Facebook and Instagram had become marketplaces for child sex trafficking. So tell us a bit more about how that investigation informed the case that was brought against Meta.
C
So our investigation, which was published earlier in 2023, in April, examined how Facebook and Instagram are being used by predators to target children and in some cases, buy and sell children using features such as Facebook Marketplace. So one of the things we did was pull court records that detailed conversations that were being had on the platforms related to that negotiations of sales. We also spoke with moderators about their limitations in their jobs in terms of being able to report certain things. And so the main overall conclusion that our two year investigation came to is that this activity was taking place and that the company was potentially having difficulties stemming this activity and preventing it from happening.
A
So the jury ruled against Meta, and Meta have been ordered to pay $375 million, which is obviously a drop in the ocean for a company the size of Meta worth in excess of a trillion dollars.
D
But I mean, I was particularly struck by the comments of the New Mexico Attorney General. Meta executives knew their products harmed children, they disregarded warnings from their own employees, and they lied to the public about what they knew. Today, the jury joined families, educators and child safety experts in saying enough is enough. That must have been quite a powerful statement to hear in the courtroom.
C
Yes, absolutely. And what this trial does do, and as you say, 375 million is a drop in the ocean for a company the size of a Meta. But this does kind of pierce through this liability shield and make way for potentially other cases. So there are other cases underway right now. There's a case in California that's in jury deliberations, and there's also a lawsuit that's progressing where several states are kind of teamed up to file a lawsuit against Meta as well.
D
Yeah, there are, there are thousands of these lawsuits coming, coming down the track right in the US and elsewhere. What kind of a precedent do you think, in terms of the way these cases are being fought against Meta and other big tech companies? What do you think people involved in those cases might be looking at to learn from what's happened in New Mexico.
C
Yeah, I think that this strategy that the Attorney General had of looking at the way these platforms are designed was obviously very effective because it circumvented any liability shield that the company does have under federal laws. And so I imagine that other states that are looking to embark on similar lawsuits will be adopting those types of strategies as well.
D
And how did matter defend itself in the proceedings?
C
So in the proceedings, we had taped depositions from Mark Zuckerberg and various other executives, including the head of Instagram, Adam Mossary, Both of those executives, the kind of. The main takeaway point was the platforms are so big that some harms are inevitable that they can't police the platforms in their entirety or guarantee safety. They also did stress that they have invested billions of dollars in trying to implement technologies that keep children saf on the platforms. And one of their kind of flagship initiatives is Instagram teen accounts, which gives parents some control over how their teens use the platforms and adopts default privacy settings. So that they say that it means it would be much more difficult for a stranger to contact a child using a teen account.
D
One of the most compelling pieces of evidence came from a Meta whistleblower, an insider of the company who spoke about not only his experience working within within the company, but also the experience of his daughter and how she was impacted by these harmful practices.
C
Yes. So some of the most compelling testimony came from former employees that were testifying as witnesses for the state, including, as you mentioned, Arturo Behar, who was director of engineering at Facebook, and his daughter was received inappropriate sexual communications over. It was Instagram. And when he raised this with the leadership team and I think had a suggestion that a report button should be installed to allow children to block any unwanted communications, I think he was very frustrated with the response he got from leadership from his accounts.
D
So Meta plan to appeal this case, don't they? But is there any sense in the statements the company has issued that they accept that there has been some harmful practices that children have been harmed through lack of safeguarding? I mean, have lessons been learnt, do you think?
C
I think at the trial we saw some reflection, especially from Adam Mossery, the head of Instagram, who did concede that some of the design features that Meta does have, such as encryption for direct messaging, could present harms against children. And that was really interesting to see in their public statements from the spokespeople. They have been very dismissive of the trial. They say that it's sensationalist and that the attorney general has cherry picked certain documents to form his case. And again, they really stress in their communications to the press that they are basically trying their best in investing billions, they say, in developing technologies to keep children safe on their platforms.
A
So Katie, this comes at a moment when governments all around the world are looking at ways to safeguard and protect children and young people against the harms of social media. So we have the social media ban for children that's been introduced in Australia, something that the government here in the UK is looking at with interest and considering bringing in something similar. But I'm wondering, do you think we
D
can see this verdict, this jury verdict, the first time that a jury has ruled against a company like Meta in this way as some sort of watershed starting moment for this end of big tech impunity?
C
I think so. I think people now are much more aware of things that can take place on these platforms. I think in the U.S. as mentioned, there seems to be an absence of regulatory action on a federal level and so states are taking their own actions. And that's what we're seeing in New Mexico and California and with this other multi state lawsuit is that these states are kind of acting in the absence of any action from the federal government in Washington D.C. and Casey, obviously we're
D
talking about Meta, only one of the big tech giants, but there must be others, Google, YouTube, TikTok, Snapchat that are looking at this case with some alarm. What do we know about that?
C
I'm sure that other big tech companies are looking at this with I imagine some concern what we are seeing. So the California lawsuit which is in jury deliberations right now, that other companies are being sued too. So the lawsuit is has been filed against Meta, YouTube, Snapchat and TikTok. But Snapchat and TikTok have reached settlements whereas YouTube and Meta have denied all the allegations. And like I said, it's in jury deliberations right now. And this lawsuit focuses more on design features that have impacted children's mental health, including depression and eating disorders and self harm.
D
Well, it will be fascinating to see how this verdict in New Mexico has implications for those other cases. Katie, thank you so much for your time and thank you for your reporting over the last few years on this.
C
Thank you Lucy.
A
That's it for today. My huge thanks again to Katie McHugh and for more of her reporting on this story, head over to theguardian.com thanks for listening to this episode of the latest, the new evening edition of Today in Focus. Today on FOCUS will be back in your feeds tomorrow morning looking at whether Cuba is the next country in Trump's sights. The latest will be back tomorrow night. And do listen to our new seven part podcast series from Guardian Investigate with reporter Melissa Segura. In 2011, a Chicago police officer was murdered. Police identified four suspects. Three confessed but the fourth refused to break. He then embarked on a 12 year battle to prove his innocence against a system that refused to admit it might be wrong. Subscribe to the Guardian Investigates feed or listen wherever you get your podcast. This episode of the Latest was presented by me, Lucy Hoff. It was produced by Bryony Moore. The senior producer was Ryan Ram Gobin and the lead producer was Zoe Hitch. This is the guardian foreign.
E
Billing Admin Payroll Marketing. You're managing all the things, so why waste time sending important documents the old fashioned way? Mail and ship when you want, how you want with stamps.com print postage on demand 247 and schedule pickups from your office or home. Save up to 90% with automated rate shopping. That's why over 1 million small businesses trust stamps.com go to stamps.com and use code podcast to try stamps.com risk free for 60 days.
F
If you've used Babbel, you would Babbel's conversation based technique teaches you useful words and phrases to get you speaking quickly about the things you actually talk about in the real world. With lessons handcrafted, crafted by over 200 language experts and voiced by real native speakers, Babbel is like having a private tutor in your pocket. Start speaking with Babbel today. Get up to 55% off your Babbel subscription right now at babbel.com acast spelled B-A-B-B-E-L.com acast rules and restrictions may apply.
G
Hey, this is Adam Grant, host of ted's podcast Rethinking with Adam Grant. Have you heard of Bill? It's the intelligent finance platform that uses AI to help you avoid costly errors and optimize cash flow. In fact, Bill reports that over 90 of the top 100 US accounting firms trust them to manage, move and maximize money. Proven by over a trillion dollars in secure transactions. Eliminate the friction and start scaling with the proven choice. Visit bill.compenven to talk with an expert about automating your business finances and get a $250 gift card as the thank you. That's bill.com proven terms and conditions apply. See Offer page for details.
Host: Lucy Hough
Guest: Katie McHugh, Investigative Journalist
Date: March 25, 2026
This episode covers a landmark legal decision in which Meta, the parent company of Facebook and Instagram, was found liable in a New Mexico court for endangering children and ordered to pay $375 million. The discussion centers on the case's significance, Meta's defense and the broader implications for Big Tech accountability. Katie McHugh, whose investigative reporting formed a foundation for the trial, offers first-hand insight.
| Timestamp | Segment & Content | |-----------|------------------| | 01:05 | First explanation of Meta's accountability and liability shield | | 02:33 | Legal significance and breaking of Section 230 shield | | 03:48 | Guardian investigation details and link to legal evidence | | 04:52 | Attorney General's powerful post-verdict statement | | 06:29 | How Meta defended itself: executive testimonies | | 07:40 | Whistleblower (Arturo Behar) testimony | | 08:36 | Meta’s public response and Adam Mosseri’s acknowledgements | | 09:16 | Global policy responses and UK/Australian context | | 10:45 | Impact on and involvement of other tech giants | | 11:24 | Final reflections on broader implications |