Podcast Summary: Today in Focus – The Latest
Episode: Big tech reckoning: Meta ordered to pay $375m in landmark case
Host: Lucy Hough
Guest: Katie McHugh, Investigative Journalist
Date: March 25, 2026
Episode Theme & Purpose
This episode covers a landmark legal decision in which Meta, the parent company of Facebook and Instagram, was found liable in a New Mexico court for endangering children and ordered to pay $375 million. The discussion centers on the case's significance, Meta's defense and the broader implications for Big Tech accountability. Katie McHugh, whose investigative reporting formed a foundation for the trial, offers first-hand insight.
Key Discussion Points & Insights
1. Historic Legal Precedent Against Meta
- First Accountability:
- Meta held responsible by a jury for child endangerment, a first in U.S. courts.
- This case targets the design features of Meta’s platforms that enable harm, not just user-generated content.
- Section 230 and Liability Shields:
- Big Tech companies have traditionally relied on Section 230, which grants immunity from liability for user content ([02:33]).
- This lawsuit challenged that by focusing on how platforms were designed to facilitate harm.
2. Guardian Investigation as Evidence
- Katie McHugh explains their 2023 Guardian investigation:
- Uncovered Facebook and Instagram being used for child trafficking, including explicit sales of children on Facebook Marketplace ([03:48]).
- Court records and moderator interviews highlighted the scale and persistent difficulties in stopping these crimes.
- The investigation provided a crucial evidence base for the New Mexico lawsuit.
3. Meta’s Response and Defense
- Meta argued:
- Platforms are too large to guarantee complete safety ([06:29]).
- They have invested billions of dollars in safety technologies, including parental controls and default privacy for teens.
- Harm is "inevitable" to some extent given platform scale.
- Testimony from high-level executives:
- Taped depositions from Mark Zuckerberg and Instagram’s Adam Mosseri presented in court.
4. Key Testimonies and Notable Moments
- Whistleblowers and Insiders:
- Testimony from former Director of Engineering Arturo Behar:
- Shared his daughter’s experience of receiving inappropriate messages on Instagram.
- Recommended a “report button” for child safety, but felt leadership was not responsive ([07:40]).
- Testimony from former Director of Engineering Arturo Behar:
- New Mexico Attorney General’s Reaction:
- “Meta executives knew their products harmed children, they disregarded warnings from their own employees, and they lied to the public about what they knew. Today, the jury joined families, educators and child safety experts in saying enough is enough.” ([04:52])
5. Precedent for Future Lawsuits
- Other lawsuits are already underway:
- Similar strategies of focusing on platform design over content are likely to be copied.
- Multiple cases pending in California and a multi-state suit; more states may follow the New Mexico model ([05:14]).
- This verdict “pierces the liability shield” and could open the door for further litigation.
6. Global and Policy Implications
- Growing international moves to regulate Big Tech:
- Australia’s social-media ban for children is influencing policy debates in the UK ([09:16]).
- With limited federal action in the US, state-level suits are leading accountability efforts.
- Other tech platforms implicated:
- Current California case also names YouTube, Snapchat, and TikTok.
- Snapchat and TikTok settled; YouTube and Meta contesting allegations. Focus is on platform features and their impact on children's mental health (depression, eating disorders, self-harm) ([10:45]).
7. Meta’s Reflections and Public Statements
- Some admissions from Adam Mosseri:
- Acknowledged that features like encrypted DMs could pose risks ([08:36]).
- Official statements remain defensive:
- Meta labels the case “sensationalist” and claims evidence has been cherry-picked.
- Company continues to emphasize the scale of investment in safety technology.
Notable Quotes & Memorable Moments
- Katie McHugh [02:33]:
“Big tech companies have enjoyed what's referred to as a liability shield ... This case did was take aim at the design features on Meta's platform ... that may allow these harms to happen.” - Katie McHugh [03:48]:
“Our investigation ... examined how Facebook and Instagram are being used by predators to target children, and in some cases, buy and sell children using features such as Facebook Marketplace.” - New Mexico Attorney General, cited by Host Lucy Hough [04:52]:
“Meta executives knew their products harmed children, they disregarded warnings from their own employees, and they lied to the public about what they knew.” - Katie McHugh [07:40]:
“Some of the most compelling testimony came from former employees ... including Arturo Behar, who was director of engineering at Facebook ... His daughter received inappropriate sexual communications ... He was very frustrated with the response he got from leadership.” - Katie McHugh [09:56]:
“I think so. I think people now are much more aware of things that can take place on these platforms. ... these states are kind of acting in the absence of any action from the federal government in Washington D.C.” - Katie McHugh [10:45]:
“I'm sure that other big tech companies are looking at this with ... some concern ... The lawsuit is has been filed against Meta, YouTube, Snapchat and TikTok. Snapchat and TikTok have reached settlements whereas YouTube and Meta have denied all the allegations.”
Important Timestamps
| Timestamp | Segment & Content | |-----------|------------------| | 01:05 | First explanation of Meta's accountability and liability shield | | 02:33 | Legal significance and breaking of Section 230 shield | | 03:48 | Guardian investigation details and link to legal evidence | | 04:52 | Attorney General's powerful post-verdict statement | | 06:29 | How Meta defended itself: executive testimonies | | 07:40 | Whistleblower (Arturo Behar) testimony | | 08:36 | Meta’s public response and Adam Mosseri’s acknowledgements | | 09:16 | Global policy responses and UK/Australian context | | 10:45 | Impact on and involvement of other tech giants | | 11:24 | Final reflections on broader implications |
Concluding Thoughts
- This New Mexico verdict marks a major shift in how Big Tech may be held accountable for harms facilitated by their platforms, especially regarding child safety.
- The case could reshape legal strategies against tech giants and prompt further litigation by focusing on platform design, not just content.
- With mounting public and governmental scrutiny, the era of “Big Tech impunity” may be ending—starting with Meta.
