The Times Tech Podcast – "Ignore, Delete, Escalate"
Air Date: September 24, 2024
Hosts: Danny Fortson (San Francisco), Katie Prescott (London)
Episode Overview
In this episode, Danny Fortson explores the hidden world of online content moderation—the "shadow workforce" that keeps the internet clean by sifting through the worst content imaginable. Fortson investigates the lives, challenges, and working conditions of moderators, both gig and contracted, highlighting their impact, the trauma they suffer, and the industry's dependence on their invisible labor. He also examines the legal, ethical, and technical dilemmas facing tech giants, with special focus on Facebook’s moderation army and the ongoing legal and cultural reckoning surrounding this vital but perilous job.
Key Discussion Points & Insights
The Rise of Content Moderation as a Profession
- Rapid expansion: In 2016, Facebook had 12,000 employees in total. By 2019, over 30,000 people were working on safety and security at Facebook, many specifically on moderation. (04:00–04:41)
- Industry-wide need: All major platforms (YouTube, Twitter, Instagram, Snapchat, etc.) employ armies of moderators, oftentimes through external contractors.
Gig Moderators: Human Intelligence Tasks on Amazon Mechanical Turk (MTurk)
- Case Study: Danny speaks with Marlena Griffin, a freelance content moderator based in Corpus Christi, Texas, who balances the flexibility work offers with the trauma it inflicts. (04:19–07:24)
- "I'm a guinea pig, a mental guinea pig. And that term has been used by workers like me ever since this work has existed." – Marlena Griffin [08:35]
- Task Variety and Pay: Moderators perform fast, repetitive, high-pressure "HITs" (Human Intelligence Tasks) for minimal pay—sometimes a few cents per task. Ten cents a minute ($6/hour) is a typical minimum, below U.S. minimum wage. (06:26–07:03)
- Trauma Exposure: Exposure to graphic imagery, including child pornography, gore, and violence, is routine. (06:05–06:12)
- "Child pornography, gore and death. Yeah. I have to walk away from my computer." – Marlena Griffin [06:05]
Structural and Economic Realities
- Global, decentralized workforce: Millions globally—often freelancers or contractors—conduct moderation with minimal protection or benefits. (08:18–08:35)
- Economic drivers: Outsourcing is about cost, geographic scaling, and language coverage. Facebook and others rely on contracted/outsourced labor for flexibility and to source rare language skills. (21:41–22:08, 23:13–23:25)
- "I think it's a cost question. Yeah." – Mary Gray [23:10]
- "In many cases, we really do need that mixed workforce..." – Abigail Soy, Facebook [21:50]
Inside Facebook: Legal and Ethical Dilemmas
- Landmark lawsuit: Attorney Steve Williams represents Selena Scola and other moderators suing Facebook for workplace safety failures after suffering PTSD from prolonged content exposure. (11:21–14:31)
- "Horrific dystopian vision of people sitting in rooms with monitors, constantly flashing just truly awful imagery at them..." – Steve Williams [13:59]
- Industry parallels: Williams draws analogy to NFL concussion lawsuits—exposure to workplace harm warrants long-term care/monitoring, even in absence of immediate symptoms. (16:27–17:04)
- "We're taking that model from a chemical exposure and we're applying it here in the instance of a psychic exposure." – Steve Williams [15:09]
The Battle Between AI and Human Moderators
- AI progress: Facebook engineer Jeff King details how automation intercepts 96% of nudity, 99% of child nudity/terrorism, and 85% of graphic violence before anyone sees it. Yet, gray areas and linguistic/cultural nuances still require human judgment. (18:59–19:56)
- "It's actually incredibly cool how much progress we've made..." – Jeff King [19:37]
- Volume remains staggering: Even a small percentage (0.3% = 300,000 pieces of hate speech per billion uploads) outpaces what algorithms can erase, ensuring persistent reliance on human moderators. (20:05–20:25)
Human Moderators: Seen as Temporary "Ghost Workers"?
- Techno-utopian mindset: Tech companies view content moderation as a stopgap until AI is perfect, justifying their lack of investment in the workforce. (24:02–24:31)
- "Their sheer grit and optimism about the potential for computation to solve that problem is unshakable." – Mary Gray [24:31]
- Inevitable dependence: Demand for moderation grows faster than automation can keep up, ensuring human "ghost workers" aren't going away soon. (28:15)
Human Toll and Hope for the Future
- Personal costs: Many moderators (especially freelancers) work without benefits, legal protections, or mental health support.
- "We're not really protected by any union or laws..." – Marlena Griffin [28:42]
- Reluctance to stay: For some, fast-food jobs would be preferable if means allowed. (29:10–29:27)
- Call for regulation: Mary Gray is optimistic society and lawmakers will catch up and force platforms to treat shadow workers better—just as labor standards improved during the Industrial Revolution. (29:36–29:53, 32:36–32:55)
- "Society has to ask for more of tech companies than we have in the past... It's early days." – Mary Gray [29:36]
Notable Quotes & Memorable Moments
- "I'm a guinea pig, a mental guinea pig. And that term has been used by workers like me ever since this work has existed." — Marlena Griffin [08:35]
- "It's sort of this horrific dystopian vision of people sitting in rooms with monitors, constantly flashing just truly awful imagery at them..." — Steve Williams [13:59]
- "We're taking that model from a chemical exposure and we're applying it here in the instance of a psychic exposure." — Steve Williams [15:09]
- "Their sheer grit and optimism about the potential for computation to solve that problem is unshakable." — Mary Gray [24:31]
- "Worst case, we keep pretending like this work is going to go away any day, that we're going to automate it out of existence, and therefore we, we don't have to care about the people doing this work." — Mary Gray [28:04]
- "Society has to ask for more of tech companies than we have in the past..." — Mary Gray [29:36]
- "The web has turned society upside down. It has made our lives immeasurably better. For some, it has made it worse... it's early in the history of technology and how we all learn to live with it." — Danny Fortson [32:36]
Important Timestamps
| Timestamp | Segment | Details | |------------|--------------------------------------|------------------------------------------------| | 04:19 | Interview with Marlena Griffin | Life as a gig moderator on MTurk | | 06:05 | Marlena on trauma | Leaving workstation to cope | | 07:24 | Marlena on DoD research HITs | Exposure to war/aftermath images | | 08:35 | “Mental guinea pig” quote | Lasting psychological effects | | 11:19 | Interview with lawyer Steve Williams | Content moderator lawsuit against Facebook | | 13:59 | Description of moderators’ job | "Dystopian vision" quote | | 16:27 | NFL concussion analogy | Drawing parallels with content moderator trauma | | 18:59 | Facebook engineer Jeff King | Scale and function of AI content moderation | | 19:37 | Algorithm success rates | % of auto-moderated inappropriate content | | 21:41 | Facebook exec Abigail Soy | Rationale for using contractors, scale issues | | 23:10 | Mary Gray on cost dynamics | Economics of outsourcing moderation | | 24:02 | Silicon Valley’s techno-utopian view | Human mod seen as AI placeholder | | 28:04 | Mary Gray on future/care for workers | Call for acknowledgment and protections | | 29:36 | Mary Gray’s cautious optimism | Legal/cultural change is possible | | 32:36 | Fortson’s concluding reflection | Early days, historic change, need for reform |
Conclusion
"Ignore, Delete, Escalate" provides a rare look at the unsung labor force cleaning up the digital world—exposing the trauma, invisibility, and systemic exploitation they endure, as tech giants race to automate their jobs. Danny Fortson strikes a balance between exposing chilling realities and acknowledging the nascent hope for reforms, drawing historical parallels to showcase both the necessity and the cost of such shadow work. Despite advances in AI, the need for human moderation is only growing, and societal pressure may soon force tech to reckon with its caretakers’ well-being.
For those interested in technology, human rights, and the unseen labor shaping the digital world, this is a must-listen episode—gritty, thought-provoking, and deeply human.
