The New Yorker Radio Hour
Episode: "A Reckoning at Facebook"
Date: February 16, 2018
Host: David Remnick
Main Guest: Nicholas Thompson (Editor-in-Chief, Wired)
Brief Overview
This episode explores Facebook’s internal crisis following revelations about its role in the 2016 U.S. presidential election and the subsequent exploitation of its platform by Russian operatives to spread misinformation. David Remnick interviews Nicholas Thompson, co-author of Wired’s cover story “Inside the Two Years that Shook Facebook,” unpacking Facebook’s response, Mark Zuckerberg’s evolution, and the enormous ethical and political challenges facing the tech giant. The episode also briefly touches on topics of corporate criticism, Silicon Valley culture, and the influence of media moguls like Rupert Murdoch.
Key Discussion Points and Insights
1. Facebook’s 2016 Election Crisis
- Context: Facebook, Twitter, and Google were summoned to testify before Congress about Russian interference in the 2016 election via social media manipulation.
- “You had a foreign government apparently buying thousands of dollars worth of advertising to create discontent and discord in the 2016 election.” (00:35)
- The hearings are described not as information-finding but as efforts to establish to what degree tech companies would admit culpability.
2. Zuckerberg as Lenny – Inability to Grasp Power
- Thompson relays a telling comparison by a Facebook employee likening Mark Zuckerberg to Lenny from Of Mice and Men:
- “He's like Lenny, the farm worker who's just too strong, who kills things because he doesn't know his own power.” (02:17, Nicholas Thompson)
3. Silicon Valley’s Self-Image versus Responsibility
- Contrasts the self-deprecating, “nice” ethos of Silicon Valley with the far-reaching consequences of their platforms.
- Thompson: “They saw all the fake news spreading that was helping Trump... The election happens and it's this moment of, 'Oh my God, did we do that?'” (03:10)
4. How Facebook Got Exploited
- Thompson details two key mechanisms:
- Filter Bubbles: Users see increasingly extreme versions of content that reinforce their beliefs.
- Outrage Amplification: The algorithm favors sensational or partisan content.
- “If you write a story and it says 'Trump is a monster' or 'Hillary is the worst,' it's going to get shared much more...because of how the algorithm works.” (04:21)
- The platform allowed for the proliferation of fake news and foreign interference (e.g., Russian operatives, “kids in Montenegro”).
5. Facebook as Platform vs. Publisher
- Facebook refused to take responsibility for verifying information because they saw themselves as a mere platform, not as editors or publishers.
- “A story from a publication that some kid far away makes up...more or less looks the same [as The New Yorker].” (05:50, Nicholas Thompson)
- After being accused of anti-conservative bias in 2016, Facebook became hesitant to intervene against misinformation for fear of regulatory backlash and appearing biased.
6. Zuckerberg’s Miscalculation and Slow Evolution
- Zuckerberg initially dismissed the idea that Facebook influenced the election, calling it “pretty crazy.”
- “He’s very analytical, looks at the numbers and decides that couldn’t really have had any influence.” (07:45, Thompson)
- Internal dissent: Employees feared his public tone could push Facebook “down the pariah path that Uber was on.”
- “They were worried Zuckerberg was 'gonna take Facebook down the pariah path that Uber was on and they had to flip him.'” (07:59, Thompson)
7. Employee Pressure and Internal Rift
- Growing unease among engineers and employees threatens the company’s ability to attract talent.
- “If the meme starts to spread, Facebook is making the world worse... then Facebook's in really big trouble.” (09:12)
8. Insiders and Outsiders: Who Facebook Listens To
- Facebook dismisses most external critics as not worth listening to except Rupert Murdoch, who commands respect and fear due to his power and willingness to use it.
- “They don’t listen to journalists... with one exception, Mr. Murdoch…” (10:18)
- Detailed account of Murdoch’s private confrontations with Facebook and Zuckerberg’s “startled respect.” (11:47–13:14)
9. Facebook’s Slow Recognition of Russian Operations
- Facebook did not become aware of the scope of Russian propaganda—specifically the Internet Research Agency—until mid–2017, well after the election.
- “The stuff that had a huge influence on America the previous year... they didn't know that until there was a story in Time magazine in May of '17.” (14:12)
- Once discovered, Facebook tried to minimize the problem and only began acknowledging its seriousness after Congressional pressure.
10. The “Education” of Mark Zuckerberg
- Experiences a slow and “hard education” in accountability.
- “I think that starting then, after that pretty crazy comment, begins the education of Mark Zuckerberg...” (08:44, Thompson)
- His public New Year’s resolution for 2018: “I’m going to fix Facebook.”
- “He got mocked… but I read it and thought, this guy’s really taking it to heart.” (16:09)
11. Facebook’s Recent Changes & Prospects
- Deprioritizing public and publisher content (“pages”) in favor of content from friends and individuals.
- Leads to lower news engagement, but possibly more for “high-quality news.” (17:11–17:25)
- Cautious optimism from Thompson that Facebook is, slowly, improving.
- “I'm happier about Facebook than I was three months ago.” (17:34)
- The real test: Will Facebook continue amplifying extreme, partisan content in coming elections?
Notable Quotes & Memorable Moments
- On Zuckerberg as Lenny:
“It’s like Lenny the farm worker who’s just too strong, who kills things because he doesn’t know his own power.”
– Nicholas Thompson (02:17) - On Silicon Valley’s self-image:
“People in Silicon Valley...like to think of themselves as good and nice and innocent.”
– David Remnick (02:51) - On Facebook’s core design choices:
“They would make everything in News Feed look the same...they did it for what they thought were good reasons, and it led to these catastrophic consequences.”
– Nicholas Thompson (05:50, 06:26) - On Murdoch’s Influence:
“Murdoch was skilled in the dark arts... Zuckerberg goes back in a panic. And that’s one of the things that starts Facebook to...reconsider its relationship to the news industry.”
– Nicholas Thompson (11:01–13:14) - On Zuckerberg’s “education”:
“It’s a long education and it doesn’t come overnight.”
– David Remnick (08:57) - On Facebook’s responsibility:
“If the meme starts to spread, Facebook is making the world worse...then Facebook’s in really big trouble.”
– Nicholas Thompson (09:12) - On Zuckerberg’s humility:
“He got mocked, right...but I read it and thought, this guy’s really taking it to heart.”
– Nicholas Thompson (16:09)
Important Timestamps
- 00:35 – Introduction to the episode’s focus: Congressional hearings on tech and Russia
- 02:13 – Zuckerberg as Lenny: the pivotal analogy
- 04:21 – Filter bubbles and outrage amplification explained
- 05:50 – Facebook’s design decision: platform vs. publisher
- 07:45 – Zuckerberg’s “pretty crazy” dismissal of Facebook’s election impact
- 09:12 – Internal employee pressure and the threat to Facebook’s talent pool
- 10:48 – Facebook’s selective listening: the exception of Rupert Murdoch
- 11:47 – Murdoch confronts Facebook; detailed account
- 14:12 – Facebook’s late realization of Russian propaganda
- 16:09 – Zuckerberg’s New Year’s resolution: “fix Facebook”
- 17:34 – Recent changes to Facebook and cautious optimism
- 17:56 – The coming test: midterm elections and the platform’s influence
- 18:34 – Acknowledgment of “chicken and egg” in political polarization
Tone and Style Highlights
- The conversation is measured but urgent—Remnick’s style is probing, analytical, with moments of dry humor (“no longer just a nice guy without a tie”).
- Thompson offers detailed, methodical explanations with a tone mixing professional detachment and genuine worry.
- The piece is suffused with skepticism toward tech boosterism and thoughtful concern about the consequences of unchecked power.
For Listeners New to the Episode
This episode provides a deep dive into Facebook’s existential crisis post-2016 election. It unpacks not only the technical reasons the platform was susceptible to manipulation, but also the organizational blind spots and leadership failures that allowed the crisis to unfold. It concludes with hope that lessons are being learned, albeit slowly, but leaves the open question: can Facebook really change, or is its DNA fundamentally flawed?
For full context, the episode continues with a lighter segment on basketball from 20:00, but the main theme—Facebook’s reckoning—concludes before then.
