Radiolab: "The Trust Engineers" (February 10, 2015) – Episode Summary
Main Theme & Purpose
This episode of Radiolab dives into the inner workings of Facebook’s “Trust Engineering” team (later renamed “Protect and Care”), focusing on their attempts to make conflict resolution and emotional communication smoother on the platform. The story traces how Facebook responded to the ever-growing number of reported photos by building new tools—based on behavioral science and language analysis—to nudge users toward more understanding, empathetic digital interactions. The episode explores both the promise and the profound ethical questions of using “trust engineering” at unprecedented social scales.
Key Discussion Points & Insights
1. Facebook’s Scale and Problem of Reports
- Facebook’s Massive Presence: As of March 2014, 1.3 billion active monthly users (03:07), more than the number of Catholics globally, leading Jad to say, “That can’t be true!” (03:07)
- Photo Reports Explosion: Post-Christmas 2011 saw more images uploaded to Facebook in a week than had ever been uploaded to Flickr in total (04:11). With photos came millions of “reports” for flagged content.
2. The Mismatch Between User Complaints and Actual Violations
- 97% Misreported: Most photo reports were mismatches, e.g., family photos flagged as harassment, puppy pictures flagged as hate speech (05:58), leading to confusion:
“Pictures of puppies reported for hate speech.” – Jad Abumrad (06:20)
- Emotional, Not Policy-Driven, Complaints: Most reporters were people appearing in the photos who simply disliked the image—exes, embarrassing party pics, even sadness over shared pets:
“It was maybe a shared puppy…maybe it’s your ex wife’s puppy.” – Arturo Bejar (07:10)
3. Investigating Emotion and Language in Reports
- Adding Emotional Context: A new “How does this photo make you feel?” box yielded more accurate information. Users selected pre-written emotional states like “embarrassing,” yet even when “embarrassing” was listed, many wrote “it’s embarrassing” (08:30).
- Changing the phrasing to “it’s embarrassing” increased usage by 28%—a subtle linguistic cue shifting responsibility from self to subject:
“It’s always good to mirror the way people talk… when you just say ‘embarrassing’… it’s silently implied that you are embarrassing. But if you say ‘it’s embarrassing’, well, then that shifts the sort of emotional energy…” – Jad Abumrad (09:18)
- Changing the phrasing to “it’s embarrassing” increased usage by 28%—a subtle linguistic cue shifting responsibility from self to subject:
4. Engineering Conflict Resolution: Prompts and Humanization
- Initiating Dialogue: Facebook wanted conflicted users to talk, but when prompted via a blank message box, only 20% actually sent a message (11:03).
- By providing a default, pre-written message (“Hey, I didn’t like this photo. Take it down.”), the response rate soared to 50% (11:39).
“We weren’t expecting to see that big of a shift.” – Arturo Bejar (11:44)
- Including the recipient’s name (“Hey Robert, I didn’t like this photo…”) added another 7% effect (12:22).
- Softening language (“Would you please take it down?”) outperformed blunter or more apologetic approaches (“Would you mind…”, “Sorry to bring this up…”). Use of “sorry” reduced positive outcomes:
“Turns out the ‘I’m sorry’ doesn’t actually help. It makes the numbers go down.” – Arturo Bejar (15:03)
- By providing a default, pre-written message (“Hey, I didn’t like this photo. Take it down.”), the response rate soared to 50% (11:39).
5. The Science of Nudging and Ethical Dilemmas
- Massive Social Experiments: Facebook routinely runs experiments on users – “any given person is probably currently involved in… 10 different experiments.” (17:14)
“I’ve been a research subject and I had no idea.” – Andrew Zolli (17:25)
- The “Emotional Contagion” Controversy: A public outcry followed news that Facebook tweaked news feeds to research emotional spread online (impacting 700,000 users) (19:16). The criticism raised issues of consent and manipulation.
“There is a power imbalance at work…highly centralized power and highly opaque power at work.” – Kate Crawford (20:50)
6. Facebook’s Power to Shape Social and Civic Behavior
- 2010 “Get Out the Vote” Experiment: Facebook increased voter turnout by an estimated 340,000 people via social nudges (21:50), showing real-world consequences and prompting concerns about democratic influence:
“That is a profound democratic power that you have.” – Kate Crawford (22:58)
7. Debate: Engineering Compassion or Overstepping?
- Critics’ Concerns: Some, like Kate Crawford, argue against the “hubris” of engineering emotional or moral outcomes at massive scale (23:28), warning about undemocratic influence.
- Defensive Perspective: Arturo Bejar argues that these interventions are a response to real requests for help (“all of the work…begins with a person asking us for help” [25:30]), not arbitrary manipulation.
8. Digital Communication, Social Norms, and Analogy to Other Technologies
- Missing Non-Verbal Cues: Online, lacking body language or tone creates new challenges in conflict and empathy (26:03).
- “Automobile Age” Analogy: Like traffic systems, digital interaction needs new “tools” and scripts to avoid crashes and enable coexistence (30:22).
“We created turn signals so we can coexist in this great flow without crashing into each other.” – Andrew Zolli (30:43)
Notable Quotes & Memorable Moments
- “Puppies reported as hate speech.” – Jad Abumrad (06:20)
- “It’s always good to mirror the way people talk.” – Jad Abumrad (09:18)
- “Turns out the ‘I’m sorry’ doesn’t actually help. It makes the numbers go down.” – Arturo Bejar (15:03)
- “I’ve been a research subject and I had no idea.” – Andrew Zolli (17:25)
- “There is a power imbalance at work. Highly centralized power and highly opaque power at work.” – Kate Crawford (20:50)
- “That is a profound democratic power that you have.” – Kate Crawford (22:58)
- “Who are we to decide whether we can make somebody more compassionate or not?” – Kate Crawford (23:28)
- “We really care about the people who use Facebook…you really have to respect people’s response and emotions, no matter what they are.” – Arturo Bejar (24:11)
- “In the absence of [nonverbal] feedback, how do we communicate? What does communication turn into?” – Arturo Bejar (26:03)
- “We created turn signals…so we can coexist in this great flow without crashing into each other.” – Andrew Zolli (30:43)
- “Maybe they’re conversation starters. Maybe that would be a beginning.” – Jad Abumrad (30:13)
Timestamps for Important Segments
- 01:39: Beginning the story; introducing Facebook’s dilemma
- 04:11: Data: More post-Christmas photos on Facebook than all of Flickr ever
- 05:58: Discovery of mass miscategorization in photo reports
- 07:10: User motives: relationship and emotional fallout drives reports
- 08:30: Nuances of emotional language—“It’s embarrassing” effect
- 09:42: Linguistic psychology in interface design
- 11:03: Blank text box doesn’t work; prewritten messages do
- 12:22: Impact of personalization (“Hey Robert…”)
- 14:00: Weekly “trust engineer” meetings and the power of data-scale
- 17:14: Most users are simultaneously part of 10 Facebook experiments
- 18:59: Public backlash to Facebook’s “emotional contagion” experiment
- 21:50: Facebook shifts 340,000 votes with a simple UI tweak
- 23:28: Critics challenge the idea of engineering compassion
- 25:30: Arturo Bejar defends user-request-oriented interventions
- 26:03: Communication without nonverbal cues online
- 30:22: The analogy: Building social “road rules” for digital life
Overall Flow and Tone
- The hosts blend curiosity, skepticism, and humor.
- Arturo Bejar is depicted as earnest, thoughtful, and nuanced in his mission to make digital communication healthier.
- Robert Krulwich frequently acts as the skeptic, questioning whether “engineering compassion” is possible or desirable.
- Kate Crawford provides a sharp critical perspective, warning of the risks in private companies quietly shaping democracy and emotional life.
- The conversation moves with pace and clarity, using real world analogies and tangible data for illustration, while retaining Radiolab's signature storytelling vibrancy.
Usefulness for Non-listeners
If you’ve never heard this episode, this summary captures:
- The fascinating “invisible” work done behind the scenes of social platforms,
- The unexpected psychological subtleties of online communication,
- The potent, sometimes unsettling reach of private tech companies into the fabric of everyday emotions, relationships, and even democracy.
You’ll get both a look at how Facebook engineers nudge user behavior, and a sharp discussion about the broader implications of those nudges—ranging from the language of kindness to the ethics of massive-scale A/B testing on a digital population.
For more context and related research, visit radiolab.org.
