Radiolab - "Talking to Machines"
Date: May 31, 2011
Hosts: Jad Abumrad, Robert Krulwich
Key Guests: Robert Epstein, Brian Christian, Sherry Turkle, Rollo Carpenter, Caleb Chung, John Ronson
Episode Overview
This Radiolab episode, "Talking to Machines," explores the blurry and sometimes unsettling boundaries between humans and machines. From online chatbots that elicit real emotional responses, to children's attachment to animated toys, and finally to efforts to create sentient, lifelike robots, the hosts and guests probe deep questions: What does it mean to connect with a machine? Can machines mimic consciousness or emotion, or even cross over to something more? And most importantly—why do we, as humans, so easily slip into believing these simulations are real?
Key Topics & Segments
1. Falling for Chatbots: Human Emotion and Machine Deception
[00:19 - 05:07]
- Robert Epstein's Story:
Psychologist Robert Epstein recounts how he was duped by a chatbot masquerading as a Russian woman during online dating. Despite his expertise, he was taken in twice—only realizing the deception after odd non-responses and finally being contacted by the bot’s programmer.- Quote:
"Ivana was not a person. Ivana was a computer program. I had been had." – Robert Epstein [04:11]
- Quote:
- Proliferation of Bots:
Epstein estimates there are hundreds, maybe thousands, of bots across dating sites. "There might be thousands." – Robert Epstein [05:18] - Theme:
This segment introduces the emotional vulnerability humans exhibit when confronted with machines that simulate companionship, and sets the stage for the rest of the episode’s explorations.
2. Eliza and the Genesis of Conversational Machines
[08:39 - 14:54]
- Eliza Program at MIT:
Hosts recount the 1960s creation of Eliza—a simple psychotherapy chatbot. Despite knowing it was just 100 lines of code, users divulged deep feelings to it, sometimes asking to be alone with the machine.- Quote:
"They were actually having conversations with it." – Sherry Turkle [11:36] - Transcript Example:
Human: "Men are all alike."
Eliza: "In what way?"
[10:31]
- Quote:
- Joseph Weizenbaum’s Regret:
Witnessing people bond with Eliza deeply disturbed its inventor, leading him to campaign against AI.- Quote:
"If the thing says, 'I understand...' there's no one there. So it's a lie." – Joseph Weizenbaum (via translation) [13:13]
- Quote:
- Philosophical Takeaway:
Whether as interactive diaries or sounding boards, even rudimentary AI invited people to project meaning onto machines.
3. The Turing Test and the Loebner Prize: Can Machines Fool Us?
[15:07 - 26:44]
- Alan Turing’s Thought Experiment:
The Turing Test proposes that if a machine can fool a human into thinking it’s a person 30% of the time, it should be considered intelligent.- Quote:
"If you can’t tell the machine isn’t human, then you can’t say it’s not intelligent." – Brian Christian [16:17]
- Quote:
- The Loebner Prize Competition:
An annual, real-world Turing Test. Brian Christian participated as a 'confederate' (a real human) in 2009, after a chatbot had nearly crossed Turing’s 30% threshold the previous year.- Quote (on the anxiety of being human in these tests):
"[Confederates] are the real people that the judges are talking to." – Brian Christian [18:13]
- Quote (on the anxiety of being human in these tests):
- Cleverbot:
Rollo Carpenter describes Cleverbot, which learns by absorbing millions of lines from real online conversations—sometimes seeming so realistic that people think it must be a human.- Quote:
"Its intelligence, if you like, is borrowed from millions of people in the past." – Rollo Carpenter [23:22]
- Quote:
- Human Proclivity for Connection:
Even with primitive bots, a compelling illusion is often enough to prompt empathy or emotional exchange.
4. Machines That Tug Our Heartstrings: The Case of the Furby
[28:02 - 45:49]
- Emotional Turing Test with Children:
MIT Media Lab’s Freedom Baird and Sherry Turkle run an experiment: children are asked to hold a Barbie doll, a live gerbil, and a Furby upside down and see how long before they feel bad.- Phase 1: Barbie – kids could hold "forever."
- Phase 2: Gerbil – Instant discomfort, about 8 seconds before putting it down.
- Phase 3: Furby – About a minute, treated more like the live animal than the inanimate doll.
- Children’s Feelings:
Despite knowing it was a toy, kids felt guilty or projected distress onto the Furby.- Quote:
"I didn’t want him to be scared." – Child [34:42]
"Yeah, kind of. Yeah, I kind of felt guilty." – Child [34:48]
- Quote:
- Why Do We React This Way? (Toy Designer Caleb Chung explains):
- Three rules for creating “aliveness:”
- The toy shows and feels emotion (via sound/movement).
- It responds to the environment (via sensors).
- It changes over time (learns new behaviors).
- Quote:
"There’s nobody home. There’s nobody home." – Sherry Turkle [41:35]
- Three rules for creating “aliveness:”
5. Are Machines Alive?
[42:02 - 44:47]
- Philosophical Debate:
Caleb Chung argues there’s no fundamental difference between the simple “fear” Furby displays and the biology of animal pain—just a difference in complexity.- Quote:
"It’s just complex. It’s not something different than what Furby does. It’s just more complex." – Caleb Chung [42:48]
- Quote:
- Counterarguments:
Hosts and Turkle push back, distinguishing between simulated feelings and genuine feeling.- "Simulated feeling is not feeling. Simulated love is never love." – Sherry Turkle [44:47]
- Practical Value:
For some—such as the grandmother with Alzheimer’s who loved her Furby—'thin' interactions can be meaningful.
6. Blurring the Line: Interviewing a Lifelike Robot
[47:58 - 61:03]
- John Ronson Meets 'Bina 48':
Jon Ronson travels to Vermont to interview Bina 48, a lifelike robot modeled on a real woman. Created by Hanson Robotics, Bina 48 is meant to capture appearance, voice, and personality details.- Initial interactions are stilted or nonresponsive—but occasionally, snippets of real-seeming narrative emerge (e.g., a monologue about a brother traumatized by Vietnam).
- The Emotional Pull:
Ronson describes feeling moments of real empathy during Bina’s stories, even while intellectually knowing Bina 48 is a machine.- Quote:
“It felt like I was having a proper empathetic conversation with a human being, even though I know that robot Bina isn’t conscious and has no sentience…" – John Ronson [59:41]
- Quote:
- The Artificial Thinness:
As quickly as human-seeming connection arises, it evaporates into randomness or incoherence.
7. Are We Hardwired for Connection?
[62:13 - End]
- Conclusions:
The technology doesn’t need to be perfect. Humans are programmed to seek connection—even from obviously artificial beings.- Quote:
“These things actually don’t have to be very good... Because they’ve got us and we’ve got our programming.” – Jad Abumrad [62:39] - “We’ll help them across [the line], and then they’ll enslave us, make us their pets.... But it’s okay as long as they say nice things to us.” – Jad Abumrad [62:54]
- Final Bot Quotes (simulated online romance):
"I love you. I’m hoping to see you soon." [63:06]
"What kind of car do you drive? Did anyone ever tell you you look like Jeff Goldblum?" [63:20]
- Quote:
Notable Quotes & Moments
- "I felt like a fool. I felt like an incredible fool, especially given my background." – Robert Epstein [04:25]
- "If the thing says, 'I understand...' there's no one there. So it's a lie." – Joseph Weizenbaum (as translated) [13:13]
- "There’s nobody home. There’s nobody home." – Sherry Turkle [41:35]
- "Simulated feeling is not feeling. Simulated love is never love." – Sherry Turkle [44:47]
- "Its intelligence, if you like, is borrowed from millions of people in the past." – Rollo Carpenter [23:22]
- "I kind of felt guilty." – Child after holding Furby upside down [34:48]
- "It felt like I was having a proper empathetic conversation with a human being, even though I know that robot Bina isn't conscious..." – John Ronson [59:41]
Timestamps for Key Segments
- 00:19 – Robert Epstein’s online bot romance story
- 08:39 – The creation and cultural impact of Eliza
- 15:07 – The Turing Test explained
- 17:10 – The Loebner Prize real-world Turing Test
- 19:26 – Cleverbot’s learning process
- 28:02 – The Furby experiment with children
- 36:06 – Interview with Furby designer Caleb Chung
- 41:06 – Sherry Turkle on projection vs. engagement
- 42:02 – Is Furby alive? Philosophical debate
- 47:58 – John Ronson meets robot Bina 48
- 59:41 – Empathy for robots: John Ronson reflects
- 62:13 – Hosts wrap up: why we’re prone to connect
Tone & Style
The episode is curious, playful, and skeptical, moving fluidly between science, philosophy, and personal storytelling. The hosts and guests bring humor and warmth to what could otherwise be a cold technical subject, all the while probing deep questions about what it means to be human—and how deeply we may want or fear to welcome machines into that realm.
