Podcast Summary
Is Business Broken?
Episode: Would You Trust a Self-Driving Car?
Release Date: March 26, 2026
Host: Kurt Nickish (Questrom School of Business)
Guests:
- Carey Morewitz – Professor of Marketing, BU Questrom, expert in psychology of technology
- Bashak Erzer – Entrepreneur, board director, former chief product officer at Motional
- Mark Schieldrop – Spokesperson for AAA Northeast
- Mike Vardabedian – International Association of Machinists District 15, representing app-based drivers
Episode Overview
This episode explores the societal, psychological, and economic questions surrounding the adoption of self-driving cars. With autonomous vehicle tests expanding in cities like Boston, the discussion focuses on what drives our trust or fear, how data and lived experience shape public opinion, and the potential economic and regulatory impacts—especially for drivers and cities.
Key Discussion Points & Insights
1. Public Perception and Trust in Self-Driving Cars
Timestamps: 00:15 – 07:10
-
Mixed Reactions: Bostonians express skepticism, fear, and cautious curiosity about riding in robo-taxis, with local driving conditions amplifying concern.
- “No. I would not ride it if it came to Boston because driving in Boston is already difficult as it is.” [00:30 C]
- “I think I'd wait a little bit just to get a gauge of whether there's any accidents in the first few months.” [00:51 D]
- Some early adopters found it “kind of surreal” but “felt pretty safe in it.” [00:59 C]
-
Value Propositions: Cost and convenience are major drivers for potential adoption. Trust in the technology, and the company behind it, is fundamental—sometimes outweighing safety stats.
2. Psychology of Control: Why Are We Hesitant?
Timestamps: 03:00 – 07:10
-
Loss of Control:
- “There is a psychology of adopting a new technology...we're giving up control as a species…and as a person.” [03:13 E – Bashak Erzer]
-
Familiarity Reduces Fear:
- Human acceptance grows with experience and visual cues from vehicles, like sharing what sensors “see.” [04:09 G]
- “What people want is predictability...if there's a jerky moment…the passenger wants to know why.” [04:42 G]
-
The Safe but “Boring” Future:
- “If the AV driving feels boring, that's when we won. That's when adoption starts.” [12:18 G]
3. Statistical Safety vs. Personal Bias
Timestamps: 07:14 – 08:54
- Safer by Numbers:
- “Waymo...looked at 127 million miles…and find...90% fewer crashes without a serious injury or death...81% fewer crashes without any injury.” [05:31 E – Carey Morewitz]
- The “Hypocrite” Effect:
- “People think that would be great for other drivers, but not for themselves. It's because they trust themselves and think they're safer drivers than other people.” [06:41 E]
4. Technical Challenges & ‘Common Sense’ for AVs
Timestamps: 07:15 – 12:27
-
Lost Communication:
- “When you remove the driver...what you're removing is common sense.” (explained: subtle road communication, eye contact, hand gestures). [08:08 G]
- AVs are programmed to act “deliberate, slower than a human driven vehicle, but always predictable.” [09:13 G]
-
Near Misses Are More Informative Than Accidents:
- “Tech companies can offer...not accidents, but near misses.” [09:55 G]
-
AVs and Rule-Breaking:
- “Much of human behavior is illegal...friction between designing a system that is legal and...integrate with human driving.” [10:54 E]
5. Regulatory Landscape and Economic Implications
Timestamps: 15:37 – 25:14
-
Boston as a Test Case:
- Waymo’s entry into Boston’s tough environment (narrow streets, snow) triggers legislative debates: full autonomy vs. mandated human operator. [17:22 B]
-
Union & Labor Perspectives:
- “When the Waymo gets confused, its first thing is to just stop right in the middle of traffic...Having a live person in that driver's seat...is necessary to prevent gridlock, safety, and so on.” [18:05 H – Mike Vardabedian]
- Labor advocates also stress the risk of driver jobs shifting overseas and automation reducing middle-class opportunities. [19:10 H, 25:14 H]
-
The Education Gap:
- “6 in 10 still say they're too afraid to ride in one.” (AAA survey) [19:32 F – Mark Schieldrop]
- Public confusion between consumer car automation (“Autopilot” branding) and actual robo-taxis. [20:40 F]
6. Trust-Building: Exposure, Regulation, and Lived Experience
Timestamps: 12:27 – 16:09, 13:01 G, 14:47 E
- “It’s a combination” of exposure, trust-building with individuals and cities, stepwise expansion, and fit-for-purpose use cases (like public/campus transport, last-mile delivery, fixed routes). [13:01 G]
- Regulatory tension: weighing safety, economic risk/benefit, and societal readiness.
- Kerry: “It’s asking people to make a really big behavioral change about something they do every day...it's probably a much slower process.” [14:47 E]
7. Societal Trade-Offs and Standards of Comparison
Timestamps: 22:34 – 24:30
-
Cognitive Offloading:
- As AVs take on cognitive tasks (“leave your thinking to us”), people may lose skill and awareness, with safety implications if systems fail. [22:30 D]
-
What is Acceptable Risk?:
- “What's the standard of comparison...failures will happen. People also make human errors...How do we balance the kinds of mistakes that machines will make with the kinds that will happen by a person?” [23:55 D]
8. The Future: Risk, Opportunity, and the Need for Deliberation
Timestamps: 25:05 – 28:06
- Will automation deliver new, equivalent jobs? Guests are skeptical, especially about local economic benefit when technical work shifts overseas and AI reduces human monitoring needs. [25:14 H]
- Disability access, economic impacts, and “deep study” are called for as AV adoption may upend labor markets and society structures. [26:51 H]
Notable Quotes & Memorable Moments
-
“If the AV driving feels boring, that's when we won. That's when adoption starts.”
– Bashak Erzer [12:18] -
“People think that would be great for other drivers, but not for themselves. It's because they trust themselves and think they're safer drivers than other people.”
– Carey Morewitz [06:41] -
“When you remove the driver...what you're removing is common sense.”
– Bashak Erzer [08:08] -
“Much of human behavior is illegal...friction between designing a system that is legal and...integrate with human driving.”
– Carey Morewitz [10:54] -
“It's more than just leave the driving to us, it's leave your thinking to us.”
– Kurt Nickish [22:30] -
“If the person who's overseeing the autonomous vehicle...is thousands of miles away, are they going to have time to react...?”
– Carey Morewitz [23:21] -
“6 in 10 still say they're too afraid to ride in one.”
– Mark Schieldrop [19:32]
Important Timestamps
- Bostonian Reactions – [00:30–01:42]
- Expert Introductions & Problem Framing – [01:56–03:13]
- Psychology of Control Explained – [03:13–04:09]
- Visual Cues & Trust-Building in AVs – [04:09–05:15]
- Statistical Safety of AVs – [05:31–06:44]
- Communication Challenges for AVs – [08:08–09:13]
- Rule-Breaking & Law Compliance – [10:54–11:41]
- Boring Is Better (Trust via Predictability) – [12:18]
- Regulatory & Economic Debates (Legislation, Labor, Oversight) – [17:22–21:19]
- AVs, Offloading, & Cognitive Risks – [22:30–24:30]
- Societal Futures & Labor Commentary – [25:14–28:06]
Tone & Style Reflection
The conversation blends curiosity, concern, and practicality, capturing optimism about technological potential but a sober awareness of societal friction and complexity—mirroring the Boston attitude toward blunt, evidence-driven discussion. Both host and guests interweave technical nuance and vivid real-life scenarios throughout.
Conclusion
The episode frames the self-driving car debate as more complex than a technological challenge: it's a matter of public trust, psychological barriers, regulatory negotiation, and economic transformation. Widespread adoption, the guests agree, will require not just better machines but a “boring” sense of safety, slow cultural normalization, and substantial societal adaptation—from workers to planners to regulators.
For a deeper dive into adjacent topics—like the future of work in an AI world—look out for the next episodes of “Is Business Broken?”.
