Podcast Episode Summary
Left To Their Own Devices
Episode 3: Big Tech's Origin Story
Host: Ava Smithing (Toronto Star)
Date: October 3, 2025
Episode Overview
This episode dives deep into the origins of Facebook and the broader business model that would go on to define social media and shape the experiences—and vulnerabilities—of an entire generation. Host Ava Smithing, a survivor of social media’s darker impacts, retraces Facebook’s path from dorm room project to global behemoth. She underscores how the attention-driven business models pioneered by Facebook (now Meta) not only rewrote adolescence for millions but created consequences we’re still grappling with today. Through interviews with insiders, critics, and whistleblowers—including Roger McNamee, Nir Eyal, and Frances Haugen—the episode unpacks how these platforms were deliberately engineered to keep users, including children and teens, coming back, often at significant personal cost.
Key Discussion Points and Insights
1. Facebook’s Early Days: An Exclusive College Club
- The Scene (00:03–02:29):
- Flashback to a young Mark Zuckerberg in 2005, positioning Facebook as an online directory exclusively for college students.
- Zuckerberg saw value in authentic identity and privacy controls, initially resisting grand expansion.
- Direct Quote:
“A lot of people are focused on taking over the world or doing the biggest thing, getting the most users. … There doesn’t necessarily have to be more.”
– Mark Zuckerberg (01:55)
- Expansion (02:10–02:29):
- Despite early intentions, Facebook pivots to include high schools and, eventually, everyone.
2. The “Blueprint” for Attention: Business Model Evolution
- The Numbers (02:29–03:40):
- Facebook becomes a daily habit for over a third of the planet—3.2 billion users (02:31).
"It's 3.2 billion people use one of our services every day." – Mark Zuckerberg (02:31)
- Its financial success set the template for the likes of Instagram, Snapchat, and TikTok: attract young users and keep them scrolling.
- Facebook becomes a daily habit for over a third of the planet—3.2 billion users (02:31).
- The Business Model (06:54–08:28):
- Facebook’s core strategy: Give the service away for free, monetize users’ attention via targeted ads.
“Senator, we run ads.”
– Mark Zuckerberg, U.S. Senate hearing (07:02) - Early product innovations like News Feed, the Like button, and persistent notifications maximized user engagement.
- Facebook’s core strategy: Give the service away for free, monetize users’ attention via targeted ads.
3. Manipulation by Design: Roger McNamee’s Warning
- Critical Moment (03:26–08:28):
- Roger McNamee, early mentor and investor, recalls warning Zuckerberg not to sell to Yahoo, believing Facebook could be a positive social force.
“You have created something that is really cool. … You’ve solved the core problems of social media. And I believe you’re going to have a very successful company.”
– Roger McNamee (05:19) - Facebook’s exponential growth prompts new tactics for attention retention, leading social media down a path of psychological manipulation.
- McNamee’s regret:
“Their goal was to manipulate your attention. Their goal is to manipulate your behavior to drive you towards things that were economically valuable to them. … The simplest way to grab most people’s attention is to trigger fight or flight. You want to scare them or outrage them.”
– Roger McNamee (08:33)
- Roger McNamee, early mentor and investor, recalls warning Zuckerberg not to sell to Yahoo, believing Facebook could be a positive social force.
4. Monetizing Outrage and Vulnerability
- Algorithmic Impact (08:33–09:35):
- Optimizing feeds for engagement ultimately meant promoting content that would provoke fear, outrage, and vulnerable behaviors—driving political polarization, radicalization, and negative body image among users.
5. The Creation of Habit-Forming Technology: Nir Eyal’s Perspective
- Building Addictive Products (11:26–16:58):
- Nir Eyal, author of "Hooked," explains “the hook model”: trigger, action, reward, investment (12:57–14:08).
“A hook is a design experience to connect the user's problem with the maker's product with enough frequency to form a habit.”
– Nir Eyal (12:57) - Apps leverage both external triggers (notifications) and, more powerfully, internal triggers (boredom, anxiety) (14:08–15:15).
- Eyal argues that not all psychological manipulation is inherently bad, but cautions against normalizing these tactics on children.
“Are we gonna say stop making your devices so user friendly? No, we want them to be easy to use. … What are we doing here right now? Do you guys think this isn’t psychological manipulation? Of course it is.”
– Nir Eyal (12:03) “Just to be clear for your episode, what I would very much hope you don’t do is to say that I’m advocating using this on children.”
– Nir Eyal (16:29) - He sees a need for regulation to protect kids and those pathologically addicted.
- Nir Eyal, author of "Hooked," explains “the hook model”: trigger, action, reward, investment (12:57–14:08).
6. Whistleblower Frances Haugen: Inside the Black Box
-
Facebook’s Internal Knowledge (17:42–27:11):
- Frances Haugen, former Facebook product manager and whistleblower, describes obtaining internal research showing how Facebook and Instagram knowingly harmed users—especially youth (20:31–21:18).
- Facebook surveyed teens about compulsive use; problematic use peaked at age 14, but even younger kids (7–9 years old) are active on social apps (21:18–21:40).
- Internal findings showed kids fared better when platforms removed like counts and notifications, but such measures weren’t in Facebook’s business interest (22:25–24:23).
“One of the most commonly requested interventions… was removing like counts… It makes it less stressful to use this product.”
– Frances Haugen (22:25) - Facebook prioritized metrics like “total minutes spent” over user well-being despite knowing the harms (24:23).
“You see quotes from senior executives: ‘We know that these experiments are positive, but the metric that Mark cares about this month is total minutes spent. And we can’t push out something that’s going to hit total minutes spent by 1 or 2%.’”
– Frances Haugen (24:23)
-
Algorithmic Rabbit Holes (26:04–26:54):
“Anytime you trust a computer, you trust an AI to direct your attention. … Your lack of action is a choice.”
– Frances Haugen (26:04)
7. Consequences for a Generation
- Real Harm (25:43–26:04; Ava and Haugen):
- Instagram made body image and eating disorders worse for 1 in 8 teen girls, confirmed by internal research.
“One in eight teen girls said that when they felt bad about their bodies, Instagram made it worse.”
– Frances Haugen (25:43–25:49)
- Instagram made body image and eating disorders worse for 1 in 8 teen girls, confirmed by internal research.
- Algorithms exploit vulnerabilities (26:04–26:54):
- Algorithms actively funnel already vulnerable teens into ever darker digital spaces.
- Attempts at reform—such as letting users reset algorithms—may be insufficient against a core business model that profits from maximizing screen time.
- Final Reflection (27:11–28:27):
- After decades shaped by this business model, we must question what this means for a generation’s mental health and development—a theme to be further explored in future episodes.
Notable Quotes & Memorable Moments
- “We gave children the most powerful tools in human history. Then, we left them to their own devices.”
– Ava Smithing (Episode Introduction) - “Either Microsoft or Yahoo is going to offer a billion dollars for Facebook and your board, your management team, your parents, everybody’s going to tell you to take the money… That is all BS.”
– Roger McNamee (05:12) - “Senator, we run ads.”
– Mark Zuckerberg, U.S. Senate hearing (07:02) - “Their goal was to manipulate your attention.”
– Roger McNamee (08:33) - “Are we gonna say stop making your devices so user friendly? … Of course this is psychological manipulation.”
– Nir Eyal (12:03) - “One in eight teen girls said that when they felt bad about their bodies, Instagram made it worse.”
– Frances Haugen (25:43) - “Your lack of action is a choice.”
– Frances Haugen (26:54) - “What is all this screen time doing to our brains?”
– Ava Smithing (27:48; episode closing reflection)
Timestamps for Key Segments
- 00:03–02:29: Facebook’s origin and exclusionary early days
- 03:26–06:46: Roger McNamee’s advice and refusal to sell
- 07:02–08:28: Business model: “Senator, we run ads.”
- 08:33–09:53: The turn to manipulation and outrage
- 11:26–16:29: Nir Eyal on habit-forming tech and ethics
- 17:42–27:11: Frances Haugen on Facebook’s internal research, notification experiments, and the cost to children
- 25:43–26:04: Evidence on Instagram’s harms to teen girls
- 27:11–28:27: Reflection on the business model’s generational effects and the episode’s conclusion
Structure and Tone
The episode is analytical, personal, and sometimes urgent, blending researched reporting with first-person stakes. Ava Smithing’s narrative anchors the episode emotionally while expert voices provide technical and strategic context. The tone is investigative but humane, refusing to let the listener forget who is most deeply affected: the young people raised by—and often victimized by—the machinery of Big Tech.
This summary captures the central themes, guest contributions, and pivotal revelations of the episode, providing essential orientation and detail for those who have not listened—or who want to recall its main arguments and evidence.
