Left To Their Own Devices – Episode 2: "A Tale of Two Algorithms"
Podcast: Left To Their Own Devices
Host: Ava Smithing (Toronto Star)
Date: September 26, 2025
Episode Overview
This episode explores the profound influence of social media algorithms on young people's lives, through intimate stories that show both their lifesaving potential and their capacity for harm. Host Ava Smithing shares her own perspective and guides listeners through the journeys of two young women, Kira and Cece, whose adolescent experiences were shaped (for better and worse) by the digital worlds they inhabited. Alongside social media scholar Ethan Zuckerman, the episode examines the mechanics, mystery, and ethical dilemmas of algorithmic feeds—and the challenge of keeping children safe while retaining the potential for connection, discovery, and even survival.
Key Discussion Points and Insights
1. Kira's Story: How Social Media Saved a Life
Timestamps: [00:01–10:01]
- Kira McDuff was an ambitious athlete whose persistent, unexplained injuries were dismissed by medical professionals as attention-seeking.
- "It felt incredibly frustrating. It was like I knew my body so well, but people believed that they knew it better." (Kira, [02:24])
- Through TikTok’s algorithm, Kira stumbled upon communities and content about chronic illnesses.
- She recalls how her feed shifted: "The longer I looked at it, the more it became specified because I would feed into the things that I related to." (Kira, [03:54])
- Eventually, she found videos with the diagnostic criteria for Ehlers Danlos syndrome (EDS) and recognized her symptoms.
- "I was like: oh, this is somebody’s TikTok explaining the diagnostic criterias of Ehlers Danlos. And I'm checking every box." (Kira, [04:22])
- After being dismissed again by a doctor—“We normally don't look into this until somebody in your family has basically just dropped dead”—Kira finally found a physician who confirmed her self-diagnosis ([06:13]).
- Getting the diagnosis was transformative, providing Kira with language, accommodations, and validation.
- "The best part was finally having a word and finally having a way to explain to people why I was the way I was... it's like having a key to something." (Kira, [08:23])
- Kira reflects on generational change: "People talk about, oh, this generation is so reliant on phones... but if you look at me getting better health care because I had access to social media, like, what the fuck is that?" ([08:56])
2. Algorithm Mechanics & The Power of Feeds
Timestamps: [10:01–12:41]
- Ethan Zuckerman, communications professor and social media researcher, explains that what we call "the algorithm" is a set of rules for what users are shown—but transparency is sparse.
- "We're talking about something that is immensely powerful in terms of shaping what media we pay attention to. We understand it very poorly, and it's a moving target." ([11:53])
- TikTok’s algorithm becomes habit-forming in about 35 minutes, after 260 videos. Major platforms provided little on-the-record comment about how their systems operate ([11:38]–[11:53]).
3. Cece's Story: The Algorithm as a Trap
Timestamps: [12:57–22:55]
- Cece Neltner received her first phone at age 11 and, coping with family upheaval and adolescent social pressures, found solace on Instagram.
- A search for weight-loss tips started a cascade of harmful, "thinspo" content—algorithmically curated for someone her age and situation.
- "So for me, I went on Instagram. I might have just, like, looked up, like, ways to lose weight or something... and after that, it was all kinds of stuff being fed to me." (Cece, [15:05])
- Extreme dieting, workout regimens, and body image content dominated her feed, driving her toward disordered eating behaviors.
- "It just became obsessive, too obsessive to the point nothing else mattered." (Cece, [16:58])
- The obsession led to hospitalization, forced feeding, and repeated cycles of relapse and crisis.
- "My heartbeat sleeping was at 22." (Cece, [19:53])
- Suicidal ideation followed; a last-minute text to a friend led to her rescue and intervention.
- "Before I tried to go through with that, I just texted my friend and... she immediately called my mom... next thing I know there's all these ambulances..." ([21:31])
- Though Cece is better today, she emphasizes the long tail of these experiences—her recovery is ongoing, and triggers still abound.
- "It's still there... my boyfriend could probably tell you, I get insecure, like, all of the time." (Cece, [22:27])
4. The Bell Curve of Social Media Impact
Timestamps: [23:20–25:03]
- Zuckerman frames social media experiences as a bell curve:
- A small number (like Cece) experience intense harm.
- A minority (like Kira) experience transformative good.
- The vast majority see little to no profound effect; social media is a background fixture in their lives.
- "For the vast majority of people, social media has almost no effect on their overall happiness and mental health." (Zuckerman, [24:34])
5. The Policy Dilemma: Harm Prevention vs. Access
Timestamps: [25:03–27:24]
- Preventing use by all under-16s, as some governments propose, could deny many the support and information that saved Kira.
- "No one under 16 can use social media isn't a great idea. It helps Cece, maybe, but it really hurts Kira." (Zuckerman, [25:16])
- Better solutions may include customizable tools and user agency: limits on exposure to fitness influencers, time caps, curated feeds.
- "More tools, more control for me, feels like the way to handle this." (Zuckerman, [25:44])
- The potential for therapeutic uses is real, especially for marginalized or isolated youth (e.g., those with chronic illness or LGBTQ+ teens).
- "If what we end up with is a fear of social media... that's really bad news for a whole lot of people who, like Kira, might have had that opportunity to find their people and find their community." (Zuckerman, [26:19])
6. The Challenge of Algorithmic Ethics and Responsibility
Timestamps: [27:24–29:50]
- Tech companies optimize for engagement, not well-being, making meaningful change difficult without regulation or pressure.
- "There's a very fine line between giving people what they want... and giving people something that's addictive and they can't turn away from." (Zuckerman, [27:32])
- "At some point, we're going to have to have a serious conversation about what are our rights to have control over the algorithms that are feeding us information." (Zuckerman, [28:21])
- Cece’s anger toward platforms is undiminished:
- "These are men who are, like, extremely wealthy, who don't give a shit... about these young girls... they don't really care about the individual users." (Cece, [28:50])
7. Accountability: Lawsuits and Legal Action
Timestamps: [29:14–29:50]
- In 2022, Cece sued Meta, alleging Instagram pushed her toward damaging content.
- "The only way to knock them down is with our voice and showing people, like, exposing people to what they are doing to youth, to— I mean, to everyone." (Cece, [29:30])
- Meta declined to comment on the case.
Notable Quotes & Memorable Moments
- “The moment that I was able to say, I can't do this, I have Ehlers Danlos… then all of a sudden, people take you seriously, and it's like having a key to something.”
— Kira, ([08:23]) - “We're talking about something that is immensely powerful... We understand it very poorly, and it's a moving target.”
— Ethan Zuckerman, ([11:53]) - “Being skinny is self respect. If that offends you, maybe it’s because it hits too close to home.”
— Thinspo content, quoted by Ava/Cece, ([15:40]) - “I just felt like there was no point in trying to get better. There was no point because I didn’t really want to get better...”
— Cece, ([21:11]) - “For the vast majority of people, social media has almost no effect on their overall happiness and mental health.”
— Ethan Zuckerman, ([24:34]) - “It helps Cece, maybe, but it really hurts Kira.”
— Ethan Zuckerman, on prohibition, ([25:16]) - “More tools, more control for me, feels like the way to handle this.”
— Ethan Zuckerman, ([25:44]) - “If what we end up with is a fear of social media... that's really bad news for a whole lot of people who, like Kira, might have had that opportunity to find their people and community.”
— Ethan Zuckerman, ([26:19]) - “These are men who are, like, extremely wealthy, who don't give a shit... they don't really care about the individual users.”
— Cece, ([28:50]) - “The only way to knock them down is with our voice and showing people, like, exposing people to what they are doing to youth.”
— Cece, ([29:30])
Key Segments & Timestamps
- [00:01–10:01] – Kira’s journey: misdiagnosis, TikTok discoveries, transformational diagnosis
- [10:01–12:41] – Zuckerman on algorithm mechanics and opacity
- [12:57–22:55] – Cece’s journey: Instagram, body image, eating disorder, algorithmic rabbit hole
- [23:20–25:03] – The bell curve of social media effects
- [25:03–27:24] – Policy, prevention, and algorithmic solutions
- [27:24–29:50] – Accountability, rights, and Cece’s legal action
Tone & Takeaway
The episode is frank, sometimes raw, and marked by the unfiltered voices of young people living through the algorithmic re-scripting of childhood—all guided with empathy by Ava Smithing. In a digital world where the same system can save or destroy depending on the user, "A Tale of Two Algorithms" asks: What do we owe the next generation as we hand them the most powerful informational tools in human history?
Bottom Line:
Algorithms shape youths' lives in deeply consequential, often unseen ways—with outcomes ranging from life-saving diagnosis to life-threatening obsession. Effective solutions demand nuance, agency, and real accountability from platforms—not just blanket bans or resignation.
