Podcast Summary: Deep Questions with Cal Newport
Episode 371: Is it Finally Time to Leave Social Media?
Date: September 22, 2025
Host: Cal Newport
Episode Overview
Cal Newport critically examines the rise of violence and toxicity linked to curated conversation platforms—social media platforms like Twitter, Facebook, Bluesky, and Threads that rely on algorithmic feeds and user conversation. Triggered by the national reaction to the assassination of Charlie Kirk, Newport makes his most direct and passionate case yet for why most people should quit these platforms entirely. He deconstructs the psychological harms, the impossibility of fixing these systems, and argues for a cultural shift away from social media toward more meaningful engagement with the world.
Key Discussion Points and Insights
1. Social Media Isn’t “Real Life”—But Its Harms Are Profound
- Newport recounts his personal response to the aftermath of Charlie Kirk’s assassination and summarizes the overwhelming positive public reaction to his call for leaving social media.
- Many people already know social media is harmful, but feel stuck in a “purgatory” where they know it's bad, they can't fully articulate why, and they question if the problems are fixable.
Quote:
“They are toxic and dehumanizing. They are responsible as much as any other force for the unraveling of civil society that seems to be accelerating.”
– Cal Newport (00:30)
2. Defining the Problem: Curated Conversation Platforms
- Newport specifies his focus: algorithmically curated platforms centered on mass conversation (e.g., Twitter, Facebook, Threads).
- He explicitly excludes platforms like Pinterest and WhatsApp and notes Instagram/YouTube can fall into this category depending on the mode of use.
3. The Three Harms: Distraction, Demoderation, Disassociation
- Distraction: Addictive qualities lead to wasted time, erode the ability to focus.
- Demoderation: Loss of moderation, tribalism; users become embattled, indoctrinated, and hostile.
- Disassociation: Severe loss of social bonds and ethical constraints, potentially leading to apathy, nihilism, or violence.
The "Slope of Terribleness" Model ([12:00])
- Newport introduces a visual model (the “slope of terribleness”) that connects these harms as stages users slide through:
- Distraction (top): Mild but constant engagement.
- Demoderation (middle): Increasing tribalism and loss of empathy.
- Disassociation (bottom): Numbness, withdrawal, or violent extremism.
- The “mainstream internet” sits atop; as users descend, they may fall off onto poorly moderated "dark web" platforms.
Quote:
“They are all connected. You start with one, but gravity is pulling you down towards the next and then towards the next.”
– Cal Newport (16:10)
4. Why the Slope Is Unavoidable—And Unfixable
- Distraction: Platform algorithms continually serve “compelling” content, constantly triggering dopamine and making resistance nearly impossible.
- Analogized to “trying to quit cigarettes but always having a pack in your hand.” ([18:10])
- Demoderation: Algorithms enforce echo chambers; the social structure fires up tribal community circuits rooted in evolutionary psychology.
- Our brains are hijacked back to Paleolithic-style us-vs-them wiring ([22:50]).
- Disassociation: Tribalism and social isolation (especially among youth) eventually lead to nihilism or rage.
- Offline communities are replaced with shallow, algorithmically reinforced imitation.
- The FBI now tracks “nihilistic violent extremists.”
- Attempts to regulate, moderate, or redeem these platforms miss that their core business model is inseparable from these harms.
Memorable Analogy:
“That’s like saying to Pizza Hut, look, we don’t want to put you out of business, but is there a way you can exist without selling pizza? … That is the service.”
– Cal Newport (28:40)
5. Why “Not Sliding to the Bottom” Still Isn’t Worth It
- Most people resist the extreme endpoint of the slope, but only by expending enormous mental energy and willpower.
- Even those who “stop themselves halfway” are less flourishing, more exhausted, and still subject to significant harm.
6. Alternatives to Social Media ([27:40])
- Stay informed via traditional news outlets, newsletters, podcasts.
- Replace digital tribalism with physical service and engagement.
- Seek meaningful entertainment outside algorithmic feeds.
Quote:
“You imagine, often if you’re engaged in combat on these platforms, you imagine you’re at the front of the crowd marching on Selma. In reality, you’re in an attention factory punching your time clock while your billionaire overseers laugh as their net worth goes up due to your efforts.” – Cal Newport (27:55)
Audience Q&A and Open Discussion
1. Is Instagram, YouTube, Pinterest Part of the Problem? ([33:14])
- Depends on usage mode. If using Instagram/YouTube as non-interactive (“expert community,” watching creators), risk is lower. Using comments/reels/livestreams, it can act like a conversational algorithmic platform.
2. Social Attitude Shifts ([35:03])
- Newport’s early criticism of social media was met with skepticism and hostility; now, most audiences agree but feel stuck.
- The key is not just knowing it’s “bad,” but understanding why and changing behavior accordingly.
3. Parental Controls and Youth ([38:58])
- Most dangerous “dark corners” are only reached by sliding down the “slope of terribleness.” Keeping kids off core curated conversation platforms prevents most harm; few kids seek out extremist sites directly.
- Recommends: “No smartphone till high school. Locked-down smartphone until 16.” ([39:18])
4. Creative Work on YouTube ([43:39])
- Video-independent media is not inherently bad; the main risk is “algorithmic capture”—creators pursuing viral metrics at the expense of quality or authenticity.
- For most, being a successful YouTuber is extremely time-consuming and often financially unrewarding.
5. Is the Genie Out of the Bottle? ([49:11])
- No: Social media companies hijacked, but did not create, the open publishing spirit of Web 2.0.
- The core functions of the Internet—sharing, community, creation—would outlive the demise of major platforms.
- Teaching “responsible use” is hopeless against addiction-triggering algorithms and ancient tribal circuits.
Quote:
“If you shut down the five major social media companies tomorrow, the Internet would still be there and a lot of other cool things could emerge.”
– Cal Newport (49:51)
6. Case Study: Why Artists Left Social Media ([1:00:50])
- Early social media was a boon (building direct relationships with audiences).
- Algorithmic turn destroyed depth in favor of cheap dopamine; great artists now forced into trends and shallow content for relevance.
- Quitting social media brought more focus and creative pride, if less relevance.
7. Policy and Regulatory Solutions ([1:11:10])
- Congressional efforts to regulate chat platforms (Discord, Twitch, Reddit) miss the point: “they’re targeting the wrong CEOs”—these services reflect the end of the “slope of terribleness,” not the cause.
- Real interventions require:
- Raising the minimum age for major social media use (e.g. Australia’s stricter standards).
- Section 230 reform—making platforms liable for published content, likely ending the business model for mass, unmoderated, algorithmic conversation.
- Most importantly: cultural shift. Make heavy social media use uncool, “something a grown man should be a bit embarrassed about.”
Notable Quotes & Memorable Moments
- On the Inevitability of Harm:
“Once we began that shift, that’s why we have this feeling of like, why do these things make me feel dark? Because we are at the top of that slope of terribleness. We’re all happy up there. And then at some point we looked up or like, I’m sliding. And that wasn’t happening before.” (1:05:45) - On the Futility of Moderation:
“You’re talking about 100,000 year old Paleolithic circuits that are being pushed on again and again and again. That advice ain’t going to stick.” - On Culture Change:
“We should culturally change it. When we see people that are really engaged on these social media platforms, we should see it as if they wore a straight-up Captain Kirk costume to work … It's a little bit embarrassing.” (1:16:10)
Key Timestamps for Major Segments
- 00:02 – Cal’s personal reflection on Kirk’s assassination and newsletter reaction
- 06:20 – Defining curated conversation platforms; algorithmic curation and conversation
- 08:30 – The three core harms: distraction, demoderation, disassociation
- 12:00 – Introducing the “slope of terribleness” model
- 18:10 – The brain science of distraction and algorithmic engagement
- 22:50 – Tribal circuits, echo chambers, and psychological inevitability
- 27:40 – Why you can’t escape the slope; alternatives to social media
- 33:14 – Are YouTube, Instagram, Pinterest as dangerous?
- 38:58 – Questions: Parental controls, video creators, “the genie”
- 1:00:50 – Case study: Social media’s effect on artists and the algorithmic turn
- 1:11:10 – Policy, Congress, Section 230, age restrictions, and the need for cultural change
Takeaways
- Curated conversation platforms' core technologies and business models are inseparable from their harms; moderation or “responsible use” is futile.
- The “slope of terribleness” is real; being “stuck halfway” is still costly.
- Quitting social media, or radically reducing its presence in your life, is increasingly the only reasonable response if you care about flourishing and civil society.
- True alternatives—meaningful news, real-world community, and independent entertainment—remain accessible.
- A deep cultural shift, making heavy social media use something to avoid rather than defend, is the most powerful antidote.
For more, subscribe to Cal’s newsletter at calnewport.com and keep up with upcoming Deep Questions episodes on your podcast app of choice.
