Podcast Summary: “The Internet May Look Different After You Listen to This”
Podcast: The Opinions (The New York Times Opinion)
Date: January 13, 2026
Host: Nadja Spiegelman
Guests: Tressie McMillan Cottom, Emily Keegan
Overview
This episode delves into the rapidly shifting landscape of internet content in the age of AI-generated media. Host Nadja Spiegelman speaks with NYT columnist Tressie McMillan Cottom and creative consultant Emily Keegan about the spread of “AI slop” (low-quality or misleading AI content), what it means for authenticity, trust in media, art-making, and their prediction that the internet and our standards for truth will never be the same.
Key Discussion Points & Insights
1. First Encounters with AI Deception
- When did you first realize something online was AI, not real?
- Tressie McMillan Cottom: Shared a funny “man on the street” video, only to later realize the emotions on the face were unnatural.
“Something in my gut said that was too funny, you know, it was too perfect for me, and I went back and rewatched it and then caught the sort of unnatural emotion on the face, which I think is, for the time being anyway, is still a tell for AI slop.” (01:48–02:26)
- Felt tricked and a bit embarrassed as someone who studies digital authenticity.
“The idea that I could have gotten. Gotten was a little, you know.” (03:09)
- Emily Keegan: Fooled by faked images purporting to be of Nicolas Maduro’s capture in Venezuela, shared widely on Instagram by a trusted source.
"I was fooled by the photographs from last weekend of the Nicolas Maduro capture in Venezuela. ...I believed them to be real for, like, 24 hours." (03:16–03:49)
- Embarrassed as a photo editor; highlights even professionals are vulnerable due to the speed and size of images on social.
- Tressie McMillan Cottom: Shared a funny “man on the street” video, only to later realize the emotions on the face were unnatural.
2. AI Slop: The Trust Crisis and Exacerbation of Social Distrust
- Crisis is bigger than skill; the tech outpaces solutions
- Tressie McMillan Cottom:
“This is not a problem that developing the right skill set is going to solve...technology has outstripped our ability to teach ourselves a set of tools at the level of accuracy that I think we would need.” (05:14–06:24)
- The issue is scale, sophistication, and the underlying low social trust which predates AI.
“AI slop is exacerbating [the trust crisis], but it is not creating it.” (06:54)
- Emily Keegan: Concurs, noting that even real images divide us on interpretation ("we are having conversations about how to understand those images, and we're seeing a country divided on what they're seeing." (07:21–07:47))
- Tressie McMillan Cottom:
3. Legacy Media’s New Role & Platform Incentives
- Media as verifier in ‘sloppy’ times
- Emily Keegan:
“What these organizations have in place are teams of people dedicated to verifying images and facts. …having their icon next to an image or next to a piece of information is helpful in verifying it as real.” (08:26–09:12)
- Social and tech platforms are optimized for emotional response, not understanding.
“None of those three platforms has done any legwork to make sure those images are being held properly for the viewer to understand what they're looking at. And now AI is here, and they have a lot of work to do.” (09:54–10:24)
- Tressie McMillan Cottom:
“There is no economic incentive for these platforms to do a better job of making consumers more informed...” (10:24)
- We can't expect platforms to solve this: "Where is the government? Where is legislation?" (11:45)
- People keep using platforms regardless of trust, because they are easy and emotionally appealing.
- Emily Keegan:
4. Can AI Content Be Attractive—or Even Art?
-
User Distaste and Emotional Response
- Emily Keegan: “It's called slop because it sucks. …AI is trying to be photography, and it is nothing like photography. The reason that photography is interesting to us is because …it's based in the real." (12:58–13:45)
- Tressie McMillan Cottom: "AI can look like reality, but it cannot communicate emotionally to us in a way that …resonates as being authentic…you do not have the appropriate emotional response to it." (14:09–15:07)
- AI imagery may leave people “emotionally cold,” but old habits (scrolling) persist; some users remain delighted by AI (e.g., older relatives enjoying AI-generated images).
-
Art vs. Slop and the Role of Human Intention
- Emily Keegan: Argues AI can be a new art tool—art happens when a person uses AI consciously.
“AI isn’t creating art. The person who’s prompting AI is creating the art…The person making the prompt is the artist…It’s a tool.” (20:19–20:45)
- Tressie McMillan Cottom distinguishes between AI as a deliberate tool for artists (fine) and “slop,” mass-generated with limited human input (problematic).
"There is a point at which the human is...can absolutely be removed from the loop entirely." (20:48–21:32)
- Memorable moment:
- Nadja jokes: "If you're dressing yourself up as a cat dancing, just send it to me." (18:52)
- Emily: "You want that? You would enjoy that." (18:53)
- Emily Keegan: Argues AI can be a new art tool—art happens when a person uses AI consciously.
5. Aesthetics of AI and the Human Desire for the Real
- AI is shifting from slick, perfect images to “grainy film” look.
- Emily Keegan: "Recently…AI is trying even harder to look like old photography with lots of grain, lots of pixelization. ...It will shape shift and [follow] you wherever you go.” (22:40–25:09)
- AI’s true aesthetic? Nihilistic, about reaction not meaning.
- Tressie McMillan Cottom:
"...there’s no choice...no political statement, there’s no cultural statement, there’s no artistic statement. Except I made you respond. I captured your energy for about 8 seconds." (23:11–23:59)
- Tressie McMillan Cottom:
6. The “Analog Revival”: Real Objects, Real Craft and Hope
- Popular speculation that 2026 will be a year for analog life—crafting, in-person meetings, valuing the hand-made.
- Skepticism and hope:
“We’ve had eras where we said that before…I'm just not sure that they are true enough to say it is an antidote to whatever it is about AI slop that scares us.” (25:06–26:01)
- Advertisers, media emphasize human-made/meta-real:
- Emily Keegan: "Apple...then did behind the scenes video to make sure we all knew that there were people involved...We have to prove that...I’m not convinced that it’s a bad thing...this might actually be a great exercise in reminding ourselves what we care about..." (26:01–27:11)
- Tressie McMillan Cottom: After encountering too much AI-generated content, people will want to “cleanse” by seeking out real connection, writing, and art.
"I do think in the long run, the human experience and our desire for it wins. It's just that in the interim, there's going to be a lot of really bad stuff to wade through and sort through to reconnect with human nature." (27:34–28:51)
- Emily Keegan: The history of photography shows painting is still most valued because of the “human hand.”
- Skepticism and hope:
7. Tips for Identifying AI Content (and Admitting the Limits)
- Emily Keegan:
- No sure way to spot AI every time: “I don’t think there’s a way, honestly.” (31:09)
- Best protection: trust in verified, reputable sources.
“The only way to know if an image is ‘real’ is if the person who’s trafficking it is a place or person that you trust and is verified.” (31:09–31:40)
- Tressie McMillan Cottom:
- Be wary of content that makes you feel strong emotion or confirm your biases.
“Does this make me want to do something? Am I enjoying it too much? Is one of the questions I ask myself, do I agree with it too much? …If you like it too much, interrogate it. That’s all.” (31:40–32:52)
- Don’t reflexively share; let it “wash over you.”
- Be wary of content that makes you feel strong emotion or confirm your biases.
Notable Quotes & Memorable Moments
-
On the limits of digital literacy:
“Technology has outstripped our ability to teach ourselves a set of tools…”
— Tressie McMillan Cottom (05:41) -
On AI’s emotional hollowness:
“You can recognize the form...but you do not have the appropriate, appropriate emotional response to it.”
— Tressie McMillan Cottom (14:09–15:07) -
On AI and art:
“AI isn’t creating art. The person who’s prompting AI is creating the art...It’s a tool.”
— Emily Keegan (20:19–20:45) -
On the futility of always knowing what’s real:
“I don’t think there’s a way, honestly.”
— Emily Keegan (31:09) -
Practical advice:
“If you like it too much, interrogate it. That’s all.”
— Tressie McMillan Cottom (32:41) -
Light-hearted moment:
“If you are dress yourself up as a cat dancing, just send it to me.”
— Emily Keegan (33:34)
Important Timestamps
- 00:47 – Introduction to episode’s theme and guests
- 01:37 – First encounters with AI-generated deception
- 04:44 – The deeper crisis: manipulation of consequential news events
- 05:14 – “No way we can become more savvy” — the limits of digital literacy
- 10:24 – The role and incentives of media platforms
- 14:09 – AI’s failure to evoke authentic emotion
- 18:40–20:48 – Debate: Is AI-generated content art?
- 22:40 – The evolving “aesthetic” of AI imagery
- 25:06 – The idea (and limitations) of an “analog revival”
- 30:49 – Can we spot AI imagery? How do we verify realness?
- 32:41 – “Interrogate if you like it too much.”
Takeaways
- The battle over what’s real online is already lost, technologically, at the individual user level. Media trust, verification, and awareness of platform incentives are more crucial than ever.
- AI-generated “slop” is likely to rise—and alongside it, so will a longing for genuine human expression and connection.
- Vigilance is emotional: If something online honestly feels too perfect (or confirms your biases too neatly), pause before sharing.
- True change will require more than user literacy—it calls for regulatory or social intervention.
- In the meantime, enjoy the real: “Go outside and remember what the sky looks like, everybody.” (Tressie McMillan Cottom, 33:14)
For listeners:
This episode is a candid, thoughtful conversation on how AI is transforming (and destabilizing) our shared reality online—and why, despite the noise and fakery, our human craving for emotional resonance may be what saves us.
