Podcast Summary: "Will AI Take My Job?"
Podcast: Hasan Minhaj Doesn't Know
Host: Hasan Minhaj (186k Films)
Guest: Karen Hao (author, Empire of AI)
Date: September 10, 2025
Episode Overview
Hasan Minhaj sits down with journalist and author Karen Hao to dissect the realities and myths surrounding Artificial Intelligence—specifically, the impact of AI on jobs, power, and society. Using Karen’s new book Empire of AI as a touchstone, the conversation explores how giant AI companies mirror historical empires, the true state (and hype) of AGI, the labor and ethical consequences of AI development, and what regular people can do to reclaim agency.
Main Topics & Key Insights
1. The Three Camps of AI Discourse
- AI Optimists: Believe AI is a world-changing revolution.
- Skeptics: Fear AI spells doom.
- Haters: Think the hype is overblown and silly.
- Karen’s Camp: "I would say I'm in the AI accountability camp, which recognizes that the AI industry has consolidated an extraordinary amount of power." (02:16)
- She acknowledges AI’s benefits but is critical of the massive, general-purpose models favored by Silicon Valley.
2. AI as Empire: Historical Parallels
- Empire Qualities: Claiming new resources, exploiting labor, intense competition, claming a “mission” to improve humanity.
- "Empires of AI not only literally use this rhetoric now, but they check off all of the characteristics of empire building." – Karen Hao (07:27)
- Data as Resource: AI companies lay claim to artists’ intellectual property and ordinary users’ data, often re-framing rules as “fair use.”
- Labor Exploitation: Use of low-paid labor in the Global South for tasks like toxic data moderation.
3. The Mythology of AI Leaders
-
Karen compares Silicon Valley’s mythology—especially at OpenAI—to the world-building in Dune:
“At some point, someone created a mythology around the extraordinary power of these technologies … now everyone in this ecosystem has come to believe that this is their sole purpose.” (10:43) -
Sam Altman & Empire Building:
"Sam Altman has said ... the thing that I was proudest of is that I built an empire. … His superpower is he is really good at knowing what you want and then saying a story based on what you want." (12:30)
4. Public Image vs. Reality at AI Companies
- Karen recounts her experience as the first journalist embedded at OpenAI:
- Public face: transparency, open collaboration, anti-Silicon-Valley posturing.
- Private reality: secrecy, internal competition, for-profit motives emerging.
- “What they’re saying publicly ... is actually not how they’re operating behind closed doors.” (15:25)
- OpenAI’s massive fundraising and market cap shows its shift to classic Silicon Valley capitalism.
5. The Data Arms Race
- AI companies want ever more data—far outstripping the “Big Data” era of social media.
- Even Meta (Facebook) is “running out” of data and is considering buying book publishers or relaxing privacy protections. (20:22)
- The tradeoff of privacy and convenience has become lopsided—users surrender huge swaths of life with little in return.
6. AGI: Definition & Real-World Impact
- AGI (Artificial General Intelligence): A marketing term that means whatever companies need it to mean at the time.
- “AGI is whatever the companies need it to be if they want to sell you a convenient product.” (25:24)
- The rhetoric morphs for commercial, regulatory, or political purposes.
- Karen advocates for task-specific AI (narrow, clear-purpose tools) over general “everything machines” that breed confusion and risk.
7. The Hidden Labor Behind AI
- Real-world harms: AI’s “supply chain” includes content moderation by workers in the Global South who are traumatized by exposure to toxic material.
- Example: OpenAI’s contractors in Kenya labeled disturbing text to “clean” AI models, often at great personal cost.
- “Kenya is that sacrifice zone where it has long served as a backstop for the Internet of the global north.” (31:11)
- Story of Mofat Okinye, whose family left him due to psychological toll of moderation work. (32:55)
8. Technology’s Unequal Benefits
- Historically, tech benefits accrue to elites while risks and harms are offloaded to invisible or marginalized labor.
- Karen cites Power and Progress (Acemoglu & Johnson):
“The pattern … is that elites are the ones that have the money, the influence, the power to rally enough resources around creating certain new technologies. But it’s also created in their image and consistently … people end up being harmed.” (33:54) - But she argues that we now live in an era with human rights and democracy: “We should actually reinvent the way that technology revolutions happen so that they don’t just repeat all of the terrible things that happened in previous revolutions.” (35:11)
9. AI Inevitable? Not So Fast.
- Empire’s Persuasion: Feeling of inevitability is a tool of power.
- “Every empire has fallen in history ... they're actually really weak at their foundations.” – Karen (36:30)
- Points of leverage: Data, labor, land, energy, water, and consumer demands.
- Activism: Artists using “glazes” to foil AI trainers, labor strikes, and local organizing against data centers.
10. Everyday Collective Action & Agency
- Organizing locally (PTAs, patients, unions) is practical and necessary.
- “If you’re a parent ... build a parent group ... before you start using facial recognition on my kid, before you start turning my kid into a QR code.” (42:16)
- Ask about AI use at your doctor’s office, workplace, etc.
- Tech companies are susceptible to organized resistance, ethical consumerism, and collaborative regulation.
11. Will AI Take My Job?
- AI is being pitched to executives as a cost-saving, labor-reducing tool.
- Downsizing sometimes happens because of the AI pitch, not reality:
“... they were like, oops, these AI tools are not good enough. Can you please come back?” (44:21)
- Downsizing sometimes happens because of the AI pitch, not reality:
- Real job loss is about executive decisions, not just technical capability—AI can replace specific, economically valuable tasks, but it is not a human-level “everything machine.”
- New jobs may arise as others disappear; risk is uneven, often hitting lucrative industries first.
12. Education, Critical Thinking, & Society
- Fear: If AI can do core skills (write, code, analyze), do human skills matter?
- Karen: Absolutely YES—critical thinking and agency are non-negotiables for democracy.
- “The best thing is for technology to be assistive to people, not to totally gouge out their brains.” (46:39)
13. The Hollywood Strikes & AI
- WGA and SAG strikes put AI front and center in labor negotiations.
- Collective bargaining proved critical for forcing major issues to the table.
- Other industries can learn from entertainment’s stance on digital likeness and automation. (48:15)
14. Realistic Alternatives & a Way Forward
- Encourage investment in task-specific, goal-oriented AI (like climate prediction, protein folding) instead of open-ended generative models.
- “All of their recommendations actually have nothing to do with generative AI.” (re: Climate Change AI, 50:21)
- Government action matters, but bottom-up organizing is crucial, especially given slow legislative cycles and technology’s rapid pace.
15. Ethics vs. Legality
- Tech companies often operate at the edge of what’s legal, but not always what’s ethical.
- Example: Fashion industry’s shift after consumer activism.
- The law is not the only tool—consumer and worker pressure create new norms and markets.
Notable Quotes & Moments
- Karen Hao:
- "AGI is whatever the companies need it to be if they want to sell you a convenient product." (25:24)
- "Empires of AI ... check off all of the characteristics of empire building." (07:27)
- “Originally the bargain of giving data to companies is they will give you something in return. But these companies have reached Empire status where they don’t actually have to give you anything in return anymore.” (22:29)
- "We need, we need our rights, we need clean air, we need clean water, we need better healthcare, better education, we need to not have an environmental crisis. And then think about, well, how do we integrate any technology, not just AI, in service of that?" (51:36)
- Hasan Minhaj:
Timestamps of Key Segments
- [02:16] — Karen’s “AI Accountability” stance
- [07:27] — AI companies as modern empires
- [10:43] — Dune analogy and Silicon Valley mythology
- [15:25] — Difference between public image and reality at OpenAI
- [20:22] — The new scale of AI data hunger
- [25:24] — AGI’s slippery definition
- [31:11] — Invisible workers: content moderators in Kenya
- [33:54] — Technology’s benefits and harms traced through history
- [36:20] — Empire’s “inevitability” is a myth
- [42:16] — Collective action strategies for parents, patients, workers
- [44:21] — Will AI really take my job? Both yes and no.
- [46:39] — Critical thinking is essential for democracy and agency
- [48:15] — Hollywood strikes and negotiating with AI
- [50:21] — Alternatives: focus on task-specific AI for climate, health
- [55:34] — Cassandra question and public receptiveness
Conclusion
Hasan and Karen’s conversation decodes both the hype and the real risks around AI, revealing that the “empire” of artificial intelligence is not all-powerful or inevitable. Instead, resistance and reform are possible through local organizing, consumer activism, and demand for accountability. Far from being a dark prophecy, Karen’s perspective is hopeful—if society chooses to assert its own agency and values.
Final Thought (Karen):
“Everyone should continue watching Hasan Minhaj ... and remember that you actually own this data. We own these spaces. We have a right to elect officials that protect our life sustaining water. And if everyone actually remembers that and asserts hey, we want AI to be developed this way, we want it to be deployed this way, companies have to follow, they're ultimately businesses.” (56:19)
