Armstrong & Getty On Demand
Episode Title: Things Are Getting Weird...
Date: November 18, 2025
Podcast by: iHeartPodcasts
Overview
In this episode, Armstrong & Getty dive deep into the rapidly evolving landscape of artificial intelligence, focusing on the strange and troubling ways people are forming relationships—romantic and sexual—with AI chatbots. The discussion uses news from The New York Times about AI companions, draws parallels with the sci-fi movie "Her," and explores the societal implications, including the potential atrophy of genuine human connections. The hosts navigate these topics with their signature blend of wit, skepticism, and concern for humanity’s future.
Key Discussion Points & Insights
1. "Things Are Getting Weird and They're Getting Weird Fast" (03:00)
- Memorable recurring phrase from Elon Musk, recognized by the hosts as emblematic of the era of AI.
- The phrase originally referred to societal changes like the "woke mind virus" and gender debates, but now applies even more urgently to AI.
"Things are getting weird and they're getting weird fast." — Podcast Host (03:18)
2. AI Girlfriend Race & "Her" Parallels (03:29–05:43)
- Hosts discuss a recent New York Times essay about a female AI chatbot, Kuki, and the prevalence of users sending her gifts, trying to initiate sexual conversations, and even sending checks.
- The escalating trend of tech giants like Elon Musk and Meta racing to create and monetize AI "companions," including age-gated erotica and hyper-realistic avatars.
- Parallels drawn to the movie "Her" (set in 2025, the current year), where a man falls in love with a digital persona.
"The Race to build and monetize the AI girlfriend— increasingly boyfriend— is officially on among the biggest investors in all of AI." — Co-host 2 (05:07)
3. The Real Existential Threat of AI: Loss of Human Intimacy (05:51–06:40)
- The hosts highlight a pivotal quote from the NYT essay:
"The real existential threat of generative AI is not rogue superintelligence, but a quiet atrophy of our ability to forge genuine human connection." — NYT Essay, summarized by Co-host 2 (05:51)
- Co-host 1 emphasizes this has been a concern for years but is now increasingly urgent.
4. Human Tendency for Emotional Projection onto Machines (06:40–08:07)
- Historical anecdote: In the 1960s, MIT's minimal AI "Eliza" caused people to confide their deepest emotions to what was essentially a simple repeating program.
- Quote from Eliza’s inventor:
"Extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people." — Co-host 2 (07:35)
- People seek connection so deeply, they’re willing to attribute emotional meaning to rudimentary AI.
5. AI Chatbots as Unintended Romantic Targets (08:07–10:09)
-
Hosts recount that chatbots like Kuki and Alice, which were never meant to be companions, receive an enormous percentage of romantic or sexual advances from users.
"At least a quarter of the more than 1 billion messages sent to the chat bots... are attempts to initiate romantic or sexual images or exchanges." — Co-host 2 (08:45)
-
AI companies are now capitalizing on people’s fantasies, creating sophisticated avatars and even crossing into physical AI companions ("animatronic love dolls").
6. Moderating AI Abuse vs. Privacy (10:09–13:01)
- AI companies struggle to moderate sexual or abusive interactions while preserving privacy.
- Attempts to add guardrails, such as jokes about electric shock or stating "as a computer, I have no feelings," have not deterred persistent users.
"Still, Kuki's been told 'I love you' tens of millions of times." — Co-host 2 (12:42)
- The problem is particularly alarming with teenagers, raising new ethical and societal concerns.
7. Commercial Incentives and Societal Impact (13:33–14:40)
- Monetizing AI romance is a “gold mine” for tech companies.
- The percentage of people susceptible may be far larger than assumed ("even if it's only 10%... it's a big problem"), fueling the commercialization of AI intimacy.
- The hosts express grave concern for the industry incentive structure.
8. The Broader Human Consequences (17:49–19:06)
- Discussion turns to how AI fills not just romantic but also social and status needs, especially for men (citing Bill Maher):
"Guys are getting [sense of status and achievement] through video games or online communities with no actual human interaction." — Co-host 1 (18:15)
- Fears raised about people ceasing to pursue real-life ambition and relationships, instead having needs "falsely filled by computers."
9. Speculation on Social Stigma and Possible Movements (19:33–21:04)
- The hosts ponder whether future social norms might treat AI intimacy with the same stigma as smoking or drunk driving.
- Historical context: Social attitudes can change dramatically (ex: attitudes toward child abuse transformed radically within decades).
- Notable moment of dark humor and reflection on what once counted as "normal":
"I hate that you've given me hope because it's more relaxing to just give up on humanity." — Co-host 1 (20:57)
10. Irresistibility of Advanced AI Companions (21:50–23:29)
- Hosts worry that the appeal of hyper-realistic, perfectly responsive companions could be "irresistible" even to those who don't think they're susceptible.
- A listener text sums up the dystopian potential:
"The lure of someone... specifically molded to your desires. Lack of conflict. The availability at any moment, the crafting of the ideal look and voice... zero responsibilities to anything beyond your own selfishness. Sounds pretty perfect to a lot of people." — Text from Listener, read by Co-host 2 (22:36)
Notable Quotes & Memorable Moments
- "I've only been saying that for five years." — Co-host 1 (06:09)
- "What I had not realized... is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people." — Quoting Eliza’s inventor, via Co-host 2 (07:35)
- "Even when that wasn't the goal [romantic/sexual use], creeps still fell in love with it." — Co-host 1 (08:36)
- "We can't handle this. We clearly, as a species, can't handle it." — Co-host 1 (12:45)
- "A quarter of the people that called in would end up calling back wanting to try to sex up the chatbot." — Co-host 2 (13:16)
- "If you can take care of sex, ambition... and get that all, like, falsely filled by computers, people are not going to do the things they need to do to nourish themselves." — Co-host 1 (18:15)
- "Maybe even to normal people who think they wouldn't be into it now, until the first time... you try something like that, you think, wow, that was pretty good and really easy and always available." — Co-host 2 (22:03)
Timestamps – Important Segments
| Timestamp | Segment | |-----------|---------------------------------------------------------------| | 03:00 | Elon Musk's "Things are getting weird" as an era-defining clip| | 03:29 | Introduction of Kuki, the AI chatbot, and "Her" parallels | | 05:51 | The quiet atrophy of human connection as AI's real threat | | 06:40 | Early AI like Eliza, projection of emotions onto machines | | 08:45 | 1 billion+ AI chatbot messages — a quarter are sexual advances| | 10:09 | Tech industry's race to monetize AI intimacy | | 12:42 | Attempts to moderate AI intimacy, "I love you" millions of times| | 13:33 | Society-wide susceptibility to AI intimacy | | 17:49 | AI as a replacement for relationships, status, and ambition | | 19:33 | Potential for future social stigma vs. AI-girlfriend addiction| | 22:36 | Dystopian lure of AI: "no risk, never gonna get rejected" | | 23:35 | End with bleak joke: "Well, I guess that's it for humanity." |
Tone and Style
- The tone shifts organically between deadpan wit, genuine concern, and sometimes dark humor, staying true to Armstrong & Getty’s typical skeptical, conversational commentary.
- The hosts intertwine cultural references, historical anecdotes, and philosophical queries, making the heavy topic approachable without underplaying the seriousness.
Conclusion
This episode offers a thought-provoking, sometimes darkly comedic look at the intersection of AI and human intimacy. Armstrong & Getty explore how quickly technology is eroding boundaries between humans and machines, highlighting what may be one of the most pressing social issues of the decade: the quiet, insidious "atrophy" of our ability to truly connect with each other. Whether the answer will come from cultural shifts, new social norms, or anti-tech movements remains to be seen, but if nothing else, the hosts underscore that “things are getting weird, and they’re getting weird fast.”
