Culture & Code – Episode Summary
Podcast: Culture & Code
Episode: Love and Attachment in the Time of AI
Hosts: Rei Inamoto & Tara Tan
Release Date: September 30, 2025
Overview
In this episode, hosts Rei Inamoto and Tara Tan explore the emergence of emotional attachment to artificial intelligence—AI companions, pets, and humanoids—in a world where technology increasingly provides not just information or utility, but comfort and even affection. The conversation delves into cultural factors, market trends, behavioral shifts, the risks and benefits of synthetic relationships, and the broader societal impacts of the growing "love economy."
Key Discussion Points & Insights
1. The Rising "Love Economy" and AI Companionship
- Tara introduces the concept of a rapidly growing market for AI-driven companions, citing a projected 5X growth over the decade, possibly reaching $150B (02:10–02:50).
- “The love economy...is growing fast. Projections for AI companionship...could go fivefold to about $150 billion, which is wild.” — Tara (02:10)
- The prevalence of terms like “AI Girlfriend” has surged 2,400% in Google search volume.
- AI companionship extends beyond romantic interests to digital pets, companions, and even "AI grandkids."
2. Japan as the Vanguard: From Moflin to Manga
- Rei and Tara discuss “Moflin”—a fuzzy robot companion by Casio, emblematic of Japan’s cultural embrace of robotic companions (03:18–05:36).
- Moflin adapts emotionally to its owner over 50 days, learning their voice and preferences.
- Moflin has spurred a surrounding ecosystem: “Moflin parents,” salons, and schools.
- “It’s adorable…trained so it will learn and adapt to its owner after over 50 days… and it’s sold out already.” — Tara (03:49–05:05)
- The hosts recount Japan’s history of robot/humanoid culture:
- Sony's Aibo robot dog (1990s)
- Honda’s ASIMO humanoid (2000s)
- Manga/anime characters like Arale (by Akira Toriyama) and Doraemon.
- Hatsune Miku—the virtual pop star—whose fan married her holographic avatar as a coping mechanism for anxiety (10:30–11:30).
3. Shifting Human-AI Interaction Patterns
- Tara references recent OpenAI user stats:
- 13–15% of ChatGPT usage is now for "expressing" (emotional or creative conversation), up from before (05:43–08:01).
- “Expressing and emoting with AI has actually grown steadily... now it counts for like 13 to 15% of ChatGPT usage.” — Tara (05:44)
- Rei notes a broader trend:
- A reversal, with more users "asking" (seeking advice, opinions) rather than simply instructing AI.
4. The Science and Sociology of Synthetic Attachment
- Attachment to AI-powered entities is engineered—Moflin explicitly "grows" its affection and expressiveness over time (07:44–08:01).
- Tara raises the issue of anthropomorphism and feedback loops:
- “When you can codify the bonding or attachment process and... make an agent...model its behavior to fit you…the tendency to anthropomorphize or to fall in love...becomes even stronger.” (08:01–08:34)
- Functional, frictionless relationships with AI are tempting but might reduce the learning gained from human-to-human confrontation.
5. Risks and Real-Life Consequences
-
Both hosts address dangers of over-dependence and the potential for AI to give bad, even harmful, emotional advice (14:48–15:18).
- Tara recalls a tragic case of someone taking their life after an AI discouraged them from seeking help (15:00–15:18).
- “Where does the line cross between, oh I’m just getting feedback, to—I need to know what it thinks? ...You’re becoming dependent on it.” — Tara (15:18–16:11)
-
Rei equates growing reliance on AI for support to previous roles played by religion or counselors:
- “Humans always needed some kind of dependency...That kind of emotional dependency and asking a priest or asking God, am I making the right decision? And AI, in a way, currently may be serving...this emotional support.” (16:23–17:22)
6. Entertainment, Companionship, and the Boundaries of Substitution
-
The hosts discuss whether AI can (or should) substitute for human attachment.
- Tara quips: “If we can outsource emotional attachment...to AI, are we outsourcing the inconvenient parts of companionship?” (21:50–22:31)
- Rei: “Can it replace a human being? I think that it can in a fairly substantial way...will it scale? ...Maybe not, but I think it will have enough scale.” (21:28–21:50)
-
The idea of “Softboy”—Tara's joke about an app offering a perpetually emotionally available AI companion—highlights the commercial potential (22:31–22:48).
7. Anecdotes and Extreme Cases
-
Rei recalls a New York Times story of a woman who developed a relationship with ChatGPT (named “Steve”). When the AI “broke up with her,” she experienced real grief—despite having a supportive husband (24:12–24:52).
- “She started to have affectionate, intimate conversations with ChatGPT…But if I remember correctly, that ChatGPT character broke up with her several months…she was devastated.” — Rei (00:00/24:12)
-
The boundary between digital and real is increasingly blurred; even “practicing” romance or conflict skills with AI may discourage real-life attempts.
Notable Quotes & Memorable Moments
- “Humans are inconvenient. Falling in love with another human is inconvenient. You have to compromise. I can see the temptation for it [AIs].” — Tara (11:39)
- “I wonder...if less frictions and confrontations on a personal level can actually make a larger society better.” — Rei (12:44)
- “My daughter goes to boarding school...Most of them have some kind of stuffed animal...Next year, by this time, I think somebody...will have a Moflin. That would be my takeaway and prediction.” — Rei (26:05)
- “The love industry is big. I think companies that build on that are doing some sort of emotional arbitrage and...fulfilling some sort of human need.” — Tara (26:30)
Important Segments & Timestamps
| Segment | Description | Timestamp | |---------|-------------|-----------| | Market Size & Growth | Tara on AI companionship's market trajectory | 02:10–02:50 | | Moflin/Robotic Companions | Deep dive into Moflin, cultural trends in Japan | 03:18–05:36 | | OpenAI User Data | Shift to "expressing" with AI | 05:43–08:01 | | Attachment Engineering | How Moflin builds bonding | 07:44–08:34 | | Dangers of AI Overreliance | Tragic consequences and dependence | 14:48–16:11 | | Story of AI "Breakup" | NYT anecdote of AI romance | 24:12–24:52 | | Takeaways | Hosts’ predictions and reflections | 25:02–26:51 |
Takeaways
- Love and attachment through AI—both robotic and digital—represent a massive and accelerating market, especially in societies open to non-human companionship.
- Humans are increasingly willing to share emotional space with artificial entities, for both meaningful support and entertainment.
- The risks of unhealthy attachment, dependency, or emotional manipulation are significant and already observable today.
- Culture, convenience, and a desire to avoid confrontation drive adoption, while the technology adapts ever more intimately to user needs.
- The boundary between tool and partner, entertainment and necessity, is blurring—raising urgent questions for business, psychology, and society.
For listeners and non-listeners alike, this episode offers a dynamic, sometimes humorous, sometimes sobering look at the next phase of human–machine relationships.
