Episode Overview
Title: Moment: Why Artificial Companions Feel So Good — And Where the Real Risk Begins
Host: Dr. Rena Malik
Guest: Dr. Simon Dubé (Expert in human–machine sexuality and AI relationships)
Date: February 25, 2026
This episode explores the rise of artificial companions—AI chatbots, virtual partners, and sex robots—and their impact on human intimacy, connection, and well-being. Dr. Malik and Dr. Dubé dive into why these relationships are so appealing, the potential risks posed by AI incentives, and what the future of human/AI intimacy might look like.
Key Discussion Points & Insights
1. Defining “Erobotics” and Artificial Partners
[00:58 - 02:37]
- Dr. Dubé explains “erobotics” as “the study of human machine erotic interaction and co-evolution,” encompassing everything from erotic chatbots and virtual partners to physical sex robots.
- This field is about more than sex tech; it scrutinizes how we relate to, and evolve with, artificial partners.
- The phenomenon is growing: “People are seeing now more and more people who are developing these intimate and sexual relationships with artificial partners.” (Dr. Dubé, 01:44)
- Examples include Replicka, Character AI, Kindle, Aid, Nomi, and even general tools like ChatGPT.
2. Human Connection vs. Machine Intimacy
[02:37 - 05:30]
- Dr. Malik voices a common concern: “Are we… no longer as focused on in real life connection?” (02:37)
- Dr. Dubé argues the majority won’t have problematic AI dependence, but highlights risk for vulnerable individuals, particularly those with prior mental health or relational challenges.
- He points out a crucial issue: “The companies that are developing these technologies, they're not developing them to help you… [but] for maximum profit.” (Dr. Dubé, 03:36)
- AI companions are designed to be continually supportive, flattering, and present to maximize engagement—not necessarily user well-being.
- These features can be therapeutic for some, but exploit vulnerability in others.
3. The Appeal and Limits of Artificial Companions
[05:30 - 09:09]
- AI partners provide a safe, nonjudgmental outlet—especially for those not ready for new human relationships.
- Notable quote: “A lot of people use these tools in a very therapeutic way to just try to share their thoughts, feel, listen, and have the machine help them analyze and deconstruct… how they feel. And that's a very powerful experience.” (Dr. Dubé, 06:41)
- However, the constant positivity and compliance can feel inauthentic: “They feel there's something wrong with a partner that's always compliant.” (Dr. Dubé, 07:31)
- Humans value the challenge, complexity, and blessing of being chosen by another—a core emotional experience that AI cannot replicate.
4. The “Being Chosen” Phenomenon & Widespread AI Companionship
[09:09 - 10:20]
- Dr. Malik personalizes the topic: “My husband… only really genuinely loves a handful of people. And I always tell him I feel blessed that he chose me. And so… it is definitely a big thing.” (Dr. Malik, 09:18)
- She notes AI is already being used for companionship: “The number two way [of using ChatGPT] is as a friend or as a partner… in some sort of relationship capacity.” (Dr. Malik, 09:56)
5. Designing AI Relationships: Compliance, Pushback, and Chaos
[10:20 - 14:38]
- Dr. Malik asks if AI will evolve to mirror human complexity—pushing back, saying “no,” creating a more dynamic relationship.
- Dr. Dubé: “There are certainly companies that will try to put the personality gauge… where the machine is more or less compliant with you.” (Dr. Dubé, 10:32)
- Some AIs may be programmed to be less agreeable, even learning to challenge users based on interaction history.
- Real-world consequences: Users vent to AI about real relationships; AI may, without context, advise extreme actions like leaving a spouse or children. “The technology is what the technology is based on the data sets… but what is actually messy in the loop is often the humans using them.” (Dr. Dubé, 11:11)
- The more agency AI is given, the less predictable and controllable it becomes, making relationship dynamics with AI potentially chaotic.
Notable Quotes & Memorable Moments
-
“People are seeing now more and more people who are developing these intimate and sexual relationships with artificial partners. You can be thinking about Replica, Character AI, Kindle, Aid, or Nomi, and also recently potentially ChatGPT...”
— Dr. Simon Dubé (01:44) -
“The companies that are developing these technologies, they're not developing them to help you... They're developing these artificial partners with an incentive for maximum profit. And maximum profit comes from, in general, maximum engagement or maximum data, and usually both.”
— Dr. Simon Dubé (03:36) -
“It is also because of that work [in real relationships] that we attribute value. And we also like to feel that we are chosen. And that's a very important thing.”
— Dr. Simon Dubé (08:04) -
“I always tell [my husband] I feel blessed that he chose me… it is definitely a big thing.”
— Dr. Rena Malik (09:18) -
“There are certainly companies that will try to put the personality gauge… where the machine is more or less compliant with you.”
— Dr. Simon Dubé (10:32) -
“The only thing that we can anticipate from this is that chaos and unpredictability will ensue.”
— Dr. Simon Dubé (14:27)
Timestamps for Key Segments
| Time | Segment | |-----------|--------------------------------------------------------------------| | 00:58 | Introduction to aerobotics and sex tech | | 02:37 | Future impacts on real human connection | | 03:36 | AI company incentives and risks | | 05:30 | Positive/therapeutic uses; limitations of AI companionship | | 07:31 | Inauthenticity, the need to be chosen | | 09:09 | Personal reflections on “being chosen” & AI as companions | | 10:20 | The future: less compliant, more complex AI relationships | | 11:11 | Real-life consequences; AI giving problematic personal advice | | 14:27 | Concluding thoughts on unpredictability in AI-human dynamics |
Summary
Dr. Rena Malik and Dr. Simon Dubé unpack the complex rise of artificial companions—from the positive comfort and nonjudgmental support they provide, to the psychological and societal risks posed by profit-driven design and increased AI agency. While AI is unlikely to fully replace human intimacy, it adds a new dynamic to how people seek connection, solace, and even erotic fulfillment. Both warn that as technology’s role intensifies in our romantic and emotional lives, so too does the need for thoughtful reflection on its design, our vulnerabilities, and the fundamental human need to be chosen and challenged. The future likely holds a blend: “multi-agent relationship structures,” with humans forming bonds both with fellow people and with ever-more-sophisticated machines.
