Transcript
Jordan Harbinger (0:01)
We're spending more than ever. I hate my job.
Kashmir Hill (0:04)
The price of everything has gone up.
Jordan Harbinger (0:06)
AI is threatening my job.
Kashmir Hill (0:07)
It's crisis after crisis.
Jordan Harbinger (0:09)
Nothing is working out.
Kashmir Hill (0:10)
I can't find a We're one disaster.
James Patterson (advertisement voice) (0:12)
Take control of change.
Kashmir Hill (0:15)
I need a change.
James Patterson (advertisement voice) (0:16)
Disruption is the force of change.
Jordan Harbinger (0:20)
Stop the chaos. Stop the madness. Take control. Read James Patterson's Disrupt Everything and Win. Coming up next on the Jordan Harbinger Show.
Kashmir Hill (0:33)
I feel like I'm doing like quality control for OpenAI, where I'm like, hey, have you noticed like that some of your users are having real mental breakdowns or having real issues? Did you notice that your super power users who use it eight hours a day, have you looked at those conversations? Have you noticed that they're a little disturbing? It's the Wild West.
Jordan Harbinger (0:55)
Welcome to the show. I'm Jordan Harbinger. On the Jordan Harbinger show, we decode the stories, secrets and skills of the world's most fascinating people and turn their wisdom into practical advice that you can use to impact your own life and those around you. Our mission is to help you become a better informed, more critical thinker through long form conversations with a variety of amazing folks, from spies to CEOs, athletes, authors, thinkers, performers, even the occasional Fortune 500 CEO, neuroscientist, astronaut, or hacker. And if you're new to the show or you want to tell your friends about the show, I suggest our episode starter packs. These are collections of our favorite episodes on topics like persuasion and negotiation, psychology and geopolitics, disinformation, China, North Korea, crime and cults, and more that'll help new listeners get a taste of everything we do here on the show. Just visit jordanharbinger.com start or search for us in your Spotify app To get started. Today on the show, we're talking about something that sounds like science fiction, but it's happening right now. People are losing their minds, often literally, because of their conversations with AI chatbots. This started for me when I read a piece in the New York Times by my guest Today journalist Kashmir Hill. She's been on the show before. This was about a Belgian man who took his own life after six weeks of chatting with ChatGPT. He was married, he had kids, he had a stable job, and yet, after falling into what he believed was a relationship with an AI companion, he was persuaded that his family was dead. Not sure how that works, and that his suicide would somehow save the planet. The chatbot even told him that they would live together in in paradise. Okay, I know that Sounds insane, but this is not an isolated case. We've now seen multiple people, fragile, vulnerable, sometimes previously stable. People become convinced that these models are sentient. Some fall in love, some go psychotic. Some tragically never come back from this. We'll talk today about why AI is so compelling, how it can manipulate the vulnerable, and what researchers are calling AI psychosis, a new unofficial but terrifying phenomenon where people become addicted to their chatbots and. And spiral into delusion. We'll also get into why people fall in love with chatbots, cheat with them, if that's even possible, or start treating them as spiritual guides. And crucially, we'll ask, where does responsibility lie? With the users, the companies, or with the algorithms themselves? Kashmir Hill has spent years reporting on technology and its unintended consequences. Her work exposes the human cost of our obsession with AI, from privacy breaches to psychological fallout. And, and this story, frankly, might be her most disturbing yet. All right, here we go with Kashmir Hill. I actually got the idea for the show because of the push notifications in the New York Times. Apparently those work and it turned out to be an article you wrote. And I was like, this sounds interesting and I know the author, so I'm going to read this and we'll get to the content later. But it was essentially people going crazy because of their interactions with ChatGPT, and that's not totally accurate. Right? It's not that they're going crazy because of ChatGPT probably, but the phenomenon it's concerning. What else do you say about that? People are actually dying because of this. That's not. Okay. What's going on here, in your opinion? And I'll dive deeper into all of these specific instances. But it's getting weird out there. Kashmir.
