Transcript
Andy (0:06)
All right, so the way that I understand this is that the Doomers and the Accelerationists, they historically share a common origin. Yeah, Is that right?
Conor Leahy (0:17)
It's a very interesting piece of history.
Andy (0:18)
And so, like, maybe let's just start right there. Right. I don't think many people know that story.
Conor Leahy (0:23)
I would be surprised. Very, very few people have told this story in any good sense. I would love for more people to tell this story because it's a fucking crazy story, man. Oh. So the story as we have it today with this ChatGPT and AGI and ASI and AI risk, all of this stuff descends from this one weird offshoot 1980s group of futurists called the Extropians.
Gregory Warner (0:56)
This is the last invention. I'm Gregory Warner. Today, the case for stopping the AI race before it's too late. And surprisingly, it's a story that starts with extreme techno optimism and an online community that came together around the belief that technology would redefine what it meant to be human. A group of people who called themselves the Extropians.
Natasha Vita-Moore (1:35)
This particular culture, this community of people, were really looking at what no one, really, no one else was thinking about at the time.
Gregory Warner (1:45)
Early participants in this online forum included designer and author of the Transhumanist manifesto, Natasha Vita Moore.
Natasha Vita-Moore (1:52)
We had a dial up where we could get on our phone lines, communicate via the world Wide web and have our discussions.
Gregory Warner (2:02)
And her now husband, co founder of the Extropians, Max Moore.
Max Moore (2:07)
Before that, there were a few sort of chat rooms, but it was really one of the very first Internet forums of its kind.
Kee Chace (2:12)
So there was this group of libertarian.
Gregory Warner (2:15)
Techno utopians, again, author and Wall Street Journal reporter Kee Chake.
Kee Chace (2:20)
And they call themselves the Extropians because extropy is the opposite of entropy.
Gregory Warner (2:25)
