Transcript
Julia Longoria (0:00)
AI agents are getting pretty impressive. You might not even realize you're listening to one right now.
Podcast Sponsor/Advertiser (0:05)
We work 247 to resolve customer inquiries. No hold music, no canned answers, no frustration.
Julia Longoria (0:12)
Visit Sierra AI to learn more.
Sean Elling (0:20)
There's a lot of uncertainty when it comes to artificial intelligence. Technologists love to talk about all the good these tools can do in the world, all the problems they might solve. And yet many of those same technologists are also warning us about all the ways AI might upend society. It's not really clear which, if either, of these narratives are true, but three things do seem to be true. One, change is coming. Two, it's coming whether we like it or not. Hell, even as I write this document, Google Gemini is asking me how it can help me today. It can't. Today's intro is 100% human made. And finally, it's abundantly clear that AI will affect all of us. Yet very few of us have any say in how this technology is being developed and used. So who does have a say? And why are they so worried about an AI apocalypse? And how are their beliefs shaping our future? I'm Sean Elling and this is the gray area. My guest today is Vox host and editorial director Julia Longoria. She spent nearly a year digging into the AI industry, trying to understand some of the people who are shaping artificial intelligence and why so many of them believe that AI is a threat to humanity. She turned that story into a four part podcast series called Good Robot. Most stories about AI are focused on how the technology is built and what it can do. Good Robot instead focuses on the beliefs and values, and most importantly, fears of the people funding, building and advocating on issues related to AI. What she found is a set of ideologies, some of which critics and advocates of AI adhere to with an almost religious fervor that are influencing the conversation around AI and even the way the technology is built. Whether you're familiar with these ideologies or not, they're impacting your life, or certainly they will impact your life because they're shaping the development of AI as well as the guardrails or lack thereof around it. So I invited Julie onto the show to help me understand these values and.
Sean Elling (3:13)
The people who hold them. Julia Longoria, welcome to the show.
Julia Longoria (3:25)
Thank you for having me.
Sean Elling (3:27)
So it was quite the reporting journey we went on for this series. It's really, really well done. So first of all, congrats. Thank you on that. And we're actually going to play some clips from it today.
