Transcript
A (0:06)
We are joined today by Mike Nealis. He is the former advisor to Vice President Kamala Harris. He is a social media impactor, entrepreneur and an AI political expert. How all that works together. And so I'm fascinated by the role that AI is playing right now in our politics. Specifically, when you get into these super right wing or even super left wing centers where they just feed you what you want, how is that impacting our elections, do you think?
B (0:41)
Yeah, well, there's just a ton of AI generated slop and basically no regulations to prevent it from happening. Although the Governor of New York, Kathy Hochul, just suggested a 90 day moratorium in elections in New York for AI. But I think what it's going to do is AI is sort of like every other facet of the Internet. It's going to make everything good potentially better, and it's going to make everything bad potentially significantly worse. And so to me, there's a million different tactics that have always been unsavory in politics. Like when I was in high school, I was volunteering for a state senator back home in Nebraska, and the Republicans would hand out flyers saying, you know, Republicans vote on Tuesday and Democrats vote on Wednesdays. Right now you're going to get a million things like that, except the scale will be bigger, the imagery will seem more real. And it's going to be hard. And unless Congress and local government step in, we're going to have a massive problem. And I think in this election. And it's only going to get worse from there.
A (1:32)
Well, and we see that Trump doesn't want any regulation on AI for obvious reasons. But how do you see, we saw the horrific murder shot in the face, Renee Good, by ICE agents. What is the role of AI in the propaganda that is being pushed out by the Trump administration? Because it amazes me the lack of shame. And I think, how can anybody look at that video and not be outraged? So what the role in AI with AI in that?
B (2:02)
Well, the first thing is AI sort of can detach people from reality. Right. It's going to make it harder for them to know what's real and what's not real. And if you look at MAGA circles right now, they're sharing all kinds of fake videos of what happened to Renee Good. And let's be clear, she was, she was murdered by that ICE agent. There was no reason for him to shoot. There was no reason for them to escalate. And now they're sharing fake videos, they're sharing fake information about her life and her kid. And I saw One that popped up on X, which continues to be a complete cesspool, where they were showing her basically in a lewd way is maybe the nicest way I can describe it. But they were trying to undermine her credibility, which happens to a lot of women. And I don't know if you follow the story at all, but like on X, Elon Musk has this AI tool called Grok. And Grok has its own image features. And all these image features have been, you know, sort of spiraled out of control in the last week to the point they had to shoot them down, where you could basically create child pornography and you could be, you know, creating naked images of people, and it's very horrific. And again, it's like you come back down to this core fundamental thing. Do you trust Elon Musk to hold the future of the Internet? Or can we get somebody in the government at some level to step in and regulate this stuff? Because it's going to destroy whole industries, but it's also just going to continue to hurt people. So I think for Trump, he wants to use AI to create, you know, silly stuff like fake endorsements from Taylor Swift, which I think everybody knows is generally false. But it's going to be hard because I'm imagining you go back to 2020, where they were claiming that Democrats were stuffing ballot boxes. Like, now you're going to have AI generated images of Mike Nellis stuffing a ballot box because, you know, the Republicans want to create that and a lot of people won't be able to tell the difference.
