Loading summary
A
Meta is investing $600 billion in AI infrastructure, bringing jobs to communities across America. Phil a Meta building engineer in Las Lunas, New Mexico, says welcoming Meta into our community is creating more opportunities. Learn more@meta.com BuildingAmerica.
B
Hey, welcome back to Politico Tech. I'm your host Stephen Overle. And on this show I break down tech, politics and policy with the people shaping our digital.
All eyes have been on the White House this week where an executive order aimed at banning state AI laws could come as soon as today. This idea has been floated in Congress a couple of times this year, but so far it's fallen short. And now it seems President Donald Trump is tired of waiting on Capitol Hill. He took to Truth Social this week to declare that the US is beating all other countries in AI. But that won't last. 50 different states write rules for the technology.
On the show today, Politico technology reporters Gabby Miller and Brendan Bordelon helped me make sense of the politics surrounding this moratorium, plus what Trump's actions could mean and why this issue is far from over. Here's our conversation.
Brendan, Gabby, welcome back to Politico Tech.
C
Thanks so much for having me.
D
Yeah, happy to be here.
B
So this idea of a federal moratorium on state AI laws, it feels like a cat with nine lives at this point. How did we end up at a point of a White House executive order?
D
Right.
C
So this effort goes all the way back to this summer when Senator Ted Cruz, he wanted to talk a 10 year AI moratorium which would freeze state AI laws for about a decade, onto Trump's one big beautiful bill that initially fell out because it didn't pass the Byrd rule, which basically says that policy provisions won't make the cut. There has to be some kind of funding element. Cruise then pivoted and tied state speed funding, which is a program to expand high speed Internet to different areas, particularly rural ones, to states that, you know, decided to go forth on AI legislation. And that also failed at the 11th hour because Senator Marsha Blackburn, Republican who has worked extensively on the Kids Online Safety act, she had expressed concerns that this was going to impact states ability to keep kids safe online. So that ended up failing in the Senate. 99. 1. So yeah, the effort was revived recently when we heard that House GOP leadership was actually considering tacking some sort of preemption language onto this yearly defense spending bill. It really took folks by surprise. We heard from Senate Commerce ranking Member Cantwell. She'd been negotiating for months with Republicans after Cruz's failed attempt to Kind of reach a consensus on AI preemption. And she was really blindsided by this introduction from House GOP leaders. And they felt like they were really boxed out of negotiations. No text ever came forward. They kept things close to the vest, but it didn't end up making into the NDAA because basically, Scalise said it wasn't the right moment for a morator. Things were really complicated because the White House decided to also jump into this. They wanted to issue their own executive order on AI, a draft leaked that would basically block state AI laws as well. Trump said on Truth Social earlier this week that he was going to issue an executive order that basically created a national, single federal standard for AI, which is what we can expect to call an AI moratorium. But things are moving so quickly that, you know, it's really still unclear what is in the works, when it'll come. So, yeah, we're all just waiting to see what that looks like.
B
Yeah, I mean, Trump took to Truth Social and really kind of was parroting a talking point. I hear a lot from industry about this idea that you can't go to 50 different states looking for approval if you're trying to move fast with AI and with developing new technology. I mean, to the extent that tech companies actually want any AI regulation, and I think that's still very much a question, they are pushing for a federal standard. Not all of them think a state moratorium is a good idea. If there is no federal standard, who stands to gain and who stands to lose from this idea of banning state AI laws?
D
Yeah, I can jump in here. So I think if the tech companies in the AI industry had their way in a vacuum, obviously they would prefer no regulations at all. Right. They'd just be able to do what they want at the state level. At the federal level, they don't live in a vacuum. We live in a democratic republic where the representatives and the people's representatives are interested in setting rules on AI, and they are increasingly, increasingly interested in setting rules on AI. If you look at the politics of this technology, it's not necessarily moving in the industry's direction. I think that they see right now as probably the best moment to lock in rules at the federal level, maybe at the state level, too. If they can't get it at the federal level to lock in rules that are relatively light, touch so relatively easy for the industry to comply with, and then they can say, hey, we did our rules on Frontier AI safety, we did our rules on chatbot safety for kids. These are done. These are Rules we can largely live with. And they avoid, two or three years down the line, a much more aggressive set of regulations. So by and large, I think most of the tech industry would like to see some sort of compromise on AI regulation, which basically would. The contours that we were hearing towards the tail end of the defense bill conversation was something on kids safety and something on frontier AI safety. So the existential risk conversation, concerns about AI killing lots of people or wiping out billions and billions of dollars, that stuff would be roped into a federal bill in exchange for a block on states regulating the technology more generally. And actually, I think a lot of Democrats would go for that too. I think even potentially a lot of folks who are skeptical of the technology would go for that because it would lock in some federal rules. You do have these maximalists on either side of the debate as well, though. You have folks who are really worried about AI and want to regulate it super aggressively. And then you have folks that are allied with the tech sector, but that really don't want to see any rules at all. They just want a blanket ban, a blanket moratorium. I think those are folks most likely. You hear that kind of thing coming from Andreessen Horowitz, the venture capitalist firm close to the White House. You hear it coming from David Sachs, who is the White House aizar, obviously very closely tied to industry. He's a venture capitalist as well. And. And you kind of hear it from folks like Jensen Huang, who is the Nvidia CEO. He's actually been on the periphery of this preemption fight as well. So it's a little bit like four dimensional chess. I think the tech lobby would prefer no regulations, but I think they are now working to get some regulation in exchange for state preemption to avoid that patchwork of rules and hopefully head off tougher laws down the line.
C
Well, I did want to add that there are a lot of people, Republicans and some folks in industry, that they thought that if we're going to do AI preemption, I think beforehand there was this kind of sense of let's just block state AI laws. But like Brendan said, something is coming down the pike and many of them are looking to what happened in California actually to kind of blow that up, take it to the national level, especially with SB53. You know, I spoke to Congressman Obernalty and just earlier this week, and he was telling us if we're going to do preemption, you have to replace it with something. And he is a Republican congressman who does frequently speak with The White House. So I think there is a real desire to get something done.
B
You know, when you've been covering tech a long time, like I have, like you two have, you hear this argument over and over again from industry about a patchwork, a patchwork of tech regulation. And the reality is that there are a lot of areas right now where the tech industry does deal with a patchwork of state level tech regulation. You've got varying data privacy laws, cybersecurity laws, consumer protection laws. Is AI actually any different?
D
Well, I mean, it depends who you ask. The industry would say yes, the AI industry would say it is different than something like say, data privacy, where the tech industry faced this exact same problem maybe like five or seven years ago. And I'm going to come back to that in a second. They say it's different largely because AI is the geopolitical football in a way that data privacy or other rules around the technology, even like social media regulation, which I think has some geopolitical elements, is not a national security issue like AI. It's not an issue with the race, with China, the rivalry with China, in the same way that AI is. I think there is very much an effort by the tech lobby to paint AI as a fundamentally different technology from the other things that states have regulated in the past because of all these other equities and sort of like particularly I think, related to defense, national security, that kind of thing. The reality is probably a little more complicated. I don't think that a patchwork is necessarily as threatening to the industry's bottom line and to innovation, as they like to say, in part because industry has actually become really good at co opting these state patchworks. So to go back to data privacy, California passed a very aggressive data privacy law back in 2018, and they passed another one in 2020. Industry really, really, really did not like it. They went to Washington to say, hey, can you preempt this? Pass a federal privacy law that blocks this because we don't like the California version. Washington never got off its. But, but in the meantime, the tech lobby actually did a very good job of basically like going to every other state capitol and saying, hey, here's a privacy bill we can live with. You should just pass this. And they basically like cordoned off the California privacy law. So the rest of the country, they, they really didn't have much of a patchwork. I mean, there were some like variations in data privacy, but like from, you know, Utah or Colorado or Washington state or whatever. But you know, for the most part, like they were all sort of following the same relatively pro industry framework. And you're seeing the same thing play out with AI Right now a lot of AI lobbyists are working hard to make sure that their preferred bills become law in a variety of states. And actually California in a lot of ways is now working with the tech lobby on this. So they've actually kind of even defused that bomb in Sacramento that really blew up in their faces on privacy. So even if Washington doesn't move on this, and it is definitely open question, and even if the executive order that we've been talking about doesn't work, and a lot of folks are worried about the legal questions around the White House trying to unilaterally block state AI rules, I think the AI lobby is going to do a pretty good job of getting what they want out of the states. Now I think they still prefer a national approach, but they will work the states and try to make the patchwork work for them if they can.
A
Meta's AI infrastructure is bringing jobs to local communities like Las Lunas, New Mexico. Phil, who grew up in Las Lunas, has seen the positive impact that Meta's ongoing $600 billion investment in American jobs and infrastructure will bring. I had to travel for work, missing moments I can't get back. Then Meta opened a data center and brought new jobs. Now I don't worry about missing out anymore. Learn more about Meta's investment@meta.com BuildingAmerica.
B
Well, states are right now where all the action is happening. I mean, this last year we saw a record number of AI related bills being proposed in the states. Well, north of a thousand. That number has over grown each year. More than doubled in recent years. I mean, is the idea that that could all just grind to a halt, that somehow state capitals are going to pack it up and go home?
C
I did want to just clear something up in terms of that. You know, there's been over a thousand bills that were introduced last year that gets tossed around a lot. And the thing to keep in mind with that number is that any kind of law that mentions AI, it could be the. A very tiny provision on like a healthcare bill that's not at its core related to. It's not about AI, but it has some sort of provision in it about AI that's being lumped into it. So when we hear over a thousand state AI laws, it's. There's actually far fewer than that. They're at its core about AI. So that's the one thing I'll say to kick that off.
D
I can jump in on this if you want. I don't think the states are going to stop because this has kind of taken on a political life of its own, particularly in the wake of the White House's leaked executive order. You've seen people come out of the woodwork in the states, Republicans come out of the woodwork in the states, governors, legislators, saying, hold on, we want to regulate this technology. It's very important to us, it's very important to our voters that we do this. And it's kind of had this perverse effect. You know, the White House is trying to stop states from regulating AI, but I think it's kind of backfiring, at least so far. It has kind of backfired in that I think it's raised the political salience of AI in a way that doesn't help industry, doesn't help the people trying to preempt state laws. And I think this executive order, the sense among industry, among some lawyers that I've talked to, it's widely sort of shared that there's not really a firm legal basis for this. So at best, I think that the executive order is going to get snarled in the courts for months, years. Industry loves certainty, so you're not going to get that. It's not like they're going to be able to ignore state laws in the interim while this is working its way through the courts. And then there's a sense that at the end of the day, it's going to get thrown out. So without legislation blocking state AI laws, I think industry is going to feel like they're not really in a good place. They're not going to feel like this executive order gives them the clarity that they need. They think it might not survive scrutiny. And I think in the meantime, states are not going to stop. There's a political sort of snowballing going on here where folks are more and more interested in regulating AI as the preemption conversation continues and becomes, frankly, more and more partisan. So, no, I don't think they're gonna stop. I think you're even gonna see Republicans buck the White House on this. You're already seeing that.
C
I think in some ways, this has actually whipped up more of a frenzy than people initially expected. Like, there's this idea that you only get so many bites of the apple in terms of trying to pass an AI moratorium or trying to get that across the board. And the opposition, like Brendan said, it's been, you know, Republicans, it's been attorney generals, it's been kit safety advocates, Stakeholders who are not usually aligned and what one would consider often strange bedfellows. But this has whipped up such, you know, concern among these people that there's growing concern within the people who are trying to pass an AI moratorium that this could actually hinder future attempts at doing so.
B
Well, there are so many legal questions here, and one of which is, as lawyers kind of dig into the language of any federal moratorium, how broad is it? Right. Does it capture some of these bills that do only make a passing mention of AI and nevertheless could be interpreted as AI legislation? And really, one of the fundamental questions that keeps coming up again and again is the tension between state versus federal power. Right. Ultimately, who has the authority to regulate this powerful, fast moving technology? Seems like that will have to play out in court in some capacity. Brendan, you've alluded to the arguments here. What arguments can we expect kind of each side to make in this debate?
D
Yeah, I hate to do this, but it will depend in part on the contours of the final order. We are now like recording this, waiting to hear. But I think the broad strokes from what we got from the draft executive order, there is likely to be some sort of argument on the government side about interstate commerce. This idea that the interstate commerce clause requires the federal government to step in and prevent states from essentially blocking the ability of these companies to work across state lines. You know, there are companies that develop the model in one state, train it in another, mine data in a third state, you know, to feed back into the model. And so that's going to be an argument. I don't know if it's going to work or not. I think there's also going to be an effort to the extent that agencies like the FTC and the FCC are brought into the Commerce Department, to the extent that these agencies are brought into this conversation and sort of sicced on the states, which is the. Was the goal of the original order basically to just like leverage the full power of the federal government to pressure states to stop passing AI laws or stop enforcing AI laws? I think there's going to be a lot of conversations about existing federal law. So does the FTC act allow the Federal Trade Commission to basically declare a state AI law unfair or deceptive practice or something like that? You know, so the government will probably try to shoehorn existing law to fit this executive order and basically what it's directing the agencies to do, open question whether that works or not. You know, not a lawyer, but a lot of the lawyers I have talked to a lot of the Folks in industry that I've talked to say it's going to be a high bar. You know, that said, we have seen the court rule, the courts, especially the Supreme Court rule for this administration in ways that have surprised folks. I don't know if this will make it the Supreme Court or not. I wouldn't be surprised. But again, that's going to be a long time and I think industry is probably looking for a quicker fix than that.
B
So I guess the last question on my mind is how we expect this executive order to change the dynamic on Capitol Hill. Because, Gabby, you were saying earlier there have been sort of these negotiations happening. Lawmakers have been working on this. Is this likely to speed up possible AI legislation or could it kill it altogether?
C
I do think there is maybe a slight increase, increased sense of urgency just because there's been so much action on this front. But at the same time, we know that Congress is pretty slow to legislate on anything tech related. And we also are going into a midterm election year. So you know, there are efforts like for instance earlier this week there was a new Democratic AI Commission that was announced by House Minority Leader Hakeem Jeffries. And I spoke with one of its co chairs, Representative Josh Gottheimer, and he was saying to me that there are Democrats, particularly more Silicon Valley minded or moderate Democrats, but there are Democrats who are willing to maybe get to a yes on preemption. And I don't know that that would have been the case beforehand. I think when people first saw the AI moratorium that Cruz released, that felt like, whoa, this is like nothing we've seen before. So I think that maybe because we've been hashing this out and folks now on the Hill know that this, this isn't necessarily going to go away. There's more likely to be debate on this. I mean, just the fact that Cantwell, as I was saying earlier, just the fact that she was discussing preemption at all with not just folks like Cruz, but the White House itself and David Sachs himself. You know, this is an issue that maybe some more progressive Democrats are saying, absolutely not. We wouldn't even consider. But Cantwell is the ranking member of the Commerce Committee and she holds a lot of power in this debate. So you know, there's a lot of room for opportunity for them to get to a yes, as long as there's, I do think the carve outs will be key here. There's going to have to be kid safety, there's going to have to be consumer protection. I know that there's a big kid safety push on the Hill this week to get anything passed before the midterms. So it's something that Congress in earnest deeply cares about is those kid safety carve outs. But yeah, we'll see how it plays out next year in a midterm election year.
D
Yeah, Gabby's making the positive case. I'm going to make the negative case for this not happening next year because I think that the probably, and I've heard this from a lot of folks around these negotiations, the best option that they had to get preemption was probably the ndaa. So the year end defense bill that just did not work out for them. And I think increasingly they're going to try again. I think there are some must pass legislative vehicles next year. People are talking about the continuing resolution in January. They're talking about potentially returning to this in the lame duck so after the election. But I think the politics of this issue are increasingly crowding out the potential for a reasoned policy compromise. I think folks, in part because of the executive order, because of the White House coming in so aggressively over the top, in part because of the failure to get this negotiation done, which I think folks thought was probably the best shot, people are starting to run in their corners. I think things are starting to freeze up a little bit in terms of particularly going into the election. And I think Democrats and Republicans alike are going to have a tough time giving the other side a win, particularly when the White House continues to sort of struggle to broker a compromise. And I think that might be the biggest wild card. If there are folks in the White House that run AI policy that are willing to say, hey, I'm actually willing to give something for AI preemption. I'm not just going to try to ram through a moratorium while not addressing concerns about kids safety, while not addressing concerns about Frontier AI safety, then maybe you can get somewhere. But a lot of folks I talk to in industry and outside and around the sort of AI community community think that the window is closing pretty fast. I do think Democrats are interested in compromising. I do think some Republicans on the Hill are interested in compromising. But there needs to be more of a signal from the White House. And so far that really hasn't come out. If anything, I think the EO is or at least the effort to get an EO out there is pushing things in the opposite direction. So not quite as optimistic. But I think there's always certainly a bunch of folks would like to see this. So I think we'll have to see how it unfolds. 2026.
B
Well, I love ending on a point counterpoint, and we'll certainly look to see what the new year brings on this. Gabby Brendan, thanks for being on Politico Tech.
C
Thank you so much for having us.
D
Yep, anytime.
B
That's all for this week's Politico Tech. If you like the show, go ahead and subscribe and recommend it to a friend or colleague. And for more tech news, subscribe to our newsletters, Digital Future Daily and Morning Tech. Our producer is Normal Malaikal. Philip Frobos edited and mixed today's episode and Praying Bandy made our theme music. I'm Stephen Overleigh. See you back here next week.
Date: December 11, 2025
Host: Stephen Overly
Guests: Gabby Miller and Brendan Bordelon (Politico technology reporters)
This episode examines the growing battle between the federal government—specifically the Trump administration—and U.S. states over who gets to regulate artificial intelligence. With the White House reportedly days away from issuing an executive order that would ban state-level AI laws, host Stephen Overly and Politico tech reporters Gabby Miller and Brendan Bordelon break down how we reached this moment, the political dynamics at play, why the tech industry is both pushing for and nervous about a federal preemption, what precedent exists, and what the future may hold for both industry and lawmakers.
On patchwork regulation and industry claims:
“I don't think that a patchwork is necessarily as threatening to the industry's bottom line and to innovation, as they like to say, in part because industry has actually become really good at co-opting these state patchworks.”
— Brendan Bordelon [08:54]
On polarization around preemption:
“The opposition, like Brendan said, it's been, you know, Republicans, it's been attorney generals, it's been kid safety advocates...often strange bedfellows.”
— Gabby Miller [14:13]
On the future:
“If there are folks in the White House...willing to give something for AI preemption...then maybe you can get somewhere. But a lot of folks...think that the window is closing pretty fast.”
— Brendan Bordelon [20:51]
The conversation is fast-paced, wonky, but accessible—mixing deep policy analysis with anecdotes from Congress and industry. The host pushes for clarity, while the guests add color from their Capitol Hill and industry sources. The mood shifts from informative to skeptical, closing with a classic point-counterpoint.
This episode paints a picture of a government grappling with a new technology it can barely keep up with. The struggle between state and federal authority, industry lobbying, and patchwork regulation is nothing new—but AI’s perceived stakes (from economic competitiveness to national security to child protection) make the current moment unusually heated. Despite a coming White House executive order, expect more legal, political, and legislative drama—with real resolution possibly years away.