Loading summary
Alex Bronzini
This episode is brought to you by Progressive Insurance. Do you ever think about switching insurance companies to see if you could save some cash? Progressive makes it easy to see if you could save when you bundle your home and auto policies. Try it@progressive.com Progressive Casualty Insurance Company and affiliates. Potential savings will vary. Not available in all states,
Podcast Host
day or night. VRBoCare is here 24. 7 to help make every part of your stay seamless. If anything comes up or you simply need a little guidance, support is ready whenever you reach out. From the moment you book to the moment you head home. We're here to help things run smoothly because a great trip starts with the right support. And hey, a good playlist doesn't hurt either. Today on ihip, we are joined by Alex Bronzini vendor, who is a writer from New York, York. He is part of the Harvard Crimson, correct?
Alex Bronzini
Yes.
Podcast Host
And he just wrote the most fascinating article for New York magazine. Palantir comes to campus at a quiet conference at Yale. The company and its allies sketched a vision for AI state power and how to mix the two. And so as I was telling Alex before we got on here, my view of Peter Thiel and Alex Karp and Palantir is that they are maniacal nut jobs that want to take over the world. Peter Thiel has been outspoken how he does not believe democracy can coexist with the future. And this just rampant elitism that the smart people should control everybody else, which comes, which I interpret that as rich people should control poor people. That's, that's where I come to this article. So what I thought was fascinating about this whole deal were the contradictions in it and the fact that Alex Karp and Peter Thiel are so outspoken about how they view what I should do. They should do weapons. They should be involved in ice. They should, you know, do the draft. They've made all this money under Trump, I think a 93% increase in revenue, yet they go in on campus on the down low. So I thought that was an interesting thought. How did, how did you perceive that?
Alex Bronzini
I mean, it was a highly low profile conference. There was no signage in the lobby. I only found out about it through a flyer that was sent by my editor. There was nothing promoted about this publicly. The students who I spoke to at this event were all, they told me that they believed that they had been there and that they had had the invitation extended to them by the organizers because they were, quote, not going to protest. And some of this, it's, you know, it makes sense. Camp Yale recently voted the student government there recently voted divest from Palantir. They're not popular on college campuses. So it made sense to me that they would choose to do this. So secretly. What maybe made less sense to me is the fact that they would choose to do it at Yale at all. You may know that Alex Karp and Palantir and this whole sort of intellectual scene, if we can call it that, have the Palantir co founder Joe Lonsdale is one of the trustees of the University of Austin, which is the school that was co founded by Barry Weiss. And so why are they doing it at Yale instead of at the University of Austin? That was a question that I didn't fully have answered. And my guess there, if you'll allow me to kind of rant here, is that the purpose of these kinds of institutions that are meant to challenge the big institutions, you know, University of Austin and also the Bari Weiss's Free Press is not for them to exist as an end unto themselves. The point of the Free Press is to take over CBS News in the same sense that the point of the University of Austin is to eventually provide a staging ground from which, you know, you can take over or at least exercise immense influence at Yale.
Podcast Host
I thought the same thing because they, the student body did divest that. You, you pointed that in your article. But I also thought it was because I. My first thought when I saw the headline Palantir comes to campus, I thought, oh God, now Palantir is going to be turning point. And it became very clear from the article where turning point goes, low education, lack critical thinking. They are looking for the intellectual elites. So it kind of made sense to me they would go to an Ivy League school to do that because they are after this aristocracy of intellect. And I think you even said that in the piece. Am I right?
Alex Bronzini
Yeah. One of the speakers, Rusty Reno, used the phrase aristocracy of the intellect to describe what he believes AI will do to society. It's going to make, you know, it's. It's going to have this sort of underclass of people who just kind of get all their ideas from the models. And then the rest of us, all of us who go to elite schools and who, you know, go to the Palantir conference are the people who give it the information on, give the chat bots the information on, you know, on politics and on society that's sort of fed to this, you know, to this underclass.
Podcast Host
Okay. So they were trying to address questions and I thought a very pointed question that came out of the students for Obvious reasons is they want, is our education going to become obsolete and is AI going to take over? What jobs would be ours? And I thought the response from the people on the, what do you call it? Committee or whatever, the speakers, I thought they had a very bizarre answer. If you'll explain that for us.
Alex Bronzini
Yes. So Maya Sulkin, who is a writer at Barry Weiss's Free Press, had a question for two of the panelists, Roger Kimball and Russ Ureno, who were both the editors, the conservative editors of First Things and New Criterion, respectively, which was about what happens if AI takes jobs? And Kimball's answer was a rising tide lifts all boats. Sort of the idea, well, this is going to be, this is going to increase productivity and that will benefit everyone. Now, of course, what I found kind of strange about that response, and I told him this after, is that if you are someone who's a Trumpist like him, you can recognize that globalization and the opening of free trade to the rest of the world, while it created a lot of growth in the United States, did not generate equal growth and left a lot of people worse off, even though it left the United States, on balance, better off. So why can't you realize that about AI? Like why, why, what's the intellectual block there where you can't extend that same scenario that you oppose so vocally in globalization. Hence, you know, how, and hence how they want tariffs and all that. Why can't they think the same thing of AI? And they just didn't really seem to have an answer there.
Podcast Host
Well, it was kind of like when I was your age, the same motto was trickle down economics, wealth growing wealth, it will trickle down to other people. And as we have seen, with the complete gap in income and assets, that did not happen, nor do I think AI is going to rise. You know, rising tides, what is it?
Alex Bronzini
Rising tide lifts all boats.
Podcast Host
Lifts all boats. It's not going to lift ship. Because in the same, the same article at the same forum, they're talking about being the elitist and everybody else are going to be the consumers. And it just struck me that the questions that they were being asked, it felt like that the students there got poo pooed on. Like they even compared it to climate change, flippantly, like, oh, we've solved climate change. That's no longer an issue. And they even went so far as to call it nihilism to not accept their form of thought. And I wonder how that hit you as a young person.
Alex Bronzini
Well, I mean, I spoke to some of the young people in the room about how they felt about the comparison of AI to climate. Some were like, well you are this sort of long tenured editor of this conservative magazine. You have absolute job security. Why are you telling me this? I think that other people, especially at elite colleges feel quite secure. I don't know if that's a reasonable assumption. You know, just, just because you have a degree from Yale that you will be protected in perpetuity from a job loss if it happens. The other thing that I was thinking about is that even if you listen to what people like Dario Amadei anthropic say, they are intensely afraid of AI job loss. Like this isn't, this isn't even a concern that this isn't a concern. That's like a radical lefty thing. This is an incredibly mainstream thing to fear in the world of AI development. So I found it incredibly striking that that was not a fear that people in that room seem to share.
Podcast Host
You're right. It's so mainstream that my maga Fox News watching parents are concerned about AI taking over jobs.
Alex Bronzini
Yeah, you're not like a pinko if you think that, you know, I poses a serious risk of dislocating the labor market.
Podcast Host
Yeah, I just thought the whole thing was arrogance on steroids. Okay. One thing that I wanted to this just, I was like what Google's motto used to be don't be evil. And they were just thought that was the dumbest thing they've ever heard. If you could expound on that.
Alex Bronzini
Yeah. So Palantir's sort of mythology. If you read Alex Karp who's the CEO and Nicholas Zamiska who's the director of corporate affairs and, and general counsel, they co wrote a book last year titled the Technological Republic which kind of lays out their vision of the tech sector and US society and what it, you know, what both should look like. And they as a company have long taken issue with Google's motto don't be evil. Google's former model, don't be, don't be evil which they say is kind of, you know, emphasizes the individual and their obligation to have moral clarity and, and, and, and assumes that it's, it's this, they, they charge it with being this kind of techno libertarian idea that allows the tech sector to say we're not of the state, we have no responsibility to the state. And on one hand they are I think narrowly correct in their observation. Like well actually big tech didn't, is not this thing that emerged from nowhere. The government funded the Internet, the government funded all the technology that then became the smartphone. It was big tech that commercialized them. It was not big tech that invented many of these technologies. So I think they're narrowly right in this observation, but I think they have this idea of what it means. So they're like, tech should serve the nation. Okay? But then their idea of service to the nation is tech should help the US Military detect targets to kill. Tech should help ICE decide who to deport. Tech should do all these things that engineers at Google protested over and resigned over. And something that I think is maybe the most flawed assumption about their whole idea, that this can be a new legitimating myth for the tech sector, that the idea that we're newly willing to do the things that the US Government wants us to do and that will make us make people like us, is that. Well, actually, I don't think that providing the technology for the war in Iran, which is a loathed war, like, people don't like this at all. I don't think that being the tech company that says we are the ones willing to power this war is at all going to make them popular with the American people. I think even on the American right, there is a deep suspicion of mass surveillance and of big data in that sense. So I think that they have miscalculated there.
Podcast Host
I hope you're right. And one thing they said, which they go on, we're not gonna, don't worry about your jobs. It's gonna be fine. And then they talk about how AI is not gonna be running the government. It's going to be the government. So how did that circle square?
Alex Bronzini
Yeah, so that was a quote from Dean Ball, who is a former Trump advisor. And he has this kind of very expansive vision about what AI is going to do to human institutions, that it could conceivably replace the Supreme Court. I'm actually, maybe we'd be slightly better if AI Right, maybe. But in any case, he has this whole idea around how AI is going to change the nature of the state in the same way that the printing press changed the nature of the state and the telegraph, telegram changed the nature of the state in that. In that you could now. You could now communicate information over long distances. And that, you know, this new technology is also going to be, you know, equally transformative for the way that we do statecraft going forward. And, you know, that may. That may well be the case that, you know, AI is, is. Has these dramatic effects upon how governments work that we'll need to plan for. That said, that's kind of obviously contradictory to the view that, you know, Reno and Kimball have of AI, which is that it's like. It is like climate change, which is fake. And therefore we have no reason to worry about it. And we have no reason, like young people have no reason to be concerned about the impact of this technology upon labor markets or the impact of this technology upon the world. Basically, that. That's just nihilism and doomerism.
Podcast Host
But yet it's going to take over everything. Basically, it's going to run everything. Okay, one other quick question before we wrap up. I got the impression from your article that they are all chips in on government, like, imminently domain and taking land for these AI centers. Like, I live in Oklahoma City. We have a right, a law right now. You can't build one of those centers, the AI centers where they write all the parts. Yeah, data center. Sorry. Gosh, I'm getting old.
Alex Bronzini
It's fine.
Podcast Host
The data center, you can't do one in Oklahoma City. And I guess they'll do that every year. But they're saying if we want to do a data center somewhere and we don't give a shit about your electricity in your grid and your prices, the government can go in and say, this is where we're doing a data center. I found that to be really alarming, particularly when you hear about all the problems they're having in places where they're taking all the electricity. The prices are high, it's too big for the grid. What did you think about that?
Alex Bronzini
Yeah, well, one of the speakers in attendance was this guy, Elliot Geiser, who's the Assistant Attorney General for the Office of Legal Counsel, which means that when Trump wants to do something, he's the guy who decides whether it's legal or not. So he was speaking about one of the sort of grave concerns that speakers at this conference had was, quote, unquote, nimbyism around data centers and efforts to block data center development in certain communities. And what can we do about that, given all the data centers we need to build? And one of the answers that he had is he had, earlier this year used. Used, I believe, the Defense Production act to preempt the California environmental law that was blocking a pipeline. And he said, well, actually, we need this for purposes of national defense, and therefore the pipeline can be built. And one of the. One of the sort of, I guess, novel things that I saw at this conference, which is the administration and not said this before, is that they believe it would be legal in the event of a war or in the event of, let's say, The US Government decides, oh, we need huge AI capabilities to defend the country. However that's defined. We can use the Defense Production act to build data centers wherever we want them.
Podcast Host
It's like, tough shit where you're going to do it. You don't want it. It's kind of like the aliens enemies act with Trump, like, where there's an emergency. So now we're going to be able to take all these people in.
Alex Bronzini
Right?
Podcast Host
It's like, that's bullshit, but we get what we want. So when you left that. This is kind of the. The end of the article was kind of like, but I get it. So you talk to people, they're like, I don't think they're moral. It's not a great company. But if it's the only place I can get a job and I have to, to work there. And I get that, like, if you're drowning in debt, you cannot get a job. You can't beggars can't be choosers type thing. So was the so you. I just got the impression that they would. People that were at that forum would still go work for Palantir if they couldn't get a job anywhere else, they would still do it.
Alex Bronzini
I mean, there were a lot of people there who had serious, serious moral reservations about Palantir. People who told me that, you know, privately, like, I think this is a company that's, like, kind of evil and, you know, abuses civil liberties. And I'm concerned about, you know, what they're. What they're doing and the products that they're selling. Some of whom were, like, enthusiastic about working there, which didn't make a ton of sense to me. I think that there are some people who can kind of convince themselves that, well, this company is doing important things to defend the United States from China or whatever, and therefore, I want to work there. And I think that working there is good. I think that that's sort of one school of thought among people who are in the room. There were other people who were just like, I think this is evil and irredeemable, but the job market is so terrible that I will work there anyway, or would if it had to come to that. And people told me stories. One girl told me a story about how she had a sister who had graduated either from Columbia or from Harvard, she didn't remember, who had applied everywhere in tech. And she had gotten two calls back, basically, one from a company she didn't name and the other from Palantir.
Podcast Host
Right. And so you are in a catch 22. I mean, it really is. And that's why when you see all the moral carve outs that people make for Trump and his behavior and his actions, like it kind of tracks like people are willing to make the moral carve out for their personal success. And I, I can't say that I'm judging it because I've been fresh out of college, fresh out of law school. You've got money, you have to pay back, you have to support yourself. Like, I get it. So when you left, did you think Palantir and its ideals are going to catch on like wildfire at these campuses with the youth, or did you get more of a sense that, oh, they're bad?
Alex Bronzini
The sense that I got is that they don't really need to convince anyone. Like, I think that they, that they, they believe these things. I don't think that this is just like a marketing pitch. I think that they, that they really do hold certain ideas about how US society should be organized and they're not just saying this so they could sell software. But on the other hand, do they really need to convince anyone? I mean, on one hand they're kind of the only firm that does the things that they do. And so, so, so they don't really need to convince the government. I mean, even left leaning governments like the UK government buys products from Palantir because they are the only company that does these things at scale. So they don't need to convince governments. Do they need to convince students? Well, maybe even, maybe, maybe not. Because again, the jobs pay very highly, they're prestigious. So I think the question of almost, does anyone believe in this stuff? Is a little, in some sense besides the point. Was my conclusion fair?
Podcast Host
I mean, that's a great point because they really do have the edge on the market. And I was reading an article after we saw how they've been getting military contracts, they're involved in ice, and I read an article, or maybe somebody, I can't remember where I got it, but they said if you removed Palantir from US government, it would be like moving about removing a body central nervous system. So I mean, it's fucking deep in there and we know that all of our private information is in the hands of this company. And at this point it's like, what are our options?
Alex Bronzini
Yeah, I mean there's, there's, I've heard from people who work in defense and in defense tech that Palantir has what's called a vendor lock in effect, which is to say once you've integrated its, its software in your, you know, in your workplace and into your institution, it becomes basically impossible to remove it because there's just so much institutional memory that gets built up in the software. I think that one thing that I think concerns a lot of the company's critics is that this is a company that has expressed opinions on how the US should conduct foreign policy. They took out a full page ad, I believe in the New York Times or the Wall Street Journal that said palantir stands with Israel. So clearly they have opinions about. And they wrote an entire book on the CEO and one of his, you know, deputies wrote an entire book about sort of how the US government should function in, in some sense, I'm kind of oversimplifying here, but so they, they are clearly very opinionated company. They did it, they did a tweet and they also talk about this in the book where they were like, we should bring back the draft.
Podcast Host
Yes.
Alex Bronzini
So they, they, they have all these ideas. Meanwhile they are this whole system that is so deeply integrated into the government and into decision making processes. So I think that it's fair to wonder on what level are they trying to exercise influence through those systems. Are these systems just kind of neutral pathways to organize information, or are they a means to an end? Because again, this is a company that has lots of opinions, that is extremely forthcoming about how it believes the US should run its military, how it believes the US should, you know, what countries it believes should be U. S. Allies.
Podcast Host
Yeah, and, and Peter Till's very outspoken against democracy. I mean, so you can't help but think that it rolls downhill, as they say in Oklahoma. Alex, thank you so much for being on. I loved your article. I am so impressed that at your very young age I thought your piece was fasc. I really enjoyed it. I'm going to keep watching for your stuff. Thanks for coming on.
Alex Bronzini
Thanks for having me.
Wix Harmony Advertiser
AI this, AI that. I get it. I'm so sick of people telling me to just use AI but weirdly enough, wix's new AI website builder really works for me. It's called Wix Harmony. And here's the thing. I get to choose how to use AI. I get everything I need to create a website. And I can either have Aria, my AI agent, design things for me, or I can edit things myself. Try it for free@wix.com Harmony how did
Podcast Host
you get your website to look like that? Mine's so basic. Thanks. I just used Wix Harmony. Sounds fancy. What's that? It's wix's AI website builder. You can just tell it what you want and it builds you a whole site. So it's like Vibe coding a website. Exactly. But even better, because you can still click and edit anything by hand. You don't have to use prompts for everything. Oh, that's neat. Yeah. Try it for free@wix.com Harmony My new
Wix Harmony Advertiser
website's been getting a lot of attention lately, and here's my secret. I used Wix Harmony. It's one of my favorite tools because it feels like such a natural way to create. And I have so much control over my website. I can just tell Aria, my AI agent, to create whatever I'm imagining in my head, or I can click anywhere on my site and change things myself. Try it for free@wix.com Harmony that's wix.com Harmony.
Episode Title: MAGA AI Billionaires Infiltrating College Campuses? Insider Exposes All
Hosts: Jennifer Welch & Angie “Pumps” Sullivan
Guest: Alex Bronzini (Writer for New York Magazine, Harvard Crimson)
Release Date: May 11, 2026
This episode explores the covert infiltration of tech giant Palantir—and its right-wing billionaire backers—onto elite college campuses. Host Jennifer Welch and “Pumps” Sullivan interview journalist Alex Bronzini, whose New York Magazine exposé detailed a secretive Palantir conference at Yale. The trio unpack the motivations behind this move, the ideology of high-profile Palantir founders, and the contradictions between their public stances and campus recruitment efforts. The discussion also dives into AI's societal impact, ethical dilemmas faced by students seeking jobs, and the deep integration of Palantir within U.S. government systems.
"The point of the University of Austin is to eventually provide a staging ground from which... you can take over or at least exercise immense influence at Yale." —Alex Bronzini [03:50]
"It's going to have this sort of underclass of people who just kind of get all their ideas from the models... and the rest of us... are the people who... give the chat bots the information." —Alex Bronzini [04:51]
"I just thought the whole thing was arrogance on steroids." —Podcast Host [09:45]
“Their idea of service to the nation is... tech should help ICE decide who to deport. Tech should do all these things that engineers at Google protested over and resigned over.” —Alex Bronzini [10:01]
“On one hand, AI is a big nothing; on the other, it's going to take over the world.” —Podcast Host (paraphrased) [14:27]
“We can use the Defense Production Act to build data centers wherever we want them.” —Alex Bronzini [16:56]
“Some... were like, I think this is evil and irredeemable, but the job market is so terrible that I will work there anyway.” —Alex Bronzini [17:44]
“Meanwhile they are this whole system that is so deeply integrated into the government and into decision making processes... are they a means to an end?” —Alex Bronzini [22:28]
“The point of the Free Press is to take over CBS News in the same sense that the point of the University of Austin is to... exercise immense influence at Yale.” —Alex Bronzini [03:50]
“It felt like the students there got poo pooed on. Like they even compared [AI] to climate change, flippantly, like, ‘oh, we've solved climate change, that's no longer an issue.’” —Podcast Host [07:43]
“If you removed Palantir from US government, it would be like... removing a body’s central nervous system.” —Podcast Host [20:45]
“I think this is evil and irredeemable, but the job market is so terrible that I will work there anyway.” —Alex Bronzini [17:44]
“AI is not gonna be running the government. It's going to be the government.” [12:37]
“That's why when you see all the moral carve outs that people make for Trump and his behavior and his actions, like it kind of tracks—people are willing to make the moral carve out for their personal success.” —Podcast Host [19:00]
The exchange is sharp, skeptical, and combative—laced with humor but grounded in real anxiety about unchecked corporate, tech, and government power. The hosts’ progressive tone contrasts with the seriousness and subtle alarm in Bronzini’s reporting. The episode offers a candid peek into the extent and implications of Palantir’s reach, both ideological and infrastructural, urging listeners to scrutinize where tech, politics, and academia intersect.