Loading summary
A
Hi, I'm Darina, co founder of OpenPhone. My dad is a business owner and growing up, I'll never forget his old ringtone. He made it as loud as it could go because he could not afford to miss a single customer call. That stuck with me when we started OpenPhone. Our mission was to help businesses not just stay in touch, but make every customer feel valued, no matter when they might call. OpenPhone gives your team business phone numbers to call and text code customers all through an app on your phone or computer. Your calls, messages and contacts live in one workspace so your team can stay fully aligned and reply faster. And with our AI agent answering 24. 7, you'll really never miss a customer. Over 60,000 businesses use OpenPhone. Try it now and get 20% off your first six months@openphone.com business and we can port your existing numbers over for free. Open Phone. No missed calls, no missed customers I'm no tech genius, but I knew if I wanted my business to crush it, I needed a website. Now, thankfully, bluehost made it easy. I customized, optimized and monetized everything exactly how I wanted with AI. In minutes, my site was up. I couldn't believe it. The search engine tools even helped me get more site visitors. Whatever your passion project is, you can set it up with Bluehost with their 30 day money back guarantee. What have you got to lose? Head to bluehost.com to start now. Imagine a world of extraordinary comfort where bowl and branch bedding wraps you in the softest. Embrace the coziest experience made from the world's finest 100% organic cotton, all so you can sleep better. Start building your fall sanctuary with Bolen Branch's iconic signature sheets made with a buttery, breathable weave that gets softer with every wash. Enjoy. 15% off your first set of sheets with free shipping and returns at B O L L and Branch.com with code buttery. See site for details and exclusions.
B
Hi, I'm Andy Levy, former Fox News and CNN HLN guy and current cable news conscientious objector. I'm a former libertarian who now sits pretty comfortably on the left.
A
Hi, I'm Danielle Moody, former educator and recovering lobbyist. But today I'm an unapologetic woke commentator on America's threats to and I'm producer.
C
Jesse Cannon and I'm here to make sure things don't go too far off the rails.
B
We're here to have fun, smart conversations with some of the most knowledgeable and entertaining people in politics, media and beyond.
A
Our goal is to try and make sense of our current crazy world, our new Abnormal, and hopefully even make you laugh through the tears.
C
Hello, and welcome to another Sunday edition of the New Abnormal. And we thank you so much for being here. Today we have an extra special guest with science writer Adam Becker, who will tell us about his new book, More Everything Forever. AI overlords, space empires, and Silicon Valley's crusade to control the fate of humanity. But first, let's have some fun. Are you guys ready to listen to some clips?
B
Clips?
A
No. But yes.
C
You know, for that, I'm going to start off with someone doing something good.
B
Oh, we all know what you mean by that.
C
Well.
B
Well, maybe we don't. Okay.
C
Throughout our time together.
B
Here we go.
A
Here we go.
B
Here it comes. Here it comes.
C
As I always say, podcast producers are the most abused workers in all of capitalism.
B
I was just gonna say. Please. This is. You are using the language of the abuser. You know, I play these clips for your own good. That's what you're saying.
C
Yeah. Listen, my friend who is a listener of this podcast, was recently talking about raising her children when that she's in the era of Hate me Thou, thank me later. That's all I'm saying here. That's all I'm saying.
A
Okay.
C
Representative Yasmin Ansari is here to give some real talk.
A
Do I know Mr. Garcia personally? I do not. Am I saying he's a perfect person? I am not. But he is a person who was in this country, a union member, a father, a working class person was illegally detained and deported to a foreign prison with no future, no due process. This country has always prided itself on due process. And the point here is the fact that we have an administration that is choosing to be authoritarian, to defy the courts, defy the law. And that is what should be concerning to every single American in this country. Because what else did he say during that White House press conference? He was heard saying, next, we want to take on the homegrowns. He's talking about US Citizens. So he has no regard for the rule of law. And if we don't stand up to this administration now, in this moment, they will only be emboldened to keep going.
B
That actually was good.
A
Yeah, okay, so I play that by mistake. You know, I don't know if I've.
C
Mentioned this to you too, but I'm a pretty good card player and I know what my hand is before we go in.
B
But look, the point she's making is a very good one. And what it boils down to is if Abrego Garcia does not have due process rights, then none of us do. It really is just that simple. She said she doesn't know the guy. She's not saying he's a perfect person. The bottom line is he could be a member of Ms. 13, he could be a terrorist. You still get due process in this country because it has to be proven. It has to be proven that you are what the government says you are. And that's the step that has been totally disregarded here. That's habeas corpus, that's due process, and it's enshrined in the Constitution. So if they can take that right away from him. Yep. They can take it away from literally any person in this country.
A
And that is exactly what Donald Trump was saying in the Oval Office to El Salvadorian President Bukele. Homegrowns are next. Homegrowns are next. So if you turn your back to this moment because, oh, he is a brown person, oh, he's a person of color, oh, he is undocumented, understand that Donald Trump said unequivocally, which was followed by laughter, that he is going to seek to deport home grown, quote, unquote, people that he believes don't deserve to be in this country and, and homegrown, AKA American citizens, to a death camp.
C
Yeah. And I think that their defense that this is now a foreign relations oversight that the Supreme Court doesn't have is really, really, really horrifying that this is where we're going with this all. And, you know, we had this week, Andy and I were joking about a tweet where somebody was pointing out people like Bill Kristol who were so far right and authoritarian during the Bush years, are, are now the people who are like, hmm, maybe the people who said abolish ICE were right. We're the same people who were right about this years ago. And we're saying this is where we need to stop and say abso fucking lutely not stop gaslighting us and telling us we're fucking crazy. We've been right about this.
A
Obscene.
C
So, you know, I work in a lot of different jobs as a freelancer, and I've seen some gross incompetence at some workplaces I've been in. But RFK Jr here, who is the head of one of our most important government institutions, is going to really show what stupid leadership looks like on a level I have never seen, sir.
B
This is an individual tragedy as well. Autism destroys families. More importantly, it destroys our greatest resource.
C
Which are our children? These are children who should not be.
B
Who should not be suffering like this.
C
These are kids who many of them were fully functional and regressed because of some environmental exposure into autism when they're two years old. And these are kids who will never pay taxes, they'll never hold a job.
B
They'Ll never play baseball, they'll never write.
C
A poem, they'll never go out on.
B
A date, many of them will never.
C
Use a toilet unassisted.
B
And we have to recognize we are.
C
Doing this to our children.
A
Just remind me, his medical degree is in what specific field?
C
Doing drugs in college.
A
Right. So he is an expert in what exactly?
B
Wait, am I a doctor?
A
I think so. And for families who have children who have autism, first of all, autism is a spectrum and a range. There are highly functioning and then there are lower functioning. And he just painted this broad brush of a very complicated, very complicated issue, very layered. And the man has no degree, he has no expertise in anything but conspiracy theories and lies. I don't know. I don't know.
B
Yeah, I mean, yes to everything you just said. What he did to paint a picture of people with autism not paying taxes, not writing poems is so false. I really do think that most people know someone who is autistic and who can function very well. And the idea that every autistic person is just whatever he thinks of them, that they can't do anything for themselves, is obscene. And on top of that, he's either wrong or lying. I'll let people decide which one. About this so called autism epidemic that is being caused by, as he said, some type of environmental exposure, none of that is true. We know scientifically that autism is mostly heritable. It's mostly genetic mutations that are there at birth that cause autism. And one of the reasons that all these recent studies are showing a higher level of autistic people, people in the country, is that the way to diagnose it has changed. It's changed over the years and we are better at diagnosing it now. So of course the numbers are going to rise. Everything he says is just wrong. It's absolutely wrong. And that has real life repercussions, unfortunately, because of the position that he's been given in this administration.
C
Yep. Okay, so now we have a backventure named Representative Dan Moiser who is going to blame Josh Shapiro for the anti Semitic arson attack that happened to him on the weekend though. But say on the left and not.
B
But, but end.
C
When you have a Tesla car bomb set outside a Trump hotel and it Blows up virtually. And there's no denouncing from the left. In fact, they're making jokes about it on, you know, comedy shows that only the left watches, for crying out loud.
B
Because it's not funny. And all it does is try to attack the president. So things like that.
C
And I also just add this. I got into a little controversy with.
B
Governor Shapiro because I stated that he makes comments where he said President Trump.
C
Was an existential threat to democracy. And he recently stated, very falsely that President Trump and the administration was withholding funding for food for hungry Pennsylvanians, keeping.
B
Those that are hungry from accessing food.
C
Complete falsehood.
B
Completely made up.
A
Yeah.
C
And when you say things like that, that drives, that creates hatred.
A
It does, right?
C
It does. No, you're right. Hatred.
B
And where does hatred go?
A
The violence. We've got to stop this. Absolutely. We can't. We can't continue this way.
B
No room for violence, regardless of politics or religion or anything else.
A
Right, right, right. All right.
C
Congressman Dan Muser, thanks for joining us.
B
Oh, go fuck yourself. I mean, first of all, both of those things that he quoted Shapiro as saying falsely are true. You know, Doge is taking funding away from feeding hungry people. And Donald Trump is an existential threat to democracy. And we're seeing that every day. We've been talking about it. But this is the game they always try to play because they know that the violence, if you want to put a number on it, 90 to 95% of political violence in this country, I would say, is from the right. And they'll find one thing, like somebody setting a couple of Teslas on fire, and then they'll try to say, see, See, both sides do it equally. I was going to say, what goddamn cable news network had him on, but they see it was Newsmax. That's the goddamn cable news channel.
A
This is coming from the same party where, I don't know, what do you refer to Donald Trump Jr. As the first son or.
C
I mean, some of us call Eric the simple son and him the stupid.
A
Son, but sure, okay. Well, the spawn of Donald Trump, following the attack on Nancy Pelosi's husband, set off a tweet storm of quote unquote jokes about that man's skull being bashed in by a hammer. They reposted people in quote unquote costumes to mock the violent attack that had that 80 year old man in rehabilitation for months. So the audacity for them to want to claim, oh, there was a, there was an attack in front of, you know, the towers of a Tesla and there was a human being that was beaten and you all thought that that was a joke. So give me a fucking break. I want to be on a different timeline. Please, somebody put me on a different timeline. I'm no tech genius, but I knew if I wanted my business to crush it, I needed a website. Now, thankfully, bluehost made it easy. I customized, optimized and monetized everything exactly how I wanted with AI. In minutes my site was up. I couldn't believe it. The search engine tools even help me get more site visitors. Whatever your passion project is, you can set it up with Bluehost with their 30 day money back guarantee. What have you got to lose? Head to bluehost.com to start now. Imagine a world of extraordinary comfort where bowl and branch bedding wraps you in the softest. Embrace the coziest experience made from the world's finest 100% organic cotton, all so you can sleep better. Start building your fall sanctuary with Boland Branches iconic signature sheets made with a buttery, breathable weave that gets softer with every wash. Enjoy 15 off your first set of sheets with free shipping and returns at B O L L& Branch.com with code BUTTERY. See site for details and exclusions.
C
Hey listeners, Meet Russell Hey. Russell just launched a fitness app and.
B
He needed to get the word out.
C
To busy professionals looking to stay fit. So I turned to acast. I used their Smart Recommendations feature to easily find shows that talk about health and fitness. Booking sponsorships through their platform was a breeze and just like that, my app was in their ears during their morning run.
B
Sounds like a smart move, Russell.
C
How's business looking now? Sweat is pouring and so are the installs. Spread the word about your business with podcast ads on Acast. Start today at go.acast.com advertise foreign.
B
Adam Becker is the author of what Is the Unfinished Quest for the Meaning of Quantum Physics? And has written for publications including the New York Times, the BBC, npr, New Scientist, and Scientific American, among many others. He's got a PhD in computational cosmology and his new book, More Everything Forever, which is out Tuesday, maps out Silicon Valley's crusade to control the fate of humanity. He joins me now to talk about it. Adam, thanks so much for being here.
C
Thank you for having me, Andy.
B
So I want to start right at the beginning of the book because I absolutely loved the first sentence of the book, which is the dream is always the same. Go to space and live forever. And I just thought that's such a great summing up of this sort of tech bro Mindset. And then, as you point out, this dream, this vision is important because these are newsworthy people, so their wants and their dreams become a huge part of the discourse, don't they?
C
Yeah, exactly. You know, these are some of the most powerful people in the world, and so whatever they think is important just sort of ends up filtering out into the culture, either accidentally or more often deliberately, as they try to sort of foist their ideas about the world on the rest of us.
B
And so much of the book is sort of based off of the idea that a lot of these guys are really into effective altruism. So can you explain what exactly effective altruism is and what. Why it is such a key to understanding their mindset?
C
Sure, yeah. I mean, effective altruism, which is sort of part of this wider set of ideas that a lot of these people believe in, or one of a set of ideas that these tech billionaires believe in. It's one of these ideas that sounds fine on the surface, and then when you start digging in, it gets significantly worse. So on the surface, it's. Oh, it's what it sounds like. We want to find more effective ways to do good in the world. We want to find the most effective forms of charitable interventions that we can direct money toward and put more money toward those things and less money toward other things. And that sounds great until you start digging into what that means. Like, oh, okay, what do you mean by effective? What do you mean by doing good in the world? Why is directing money toward organizations the best way to effect good change in the world?
B
World?
C
What about different kinds of change, like full on institutional reform? And then there's also what I think of as the galaxy brain problem with it, which is, you know, if you take it really far and a lot of these people do, you start coming to some really strange and difficult to defend conclusions about what the right course of action is to take in the world. Like, what's the best way to do good in the world? Well, they start defining good in terms of making the most people happy, and then they start focusing on the first half of that, of making the most people so you can make more of them happy, or at least a little bit happy. And so then they end up saying things like, there's a moral case for settling the universe, which not only is there not even a good scientific case for that working out, but the moral case for it just sort of falls apart.
B
Yeah, and before I get more into that, I want to ask you about another thing that seems to go maybe hand in hand with it. And that's something called longtermism. And you talk about that in the book and you talk about a guy named Will MacAskill explain what long termism is and how it sort of fits into this paradigm.
C
Absolutely, yes. So that's sort of that galaxy brain take on effective altruism that I was talking about, the idea that what matters most when trying to figure out what, how to do good in the world. And by the way, for defining good, there are questions like, what does that mean? How do we assign a number to that? Because they're really big on quantifying things. But longtermism is this idea that, oh, the best way to do good in the world is to focus on the long term future of humanity. Because there could be a lot more people who live in the future than who are alive now or who have lived. And so if you can find a way to make life better for all of those hypothetical future people, that matters more than making life better for the people here right now.
B
Yeah, it's really interesting. And you start to see, or you actually fairly easily see, how many of these tech billionaires fit into this paradigm. You've got Jeff Bezos's Blue Origin and Musk's Space X. They portray these things as, we have to go to space, we have to colonize Mars to secure the long term future of humans, humanity. And it's so binary for them. It's either we do this or we die out.
C
Yeah, exactly. Yeah. They don't really see a third option. Either we go extinct or we take over the universe.
B
Right.
C
And you know, that's just a really impoverished view of, like, how the future can be. There are a lot more options than that.
B
Yeah. And as you said, all these quote unquote altruistic endeavors don't seem to care all that much for the people who are alive now. That's not an unfair thing to say, is it?
C
Certainly some of these people care for some of the people alive right now. And credit where credit's due, a lot of the effective altruists will put a lot of money toward worthwhile causes like trying to address neglected tropical diseases, malaria, things like that. Although there has been some very good criticism of their approach to doing that as well. At the very least, that shows that they are trying to care for people who are here right now. They just care more about people who live in the future who don't exist. And they think, well, yeah, we have to care more about those people because there could be more of them. First of all, we don't know for sure that that's true. And second, it's just not at all clear what the right things to do to help the vast populations in the future could be. And third, and this is the most important thing, I think you can come up with any justification of basically any course of action with that line of reasoning. You can justify anything you want to do by saying, oh, well, you know, I have to do this to save the trillions of people who are going to be alive in the future. That's why I have to kill millions of people here now. Or something like that.
B
Yeah, it's a very, I guess utilitarian ends justify the means sort of thing.
C
Yeah, exactly. I mean, the people promoting this say that they don't think theons justify the means, but their moral logic just really betrays that.
B
Yeah. And you have a great quote in the book. You write that Bezos dreams of 1000 Mozarts and 1000 Einsteins among his trillions of humans living in space.
C
Yeah.
B
But he's neglecting the potential Einsteins and Mozarts that are living and dying in poverty right now.
C
Yeah, I think that's just true. Bezos has said repeatedly he wants a trillion people living in space because we could for many outlandish reasons. He says, otherwise we're gonna run out of stuff here on Earth and we could have a thousand Einsteins and a thousand Mozarts, which again is a convenient way of avoiding looking at the people who are here now. I should also say, first of all, thank you for saying that you like that quote. But also, while that is a quote from my book, the original idea there, it's a twist or a spin on a quote from the evolutionary biologist and paleontologist Stephen Jay Gould.
B
Oh, okay, yeah.
C
Gould originally said like 25, 30 years ago, something like, you know, I care less about the weight and shape of Einstein's brain than the fact that there have almost certainly been, you know, many people just as smart as he was who lived and died in poverty.
B
Sure, sure, absolutely. So let me pivot a tiny bit and ask how AI fits in with all of this, because it does seem to be all part and parcel of the same mindset. Right. Particularly when we're talking about so called AGI, Artificial general intelligence.
C
Yeah, absolutely, Yeah. I mean, they see this as crucial for the future. They see it as both a source of great promise and great thought threat. It could kill all of us. But if it doesn't, then it will usher in this utopia where we all get to go to space and live Forever with an AI God. And there's just no reason to believe any of that and plenty of reason not to. Yeah.
B
And then there's the notion that you write about that is put forth by people like Eliezer Yudkowski, that we're all going to be murdered by a future super intelligent AI. I think possibly because it wants to make more paperclips.
C
Yeah, this is this idea that this super intelligent future AI that these people think could be coming soon. They think it's going to kill us all because we're going to get in the way of it doing whatever it is that it was programmed to do in the first place, you know, no matter what that was. And it's difficult to describe without sounding nuts because the idea is pretty crazy. It's one of those things where like when you first look at it, you think it's crazy. And then when you look at it a little more, you say, oh, okay, I guess I kind of understand why they think that. And then you look at it even more like, no, I was right the first time. It is crazy. So like they think, oh, you know, when we get an AGI, an artificial general intelligence, which is this vaguely defined thing of, you know, some AI that could do everything a human can do and more, then it will inevitably become even smarter and smarter and it will keep wanting to do whatever it was that it wanted to do before it started the process of getting smarter and smarter. But that increased intelligence will make it more powerful and more capable of doing things and humanity will inevitably be in the way of whatever it is that it wanted to do, like make paperclips. And so in order to make more paperclips, it's going to kill us all and turn everything into paperclips. This is seriously the argument that these people make. You know, Yudkowski is this AI researcher, self styled AI researcher. But you also see this argument from people like Nick Bostrom, who's a philosopher until recently at Oxford University and others. And you'll also see a lot of these same people, these effective altruists, long termists, saying that this is the most pressing danger facing humanity today, more pressing than nuclear war, global warming. And when you take a close look at the arguments that they make for this, they just fall apart. They're not very good at all.
B
Yeah. So how much does all of this, and by all of this, I mean the AI stuff, I mean the Elon Musk and Jeff Bezos stuff, how does all of this intersect with our current political state? Because it sure does Seem like a lot of these guys are heavily invested, and I mean that financially and otherwise in the Trump administration.
C
Yeah, absolutely. I mean, I wrote this book before the election. I turned in, like, the final edits and stuff, like, just a couple days before the election and managed to get one or two in after. And it is not fun to have seen as much of this coming as the. It is closely connected. Like, this is these. These billionaires. You know, there's been this. This narrative that's been running around since the election, since really shortly before the election, that somehow tech billionaires have betrayed the true ethos of Silicon Valley by turning toward Trump and conservatism and authoritarianism. And it's just, I think, not true. I think this was always sort of lurking there. There's this idea of going to space and living forever and techno utopianism of that kind that's been in Silicon Valley from the start. And it has deep authoritarian. I'm not even going to call them undertones. It's just deeply authoritarian, this idea that, you know, this AI is going to take over civilization and tell all of us, you know, what's best for us because it's godlike. That's an authoritarian idea of total control. Like, there's no democracy in a future like that. And if you feel that, you know, that that kind of future is inevitably coming soon and the only alternative is the extinction of humanity, then, yeah, you gotta take all the political power you can get, because you know what the future holds, and so only you can be trusted with power. And I really think that this is what's going through the heads of people like Elon Musk and Jeff Bezos and others. I can't know that for sure, but their public statements really make it seem like this is the kind of thing that they're thinking. Musk, in particular, you know, has said that he sees this sort of struggle between people who want humanity to go extinct and people who want humanity to expand forever and understand the universe. And the part he doesn't say out loud is under his control. Right, yeah.
B
How much blame do the science fiction authors of the Golden Age period get for all this? And we're talking here about my beloved I, Isaac Asimov, my beloved Arthur C. Clarke, and my problematic fave, Robert Heinlein. People like that.
C
Yeah, yeah. I mean, look, I want to be really clear. I love all of them, too. They are all problematic faves, but I love them. You know, I grew up reading this stuff. Yeah. Same as these guys. And so, you know, saying how much blame do those authors take. I don't know. But it's definitely true that some of the blame lies with the ways that these people have interpreted the science fiction that they read. That's definitely true. Now, how much of that has to do with the science fiction itself? That's a longer conversation. But, like, these guys, you know, they read these things and we're like, oh, yeah, yeah. So that's the future, right? Science fiction just tells us what the future is going to be. Musk even tweeted at one point, science fiction should not remain fiction forever. And, like, really, man? What about dystopian science fiction? I think that would be good if that. I don't want 1984 to happen, and that's arguably a piece of science fiction. I don't even want the future of Neuromancer to happen. And that's definitely science fiction. That was an unpleasant future. And yet these guys seem determined to turn our entire world into a very stupid version of a cyberpunk dystopia and say, look, we did the thing.
B
Yeah, I mean, it's very much. And if my memory serves, you used as one of the epigraphs for the book Alex Bleckman's famous tweet about the scientists creating the torment nexus. Even after having been, you know, having read the book, that clearly says, do not create the torment nexus.
C
Yeah, exactly. I actually. I had to reach out to him to get his permission to use.
B
I saw that.
A
Yeah.
B
Yeah. But it really is spot on.
C
Yeah.
B
I want to close by. Well, there's a couple questions I have. First of all, the name of the book is obviously it's more Everything Forever. Is there maybe an alternate name that you've kept secret?
C
Yeah, I think the title is appropriate. I love the title more. Everything Forever. But in my heart, when I was working on the book, there was a different title that I was thinking of as the true name of the book in the sort of fantasy novel sense. I thought of it as these fucking People.
B
I understand why you didn't go with that as a title, but it would have been a great one and obviously highly appropriate.
C
Thanks.
B
Something you include at the end of the book. And I'm not sure I've ever seen this before. You have a list of people you've interviewed, which seems totally normal. Then you follow it with a list of interview requests that you made that were either declined or ignored.
C
Yes.
B
And that includes Sam Altman, Marc Andreessen, Jeff Bezos, Nick Bostrom, Eric Drexler, Ray Kurzweil, William Macaskill, Elon Musk, etc, etc, have I ever seen that before? I don't think I have.
C
I haven't seen it before. Maybe someone's done it before. But I included that specifically because I figured that someone would try to criticize the book by saying, you know, Becker attributes all of these views to these people, but he never even talked to them. And, like, I tried. I had this idea when I started writing this book. You know, I think the same sense that everyone gets after they successfully land a book contract of, what the fuck am I doing, right? And the form that that took for me was, why am I writing this? Shouldn't a tech journalist be writing this? And then I, you know, as I worked on it and learned more, I sort of slowly realized, right, a tech journalist probably can't write this because tech journalists need to maintain access and friendly relations with the people on that list. And I don't need to do that. I'm a science journalist, not a tech journalist. I don't need to be able to interview Elon Musk in order to be able to do my job. And the good news is these people are very loud.
B
These fucking people.
C
I didn't say that. But yeah, they say what they believe all over the place. And so it's very easy to find out what they have to say about any given subject, even though they didn't feel like they wanted to talk to me.
B
Yeah, absolutely. The book is more Everything Forever. It's out Tuesday. It's absolutely fantastic. And as I was telling Adam before we started recording, I, after reading this, picked up his previous book, which is called what Is the Unfinished Quest for the Meaning of Quantum Physics? And I'm super excited to read that. Adam, thanks so much for joining us.
C
Thank you so much for having me, Andy. This was great.
A
Hope you enjoyed checking out this episode of the New Abnormal. We're back every Tuesday, Friday and Sunday.
B
If you enjoyed it, please share it with a friend and keep the conversation going. This podcast is a Daily Beast production with production by Jesse Cannon and Seamus Calder.
A
Imagine a world of extraordinary comfort where bowl and branch bedding wraps you in the softest. Embrace the coziest experience made from the world's finest 100% organic cotton, all so you can sleep better. Start building your fall sanctuary with Bolen Branch's iconic signature sheets, made with a buttery, breathable weave that gets softer with every wash. Enjoy. 15% off your first set of sheets with free shipping and returns at B O L L& Branch.com with code BUTTERY. See site for details and exclusive conclusions.
C
ACAST powers the world's best podcasts. Here's a show that we recommend.
A
Hello hello, it's Brooke Devard from Naked Beauty. Join me each week for unfiltered discussion about beauty trends, self care, journeys, wellness tips, and the products we absolutely love and cannot get enough of. If you are a skincare obsessive and you spend 20 plus minutes on your skincare routine, this podcast is for you. Or if you're a newbie at the beginning of your skincare journey, you'll love this podcast as well. Because we go so much deeper than beauty, I talk to incredible and inspiring people from across industries about their relationship with beauty. You'll also hear from skincare experts. We break down lots of myths in the beauty industry. If this sounds like your thing, search for Naked Beauty on your podcast app and listen along. I hope you'll join us.
C
ACAST helps creators launch, grow and monetize their podcasts everywhere. Acast.com.
B
Want more great listens? Check out our comedy podcast the Last Laugh and our Star Studded the Daily Beast podcast@thedailybeast.com podcasts if you enjoyed this.
A
Episode, consider becoming a Daily Beast subscriber. Subscribing is the best way to feed the beast and support support all of your podcasts as we cover what might become the darkest timeline. Head to thedailybeast.com membership/podcast and sign up today.
Episode: RFK Jr. Takes Action on His Most Dangerous Beliefs Yet
Date: April 20, 2025
Host(s): Andy Levy, Danielle Moody
Producer: Jesse Cannon
Special Guest: Adam Becker
This episode dives deep into the dangerous rhetoric and policy actions of RFK Jr., especially concerning public health and autism, contextualizing his influence within broader political and media narratives. Additionally, the second half features a compelling interview with science writer Adam Becker about the technocratic and at times authoritarian ambitions of Silicon Valley's elite—exploring themes from his new book "More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity." The episode deftly mixes policy critique, cultural commentary, and a sharp examination of tech futurism with characteristic wit and urgency.
Representative Yasmin Ansari delivers a pointed critique of the erosion of due process rights in America, referencing the case of Mr. Garcia, who was deported without recourse and with disregard for legal norms.
The hosts see this as symptomatic of an authoritarian shift, potentially threatening all Americans’ rights.
"If Abrego Garcia does not have due process rights, then none of us do. It really is just that simple."
— Andy Levy (05:01)
"Donald Trump said unequivocally...he is going to seek to deport homegrown, quote, unquote, people that he believes don't deserve to be in this country—homegrown AKA American citizens—to a death camp."
— Danielle Moody (05:49)
Timestamps:
Hosts dissect and strongly criticize RFK Jr.'s latest statements about autism, which include inaccurate and stigmatizing claims: that autism “destroys families,” that children with autism “will never pay taxes, never hold a job,” etc.
RFK Jr.’s framing of autism as solely a destructive force and portraying it with a uniformly hopeless prognosis is condemned as both scientifically baseless and cruel.
The dangerous implications of having a figure with no medical expertise leading a major public health conversation are highlighted.
"The man has no degree, he has no expertise in anything but conspiracy theories and lies."
— Danielle Moody (08:40)
"Everything he says is just wrong...And that has real life repercussions, unfortunately, because of the position that he's been given in this administration."
— Andy Levy (09:12)
Timestamps:
Hosts critique attempts by some GOP representatives and right-wing media to equate left-wing and right-wing violence, citing selective outrage and hypocrisy.
They call out efforts to blame Democrats for violence while the GOP fails to denounce or often encourages violence from their own ranks.
Emotional, exasperated language is used to convey frustration at false equivalency and media complicity.
"No room for violence, regardless of politics or religion or anything else."
— Andy Levy (11:51)
"This is coming from the same party where...following the attack on Nancy Pelosi's husband, set off a tweet storm of quote unquote jokes about that man's skull being bashed in by a hammer."
— Danielle Moody (12:55)
Adam Becker explains that many tech elites dream of going “to space and live forever”—a mindset he traces in his new book.
Becker outlines the concept of “effective altruism”—the idea of finding mathematically optimal ways to do good, which becomes twisted when focused on hypothetical future populations at the expense of people living now.
"On the surface, it's...We want to find more effective ways to do good in the world. That sounds great until you start digging into what that means."
— Adam Becker (17:14)
Discussion shifts to “longtermism,” which prioritizes the future of trillions of hypothetical humans over those alive today, justifying nearly any current action for the sake of potential future welfare.
"It's not at all clear what the right things to do to help the vast populations in the future could be. And...you can come up with any justification of basically any course of action with that line of reasoning."
— Adam Becker (21:09)
Becker discusses the obsession with AI and its potential dangers, referencing fears stoked by people like Eliezer Yudkowski (e.g., the “paperclip maximizer” scenario).
He critiques the binary thinking of tech founders who believe humanity must either colonize the universe or go extinct.
"They see it as both a source of great promise and great threat. It could kill all of us. But if it doesn't, then it will usher in this utopia."
— Adam Becker (24:01)
Becker argues that the turn of many tech billionaires towards authoritarian politics (e.g., Trump support) is not a betrayal of Silicon Valley ideals, but a logical outgrowth. Their technocratic visions often entail anti-democratic, top-down control.
"It's just deeply authoritarian, this idea that, you know, this AI is going to take over civilization and tell all of us, you know, what's best for us because it's godlike. That's an authoritarian idea of total control. Like, there's no democracy in a future like that."
— Adam Becker (27:21)
Both Becker and Andy Levy reminisce about classic science fiction and how its misinterpretation may have inspired contemporary tech utopianism.
They reference the now-famous “torment nexus” tweet—warned against in fiction and yet, ironically, being built in reality.
"These guys seem determined to turn our entire world into a very stupid version of a cyberpunk dystopia and say, look, we did the thing."
— Adam Becker (30:33)
On RFK Jr.'s autism rhetoric:
"The man has no degree, he has no expertise in anything but conspiracy theories and lies." — Danielle Moody (08:40)
On effective altruism/longtermism:
"You can justify anything you want to do by saying, oh, well, you know, I have to do this to save the trillions of people who are going to be alive in the future. That’s why I have to kill millions of people here now." — Adam Becker (21:09)
On tech billionaires and authoritarianism:
"It's just deeply authoritarian, this idea that, you know, this AI is going to take over civilization and tell all of us, you know, what's best for us because it's godlike." — Adam Becker (27:21)
On calling out media “both sides” coverage:
"This is the game they always try to play because they know that...90 to 95% of political violence in this country...is from the right." — Andy Levy (11:57)
A moment of levity:
"In my heart, when I was working on the book, there was a different title that I was thinking of as the true name of the book in the sort of fantasy novel sense. I thought of it as these fucking people." — Adam Becker (31:12)
Takeaway:
This episode delivers a forceful rebuke of reactionary, pseudo-scientific, and anti-democratic tendencies emerging from both politics (via RFK Jr./Trump) and the tech world (via Silicon Valley utopianism). The hosts combine sharp humor, deep concern for rights and justice, and a critical look at who wields power—and to what ends. Adam Becker’s interview is a highlight, unpacking the cosmological delusions and dangers in today’s tech elite, making clear how their visions could harm us, not just in the future but right now.
Recommendation:
A must-listen for anyone interested in the intersection of policy, science, culture wars, and the future being designed—often without our input—by the world’s most powerful (and sometimes, most reckless) actors.