
Hosted by Reason Video · EN

Peter Thiel, the billionaire venture capitalist and PayPal co-founder, has a provocative theory about how the Antichrist could take over the Earth and enslave humanity. "My speculative thesis is that if the Antichrist were to come to power it would be by talking about Armageddon all the time," Thiel told Hoover Institution interviewer Peter Robinson earlier this year. The greatest danger we face, according to Thiel, might not be from global warming, terrorism, nuclear winter, or artificial intelligence going rogue. The real danger is that we're so afraid of these threats that we're willing to give up our freedom in the interest of "peace and safety," which is the phrase Thiel ascribed to the Antichrist, citing Thessalonians 5:3. "It's the opposite of the picture of Baconian science from the 17th, 18th century, where the Antichrist is like some evil tech genius, evil scientist who invents this machine to take over the world," Thiel told the New York Times' Ross Douthat on a podcast. "In our world, it's far more likely to be Greta Thunberg." "I feel like that Antichrist would maybe be using the tools that you are building," replied Douthat. Douthat was referring to Palantir, the government contractor that Thiel co-founded in 2003 during the height of the war on terror. Today, Palantir is "in the white-hot center of the latest trend reshaping the global order," according to The Wall Street Journal, receiving more than $322 million from government contracts in the first half of 2025. It's equipping the government with tools to sift through massive data troves to identify patterns and hunt down illegal immigrants. It's helping the Feds deploy facial recognition technology, and has created AI tools to "predict" where crimes might happen in advance. "Like, wouldn't the Antichrist be like: Great, we're not going to have any more technological progress, but I really like what Palantir has done so far," Douthat asked Thiel. "Isn't that a concern? Wouldn't that be the irony of history, that the man publicly worrying about the Antichrist accidentally hastens his or her arrival?" When Thiel replied that hastening the Antichrist's arrival is "obviously" not what he thinks he's doing, Douthat agreed that it was unlikely but pressed, "I'm just interested in how you get to a world willing to submit to permanent authoritarian rule." It's a great question. While Peter Thiel is warning that the Antichrist could bring totalitarianism by exploiting our desire for "peace and safety," a company that he co-founded is building the tools with great potential for abuse by a totalitarian surveillance state, all based on our desire for "peace and safety." Is it too late to stop this nightmare? It's fitting that Palantir is named after the mythical stones in J.R.R. Tolkien's Lord of the Rings, which allow users to see into distant lands, eavesdrop on conversations, peer into the past, and—some claim—conjure visions of the future. Tolkien, a devout Catholic, explored the dangers of unbridled power in a way that resonates with Thiel's ideas about the Antichrist. While the Antichrist invokes peace and safety, the corrupted wizard Saruman in Lord of the Rings implores his fellow wizard Gandalf to join forces with the dark entity Sauron, he invokes the values of "Knowledge, Rule, Order." Trump promised peace and safety when he took the stage to accept the Republican nomination in 2016 by cracking down on crime and riots in America's big cities. "I have a message for all of you: the crime and violence that afflicts our nation will soon come to an end. Beginning on January 20th 2017, safety will be restored." Thiel, who was the first tech billionaire to back Trump, spoke at that convention. Eight years later, Trump was back at the RNC once again warning of a crisis and promising to restore peace and safety. "There's never been an invasion like this anywhere," said Trump, referring to a spike in illegal immigration. Exploiting a national emergency, real or manufactured, is how governments typically grow their power and limit our freedoms. As the libertarian economic historian Robert Higgs chronicled in his 1987 classic, Crisis and Leviathan, the government ratcheted up its power in the 20th century by capitalizing on the Great Depression and two world wars. More recently, the 9/11 terrorist attacks justified the war in Afghanistan, provided cover for the invasion of Iraq, and led to the expansion of the surveillance state under a new paradigm known as Total Information Awareness. "[Total Information Awareness was] what Palantir is now. What they were literally trying to do is come up with the ability to ingest data from pretty much any source and then run that against t...

In President Dwight D. Eisenhower's famous 1961 speech about the dangers of the military-industrial complex, he also cautioned Americans about the growing power of a "scientific, technological elite." "The prospect of domination of the nation's scholars by federal employment project allocations and the power of money is ever present," warned Eisenhower. The federal government had become a major financier of scientific research after World War II, and Eisenhower was worried that the spirit of open inquiry and progress would be corrupted by the priorities of the federal bureaucracy. And he was right. Today, many of the people protesting the Trump administration's cuts to federal funding for scientific research are part of that scientific, technological elite. But there's a good chance that slashing federal spending will liberate science from the corrupting forces that Eisenhower warned us about. "If you look at, particularly, 19th century Britain when science was absolutely in the private sector, we have some of the best science," says Terence Kealey, a professor of clinical biochemistry at the University of Buckingham and a critic of government science funding. "It comes from the wealth of the rich. Charles Darwin was a rich person. Even [scientists] who had no money had access to rich men's money one way or another. The rich paid for science." Kealey points out that Britain's gross domestic product (GDP) per capita outpaced that of 19th-century France and Germany—both of which generously subsidized scientific research—indicating that the return on state subsidies in the form of economic growth was low. As America emerged as a superpower, its GDP per capita surpassed Britain's. "So the Industrial Revolution was British, and the second Industrial Revolution, was American, and both were in the absence of the government funding of science," says Kealey. Thomas Edison's industrial lab produced huge breakthroughs in telecommunications and electrification. Alexander Graham Bell's lab produced modern telephony and sound recording, all without government money. The Wright Brothers—who ran a bicycle shop before revolutionizing aviation—launched the first successfully manned airplane flight in December 1903, beating out more experienced competitors like Samuel Langley, secretary of the Smithsonian Institution, who had received a grant from the War Department for his research. The notion that the government needs to accelerate scientific progress was based on America's experience during World War II, when federally funded research led to breakthroughs in rocketry, medicine, and radar. The Manhattan Project, which cost $27 billion in today's dollars, employed more than half a million people and culminated in the creation of the atomic bomb and the discovery of nuclear fission. "Lobbyists took the Manhattan Project and said, 'Look what government funding of science can do,' and they then twisted it," says Kealey. He acknowledges that the government can accomplish discrete, "mission-based" scientific projects—like racing toward a bomb—but he argues that this is very different from the generalized state funding of "basic research" that followed. In November 1944, President Franklin D. Roosevelt sent a letter to Vannevar Bush, director of the U.S. Office of Science and Development during the war. Roosevelt instructed Bush to come up with a plan to make federal funding of scientific research permanent. "It has been basic United States policy that government should foster the opening of new frontiers," wrote Bush in calling for the nationalization of basic science research. "It opened the seas to clipper ships and furnished land for pioneers." Bush's treatise eventually led to the creation of the National Science Foundation in 1950. But it was a stunning accomplishment from America's greatest rival that would supercharge the nationalization of science. Sputnik, the world's first manmade satellite, seemed to confirm fears that the Soviets, with their centrally planned economy, might eclipse the U.S. in scientific innovation and weapons technology. That turned out to be completely wrong. But in 1957, Americans were terrified. After Sputnik, the Eisenhower administration tripled the budget of the National Science Foundation, which would provide federal grants to universities and labs. If federal funding of science is counterproductive, as Kealey argues, what explains the success of Sputnik and the Manhattan Project? Of course, government funding has led to major breakthroughs both during and after World War II, such as the synthesis and mass production of penicillin during World War II (though it was accidentally discovered in a contaminated hospital lab in 1928), cancer immunotherapy, artificial heart valves, and the gene-editing technology CRISPR. But this has to be compared to what might have otherwise happened. Good economics takes into account not only the seen, but the unseen. What are the unseen innovations the world misses out on when governments set the research agenda? "If the government funds science, it actually takes the best scientists out of industry puts them in the universities, and then industry in fact suffers," says Kealey. After Sputnik, government money pushed basic science out of the private sector. By 1964, two-thirds of all research and development was paid for by the federal government. "If you were a tool maker in Ohio in 1964, and you wanted to invest in R&D to make better tools because you wanted the beat your competitors in Utah, you wrote a grant to the Department of Commerce," says Kealey. "That's how nationalized American science was … Eisenhower's warning is absolutely correct." In academic science, process often takes precedence over outcomes. Researchers are incentivized to publish peer-reviewed papers that garner citations, which helps them secure prestigious academic posts and more federal grants. "What happens under peer review under the government is that there's homogenization, and only one set of ideas is allowed to emerge," says Kealey. The pressure to publish has created a positivity bias, where an increasing number of papers supporting a hypothesis are published, while negative findings are often buried. One biotech company could confirm the scientific findings of only six out of 53 "landmark" cancer studies. <...

"Wise words," wrote Elon Musk about this 1999 viral clip described as "Milton Friedman casually giving the blueprint for DOGE [the Department of Government Efficiency]" as he ticks off a list of federal government agencies he'd be comfortable eliminating. Musk is right. Friedman, a Nobel Prize–winning libertarian economist, did offer a solid blueprint for creating a smaller, less intrusive government. At the peak of his fame, he seemed poised to influence an American president to finally slash the federal bureaucracy. But those efforts ended in disappointment because they were blocked by what Friedman called the Iron Triangle of Politics. Slashing government waste and making the federal bureaucracy more accountable are incredibly important. But President Donald Trump and Musk are hitting the same wall President Ronald Reagan did more than four decades ago. Now more than ever, it's time to pay attention to Milton Friedman's advice for how to defeat the tyranny of the status quo. In the 1980s, Friedman's influence reached deep into the halls of power. "Government is not the solution to our problem. Government is the problem," said President Reagan during his first inaugural address in January 1981. Like Trump, Reagan was preceded in the White House by a big government liberal, who expanded the size of government and whose presidency was plagued by inflation. Reagan, who awarded Friedman the Presidential Medal of Freedom, promised to enact many of the libertarian policy ideas laid out in the 1980 bestseller co-authored with his wife Rose. "I don't think it's an exaggeration to call Milton Friedman's Free to Choose a survival kit for you, for our nation, and for freedom," Reagan said in an introduction to the television adaptation of Friedman's book. But for the most part, the Reagan Revolution failed to deliver on its libertarian promises. "Reagan's free market principles…clashed with…political reality…everywhere," wrote his former budget director David Stockman in his 1986 book The Triumph of Politics: Why the Reagan Revolution Failed. "For the Reagan Revolution to add up," he wrote, all the people "lured" by politicians into milking social services "had to be cut off." Reagan tried to keep his promises but, like most presidents, he was only partly successful. Reagan lifted price controls on oil, cut taxes, and pushed for deregulation. But his commitment to these initiatives quickly fizzled. Federal spending exploded, and he even left trade quotas in place for the automotive industry. The failure of the Reagan revolution inspired the Friedmans to write The Tyranny of the Status Quo, which examines the political obstacles that obstruct government cost cutting. Their insights are as relevant today as they were 41 years ago. The book, which came out in 1984, pinpoints the Iron Triangle of Politics as the main obstacle to cutting government. The triangle's three points reinforce each other to uphold the status quo: the Beneficiaries, the Politicians, and the Bureaucrats. The "beneficiaries" are interest groups and connected industries that profit off of government programs at the expense of taxpayers. Today's beneficiaries include farmers who receive federal dollars. The new budget bill backed by the Republican Party would extend the Farm Bill, which subsidizes crop purchases. As Friedman said, the people paying the bill are "dispersed." You might not have noticed your share of the $2.1 billion going to prop up corn, soybeans, wheat, and other prices when you paid your 2023 taxes, but the farmers who get that money certainly did. The "politicians" depicted on the triangle are supposed to be responsive to their constituents but end up serving interest groups instead. But it's the bureaucrats who actually distribute the money. They grow their power when politicians grow the size of their departments, which generates more spoils to distribute to the beneficiaries. It's a symbiotic relationship all at taxpayer expense. Bureaucracy tends to "proceed by laws of its own," wrote Friedman, noting that in the half-century between Franklin Delano Roosevelt's New Deal and the Reagan Revolution, the U.S. population "didn't quite double but federal government employees multiplied almost fivefold." Musk has also observed that a metastasizing bureaucracy "proceeds by laws of its own," stating in a press conference from the Oval Office that "if the people cannot vote and have their will be decided by their elected representatives…then we don't live in a democracy, we live in a bureau...

"Deny," "defend," "depose"—these three words were allegedly written on bullets found at the murder scene of United Healthcare CEO Brian Thompson. The slogan began appearing in graffiti, highway banners, and T-shirts. When the identity of the likely killer was revealed to be a man named Luigi Mangione, he developed a passionate fanbase. "So many men and women are going nuts over how good-looking this killer is," said Jimmy Kimmel in a breezy monologue joking about his writing staff's adulation of Mangione's physique. "Free Luigi!" exclaimed comedian Bill Burr on a later episode of Kimmel's show. How did a man who allegedly executed a married father of two at dawn on a New York City sidewalk become a hero? Those three bullets with words inscribed on them explain not just why the alleged killer did it, but why he received so much adulation. And it's not for the reasons most people think. It seemed like the perfect American tragedy: A handsome valedictorian with a promising future suffers a back injury and a botched surgery, robbed of life's pleasures at his physical peak—no surfing, no travel, no sex. The personal became political, so the story goes, when the health insurance industry rejected Mangione's claim for treatment. "Deny, defend, depose" was likely a reference to a 15-year-old book by Jay Feinman exposing how health insurance companies don't pay routine claims. "Frankly, these parasites simply had it coming," Mangione wrote in a manifesto. His fans embraced him as "our shooter." The media made him a symbol of American rage towards a system that denies basic treatments with an eye toward the bottom line. Former Washington Post and New York Times reporter Taylor Lorenz defended the celebrations of Thompson's murder, writing that in a nation with "a barbaric healthcare system," where "the people at the top…rake in millions while inflicting pain, suffering, and death on millions of innocent people," "it's natural to wish" that people like Brian Thompson "suffer the same fate." "I felt alongside so many other Americans, joy, unfortunately," Lorenz told an incredulous Piers Morgan when asked to describe her reaction to Thompson's murder, later clarifying that she felt, "maybe not joy, but certainly not empathy." Forty-one percent of poll respondents under age 30 say the killing of Brian Thompson was acceptable. More young people polled admitted to viewing the killer favorably than unfavorably. But these poll numbers don't actually tell us very much about popular dissatisfaction with health insurance. Most people under 30 are healthy and don't interact much with the health care industry. In fact, despite its problems, two-thirds of Americans say they are personally "satisfied" with their own insurance coverage. Yet, the "delay, deny, and defend" inscribed on bullets do explain Mangione's popularity: Equating words with weapons is a reflection of how our culture increasingly treats language and violence as morally indistinguishable. I first encountered claims that speech equaled violence a decade ago as I interviewed college students about microaggressions, trigger warnings, and deplatforming mobs. One student expressed the view that "political change is hard to conceive of without violence…even taking human life." Today, most students approve of shouting down viewpoints they disagree with; almost half are okay with blocking access to speeches; and a third say violence is a justified response to hateful ideas. These notions trace back to the 1960s and a group of intellectuals who were part of the "Frankfurt School." In a 1965 essay, the German-American philosopher Herbert Marcuse, who was once branded the father of the New Left, called into question the value of free speech in a "manipulated" society, arguing that we need to "reexamine…the traditional distinction between violent and non-violent action" and recognize a difference between "revolutionary and reactionary violence, between violence practiced by the oppressed and the oppressors." "People experience denied claims as an act of violence against them," Rep. Alexandria Ocasio-Cortez (D–N.Y.) said in a social media post addressing Thompson's murder. If words are violence—and denying a service is violence—then actual violence is justified as retribution. To celebrate the murder of Brian Thompson, one must first dehumanize him by transforming him from a three-dimensional human into a low-resolution symbol. But he was a real person with a family: a father of two and the son of a grain elevator operator who worked his way up the corporate ladder. Thompson reportedly rushed $135 billion to an emergency fund for health care providers, keeping "thousands of hospitals and other healthcare providers afloat during the pandemic." Meanwhile, the federal government struggled to find the money. He was soon after promoted to CEO. He made a lot of money in that role but didn't start out with the same "privileges" as his accused murderer: an Ivy League son of wealthy parents who spent the months leading up to the murder bumming around with friends in Hawaii. There's also <a href="https://apnews.com/article/luigi-mangione-united-healthcare-ceo-d148fbd...

Ross Ulbricht was arrested at 29. Now, he's 40. He faces a double life sentence plus 40 years with no possibility of parole for creating the Silk Road, a dark web drug marketplace that facilitated $1.2 billion in bitcoin-denominated transactions. "I'll spend the next few decades in this cage. Then, sometime later this century, I'll grow old and die. I'll finally leave prison, but I'll be in a body bag," he told an interviewer at a 2021 virtual blockchain conference. But a second chance might be coming for Ulbricht, from an unlikely savior. "If you vote for me, on day one, I will commute the sentence of Ross Ulbricht," Donald Trump told a crowd of attendees at the 2024 Libertarian National Convention. Trump made a deal with the Libertarian Party. And now, Ulbricht might not have to spend his middle and old age behind bars. He might not have to leave in a body bag if Trump makes good on his promise. Will he? In the coming months, career FBI officers and Department of Justice (DOJ) attorneys may dredge up lies about Ulbricht so that Trump will change his mind. They may appeal to some of his draconian instincts. They may try to pin on Ulbricht some of the disastrous outcomes of the drug war. But Trump should ignore the saboteurs and keep his promise to free Ulbricht. Here's why: Ulbricht's arrest on October 1, 2013, at the Glen Park Branch of the San Francisco Public Library was like a scene from an action movie. As he was downloading an interview with Vince Gilligan, the creator of Breaking Bad, he was simultaneously administering the Silk Road in another browser window. Undercover FBI agents staged a physical fight behind him. When he turned his head to observe the commotion, another agent snatched his laptop before he could close the cover, which would have encrypted the contents of its hard drive. The FBI keeps a picture of the computer on display as a trophy from the hunt. But Ulbricht was no Walter White, the frustrated high school chemistry teacher who transforms into a violent drug kingpin in Gilligan's series, driven by his lust for power and retribution. Ulbricht was an Eagle Scout, who majored in materials science and engineering at the University of Texas at Dallas on a full scholarship. He was passionate about libertarian philosophy and Austrian economics, and read Ludwig von Mises and Murray Rothbard. He wrote on LinkedIn that he wanted to use economic theory "as a means to abolish the use of coercion and aggression amongst mankind" and create "an economic simulation to give people a first-hand experience of what it would be like to live in a world without the systemic use of force." "I was trying to do something good," said Ulbricht in a 2021 jailhouse interview. "I was trying to help us move forward." And he did. Ulbricht created an underground e-commerce website called the Silk Road. He was its first vendor, selling homegrown psilocybin mushrooms. The Silk Road became the eBay of drugs, with trusted sellers earning higher ratings, and the message boards filled with tips for safer drug use. It established an ethical code of conduct: No fake degrees, no child porn, no stolen goods. "Our basic rules are to treat others as you would wish to be treated and don't do anything to hurt or scam someone else." "I was trying to help us move toward a freer and more equitable world," said Ulbricht. At the same time, the Obama administration's justice department was pressuring banks and credit card companies to stop servicing gun shops, adult websites, and payday lenders, even though what they were doing was completely legal. The Silk Road demonstrated that, with bitcoin, you could buy things on the internet by circumventing payment rails that the government controlled. Online trade had become virtually unstoppable. "Back then, bitcoin made me feel like anything was possible," Ulbricht says in his jailhouse interview. Ulbricht became a hero to libertarians. But others say he got exactly what he deserved. "Life in prison without parole. Anybody else? Any other wise guys want to do it? That's what you'll get," gloated Bill O'Reilly on Fox News at the time of the sentencing. For his part, Ulbricht is remorseful and regretful, telling his interviewer from the jailhouse that "we all know the road to hell is paved with good intentions, and now here I am. I'm in hell." Does he deserve this fate? As Ulbricht became paranoid that he'd be caught, did he stray from his high-minded ideals and "break bad" like Walter White? And does that mean Trump should think twice before freeing him? Ulbricht's friends were shocked at his arrest. They described him as "sweet-natured," "loyal," and "guileless and nonaggressive." He comes across as poetic and sensitive in his artwork and an online interview with a friend posted before his arrest, where they each muse about their first love and plans for the future. But many who oppose freeing Ulbricht say that he was also a contract killer, pointing to uncharged allegations that he tried to hire hit men to take out digital bandits during his tenure at the Silk Road. But when you look closer, things get murky. Here's what we know: When building their case, prosecutors drew on chat logs from a moment of crisis at the Silk Road. The site's top administrator, Curtis Green, had just been arrested. It looks like he might have stolen about $350,000 worth of bitcoin. "Nob," a participant on the Silk Road, was chatting with the site's top administrator, who called himself the "Dread Pirate Roberts." He told Nob, "This will be the first time I have had to call on my muscle," and asked that Green be "beat up, then forced to send the bitcoins he stole back." Later that day, the Dread Pirate Roberts messaged Nob again: Can you change the order to execute rather than torture? Nob sent the Dread Pirate Roberts pictures of what looked like Curtis Green being tortured and killed. It turns out that Nob was DEA agent Carl Force, one of two investigators on the case who went to prison for embezzling bitcoin during the investigation. He had staged Green's murder as part of a sting operation. But Ulbricht's defenders say that the Dread Pirate Roberts who was chatting with the corrupt undercover agent who set up a fake hit wasn't actually Ross Ulbricht. After all, the name was inspired by the film The Princess Bride to describe a character inhabited over and over by different individuals through many generations. When the Dread Pirate Roberts granted an <a href="https://www.forbes.com/sites/andygreenberg/2013/08/14/an-interview-with-a-digital-drug-lord-the-silk-roads-dread...

Is a nuclear renaissance about to begin on the very site of the public relations catastrophe that practically destroyed the industry 45 years ago? Constellation Energy recently announced a deal with Microsoft to restore a retired reactor on Pennsylvania's Three Mile Island. Microsoft has agreed to purchase energy from the plant for 20 years to power its AI data centers. A U.S. nuclear reactor has never before been brought out of retirement. Nuclear power was once considered the clean energy source of the future, with dozens of new plants coming online in the late '60s and early '70s. But in March of 1979, a meltdown occurred at Three Mile Island's nuclear plant. There were no casualties, and there was no lingering environmental damage. But the incident spooked the nation. From a publicity standpoint, the timing was disastrous—Three Mile Island occurred while The China Syndrome, a fictional account of safety cover-ups at a nuclear plant, was still in theaters, featuring Jane Fonda, Jack Lemmon, and Michael Douglas. "After Three Mile Island, what was considered to be the best interest of the public was just reducing risk to as low as possible," says Adam Stein, director of the Nuclear Energy Innovation Program at the Breakthrough Institute. "It resulted in a huge volume of regulations that anybody that wanted to build a new reactor had to know. It made the learning curve much steeper to even attempt to innovate in the industry." It was a public relations disaster for the nuclear industry, and the industry's expansion tapered off, concluding in a 20-year spell in which no new nuclear reactors were built in the U.S. "My view is that these supposedly environmentalist groups formed in the 1970s that are not primarily pro-environment. They're really primarily anti-nuclear," says Eric Dawson, co-founder of Nuclear New York, a group fighting to protect the industry on the grounds that nuclear is "the most scalable, reliable, efficient, land-conserving, material-sparing, zero-emission source of energy ever created." He says that Three Mile Island empowered the antinuclear movement. The same year of the meltdown, about 200,000 antinuclear activists crowded into New York's Battery Park City, capping off a week-long concert featuring Pete Seeger, Jackson Browne, and Bonnie Raitt, which raised awareness and funding for the antinuclear movement. "Stopping atomic energy is practicing patriotism," Ralph Nader told the crowd. "Stopping atomic energy is fighting cancer; stopping atomic energy is fighting inflation." "They are a generation that was radicalized from the Vietnam War," says Dawson. "They became antiwar. They then became anti nuclear weapons, and then they conflated nuclear weapons with nuclear energy. And they made it their mission to shut down nuclear energy." And they succeeded in that mission. Environmentalists, in effect, may have crippled the only truly viable form of clean energy. The federal government makes permitting arduous. Many states severely restrict new plant construction and force operational ones to shut down prematurely. A striking recent example was the shutdown of Indian Point Energy Center, New York state's largest nuclear plant. Antinuclear activists had targeted the plant. Their cause gained significant traction with the support of New York State Attorney General—and future governor—Andrew Cuomo, who believed the nuclear plant was "risky." Of course, it is true that nuclear energy carries risk. So does every form of power generation. "If you look at energy sources, there's nothing that's perfect. There is no utopia. basically we have a choice. Everything is compared to something else," says Dawson. Decades of political attacks on the nuclear industry have caused the United States to rely more on burning fossil fuels, which brings another set of risks. "[Nuclear] would eliminate the majority of pollution-related fatalities in the US, which is thousands a year, because most of those come from coal-fired power plants," says Stein. As politicians have slowly realized that the dangers of nuclear power may have been exaggerated by activists, and the benefits of a reliable emissions-free energy source underappreciated, the regulatory landscape has slowly changed. The first new U.S. reactor built from scratch since 1974 opened in Georgia in 2022—albeit at a very high cost. The federal government issued its first ever approval for a small modular reactor in January 2023. Constellation estimates that it will spend about $1.6 billion to bring the Three Mile Island reactor online by 2028 and will seek to renew the operating license through 2054. Pennsylvania's governor Josh Shapiro wrote a letter to federal regulators asking that the application be fast-tracked. Microsoft's VP of energy calls the deal "a major milestone" in the company's effort to "decarbonize the grid" while pursuing an AI-driven future that's going to require a lot of energy. The Microsoft deal is the latest piece of evidence that nuclear energy—after being hampered by decades of hyper-cautious regulation—is poised for a comeback. Three Mile Island could one day become a symbol for nuclear's rebirth. Photo Credits: RICHARD B. LEVINE/Newscom; FRANCES M. ROBERTS/Newscom; Paul Souders / DanitaDelimont.com / Danita Delimont Photography/Newscom; LAURENCE KESTERSON/KRT/Newscom; Robert J. Polett/Newscom; Dick Darrell/Toronto Star/ZUMA Press/Newscom; St Petersburg Times/ZUMAPRESS/Newscom; Library of Congress/Bernard Gotfryd; Jmnbqb, CC BY-SA 4.0 DEED, via Wikimedia Commons; Meghan McCarthy/ZUMA Press/Newscom; Erik Mcgregor/ZUMA Press/Newscom; Joe Sohm/Visions of America/Joseph Sohm/Universal Images Group/Newscom; Reginald Mathalone/ZUMAPRESS/Newscom; Bastiaan Slabbers/ZUMAPRESS/Newscom; Anthony Behar/Sipa USA/Newscom; */Kyodo/Newscom; Pacific Press/Sipa USA/Newscom; Paul Hennessy/ZUMAPRESS/Newscom; Michael Siluk/UCG/Universal Images Group/Newscom; KEVIN DIETSCH/UPI/Newscom; Reginald Mathalone/ZUMAPRESS/Newscom; ROGER L. WOLLENBERG/UPI/Newscom Music Credits: "Bubbles Drop" ...

Average toddler day care costs in Washington, D.C., exceed $24,000 a year, outstripping expenses in cities like New York and San Francisco. Despite the steep prices, parents such as Megan McCune and Tom Shonosky, who live in a suburban D.C. neighborhood with their children John and Lizzy, believe day care is still worth it. "They're doing these amazing activities with kids. John's last teacher was planning just all these really stimulating, exciting experiences," McCune says. "That's just not something that we can feasibly do and also have full-time jobs." But day care might soon become a luxury the couple can no longer afford. In 2016, a regulation was passed mandating that day care workers obtain a college degree. The city's logic is straightforward: If D.C.'s day care staff had college degrees, they could do a better job helping disadvantaged kids climb out of poverty. "The developmental opportunities and those early opportunities that they have really set the foundation for their potential success long term," explained local education official Elizabeth Groginsky, a proponent of the regulation. After a delay, the rule was finally implemented in December 2023. Yet contrary to its intended benefits, this regulation could lead to job losses among day care workers, increased operating costs for day cares, and higher tuition for parents. Ami Bawa, lead teacher and assistant director at a nursery school in northwest D.C., exemplifies the unintended consequences of the regulation. Although she has been working in the field for over 20 years, Bawa may now be forced out of her job. "Even though I have a lot of experiential learning, I don't meet what is now the current standard," she explains. As a veteran teacher, Bawa is technically eligible to apply for a waiver to continue working, but she's been waiting for five months for a response from the city. "All of these roadblocks make it harder. We're going to lose a lot of really good teachers," Bawa says. Proponents argue that the regulation will earn teachers more respect and higher salaries. But Bawa disagrees: "A profession like teaching specifically has to be one where you really care for and love what you're doing. What your education credential is doesn't equate to loving and being committed to the field." The regulation "makes us feel like we're interchangeable, like anybody could do this job, when that really is not the case." In addition, the college requirement complicates the process for day cares to find qualified staff. McCune explains, "It's going to be the smaller day cares, the more affordable day cares that are going to suffer because they're not going to be able to attract talent or retain it, and they're not going to be able to put their prices to the level that they need to be to cover that talent, because people like us aren't going to be able to pay it." In 2018, the libertarian-leaning public-interest law firm the Institute for Justice sued the D.C. government to overturn the education requirements, claiming it interferes with the right to earn a living. But the courts ruled in favor of the city on the grounds that the requirement was reasonable. Yet the effectiveness of college requirements remains a subject of debate. As Robert Pianta, a professor of early childhood education at the University of Virginia, points out, "The evidence for a two-year degree or a four-year degree is not strong." There are over 3,000 early childhood degree programs across the United States, and they vary significantly in terms of what they teach and focus on. "With all that variation under there, it's no surprise to anyone that the degree itself doesn't matter," Pianta says. Many day care teachers eager to retain their jobs have enrolled part-time at institutions such as Trinity Washington University, a small college in the district. To earn the degree required to be an assistant teacher at a D.C. day care, students at Trinity can take classes like American history and music appreciation but aren't required to take courses in early education. Councilmember Christina Henderson supports the idea that day care workers study subjects unrelated to early education, emphasizing the importance of "critical thinking and learning." In contrast, McCune remarks, "Let's just back up a little and remember that these are babies….I think the needs of children at that stage, they're pretty primal." Nicole Page, a local preschool director, believes that "it does not only take education, it takes experience" to work at a day care. "That's what we will lose if we are not able to retain our staff, is the wealth of knowledge that they have by hands-on experience." Her preschool is at risk of losing valuable staff, with at least 11 teachers failing to meet the new qualifications. One teacher even has a Ph.D. in family and children studies and is an adjunct professor teaching a policy and advocacy course for early childhood education at a local university, but she's no longer qualified to teach at a day care because her degree isn't in early childhood education. "If we are not able to retain the staff that we have, we may end up having to close some of our classrooms," Page explains. This regulation, intended to improve child care quality, may instead harm those it aims to assist. "I just think in D.C., there's a lot of bureaucracy," says Shonosky. "This is just another case where bureaucracy is going to make our lives worse." Music Credits: "Pizzi Waltz" by Kadir Demir, via Artlist; "Against the Clock" by Rhythm Scott, via Artlist; "The Morning Lights" by Francesco DAndrea. via Artlist; "Sophisticated Nostalgia" by Nobou, via Artlist; "Deep Dive" by Ty Simon, via Artlist; "The Isle" by Rhythm Scott via Artlist; "Grey Shadow" by ANBR, via Artlist; "Cur...

After surviving a disastrous congressional hearing, Claudine Gay was forced to resign as the president of Harvard for repeatedly copying and pasting language used by other scholars and passing it off as her own. She's hardly alone among elite academics, and plagiarism has become a roiling scandal in academia. There's another common practice among professional researchers that should be generating even more outrage: making up data. I'm not talking about explicit fraud, which also happens way too often, but about openly inserting fictional data into a supposedly objective analysis. Instead of doing the hard work of gathering data to test hypotheses, researchers take the easy path of generating numbers to support their preconceptions or to claim statistical significance. They cloak this practice in fancy-sounding words like "imputation," "ecological inference," "contextualization," and "synthetic control." They're actually just making stuff up. Claudine Gay was accused of plagiarizing sections of her Ph.D. thesis, for which she was awarded Harvard's Toppan Prize for the best dissertation in political science. She has since requested three corrections. More outrageous is that she wrote a paper on white voter participation without having any data on white voter participation. In an article in the American Political Science Review that was based on her dissertation, Gay set out to investigate "the link between black congressional representation and political engagement," finding that "the election of blacks to Congress negatively affects white political involvement and only rarely increases political engagement among African Americans." To arrive at that finding, you might assume that Gay had done the hard work of measuring white and black voting patterns in the districts she was studying. You would assume wrong. Instead, Gay used regression analysis to estimate white voting patterns. She analyzed 10 districts with black representatives and observed that those with more voting-age whites had lower turnout at the polls than her model predicted. So she concludes that whites must be the ones not voting. She committed what in statistics is known as the "ecological fallacy"—you see two things occurring in the same place and assume a causal relationship. For example, you notice a lot of people dying in hospitals, so you assume hospitals kill people. The classic example is Jim Crow laws were strictest in states that skewed black. Ecological inference leads to the false conclusion that blacks supported Jim Crow. Gay's theory that a black congressional representative depresses white voter turnout could be true, but there are other plausible explanations for what she observed. The point is that we don't know. The way to investigate white voter turnout is to measure white voter turnout. Gay is hardly the only culprit. Because she was the president of Harvard, it's worth making an example of her work, but it reflects broad trends in academia. Unlike the academic crime of plagiarism, students are taught and encouraged to invent data under the guise of statistical sophistication. Academia values the appearance of truth over actual truth. You need real data to understand the world. The process of gathering real data also leads to essential insights. Researchers pick up on subtleties that often cause them to shift their hypotheses. Armchair investigators, on the other hand, build neat rows and columns that don't say anything about what's happening outside their windows. Another technique for generating rather than collecting data is called "imputation," which was used in a paper titled "Green innovations and patents in OECD countries" by economists Almas Heshmati and Mike Shinas. The authors wanted to analyze the number of "green" patents issued by different countries in different years. But the authors only had data for some countries and some years. "Imputation" means filling in data gaps with educated guesses. It can be defensible if you have a good basis for your guesses and they don't affect your conclusions strongly. For example, you can usually guess gender based on a person's name. But if you're studying the number of green patents, and you don't know that number, imputation isn't an appropriate tool for solving the problem. The use of imputation allowed them to publish a paper arguing that environmentalist policies lead to innovation—which is likely the conclusions they had hoped for—and to do so with enough statistical significance to pass muster with journal editors. A graduate student in economics working with the same data as Heshmati and Shinas recounted being "dumbstruck" after reading their paper. The student, who wants to remain anonymous for career reasons, reached out to HeshmAati to find out how he and Shinas had filled in the data gaps. The research accountability site Retraction Watch reported that they had used the Excel "autofill" function. According to an analysis by the economist Gary Smith, altogether there were over 2,000 fictional data points amounting to 13 percent of all the data used in the paper. The Excel autofill function is a lot of fun and genuinely handy in some situations. When you enter 1, 2, 3, it guesses 4. But it doesn't work when the data—like much of reality—have no simple or predictable pattern. When you give Excel a list of U.S. presidents, it can't predict the next one. I did give it a try though. Why did Excel think that William Henry Harrison' would retake the White House in 1941? Harrison died in office just 31 days after his inauguration—in 1841. Most likely, autofill figured it was only fair that he be allowed to serve out his remaining years. Why did it pick 1941? That's when FDR began his third term, which apparently Excel consi...

In his 1996 book, The Vision of the Anointed, economist Thomas Sowell sketched out a pattern that many of the "crusading movements" of the 20th century have followed. First, they identify a "great danger" to society, followed by an "urgent need" for government action "to avert impending catastrophe." A new book by psychologist and author Jonathan Haidt, The Anxious Generation, argues that the government must regulate social media because it's causing a teen mental health crisis. Haidt is, in many ways, a model researcher because of his rigor, transparency, and openness to dissent. On this issue, however, he fits neatly into Sowell's framework. Those best equipped to get attention from the government and the media are the most "articulate" people, Sowell observes, and they often reference opaque studies without explaining them. And Haidt is certainly articulate—his book is well-written and filled with compelling insights. But he claims far too much certainty for his views, based on research that is mostly junk. And he advocates for restrictive government policies without doing the simple tests that might support or disprove their value. Academic studies often make use of statistical techniques that are hard for the average person to decipher, which is a shame because "most published research findings are false," as Stanford's John Ioannidis argued in a 2005 paper. Ioannidis wasn't just referencing the many scandals of fabricated data, conscious or unconscious bias, and misrepresented findings. Even top researchers at elite institutions have been guilty of statistical malpractice. Peer review is worse than useless, better at enforcing conventional wisdom and discouraging skepticism than weeding out substandard or fraudulent work. Academics face strong pressure to publish flawed research. Few have the skill and drive to produce high-quality publications at the rate required by university hiring and tenure review committees. Even the best researchers resort to doing some easy, low-quality studies. Bad studies tend to be the most newsworthy and the most policy-relevant. Many of the papers Haidt compiled contained coding errors, inappropriate statistics, and other issues. Most downloaded some data of little relevance—either cheap to generate, like surveying your sophomore psychology students, or data collected for a different purpose—and analyzed it with an off-the-shelf statistical approach. Haidt cites 476 studies in his book that seem to represent an overwhelming case. But two-thirds of them were published before 2010, or before the period that Haidt focuses on in the book. Only 22 of them have data on either heavy social media use or serious mental issues among adolescents, and none have data on both. There are a few good studies cited in the book. For example, one co-authored by psychologist Jean Twenge uses a large and carefully selected sample with a high response rate. It employs exploratory data analysis rather than cookbook statistical routines. Unfortunately for Haidt, that study undercuts his claim. The authors did find that heavy television watchers, video game players, and computer and phone users were less happy. But the similar graphs for these four ways of spending time suggest that the specific activity didn't matter. This study actually suggests that spending an excessive amount of time in front of any one type of screen is unhealthy—not that there's anything uniquely dangerous about social media. An example of a bad study that Haidt cites in his book is one that paid $15 each to 1,787 self-selected internet respondents, aged 19 to 32, to answer 15 minutes' worth of questions online. Few were likely to have been teenage girls, and there's no reason to expect any were depressed teenage girls who used social media. In fact, I couldn't find any studies in Haidt's compendium that spent substantive time interviewing any depressed teenage girls or heavy social media users. Another study that uses low-quality data was based on surveys of 143 University of Pennsylvania students, who participated for psychology course credit. Undergraduate psychology students at an elite university are hardly a representative sample of the population. The authors seem to have made significant coding and random assignment errors. Haidt cites 17 studies he considers to be longitudinal that either find no effect or an effect in the opposite direction of his claim, and only four were true longitudinal studies, meaning they analyzed the same group of people at different times to see how changes at one time, like increased social media use, were associated with future changes, like more depression. One of the studies on Haidt's list contradicted his claim, finding that depression occurs before social media use, not the other way around. Practically all of the studies Haidt cited either have major methodological errors or didn't say what he claimed. I doubt many of the senators on the Judiciary Committee or members of their staff, to whom Haidt claimed an "urgent need" for government action, read the underlying papers. The fact that Haidt cites a lot of studies just makes the problem worse. Errors cascade; they don't cancel each other out. Even the best studies Haidt relies upon have the fatal flaw of not studying the subject. You can't establish the effect of heavy social media use on teenage girl depression unless you study heavy social media users and depressed teenage girls. None of the studies do this. Instead they study mostly adults, mostly average social media users, without serious psychological issues. When Haidt moves on to policy, a different kind of study is called for. If you want to ban phones in schools, study kids who went to phone-free schools vs. a control group of kids who were allowed to use phones in school. Even this would only tell you average effects, but at least showing the change is positive on average is a step toward testing your hypothesis. To be fair, Haidt does have some solid policy proposals that don't suffer from this flaw. When he argues that governments should stop criminalizing free play or that internet companies should insist on more reliable age-verification measures, which parents are requesting, he doesn't need studies. When Haidt first presented his argument on Substack, I critiqued the studies he cited, and he responded by suggesting that I have impossibly high standards for proof and that the consensus of hundreds of researchers studying related issues is strong enough to justify policy actions. My objection is that the researchers whose work Haidt relies on didn't bother to talk to teenagers who are heavy social media users and who are in treatment for depression. Based on poor quality studies, he wants the go...

Once upon a time, America embraced nuclear power as the future of energy. Today it accounts for a mere 18 percent of the nation's electricity generation, while fossil fuels remain dominant at 60 percent. Why did nuclear fail to take off? From 1967 to 1972, the nuclear sector experienced significant growth, and 48 new nuclear plants were built. But in March 1979, a meltdown at Pennsylvania's Three Mile Island nuclear power plant, which resulted in no casualties and no lingering environmental damage, spooked the entire nation and empowered anti-nuclear activists. "After Three Mile Island, what was considered to be in the best interest of the public was just reducing risk to as low as possible," says Adam Stein, director of the Nuclear Energy Innovation Program at the Breakthrough Institute. "It resulted in a huge volume of regulations that anybody who wanted to build a new reactor had to know. It made the learning curve much steeper to even attempt to innovate in the industry." After the incident, the momentum behind nuclear reactor construction tapered off and no new reactors were built for the next two decades. Nowadays, the landscape remains unchanged: The federal government makes permitting arduous, while many states impose severe restrictions on new plant construction and force operational ones to shut down prematurely. For example, take Indian Point Energy Center, the largest nuclear plant in New York State. In 2007, anti-nuclear activists targeted the plant, which provided a quarter of downstate New York's electricity. Their cause gained significant traction with the support of New York state attorney general—and future governor—Andrew Cuomo, who believed the nuclear plant was "risky." Cuomo promised to usher in a new era of clean energy for New Yorkers. His moves against Indian Point garnered support from fellow Green New Deal advocates, including Senator Bernie Sanders (I-Vt.) and Rep. Alexandria Ocasio-Cortez (D–N.Y.), as well as environmental groups. The plant eventually closed in April 2021, but there was "a gulf between intentions and results," explains writer Eric Dawson, co-founder of Nuclear New York, a group fighting to protect the industry. The closure of Indian Point increased New York's carbon emissions. State utilities had to make up for the loss of energy by burning more natural gas, resulting in a 9 percent increase in energy-related CO2 emissions. At the same time, the state's energy prices also increased. This outcome isn't unique to New York. Germany also opted to phase out nuclear power, betting on wind instead. Electricity from windmills increased, but so did the country's reliance on coal. In 2023, Germany emitted almost eight times the carbon per kilowatt-hour than neighboring France, which still gets the majority of its electricity from nuclear and less than 1 percent from coal. According to Dawson, nuclear power is "the most scalable, reliable, efficient, land-conserving, material-sparing, zero-emission source of energy ever created." Wind and solar aren't as reliable because they depend on intermittent weather. They also require much more land than nuclear plants, which use about 1 percent of what solar farms need and 0.3 percent of what wind farms require to yield the same amount of energy. The economics of nuclear power are undoubtedly challenging, but its advocates say that's primarily because of its thorny politics. The headache of building a new power plant is vividly exemplified by Georgia's Plant Vogtle. The first U.S. reactor built from scratch since 1974, the project turned into a nightmare scenario: It took almost 17 years from when the first permit was filed for construction to begin, it cost more than $28 billion, and it bankrupted the developer in the process. Nuclear regulation is "based on politics and fear-mongering and a lack of understanding," explains Indian Point's vice president, Frank Spagnuolo. If they aren't shut down, he says, power plants such as Indian Point could safely continue to provide clean energy for decades. Despite the opposition, there remains some hope for the future of nuclear energy. Companies are actively developing next-generation nuclear technologies, such as small modular reactors and molten salt-cooled reactors, to minimize the risks associated with nuclear meltdowns and explosions. And some former nuclear opponents have become advocates, acknowledging it as a vital source of clean energy. The converts include the Environmental Defense...