Loading summary
A
Deal's not just another payroll platform. It's one your team might actually enjoy. Hr IT and payroll together finally built in house built for peace of mind. Visit d e l.com HPR Adi I'm Adi Ignatius. I'm Alison Beard, and this is the HBR IdeaCast. Alison, it's been interesting to live through the evolution in economics. You know, in the 70s and the 80s, you had this early explosion of behavioral economics, right, led by people like Daniel Kahneman, Richard thaler, and then 20 years ago, we had the Freakonomics phenomenon. So you had Stephen Levitt, an economist, and Stephen Dubner, a journalist who wrote a book that popularized all of this thinking, that attempted to show the hidden side of everything, what truly motivates us as economic actors. And the field took off behind this basic tagline that conventional wisdom is wrong. Are you a fan? Yeah, absolutely. I mean, that was the book that made economics cool. Everyone wanted to read it, everyone wanted to replicate it. And I do think it changed the way that companies think about consumer behavior and also the way that managers thought about getting the best out of their employees. Yeah, I think that's right. I think it opened our eyes to a lot of phenomena that were not immediately clear. I think it also changed academia in some ways that professors saw the success of Malcolm Gladwell's books of Freakonomics and thought, well, wait a minute, I can do research that is also broadly relevant, that has popular appeal, and I can make a name for myself. So I think the book has actually had a profound impact. So 20 years later, I spoke to Stephen Dubner, the journalist, part of the duo, and we talked about the book's legacy. We talked about what they got wrong, what they got right, and why, and how we can find the hidden side of everything. So here's that conversation. All right. So, Stephen, welcome to the HBR IdeaCast.
B
Thank you so much. Appreciate it.
A
So congratulations. The book is 20 years old. It has had an outsize influence, I'd say, on. On American culture, on global culture. Levitt, I guess, was, you know, an early member of that generation of economists that, you know, kind of shook up the profession, making use of data sets that either were not. Didn't exist because we're sort of into the data era at this point, or were. Were inaccessible or, or didn't seem seemly for economists to use, that really looked at what truly provoked consumer behavior, individual behavior. So how did the economics profession respond to him and, and then to him and you, when the book came out, and especially when the book sold so Many copies.
B
I think there was a range. So there were some people who were irate with Levitt. There were people who felt that he was showing how the magic, the regression analyses get performed. And I think they weren't so happy about that. And then there were a lot of economists that we started to hear from, you know, maybe six to 12 months later who recognized that the success of Freakonomics was actually really good for them too. And it came in a couple forms. One was, you know, my mother, or my mother in law maybe more often has no idea what I do. Thank you for writing a book that she can read that sort of explains it. So that was one. The other was economists recognized that since this was a book that became very popular and it did have a foundation in how an academic economist goes about doing his work, that it would probably drive more students to economics at universities. And whether our book did that or it happened to coincide with a big surge in economics majors, I don't know. But there was a big surge and therefore it did, I would say enlarge and maybe enrich the market for economics professors. So, you know, I think anytime you do anything that's high profile, doesn't even have to be popular, but high profile, a lot of people, their first instinct, they're almost trained. If you're a journalist, if you're an academic, if you're a certain kind of cultural critic, you're almost trained to look for the flaws or to look for the ways in which it can't be as good as everybody's saying it is. So I get that. Why it captured the public's attention the way it did is hard for me to explain. I have a lot of kind of half baked theories, probably none of them are very correct, but I'm mostly, I mostly dwell in the gratitude space and then like, let's get on with it. How can I use this opportunity, this platform to do more work that I like to do? And Levitt felt the same for him.
A
So part of the, the appeal of the book, and really the point of the book is that there's a hidden side to everything. But isn't that also risky and almost conspiratorial? I mean, aren't the best explanations usually or often simple and straightforward?
B
I think a lot of times they are, for sure. We tried to point out when the conventional wisdom is, quote, wrong or maybe partially wrong. We tried not so much to just play gotcha and say, oh, look at you being a simpleton for thinking X causes Z like that. What we tried to do is look at the whole ecosystem around that particular piece of conventional wisdom and look at how it was created and who were the experts who created and published and promulgated that conventional wisdom and did they have their thumb on the scale or a horse in the race in any way? And the answer to that question is almost always yes. I feel we've always been pretty positive about the systems, even when we critique them because you know, humans, you know, I think we're a pretty great species, but we're fallible. So even when we mean well, we often mess up a little bit. We might have an intention to create some great piece of policy, some pro social policy. Maybe it's in the education space or healthcare space. And then when you gather the data 10 years later you see, oh wow, not only did it not produce the gains we thought, but it actually kind of backfired. There were some unintended consequences, but. But you don't go throwing those charges around willy nilly. You wait until there's data and you.
A
Try to measure the effect without any evidence. I would say that the success of Freakonomics, the success of Malcolm Gladwell's writing and I interviewed Malcolm on this podcast a little while ago, was partly a cause of kind of a re energizing of economics profession and certainly the behavioral economics profession. But I also think there's a crisis in that area in social science research and certainly in business and workplace related research where you're seeing slew of retractions, fabricated and shoddy data and then this whole network of right of self appointed arbiters who are ready to sort of try to replicate and to denounce the scholarship. So you know, I think you may have unwittingly contributed to that because you made this stuff accessible because people realized if you produce an interesting counterintuitive finding, you're going to get press coverage, right? You're going to get attention, you're going to sell books and that's created an incentive that, you know, if you're a skeptic or a cynic, you're thinking, well, well, people are putting their finger on the scale with the data to try to come up with these things. I'd be interested in your take on that because unintentionally you're sort of part of this kind of new world of scholarship that can be more popular and can be more lucrative. And so talk about that and the pressure to get it right given those circumstances.
B
I mean, first of all, I would say that the incentives to get attention for yourself or your Work are as old as humankind. If you look at yellow journalism from whatever, 100, 130 years ago, and you know, I came up as a journalist, so all these axioms are drilled into your head. Dog bites man, you know, ho hum, man bites dog. Great local TV news. If it bleeds, it leads. Is it representative at all? No, it's not representative at all. I came up as a print journalist. I was at the New York Times. We like to say that TV is called a medium because it is neither rare nor well done. So all of us have been well.
A
And then and then. Remember when USA Today came out somebody that this is for people who find TV news too complex.
B
So the incentives to get attention like that, whether you're a producer or a producer of news, a producer of research, et cetera, I mean, they've always been very, very strong. It's funny because people did single out little tidbits from our book and they did become cocktail party fodder. Did you know that? Blank, blank, blank. And it's funny because we didn't treat it like that when we wrote it. I think the first chapter began with this story about a fine gone wrong at an Israeli daycare center where parents were coming late. So the school instituted a fine so that the parents would not come late, but the fine was relatively small. And what happened was that more parents began to be late because they figured I could just pay this small fine a few dollars and actually get in an extra game of tennis or whatever. It's like cheap childcare. So that was the kind of story that people would love to retell us. Which was fine. It was fine. But we told that story, which was based on a research paper by Uri Ganesi and Aldo Rustocchini. The whole point of telling that story was to give a really tiny example, a tiny scale example of a few doll, how incentives work, right? So when people talk about incentives, especially economists, the first consideration is usually financial incentives. And we were trying to create a framework or an argument that there are all kinds of incentives that we're all always responding to. I have a strong incentive to burnish or protect my reputation. There may be moral or social incentives that matter a lot to me more than financial incentives. So, you know, I guess I feel like we weren't just dropping one liners to get attention, but this book of ours was full of stories connected to, like I said, bigger ideas or bigger themes. In terms of the larger issue that you raise about academic fraud and the replication crisis and so on, the incentives of the researchers and the incentives of the universities to protect research that may be bad are so strong. And that's because the universities are run by lawyers and lawyers are extremely, extremely risk averse. So I hate, hate, hate the fraud. I will say this. We also covered the kind of whistleblowers and the do gooders, if you want to call it. Ivan Oransky at Retraction Watch does amazing work. The three guys at Data Colada, they've done a lot and very selfless work and work that draws attention to them for being the whistleblowers, which is not what they want to do. There are a lot of people fighting the good fight there, but again, the incentives are massive. There are hundreds, maybe thousands of essentially fake academic journals that exist to offer paid publication to scholars, or would be scholars, but especially second or third tier scholars. So that's a joke. It's a, it's a terrible situation. Those same journals will also charge university libraries a huge amount to force them to carry the journal. So it's very, very murky in there. One thing that always attracted me to economists, I will say, I don't want to say that economic research is prima facie more robust than psychology research or sociology research or anthropology research. But I will say this. The economists that I've interviewed and gotten to know over the years, which is probably in the at least many dozens, probably hundreds, and a lot of psychologists as well, at least in the many dozens, and I really respect a lot of their work, the research feels very different. There's a lot more data in an economics paper. If you really want to fake a massive data set, you could, but you'd be crazy to try because the way that the peer review works. The other thing I love about economics papers that I wish the other social sciences would copy is when you're an economist and you write a paper for publication, you say, here's my thesis, here's my data, here's my methodology, and here's my chief finding. Here's what I'm going to argue in this paper and here's how I'm going to argue it. And then they will typically say, now I'm arguing that X causes Y. But there are others who might have thought in the past or might think in the future that A causes Y or B or C. I'm going to go through those and explain why I don't think those are reliable answers or why they maybe carry a small piece of explanatory power, but not as large as the piece that I'm about To give you. I find that to be an unbelievably useful exercise. It also takes, you know, two, three, four years to write a paper like that. So it's a very labor intensive thing.
A
You know, there is the basic question causation versus correlation. And that seems to be where, you know, amateurs trip up or journalists trip up or journalists can get duped or something like that. And, you know, if you purport to show that the consumption of Pop Tarts is driving up the price of oil, I'll run that story. But, you know, that's. You probably have a causation correlation problem there. How have you learned to apply rigor to that question?
B
It's really hard. I mean, the more complex something gets, the harder it is to say with any kind of certainty that, you know, that X causes Y. And we do try to illustrate in the book, and certainly in radio programs ever since then, how easy it is to fall into a bad correlation causation trap. You know, one example we wrote about maybe in, maybe in our second book was about polio, which was a really terrible but interesting disease. The way it was approached, and I believe the cause is still essentially unknown, or the origin cause, you know, the way it spread came to be known. But once the vaccines were available, that question became less pertinent. But there was a theory at a certain point, since polio tended to spike in the summer, a lot of people would keep their kids out of swimming pools and things like that. There was a theory that the consumption of ice cream caused polio. Similar to your Pop Tarts in oil, maybe. So you can see why, especially in the moment, especially if things are going sideways, especially if people are scared. You can see why people attach themselves to what seem to be causal relationships that are only correlational at best. What we tried to do is really just show the homework. So in. Probably the kind of most high profile and highest stakes example of a causational argument in Freakonomics was based on a paper that Steve Levitt written with Steve, with John Donahue earlier about how the legalization of abortion led to a generation later, a decrease in crime because it made available abortion to women whose children would have been born into circumstances that are more likely likely to lead them to trouble. Lower income, unstable families, unstable circumstances. Interestingly, one thing that I think we wrote that people almost never attached to that or remember from that is that it didn't necessarily mean that that woman that would be mother wouldn't have a child. It was often a timing thing and the child would often come later when she was better set up to have a kid and to raise that kid productively. So it was really an argument about how unwantedness is a driver of an outcome or a life that is more likely to have trouble in it, whether it's crime or whatnot. But when we wrote that piece of the book, and this was really drawing on a very, very, very intense and robust analytical paper that Steve Levitt had written already with John Donahue, we again entertain all kinds of other potential explanations. And then you marshal what Levitt would call a collage of evidence to make your claim feel as if it's about as substantiated as it can be. So, for instance, you know, one simple maneuver in that thought process would be to say, okay, if the legalization of abortion, represented federally, nationally by Roe v. Wade in 1973, led to a decrease in crime 15, 16, 18 years later, what do you do to get something outside that sample set of 1973? A switch is turned on. You're always looking for a place where a switch might have been turned on earlier or later or a different kind of switch was turned on, or something else came in to provide for you. What they call an instrumental variable that's clean. In the case of abortion, one piece of evidence from that collage of evidence was that I believe it was five states had previously separately as states legalized abortion. So, okay, now you've got a different instrumental variable to play with. And you can look at, in a place like New York or California, did crime begin to fall earlier than it did elsewhere because there was availability to abortion earlier? And the answer was yes. So when you're trying to make an argument whether you're a politician or a business person, whatever, I like people who show their homework. And then when I'm interviewing them, I like to say, you know, this is a very basic question I would encourage everybody to ask all the time, what is your best evidence that the argument you've just made is true? And if you don't have any evidence, then I can probably assume that you're just BSing me or making it up. If you offer me evidence that is based on a survey on the street of 18 people, I'm going to say, well, that doesn't sound very reliable either. If you're going to say your evidence is based on administrative data from a big state in India from the 1960s, I'm going to say, well, yeah, that's a lot of data, but it's from a long time ago. So. So, yeah, you gotta poke and Investigate the data as best as you can if you want to make an argument that something is actually causal. And that said, making causal arguments is really hard. And that's why science is hard and that's why I like scientists.
A
One difference between now and when the book came out. You know, when the book came out, you could disagree with the methodology, you could talk about correlation versus causality. But I feel like we sort of agreed then that there were verifiable facts that could be analyzed and picked apart. I don't think we're in a post fact era because I think we have more data at our fingertips than ever. But somehow in the political divide that we have, people are comfortable with the idea of there being alternate facts and dismissing inconvenient facts as somehow fake or somehow made up. I worry about that a lot as a journalist, as you can imagine. But I guess I'm curious. You're ultimately looking at some fact based set of numbers, circumstances that can be analyzed, but you know, is the world open to fact based explanation at this point?
B
I'm with you. You know, it saddens me, it frustrates me, it scares me how easy it is for people to think whatever to put it really basically that something is true if it's demonstrably not true. Now granted, it's hard to prove that sometimes that something is demonstrably true. I guess I've responded to that by, look, I don't think I'm the person that's going to be able to fix that problem. I think there are millions of people in the world who are struggling to fix that problem and I don't know if they're doing very well either. But it's just not something that I think I have a particular talent for. What I try to do is just practice what I kind of would like to be preached a little bit more, which is find the people who know the stuff. Ask them questions that are, you know, fairly well informed. I try to do a good bit of prep before I interview anybody about anything. I try to give them the room to really explain themselves. I listen really hard so that if they say something that either isn't clear or, you know, people misspeak all the time in interviews and sometimes I don't catch it. And then we're looking at the transcript later, someone said, you know, went inside the curve instead of went outside the curve. And we're like, oh, well, I guess we don't want to use that tape some, you know, but then we'll call them up and say, did you misspeak here. And if they did, then we may rerecord them saying that, you know, but that's just basic journalism, basic fact checking. So that's the world where I came up in, which I still operate, in which I still like. But then I think it's just the way you do your work and people either subscribe to that style or not. Like our listenership of freak radio. It's a very interesting listenership. It's broad and it's very diverse. I would argue, I mean, this is probably unprovable argument. I would argue it's one of the most diverse, I would call it a large niche media audiences in the world. We've got about two and a half million people that listen to at least one of our episodes every month. And the emails that we get are from such diversity of people doing so many different things and the way that they think about the world is so different. And I take a lot of comfort in that because I think there is a large community out there of people who simply want to understand the world better, often with the intention of making the world a little bit better than when they came into it. And that's me, that's what I like to do too. So I'll give you, for instance, we're working on a two part series right now about the air traffic control system in the United States. I'm sure that if I went on Reddit for two hours, I could come up with a story that would frustrate, scare, whatever, do a lot of things, and it would be based on probably very little fact, at least very little primary source fact, but a lot of passion. A lot of passion. A lot of passion. And look, passion is wonderful. So what we did for that, and I have to say the producer on this, Teo Jacobs, did a really, really, really good job of, of working through a lot of potential sources. And the sources that he came up with for me to interview are all extraordinarily substantial, smart and good at describing the scenarios they're dealing with. So one of them is the CEO of a major airline. One of them is an economist who specializes in transportation deregulation and really knows the FAA inside. Now, these are people who, when you hear them, when you hear them being interviewed, and I would argue that almost anybody that's hearing or seeing this would say, okay, I get it, this person knows what they're talking about. They're not BSing, they're not pushing a position, et cetera. So that is a style of journalism or storytelling that I Like, doing. It's fun, it's interesting. I believe it when I publish it. I sleep well at night. Look, there's a lot more room for that kind of journalism and storytelling in the world. So for anybody out there who feels like they're locked into this thing where they have to come up with like, like tiny little cocktail party things that are going to get them a new contract or an invitation to something, I would just say the water over here is good. It takes a little bit more work. You have to adjust your. Your expectations a little bit. You know, I grew up on a farm. As a kid, we used to make maple syrup and do all this outdoor work that was like, back breaking and very labor intensive. And I sometimes think of this kind of journalism as like making maple syrup. You have to go around to all the trees. You gotta bang in the taps, you put a bucket. You have to empty out the buckets when they fill up, then you gotta boil down all that SAP. And you get this much at the end of the day. And I'm like, that was a lot of work to get this much. And that's kind of the way I feel every Friday morning when we publish an episode of Freakonomics Radio. It's one little episode in a sea of things, but I feel maybe not as delicious as maple syrup, but I think it tastes pretty good.
A
So let me ask, before we have to close, you know, as you reread it, is there something or are there things that make you cringe where you just think, wow, I would not have done that if we were doing the book now. I wouldn't do it the same way.
B
I think if we wrote it now, the tone would be different. We had a really good time writing it. We were often playful in a way that I think now would feel a bit callow. We were also younger. You know, I'd like to think that I've gotten a little bit, you know, wiser, kinder, as I've gotten older. I've tried at least I would say there were very few errors that needed correction. One that did, and we corrected it early. That was really. It was painful because it meant kind of taking down a hero. We wrote about this guy named Stetson Kennedy who had done a lot of work against the Klan and against racism in the South. At great personal risk to himself, he infiltrated these Klan meetings and Klan groups and so on. As it turns out, after we published the book, I heard from someone who had worked in some archives that intersected with this work on the Klan. And it Turned out that according to our best reading of all these archives that came after the publication, the book, that this guy was right. And so it seems as though he had exaggerated his activities or conflated his activities with someone else who was an undercover agent. And so that was really painful because I had to. You know, he was an older guy, he's since died, Stetson Kennedy. And he did a lot of very, very, very good work. Work. But I couldn't let that stand if he had exaggerated to that extent. So I called him up, I said I needed to talk to him about something serious. He said, come on down. He was in Florida. I flew down there, we went to this place for lunch, and I just laid out the argument. And he. The look on his face, I'll never forget. It was a very unpleasant reaction to have been the cause of. But he also didn't admit. He just kind of said, I don't know what you're talking about. Everything I wrote is true. This was. He'd written this probably 40, 50 years ago. But the more I tried to unpeel the evidence and present it to him and say, this doesn't match with this. This doesn't match with this. I was persuaded I was right. So what we did in that case was we wrote a column for the New York Times about how we had gotten this wrong, why we'd gotten it wrong, and what the truth was. It didn't change the nature of the story, but it certainly changed some of the facts. And then we rewrote that section of the book and republished it for later editions. So, look, you know, nobody wants that to happen. Anybody who's written anything that gets read by more than 20 people, you've probably had one of those 20 come and say, you know, you didn't. That's not quite right. So you do as much as you can to get it right with every episode of Freak Radio, you know, we research, we interview, we fact check, we fact check again. We still make mistakes. And then if we make one that gets into the publication, we republish. But I think in terms of Freakonomics, the book, I like it. I liked it a lot. I. I learned a lot. I like that it's out in the world and it makes people think. I love. You know, we've gotten thousands of emails or letters from people who have been, whatever, inspired either by our books or the radio shows to do amazing things. You know, kicked off a big kidney donation community, kicked off a lot of education and healthcare things. So I'm very, very proud and happy of that. Happy about that. And for anybody that became more cynical because of what we wrote, I'm sad about that. But we're not cynics. Skeptics. Yeah. Cynics. No.
A
All right. Well, congratulations on the 20th anniversary edition and keeping the franchise going for so long. And thank you for being on HBR IdeaCast.
B
It was a lot of fun. I liked your questions. Thank you very much.
A
That was Stephen Dubner, host of the Freakonomics podcast and co author of A Rogue Economist explores the hidden side of everything. Next week, Allison talks to executive coach Muriel Wilkins about something else hidden the blockers that hold leaders back. If you found this episode helpful, share it with a colleague and be sure to subscribe and rate IdeaCast in Apple Podcasts, Spotify or wherever you listen. If you want to help leaders move the world forward, please consider subscribing to Harvard Business Review. You'll get access to the HBR mobile app, the weekly exclusive Insider newsletter, and unlimited access to HBR Online. Just head to hbr.org subscribe and thanks to our team, Senior Producer Mary Du, Audio Product Manager Ian Fox, and Senior Production specialist Rob Eckhart. And thanks to you for listening to the HBR IdeaCast. We'll be back with a new episode on Tuesday. I'm Adi Ignatius. Have you ever loved your HR or payroll software? I didn't think so. Deal changes that it's one global AI native platform for hr, IT and payroll. Built in house and backed by more than 2000 experts in over 130 countries. It keeps you compliant, it scales as you grow, and it's actually nice to use. Book your demo@deal.com that'S-E-E-L.com HBR.
Podcast: HBR IdeaCast
Date: October 21, 2025
Host: Adi Ignatius (A), with co-host Alison Beard
Guest: Stephen Dubner (B), co-author of Freakonomics, host of Freakonomics Radio
This episode marks the 20th anniversary of Freakonomics by Stephen Dubner and Steven Levitt, a groundbreaking book that popularized exploring the “hidden side of everything” in economics. The discussion revisits the book's impact on economics, academia, business, and broader culture, examining both its contributions and unintended consequences. Host Adi Ignatius and Dubner engage in a thoughtful conversation about truth in data, the dangers of cynicism, and the ongoing relevance of skeptical, evidence-focused inquiry in a world where facts are often contested.
Changing the Field: Freakonomics made economics accessible and “cool,” inspiring new ways of thinking among students, managers, and even professors.
“That was the book that made economics cool. Everyone wanted to read it, everyone wanted to replicate it.” (A, 01:36)
Academic Reaction: While initially controversial among economists for exposing the inner workings of their field, the book ultimately led to greater interest in economics and broader public engagement.
“There were people who felt that he was showing how the magic, the regression analyses get performed. … But there was a big surge [in economics majors] and therefore it did, I would say, enlarge and maybe enrich the market for economics professors.” (B, 03:08)
Popular Appeal: The unexpected mass appeal of Freakonomics provided a platform for further investigative work and changed how researchers thought about communicating with broader audiences.
On “Hidden Side” Thinking: Dubner clarifies their aim was not to spin conspiracy but to challenge the complexity and partiality of accepted wisdom, examining who shapes it and why.
“We tried not so much to just play gotcha … What we tried to do is look at the whole ecosystem around that particular piece of conventional wisdom and look at how it was created … did they have their thumb on the scale or a horse in the race in any way? And the answer to that question is almost always yes.” (B, 05:19)
Humans and Unintended Consequences: Dubner underscores the necessity for humility in policy and analysis, since even well-intended efforts often bring unintended, measurable consequences over time.
Incentivizing Counterintuitive Findings: Making research accessible and popular isn’t risk-free—it can create perverse incentives for attention-grabbing but shaky findings.
“If you produce an interesting counterintuitive finding, you're going to get press coverage, right? You're going to get attention, you're going to sell books, and that's created an incentive...” (A, 06:41)
Dubner’s Perspective on Fraud and Replication Issues: Highlights longstanding media biases toward sensational stories (“If it bleeds, it leads”) but laments the spread of academic fraud and proliferation of “fake journals” catering to low-quality or fraudulent research.
“There are hundreds, maybe thousands of essentially fake academic journals that exist to offer paid publication to scholars, or would-be scholars... That's a joke. It’s a terrible situation.” (B, 11:01)
Scientific Rigor in Economics vs. Other Disciplines: Dubner finds economics papers often show admirable transparency in methodology—listing alternative hypotheses and evidence—so their robustness can surpass some other social sciences.
The Pop Tarts and Oil Adage: Both host and guest agree that confusion between causation and correlation plagues both public discourse and scholarship.
“If you purport to show that the consumption of Pop Tarts is driving up the price of oil, I'll run that story. But … You probably have a causation-correlation problem there.” (A, 13:11)
How to Demonstrate Causal Impact: Dubner shares how Freakonomics used multiple, “collage” strands of evidence—for example, comparing state abortion policy shifts—to strengthen causal arguments, always showing the underlying reasoning to readers.
“What they call an instrumental variable that’s clean… So, for instance… did crime begin to fall earlier than it did elsewhere because there was availability to abortion earlier? And the answer was yes.” (B, 15:23)
Can Facts Still Win?: The hosts discuss growing public cynicism toward “facts” amid political polarization and misinformation—acknowledging it’s easier than ever to harbor or promote untruths.
“It saddens me, it frustrates me, it scares me how easy it is for people to think … that something is true if it's demonstrably not true.” (B, 18:57)
Journalistic Method: Dubner advocates diligent preparation, primary sources, and rigorous fact-checking—an approach which, though labor-intensive, yields trust and satisfaction.
“I sometimes think of this kind of journalism as like making maple syrup... and you get this much at the end of the day. And that's kind of the way I feel every Friday morning when we publish an episode of Freakonomics Radio.” (B, 22:31)
Having Fun, Gaining Wisdom: Dubner reflects that, with more age and experience, he’d take a less playful tone—and is more conscious now of the weight of certain stories.
“We were often playful in a way that I think now would feel a bit callow. We were also younger.” (B, 23:48)
Correcting a Mistake: He recounts the tough but crucial process of correcting an error about civil rights activist Stetson Kennedy—documenting the error publicly and revising the text, even at emotional cost to those involved.
“That was really painful … But I couldn’t let that stand if he had exaggerated to that extent. … So what we did in that case was we wrote a column for the New York Times about how we had gotten this wrong, why we'd gotten it wrong, and what the truth was.” (B, 24:33)
On Cynicism vs. Skepticism: Dubner draws a line between being a healthy skeptic and a corrosive cynic.
“Skeptics, yeah. Cynics, no.” (B, 27:00)
On the transformative effect of Freakonomics:
“Everyone wanted to read it, everyone wanted to replicate it. And I do think it changed the way that companies think about consumer behavior and also the way that managers thought about getting the best out of their employees.” (A, 01:36)
On the difficulty of making causal arguments:
“Making causal arguments is really hard. And that's why science is hard and that's why I like scientists.” (B, 17:55)
On the labor of trustworthy journalism:
“It takes a little bit more work. ... I sometimes think of this kind of journalism as like making maple syrup … but I think it tastes pretty good.” (B, 22:31)
Acknowledging an error and living with humility:
“It didn't change the nature of the story, but it certainly changed some of the facts. And then we rewrote that section of the book and republished it for later editions.” (B, 25:29)
Twenty years after its publication, Freakonomics continues to resonate due to its playful yet rigorous approach to debunking conventional wisdom, its impact on how business and academia approach data, and its championing of skeptical—but not cynical—investigation. Stephen Dubner’s reflections underscore the importance of methodological transparency, humility in error, and resilience in the pursuit of truth—principles more vital than ever in a world awash with both information and misinformation.