
In this episode we welcome Dr. Sarah Stein Lubrano, a political scientist who studies how cognitive dissonance affects all sorts of political behavior.
Loading summary
David McRaney
For 140 years, MultiCare has been in.
Dr. Sarah Stein Labrano
Washington prioritizing long term solutions, partnering with local communities and expanding access to care. Together we're building a healthier future. Learn more@mycare.org you can go to kittedkitted shop and use the code Smart50Smart50 at checkout and you will get any half off a set of thinking superpowers in a box. If you want to know more about what I'm talking about, check it out. Middle of the show welcome to the you Are not so smart podcast, episode 325. I am David McCraney. This is the youe're not so Smart podcast and and this is part two in a two part series about cognitive dissonance. And I could do a 500 part series about cognitive dissonance. We could just start a whole new podcast that was only about cognitive dissonance if we wanted to. It's a very old idea in psychology, about 70 years old. And there are lots of tributaries and offshoots from the original research. The last episode we discussed the sort of inception of all of this. And in this episode I want to talk about the landmark study that came out of the stuff we were talking about in the previous episode. And let's just get started. Let's do that. I want to begin by telling you about one of my favorite studies ever about anything like in all of psychology and all of science really. And it just so happens to be one of the original landmark studies into cognitive dissonance. So to set the stage, let's go back to 1959 to Stanford University, where the psychologist Leon Festinger is about to change the way we think and feel about the way we think and feel. Officially, the authors of this study in which this experiment appears are Leon Festinger and Merle Carlsmith. We met Festinger in the previous episode of this show and joined him as he and his team infiltrated a doomsday cult to observe their behavior. After the day of doom came and went, not only did most of the cult members not leave the cult after encountering pretty strong evidence they had been wrong all along, but for most, their convictions and beliefs and attitudes and loyalties grew even stronger. Festinger coined the term cognitive dissonance to describe the stress that people within that cult felt in the presence of such striking disconfirmatory evidence. And he remarked that the dissonance in this instance was so strong that they chose to not admit they were wrong at all. They changed their minds about what was real, about what was true, about what was rational and logical and reasonable instead of simply admitting that they had been mistaken. Festinger wrote a book about all of this. He became a world famous psychologist. He joined the faculty of Stanford, wrote a second book about how he thought dissonance likely worked, and then after all that, decided it would probably be a great idea to design some controlled laboratory studies to quantify and measure and test his hypotheses.
David McRaney
After Festinger did his very weird and wonderful study into an alien cult, he needed to create some lab studies that could measure roughly the same kind of psychological phenomena.
Dr. Sarah Stein Labrano
That is the voice of Dr. Sarah Stein Labrano. She is a political scientist and theorist and academic who studies how cognitive dissonance affects all sorts of political behavior. She's also the co host of a podcast about activism called what Do We Want? And she wrote a book that's coming out in May of 2025 titled Don't Talk about Politics, which is about how to. It's about how to talk about politics without talking about politics.
David McRaney
So yeah, I look at the intersection of how people change their minds or don't a lot of the time and what that means for our political life on planet Earth.
Dr. Sarah Stein Labrano
Okay, enough bonafides. Back to Festinger. He is riding high on the cult infiltration study and the resulting book and fame. He's now at Stanford and he has designed a lab experiment to study a phenomenon he observed in the wild. He designed this experiment with Merrill Carl Smith, an undergraduate at the time who actually came up with the original idea for all of this. Carl Smith would go on to become an influential psychologist, but he isn't one yet. And here is how the study worked. They got 71 psychology students at Stanford to sign up for a project called Measures of Performance, which they told them was part of a project the psychology department would be conducting to improve the quality of all their psychological experiments in the future. None of that was true, but it allowed them to tell the students that they needed everyone to be completely frank and totally honest in the interviews that would come after the experiment. So one at a time, a student would arrive and then get escorted into an office holding area where they waited for their turn to be asked to step inside the official experiment room where they would learn what they would be doing for the next hour.
David McRaney
Asked them to do an incredibly boring task. I think it was pretty much literally taking wooden knobs and turning them like 90 degrees or 180 degrees.
Dr. Sarah Stein Labrano
Yeah, that was task two. First they sat down in front of a tray filled with 12 wooden spools and they were then Instructed to empty the tray with one hand, one spool at a time, refill it one spool at a time, and Repeat that for 30 minutes while working at their own pace. Meanwhile, the experimenter, the scientist, sat nearby with a stopwatch and pretended to take notes. Then, after half an hour, the experimenter removed the tray and replaced it with those square pegs, a horizontal board on which 48 square pegs had been mounted. The new task was to rotate each peg one quarter turn until they'd rotated them all, then repeat all of that over and over for, for half an hour at their own pace while the experimenter watched. And if this sounds excruciatingly boring and mind numbingly repetitive, that was the idea. Festinger and Carl Smith hoped by the end of all of this that the participants would hate what they were doing, would regret they'd signed up for this, and not wish this on anyone.
David McRaney
And by the way, you can watch this study. They have videos and those are online. So if you Google this, there are videos of people doing this from back in the day with wild 50s haircuts and so on. So they do this task. It's very boring. And then something very clever happens. As with most psych studies, there's a lie in the study because that's how you make sure the person doesn't realize what it is you're actually measuring. So you lie about what the study is about.
Dr. Sarah Stein Labrano
And I love this part, the lie part. Because at the end of the hour, the experimenter in the room with the student leans back and lights a cigarette, like clink. And then through a cloud of smoke tells the student what the real study is about.
David McRaney
And the researcher, wearing a lab coat and looking for very professional, as you know, oh, thank you so much for doing this study. Actually, what we're trying to measure, just so you know, is we're trying to measure if we tell someone a task is interesting, will they find it more interesting? And obviously I didn't tell you that the task was going to be interesting or boring. But another student who's coming next is going to be told the task is really interesting, but I have a problem. And the student's like, okay. And the researcher says, I have a problem. My colleague who's supposed to tell the next student that this task is really interesting, he just didn't show up for work today. And the student goes, oh, okay. And the researcher says, could you help me, do you think, actually just to solve this problem that I'm having, If I pay you, will you go into the hallway and tell the next student that the task is really interesting so we can continue this study. And most of the students said, yes, but they were paid different amounts of money.
Dr. Sarah Stein Labrano
They were paid different amounts of money. This established the conditions of the experiment. From here, they created three groups. There's the control, but most crucially, There is the $1 group that's about $9 in today's money, and a $20 group that's about $190 in today's money. The control group, they proceeded straight to the interview. But the 1 and $20 groups go back into the waiting area where another student is waiting, and they tell that.
David McRaney
Student, wow, I just did this research study. And it was really interesting.
Dr. Sarah Stein Labrano
To reiterate, the students spend an hour performing a very, very boring task. And then they are asked if they would mind helping the psychology department by telling the next person waiting to do this task that it is actually quite fun and interesting and not. Not boring. They also tell these participants, these subjects in this study, that what they're studying are expectations. And the person who is supposed to lie and say all this couldn't make it today. Also, they are told if they agree, they will be officially hired and might be called back sometime later to help with other stuff. So all of this is out there. And so some of these students are offered the equivalent of $9 to do this, and some students are offered the equivalent of $190 to do this.
David McRaney
That's the only variable. How much are they paid? Right.
Dr. Sarah Stein Labrano
Almost everyone agrees. And then those who have agreed are escorted back into the waiting room, where the next student awaits, who is of.
David McRaney
Course not actually a student, but is in fact the lab assistant in disguise as a student.
Dr. Sarah Stein Labrano
And they tell this pretend next student, who they believe will be the next person doing that very boring experiment.
David McRaney
Oh, yeah, I just did it. It was so interesting.
Dr. Sarah Stein Labrano
They lie. They lie. They tell the next student it will be fun, even though they know it will not be fun. And like Dr. Stein Labrano said, this supposed next student they're telling all this to is actually in on the experiment. And when the experimenter leaves the room saying they'll be back in about two minutes, the fake student waits until the participant tells them the experiment was interesting. And then she tells them she's surprised to hear that because one of her friends who had done this experiment already told her it was terrible. And in response, nearly all the students tell her, no, it's not terrible. It's actually very fun and she is going to enjoy it. However, two did tell the truth at this point. And one got her number so he could tell her the truth after. At least that's what he said. Also, three rejected the money during the light a cigarette. Can you help us? Part of the experiment. And all of those students who were just not playing along, they were excluded from the final analysis. But the rest, 65 in all, they lied. And when the experimenter returned after leading the fake student into the experiment room, those participants went down the hall for their post experiment interviews. And here, each student was asked to honestly rate how fun the study that they were just participating in, how fun was it really, on a scale from negative five to positive five. And then they asked each participant what they felt the study was actually about and if they suspected anything. And at that point, five students said, is this about lying? And so they got excluded as well. So that leaves us with a pool of people who believed that they were actually helping the psychology department and they were actually lying about something that they knew that they were lying about. And some of them were doing it for $1, which is $9 in today's money, and some of them were doing it for $20, which is $190 in today's money.
David McRaney
In a couple of weeks, the same students are asked in a. In a survey, how interesting was the task really? And what's interesting is that the students who had to lie for only $2 have a very different result than the students that had to lie for $20. The students that lied for $20 are like, that task was incredibly dull, because, of course it was. It was a really dull task. And the students that were paid only $2 seemed to think the study is interesting. And that is what is incredibly weird, right? They went out and lied for not a huge amount of money, and they now believe their own lie. And they really do seem to. And this is where I really recommend watching the YouTube videos, because there are interviews where Festinger or his colleague asks the student, like, you know, some other students have told us that that task was boring. And the students are in, like, shock. They're like, no, it was really fun. You must be talking about another study, because the one I did was great. It was a lot of fun.
Dr. Sarah Stein Labrano
This is what makes this study so incredible because it starkly reveals the strangeness of how we often deal with the discomfort of cognitive dissonance. Both groups observed themselves saying something they did not believe. Both noticed an inconsistency, an incongruence, a dissonance between their behavior and their beliefs, their experiences and what they said about those experiences. And both groups not only noticed this inconsistency, but they had to contend with the fact that that they had accepted a bribe. A bribe to do something that might be considered by others as morally questionable. But their reactions to all of that differed only depending on the size of that bribe. When they asked the students, paid the equivalent of $190 how they actually felt about the boring tasks, those students said, yeah, the tasks were really, really boring. I hated it. I do not recommend it. When they asked the students who had been paid the equivalent of $9 how they felt about the boring tasks, they said, actually, in all honesty, I loved it. It was fun. It was not boring. They recommended it. They would have done the tasks for nothing and would happily participate again. And here is the thing that really blows my mind about this study. They really did feel that way. After lying about how they felt, they then changed how they felt so that it would not be a lie. No part of them, nothing inside them was telling them that they were fibbing, that this was a sham, this was a charade, none of it. From that point forward, it was fun to them. They changed their own minds to reduce the dissonance created by the experimenters.
David McRaney
So the question is, how did this person come to believe that actually the task was interesting? That's a really weird thing to gaslight yourself about, to misappropriate a word. And the answer is that the person is suffering from a cognitive inconsistency, right? They went and lied to the student and told them that was a really interesting study, and they actually found the study quite boring originally. And they need to reconcile the dissonance they face about the contradiction between their actual belief that the study was boring and their actual action, which was saying that it was interesting. And unlike the student that was paid a lot of money, who can reconcile this by saying, well, but I was paid a lot of money. So this is consistent with the fact that it was boring, but I was willing to because I was paid a lot of money. Student who wasn't paid very much doesn't really have a good rationale for having lied to the student in the hallway. And they appear to shift their beliefs to tolerate this contradiction. They're like, oh, gosh, I lied to that student, and it wasn't for very much money, but actually it wasn't that boring, so it's not that much of a lie. Right? They're doing a rationalization so that they don't have to live with the discomfort of a contradiction between their beliefs and their actions. This is a very repeatable finding and a very weird study, but I actually think it works really well for explaining just how funky this is. And notice that there are all kinds of elements at play in it that come up again and again in dissonance studies after this. There's your sense of yourself as a good person that's at stake. There's a sense of yourself as having actually chosen the thing that you did, even though you were really compelled to do it. And that will fit into a lot of other things that are worth saying about dissonance and the way it features in our life that a lot of the time we feel dissonance about actions we kind of had to undertake but don't really feel comfortable with. And the way that we manage that is by deciding that in the end, we definitely wanted this thing and chose it. For sure.
Dr. Sarah Stein Labrano
Yes. This is the thing. I'm a good person. I don't take bribes to do bad things. I don't do boring tasks. I do. I. I would have quit if that had been boring. I don't just lie to people. That would be heinous. I am trying. I'm doing something for the betterment of humanity. And there are all these opportunities for you to see the truth of what is happening and possibly benefit from being honest with yourself. But people will variably do that depending on how much money you give them, because the story you tell yourself is different. Why did I do that? Because I was paid. But even then, that's going to divide people into different groups. Some people are going to be like, I'm not okay with the fact that I did that because I was paid well to do that. I wouldn't do that no matter how much you would pay me, and so forth and so on. It gets complicated very quickly. But the fact that this is introspection that you're not aware is taking place to the point that you actually do now truly believe that that was not a boring task is very freaky, upsetting and weird to me and always has been. I'm making up this story about who I am and why I am at all times, and I'm unaware that I'm doing it. And a good portion of it is going to be fiction for the sake of not thinking I'm an inconsistent, bad person. Sarah, what's going. What are we supposed to do with that? What are we supposed to do with knowing that about ourselves?
David McRaney
Okay, well, look, I think it's. It's very valid as. As we like to say in pop Culture discourse to feel uncomfortable about this aspect of human psychology. And we should. And something I'm very interested in lately is a bunch of studies by a wonderful researcher who I've interviewed called Kristin Lauren. And she looks at what happens when people are faced with limitations on their actions and how quickly they do a really similar thing to the people in the study we just talked about. So in the study we just talked about, people were kind of gently coerced and. Or bribed into lying, and then they came in some cases, to convince themselves, oh, I actually didn't lie. I totally believed in that thing in the first place. Right. And Lauren's work looks at what happens when people don't like a restriction that is placed on them, but they think it's unavoidable, and how quickly they decide that actually they always like this restriction. So she looks at two examples. She looks at a plastic water bottle ban and San Francisco, and she looks at smoking restrictions in restaurants in Ontario, and she found that the very same people would have extremely different beliefs about their preferences before and after those bans. So many people before a smoking ban would say, this is, you know, government interference and overreach. I don't want to be limited this way. Or, same with the water bottles. Right. Like, this is a huge pain. And then the day that the ban comes into effect, you can ask exactly the same people, and a lot of them are like, oh, I like this ban, and I've never really smoked that much in restaurants anyway. And they actually don't rem that they used to not like this ban. And they also have misremembered how much they were smoking. Of course, you can imagine a lot of scenarios where these kinds of restrictions are a lot more authoritarian or harmful. We can see that the same mechanism at work probably lets people justify adhering to really awful despotic laws and doing terrible things to other people. So we should be alarmed. And I want to validate that. And I also want to say that at the same time, I always look for this sort of like, utopian kernel in this as well, which is to say that actually, if human beings were really little sentient robots who were just responding to incentives and disincentives about their material interests, if all we cared about was like, am I getting paid well and am I getting laid and am I getting to maybe have a fun time? I'm not that interested in that human subject, you know, like, that's kind of boring and it's kind of pathetic, and it's kind of not that interesting. Morally but a human subject that is desperate to tell a story about itself, or it's a good creature that does good in the world and that keeps track of its sense of self and wants to think about itself as an active, good person who can can do something in the world. I'm interested in that subject and I find it reassuring that that is even possible for human beings, even though it can be used against us to make us do terrible things as well. It gives me a little bit of hope in a time of like real political despair that human beings care about the kind of creature that they are and the kind of force that they are in the world and that they want that to be for good.
Dr. Sarah Stein Labrano
We'll be right back after this break. Sam hey everybody, Ted Danson here to tell you about my podcast with my longtime friend and sometimes co host Woody Harrelson. It's called where everybody knows your name and we're back for another season. I'm so excited to be joined this season by friends like John Mulaney, David Spade, Sarah Silverman, Ed Helms, and many more. You don't want to miss it. Listen to where everybody knows your name with me, Ted Danson and Woody Harrelson. Sometimes, wherever you get your podcasts, upgrade your laundry routine with a durable and reliable Maytag laundry pair at Lowe's. Like the new Maytag washer and dryer with performance enhanced stent fighting power designed to cut through serious dirt and grime. And what's great is this laundry pair is in stock and ready for delivery when you need it the most. Don't miss out. Shop Maytag in store or online today at Lowe's. The School of Thought I love this place. I've been a fan of the School of Thought for years. It's a non profit organization. They provide free Creative Commons critical thinking resources to more than 30 million people worldwide. And their mission is to help popularize critical thinking, reason, media literacy, scientific literacy, and a desire to understand things deeply via intellectual humility. So you can see why I would totally be into something like this. The founders of the School of Thought have just launched something new called Kitted Thinking Tools. K I T T E D Thinking Tools. And the way this works is you go to the website, you pick out the kit that you want and there's tons of them. And the School of Thought will send you a kit of very nice, beautifully designed, well curated, high quality. Each one about double the size of a playing card, matte cello 400 GSM stock prompt cards and a nice magnetically latching box that you can use to facilitate workshops, level up brainstorming and creative thinking sessions, optimize user and customer experience and design, elevate strategic planning and decision making, mitigate risks and liabilities, and much, much more. Each kit can, if you want to use it this way, interact with this crazy cool app. Each card has a corresponding digital version with examples and templates and videos and step by step instructions and more. You even get PowerPoint and Keynote templates. There's so many ways you could use this. Here's some ideas. If you're a venture capital investor, you could get the Investors Critical Thinking Kit and use it to stress test and evaluate different startups for Series A funding. If you're a user experience designer, you can get the User Design Kit to put together a workshop with internal stockholders for a software product. Or if you're an HR professional, you could mix and match these kits to create a complete professional development learning program tailored specifically for your team team over the course of the next two years. So if you're the kind of person who is fascinated with critical thinking and motivated reasoning and intellectual humility and biases, fallacies and heuristics, you know, the sort of person who listens to podcasts like you are not so smart. You're probably the kind of person who would love these decks. If you're curious, you can get a special 50% off offer. That's right, half off offer right here. You can get half off of one of these kits by heading to Kitted Shop K I T T E D Shop and using the code smart50 at checkout. That's smart50 at checkout. 5% of the profits will go back to the school of thought. So you're supporting a good cause that distributes free critical thinking tools all over the world on top of receiving a set of thinking superpowers in a box. Check all of this out at Kitted Shop or just click the link in the show notes. And now we return to our program. I'm David McCraney. This is the youe Are not so Smart podcast and we just talked a whole lot about cognitive dissonance. I'm going to pick the conversation back up again with Dr. Sarah Stein Labrano. But first, a brief recap and a brief summary of where we are in this dissonance theory thing. First, Festinger Festinger infiltrated a cult. Then he wrote a book about it. Then he wrote a book about cognitive dissonance. And then he conducted the famous dissonance study we were just talking about before the break so strange order. But it can't be understated just how revolutionary the Festinger Carlsmith Boring Tasks study was. It's still the subject of replication, meta studies and retrospective analyses. All of these things are still a thing. For 70 years, we have been experiencing dissonance concerning the fact that cognitive dissonance can have such an impact on human thoughts, feelings and behaviors. We now know you can create a situation that will generate dissonance, and that dissonance will then compel people to sometimes change their beliefs. You can manipulate a person's environment and get them to tell a story about themselves that will alter their attitudes or their values, their opinions, their actions, their intentions, to act, and so on. And they will do it. They will change themselves. They will rewrite the truth of who they are. Okay, so what is the definition of cognitive dissonance? Let's have Dr. Sarastein Labrano answer that.
David McRaney
So usually the phrase cognitive dissonance is used colloquially to tell people on the Internet that they are idiots. But that's not what it means to people who are study it. And in particular, psychology uses this term to refer to the discomfort that we feel, often unconsciously, when we're faced with a contradiction between two or more of our beliefs or actions. So what that means is that we might notice, I don't know, that we want something, but we also want another thing that's incompatible with it, and we feel attention about that. And then we erase that tension, sometimes unconsciously, by choosing one and devaluing the other. I'm using this example first because it's not about hypocrisy. And I want to distinguish. It's not always hypocrisy. Sometimes it's just ambivalence. Ambivalence about not being able to reconcile different parts of our belief system or our actions and our beliefs. Lots of instances of dissonance also are about hypocrisy. We might believe one thing, but do another. We might believe in climate change, but fly to the Maldives. We might know smoking is bad for us, but pick up another cigarette. And we largely experience dissonance in those occasions as well. So it's discomfort about a dissonance between our beliefs and our actions, or two of our beliefs, or two of our actions. Unconscious is an important thing there as well. Most people experiencing dissonance don't appear to be conscious of what is happening for them. And what happens a lot of the time is that we find clever ways to get rid of the discomfort without noticing that we've done that. So the Two most common ways that we deal with dissonance, discomfort as human beings are usually either rationalized. The way I explain what a rationalization is, it's when you give a series of reasons for something that are not the real reasons you did that thing. If you decide that the person who dumped you was always a terrible person and you should never ever have met them, but actually you're just grumpy because you wish that you were still dating, that's a rationalization. It's not the real reason. You think they suck, but you're not willing to admit to yourself even that that's not the reason. Or if you are, you know, not going on a run and you tell yourself it's because it's rainy outside and you might fall and slip, that's a rationalization. That's not the real reason you didn't go on a run. You're just being lazy today. But it's easier to find a rationalization. So when we face dissonance, when we encounter a contradiction between our beliefs and our actions, we might come up with a rationalization like, it's fine that I'm flying to the Maldives because the plane is going to take off anyway, right? So, yeah, it's describing kind of human beings discomfort with contradiction and ambiguity in their own worldview and with their sense of self and their sense of themselves as good people.
Dr. Sarah Stein Labrano
Something that we haven't covered at all for some reason, is that dissonance theory emerged in the 1950s and into the 1960s, right as psychology was going through a sort of punk response to the stodgy lab coat behaviorism of the 30s and 40s, which had no real interest in introspection. It was all conditioning back then. Watson and Pavlov and Skinner, they had this picture of humans as basically simple animals, easily trained via rewards and punishments. And instead of seeing brains as bags of chemicals passively responding to the external environment, cognitive psychology emerged in the 1950s as a way of seeing the brain as actively involved in the construction of knowledge and meaning, actively organizing and integrating information, actively generating schemas and priors and assumptions and memories actively on purpose, knowingly curating concepts and consciously perceiving, interpreting, and categorizing the world. Cognitive psychology said, we did a lot of this within purely contemplative spaces, in our imaginations, while worrying and thinking and ruminating, while in simulated internal worlds that we use to imagine potential futures and outcomes of our as yet committed acts, the results of our as yet decided decisions.
David McRaney
When cognitive dissonance theory was first sort of published behaviorists were the arguably leading group of psychology researchers in America. And they were very interested in this idea of the human being, and indeed of the kind of animal in general as a creature that seeks like material benefits and responds to rewards, and also, of course, is disincentivized by punishment. And to be clear, every living organism does do this to some significant degree. If I pay you a certain amount of money, you will do many things. If I am continuously rude to you, you will probably avoid me. There are lots of instances where behaviourism can somewhat accurately anyway, summarize how human beings or even other animals respond to incentives and disincentives. But it turns out that that is only one model for how human beings behave, and that there are other systems, let's say, in our psychology that can override that. And cognitive dissonance theory is one of the descriptions we have for when that very basic theory about human beings responding to rational incentives falls apart, that there are times, maybe only very specific times in a human being's life or particular areas of cognition where we are not calculating that way and we are not responding to what is in our rational interests. And as you say, it's because we are developing a story, for lack of a better word, about ourselves. Or actually, I like to think about it more in terms of a map. So a story might be about the past, but a lot of what dissonance is responding to, I would argue, is actually about our sense of who we are in the world and what we could possibly do in the future. And that's probably why it exists, why we have dissonance at all. If we are facing too many contradictions about what we think the world is like and what we think we are like, we won't know how to act anymore, right? We won't know, okay, is it good or bad that I'm a Democrat, a Republican, a feminist, a Christian, a Jew or whatever? If we face a contradiction in our sense of self that is quite profound, or in our sense of how the world actually operates, that more or less prevents us from knowing how we should take action. And I would say that while you can only ever theorize ultimately about why something evolved in the human subject, this is a pretty good theory for why we would have a system like this that forces us back into cognitive consistency rapidly, because otherwise we might never know how we want to act next. And also because we are such interpersonal animals and we're such sort of, you know, we don't just live in the present as human beings. We have these Long term projects, we collaborate with other people. We build shared systems of narrative together that structure how we collaborate. We're probably driven to have meaning like this so we can have a shared sense of meaning. Some evolutionary theorists, I think Sperber and Mercier, right, they talk about this that like a lot of what human beings do when they engage in reasoning isn't so much about trying to find a fundamental true fact. It's about trying to have a shared set of reasons we can give each other so we can keep collaborating. And dissonance helps us do that also. It helps us stick to something that feels consistent enough that we could communicate it to anybody else.
Dr. Sarah Stein Labrano
Dissonance theory is something that's been evolving ever since that original research, the forced compliance experiments that we were talking about before the break. That was just the beginning. We've done lots of research since. But those studies are called the forced compliance experiments. Those are the ones in which a person is compelled to say, write or do something counter to their beliefs, attitudes or values. Well, we learned from those that the weaker the external justifications for compliance, I.e. the fewer consonant cognitions and, or the more dissonant cognitions that compliance generates, the more likely a person will produce an internal justification. But we now know dissonance can be generated in many other ways. One is called effort justification. Researchers have found that if you engage in a painful initiation ritual, or a pointless or laborious work project, or go on an expensive or terrible vacation, there's a high likelihood you will rationalize what you've gone through and see it as well worth your time. Instead of admitting the pain, harm and, and waste, and you'll truly believe it. You'll even become defensive about it. That's effort justification, a form of dissonance reduction where people justify their efforts by inflating the value of the outcome of those efforts. Then there's post decision dissonance. In studies where people are asked to choose between two equally appealing products after making their choice, people will rate the chosen item as being much better than the the rejected item. That's true in all sorts of other similar situations. We will enhance our commitment to a choice to reduce any discomfort after making a difficult decision. There's also something called the hypocrisy paradigm in studies like this. In one study, participants were encouraged to advocate for condom use and when they were all reminded of the times they had not used condoms, it created dissonance. This hypocrisy induction, as they call it, led to a higher likelihood of future condom use. And it illustrated how dissonance from perceived hypocrisy can influence future actions to reduce feeling like a hypocrite. We also know now that people must feel like they have a choice in the matter whether or not they really did. Otherwise they just won't feel very much dissonance about doing or saying something that runs counter to what they think, feel, or believe. They can always just blame it on the coercion. They didn't actually choose to do that. They can't be blamed for it. In studies in which people are paid a little or a lot to write an essay that runs counter to their attitudes, if they don't feel like they had a choice to opt out, the greater the reward, well, the more they'll adjust their attitudes to match the essay. If they do get a chance to opt out, though, then the opposite is true. The less the money, the more the attitude change. Just like the Festinger experiment. And one of the most important findings since the early days of dissonance research is the fact that people tend to actively seek situations that provide consonance and actively avoid situations in which they might experience dissonance. And we will do both without realizing we're doing either, whether that's avoiding cable news channels that might threaten our attitudes or spending time with people who will likely praise all our decisions. We don't just respond to cognitive dissonance after the fact. We actively manipulate our environment to optimize for it before it might happen. When we notice dissonance between our attitudes and our actions, our beliefs and our experiences, our current understanding and some disconfirmatory evidence, it's the anterior cingulate cortex. When we have studied cognitive dissonance by trying to get down into the neurophysiology of what's actually going on, that seems to be mostly where this is coming from. The portion of the brain that notices errors, the error detection system, is very active in this regard. But also what becomes active during these moments of strongly felt dissonance are aspects of the prefrontal cortex, which is involved in higher order thinking and decision making, planning, thinking about what you're going to do next. The dopamine system, the dopaminergenic system, which is a system for motivation. And within that system, dopamine affects the feelings that arise when outcomes don't match our expectations. And varying dopamine levels will then motivate us to notice, learn, and adjust our predictions going forward. Also, upon resolving dissonance, which is to say justifying one's behavior, you get a dopamine release providing A little bit of. Hmm, that's nice. Also, the sympathetic nervous system is activated during moments of intense dissonance. This generates increased heart rate, sweating, anxiety and so on. This is a lot of like slapping you around from the inside to get you to pay attention. But most of all, it's the anterior cingulate cortex, the part of the brain that plays the most crucial role in error detection, emotional regulation and cognitive control. So yes, cognitive dissonance is a real thing. It is a bodily thing. It is a physiological reaction. The dissonance reduction behavior witnessed in that boring task study, the one with the spools and the lying. Today that's called the insufficient justification effect. Without a sufficient extrinsic justification, people will create an internal one. And we now know that there is an overjustification effect as well. In studies where people are told they will be greatly rewarded if they choose to engage in activities that they already enjoy doing, those people will over time report enjoying those activities less. When tasked with explaining themselves, the justifications no longer seem intrinsic. The answer to why did I do this? Becomes because I got paid, not because I think this is fun. We're the unreliable narrator in the story of our own lives, right? And something about that, something about that has always messed with me in the. Because it leads to the next question, which is why are we doing this? Like, like best I know, this is going to be speculation and we don't understand the mind this than the brain. This will even at this point in our history trying to make sense of it. But why wouldn't it be better to pursue raw accuracy, right? When it comes to fact based stuff, I'm gonna, I'm gonna pull that bullet point aside first, why not try to, when you notice you're wrong, attempt to admit that you're wrong and then be right, factually speaking, evidence based, what's up with this? Why would that not be installed into the adaptive functions of the brain?
David McRaney
So I want to point out first of all that because we've run so many studies, we know that dissonance actually doesn't happen for all like all factual information corrections. And actually I would argue it only happens for a very specific set of them. If you and I are having a conversation about whether it's raining outside and you're like it's raining. And I'm like it's not raining. And you go to the window and you see that it's whatever, you will adjust probably to the reality of weather, it's raining. And more broadly, we tend to revise our beliefs A lot in favor of what you might call Bayesian reasoning or lots of things we're very capable of adjusting our beliefs about. There's just a specific genre of things that we're not very good at adjusting our beliefs about.
Dr. Sarah Stein Labrano
Okay, this is perfect. This is perfect. What is this genre? Help me understand the genre where this is going to be more likely.
David McRaney
If we had to summarize all this research over time, as researchers have done more and more studies, they found that, okay, we're only going to find. We only find cognitive dissonance, rationalizations, and confirmation bias and other evidence that this is happening in specific circumstances. And that's usually when people's sense of self is under threat in some way and. Or their sense of themselves as good agents. So doing things in the world that have good or bad outcomes. And I could run you through tons of different ways that they measure this, but like a good example for the. The thing about our actions is that often if people are told, oh, that was actually just an experiment, the letter you just wrote telling the university to change its policy is now going to be thrown in the trash. They don't adjust their beliefs because it's not a real action. It doesn't have any negative consequences, right? So they stop feeling dissonance because ultimately their sense of self isn't thrown into question that much anymore, and their actions in the world will have no effect. And so it seems like dissonance is happening only around issues that either make us feel like we are bad people or make us feel like our actions in the world, which have had a consequence, are causing a contradiction. And again, that's actually a relatively small part of our lives. Like, most of the time, you and I, Sarah and David, are wandering around discovering for real whether the coffee is hot and whether we should have repaired the other way and whether parallel parking is possible on this street. We adjust our beliefs all the time, but we don't adjust our beliefs about things that affect our sense of self and our sense of agency. And unfortunately, there are some very important issues where our sense of self and agency are kind of always involved. And politics is like, the big one. It's probably like that and religion and, you know, like very difficult interpersonal conflicts you've had. And in those cases, we're going to have dissonance basically all the time.
Dr. Sarah Stein Labrano
We created this term. Festinger created this term by demonstrating scenarios in which it can lead to really terrible outcomes, which is a group of people who, like, destroyed their lives, thinking that some aliens were going to come pick them up and take them off of Earth because of a flood. And then when it didn't happen, they doubled down, triple down, and ruined their lives even further. They had opportunities to update and go, oh, well, I guess I was wrong about that. And this person is manipulating me. And they had chances to say, all right, look, okay, maybe this. I wasn't completely stupid to have done what I've done up till now, but to keep doing it is very stupid. And yet they doubled down. So clearly, even if this is usually adaptive, there are times when this is bad. It's not something we should do.
David McRaney
Yes, that's right. And, you know, you could see the sort of tragic nature of humanity in that, in a way. But yes, I mean, the fact that we've evolved a certain way, as we know from every other sort of evo psych study, doesn't always mean that it benefits the individual. And actually, we also are now living in very different circumstances than we evolved in. Right. So we're, we're both, we're both sometimes just situationally screwed over by cognitive dissonance. But also, more broadly, I would say that our brains are not well adapted to, like, the modern news environment.
Dr. Sarah Stein Labrano
Right, right.
David McRaney
It was probably much easier to live with your cognitive dissonance when it's essentially just allowed you to get along with your neighbors or at least collaborate with a couple of them against the other ones or whatever. And it's pretty poorly suited to a constant barrage of information and misinformation that challenges your sense of self all the time and cognitively overwhelms you and disorients you in the world, where it's not even clear about a lot of political issues, what we could even do. Right. And I think that's actually a big struggle as well. It's one of the reasons in my book I talk about not so much presenting people with constant arguments, but giving them options to change how they actually operate day to day. So if you want someone to believe in climate change, one of the best things you can do is give them something they can do about climate change that makes them feel like a good person.
Dr. Sarah Stein Labrano
Okay, so what do we do with this knowledge? How do we actively manipulate people using this information toward it, toward goals that we assume may be good? Or you were talking about climate change, like, how do you encourage someone to do things that prevent the, you know, world from ending? And you're saying encourage them to in a way where they feel good for doing the right thing or feel good for doing the thing. Don't let me over Put it in my own words. I only want to hear your words here. This is really cool stuff. I like this angle. So let's switch to this, which is. Okay. With all this in mind, how do we use this in some way or another instead of just letting it play out and then write cool think pieces and sub stacks about it? Like, what do we, how can we activate, how can we actively use dissonance theory to adjust our policies and actions and, and so forth and so on. Give me some ideas here.
David McRaney
Right, I will, yes. I mean, look, I think all of us in a certain way are stuck in a certain kind of liberal ideology. And by that I don't mean liberal like Kamala Harris. I mean liberal like the tradition of liberalism that started in the 16th century. Vaguely.
Dr. Sarah Stein Labrano
Right.
David McRaney
By that I mean a system that has a legal system that defends private property, a system that, you know, thinks about people as individuals primarily, first and foremost, rather than families or communities. There are certain cultural aspects of liberalism that have very much filtered into the way we think about everything, even the way we do psych studies, for that matter. And one of the downstream effects of that shift that we've experienced in the west is that we, we think that we are rational agents who can just change our minds by having discussion. And if you think about it, that's like how our legal system is set up and how our parliaments are set up. We think political change or like even personal change is like we have a discussion and then we have a new opinion. And that's how change happens. And my book is all about how that's mostly not true. That's not how historical change happens. It's not how we change our minds as individuals. Right, but that doesn't mean that people don't change their minds about hugely important political issues. It's just that they change their minds when they are faced with new action possibilities that they could really take or new relationships that they could have. And they're faced with them in a way that allows them to articulate their ambivalence, to grapple with it, and ultimately to choose a different way of looking at the problem so that they can go out in the world and be. Behave differently. And actually your book is one of the ones that's helped me think through this. Right. Why does deep canvassing, which you write about in one of your books, work? It works because, in a way, the person there is articulating their ambivalence. So the cognitive dissonance is made somewhat conscious, even if it's not a term they're familiar with. But also they're building a new relationship with the person on the doorstep who's told them an important story about their own life, right? And often they're being given the opportunity to engage in an action, even if it's just voting in a referendum that would allow them to be a good person, even if they change their mind. So they might be like faced with information about, I don't know, a ban on trans people using the bathroom of their choice, and then they're given the opportunity to learn a new thing and then engage in an action that lets them be a good person and vote in defense of these rights. And by the way, there is climate canvassing as well now. So people do deep canvassing, this technique, these long form conversations, and that is pretty effective as well in the studies we have about it. So I guess to bring it all the way back, my suggestion is that the really effective forms of political action that we can take right now, partially involved, there are other things we can do too, but partially involve giving people the opportunity to learn about a new action they can take in the world that will let them hold onto their sense of self as a good person, as something they can do next. And actually you can really see this in the climate research I've looked at. So for example, the number one predictor of whether people will do something that's climate friendly in their life, like whether they'll install a heat pump in their house, doesn't have anything to do with whether they're given arguments for it. And interestingly, it doesn't even matter whether they're given financial incentives or for it as much as it matters whether their friends are doing it right, which, which sort of aligns with some of these findings. Similarly, people are much more likely to, you know, change their minds about gay people if they discover that they know one. And suddenly they need to align either their beliefs with their actions and defriend this person or, and often they choose to align their beliefs with their actions, by which I mean they choose to remain friends with that person and shift their beliefs on homosexuality. That's a very consistent finding. So what we would learn from this, in my opinion, is that if we give people opportunities to try new ways of living or form new relationships, they are much more likely to change their mind than if we just give them a bunch of arguments. And I want to point out that we're actually living in a low point in American society, in particular, for a lot of these opportunities and relationships. Americans have fewer friends than they did 20, 50 years ago. They spend less time with their friends. We are actually resegregating in a lot of different ways, not just racially, but often economically. Millennials in particular, but other people as well, are moving away from city centers because of a lack of affordable housing. So we're actually forming fewer and fewer relationships with our neighbors. In a lot of cases, we're at a very low point for social capital, to use a sociology term. And that's actually quite frightening to me as a political theorist because it means we're probably much more cognitively rigid and isolated. So I guess the point is that I think the number one thing we can do, including and especially in the next four years, is try to create spaces and affordable opportunities for people to mix with people not like themselves and to try out new ways of living, even if that's just, you know, like installing a solar panel or helping a refugee. And that all the arguments down the road for who we should vote for, what we should believe, are in some ways secondary to the options we give people for how they live their lives.
Dr. Sarah Stein Labrano
That is it for this episode of the you Are not so Smart podcast. For links to everything we talked about, head to you are not so smart.com or check out the show notes. It's right there in your podcast player. You can find my book How Minds Change wherever they put books on shelves and ship them in trucks. Details are@davidmcraney.com and I'll put links to all sorts of things related to that right there in your podcast player. For all the past episodes of this podcast, go to Apple Podcasts, Amazon Music, Audible, Spotify, or you are not so Smart. Follow me on Twitter and threads and Instagram avidmcraney. Follow the show notsmartblog. We're also on Facebook, you are not so Smart. And if you'd like to support this one person operation, go to patreon.com you are not so Smart. Pitching in at any amount gets you the show ad free. But the higher amounts get you posters, T shirts, signed books and other stuff. The opening music, that's Clash by Caravan Palace. And if you really want to support this show, just tell somebody about it. Share it somewhere. If there was an episode that really meant something to you, share that episode and check back in about two weeks for a fresh new podcast. Adam Pally here and I'm John Gabris. We're a couple actors and best friends who you may know as the host of the TV show 101 Places to Party before youe Die. Now we're bringing you a comedic look at health and wellness. We're with our new show, Staying Alive.
David McRaney
We'll have guests like our friend, actor.
Dr. Sarah Stein Labrano
Jerry O', Connell, ketamine therapist Dr. Steven Radowitz, Paul Shear, Ego Wodo, Gillian Bell, Dr. Dolittle. Staying alive with John Gabris and Adam Pally is out right now.
David McRaney
Get them a week early and ad.
Dr. Sarah Stein Labrano
Free with SiriusXM podcast plus on Apple Podcasts.
Episode 325 – Cognitive Dissonance – Part Two (rebroadcast)
Date: October 27, 2025
Host: David McRaney
Guest: Dr. Sarah Stein Labrano, political scientist and theorist
This episode continues the exploration of cognitive dissonance—a foundational concept in psychology first articulated by Leon Festinger. The discussion dives deeply into the seminal Festinger-Carlsmith experiment, the mechanisms behind cognitive dissonance, its roots in psychological history, and its pervasive role in individual, societal, and political behavior. The episode closes by examining how understanding cognitive dissonance might be used to promote positive change.
Background
The Study Design
(07:05–14:28)
The Results
Implications
(30:44–33:20)
Formal Definition:
Not Hypocrisy Alone:
(33:20–38:14)
(38:14–46:03)
Effort Justification: Suffering for a group or task increases valuation of that group/task (ex: hazing, difficult goals).
Post-Decision Dissonance: After making difficult choices, we retroactively enhance our opinion of the chosen option.
Hypocrisy Paradigm: Pointing out personal inconsistencies (like advocating for a behavior one hasn’t followed) can prompt genuine behavior change.
Agency Matters: Dissonance peaks when people feel they had a real choice in their action.
Bodily Systems:
(46:03–49:55)
Dissonance doesn't show up everywhere—mainly when self-image or sense of agency is at stake.
For neutral facts (e.g., weather), people easily update beliefs.
For deeply-held or identity-relevant beliefs (politics, religion, morality), dissonance is chronic and hard to overcome.
(48:27–52:14)
(51:01–57:30)
Persuasion & Policy:
Modern Challenges:
The dialogue is inquisitive, friendly, and open about uncertainty. Both host and guest blend historical anecdote, scientific rigor, and a touch of wry humor. There is sincere unease—and hope—in recognizing our cognitive quirks, but also an optimistic search for how to harness self-storytelling for societal good.