Bob Crawford (40:58)
Yeah, exactly, exactly. That's. That's the name of the phenomenon. So, I mean, when. When the Stanford Prison Experiment, when people tried to recreate the experiment for television even, it made for pretty boring tv because it was bad science in the first place. It's not something that people do naturally. It's what they do when they are pushed, when they are prodded, when certain expectations are set up. It's kind of similar with this other famous experiment that Bregman talks about, which is the Stanley Milgram's obedience experiment, where volunteers were told to administer increasingly painful electric shocks to a stranger just because a guy in a lab coat told them to. Just like another instance of, you know, are we doing these things just to please authority? Even to the point of murder? Because, you know, the. The dial of the electric shock was deadly after a certain point, and you could hear the screams of the victim. Of course, they were fake screams, but, you know, the participants could hear them. But what Bregmund ended up uncovering is that most of the participants weren't following the orders blindly. They were following the orders. Yes, but they were doing it because they believed that they were doing something good. Something good for the good of science. That even though the shocks were uncomfortable that it wasn't something they want to do. There was a noble sacrifice in the name of progress. Even so, the participants weren't indifferent. You know, they were distressed, they were shaking, they were sweating, they were begging to check on the learner. But they also said things like, he agreed to be in the experiment, you know, or this will help science, right? Or I don't want to do this, but I have to. The man in the lab coat who was telling them to continue, please continue, please continue. He was calm, he was professional. And also even how the nudges that he used were framed made a difference. So if he was directly ordering them and telling them, you have to do this, surprisingly, people would actually be more likely to resist a direct order framed in that way for such an experiment. But a more subtle nudge is like no science. The experiment requires this, the experiment needs to do this. A little more subtle. It tended to get people to continue. And the people who were interviewed who did take it up to those higher voltages, they said they did it because they believed they were contributing to scientific development. So it's really this misguided belief in a higher cause that also contributes to atrocity. It's very easy to get this idea that, oh, you know, that those are just monstrous people. You know, we have this idea in pop culture that these, the Nazis are like cartoonish monsters. They are monstrous, but they are monstrous people. You know, they are, at the end of the day, people who do evil with the belief that they are doing good to varying extent. I know that there were some who, you know, recanted or who knew what they were doing wrong, but they had other pressures that were pushing them in that direction. Right. There are many explanations, people's behavior in all sorts of situations, but a lot of the people, they thought that they were contributing to the right thing. It's not that they didn't care, but that they were taught to care in the wrong direction. The bad guys don't think that they're bad guys. And whether we're talking about the Nazis of the past or the Zionists of today, they construct these elaborate narratives to frame themselves as the righteous ones. You know, as far as the Nazis are concerned, they are purging Germany of a serious threat to their well being and the safety of their and their future and all that stuff, right? Designers, you ask them, even though they're pariahs of the world at this point, you ask them why they believe that this must continue, and they will say, you know, we have to defend ourselves. We have a Right, to defend ourselves, yada, yada, yada. There are true believers within these groups, you know, who are able to commit some of the worst acts. Committed ideologues who boast of their trustees, who express no remorse, who take pride in their role. And people reach that point of ideology through a process of radicalization. You know, we look at the 10 stages of genocide, I think, is the framework people have used before to point out how a segment of a population can become a target of genocide. It's not like one day you wake up and it's just like, oh, we're going to genocide this group of people. It's a process. You know, first you start off with classification. You create a separate group of people, separate category of person. You make them signify themselves in some way, carry ID cards or some kind of insignia on their clothing or whatever. They begin to face discrimination of some kind. The discrimination, you know, is ramped up through dehumanizing language. You compare them to vermin, to rodents, disease. And that's just the thing, and we're going to get to that. But part of how you get people who would otherwise be caring or compassionate about their fellow human is through distance, right? So the people who are most bloodthirsty tend to be very far from the front lines. You know, people who are demanding that World War I continue, for example, they were very far from the actual fighting versus at the actual front lines of World War I. You had soldiers playing football together during Christmas. That's a separate story. But you create distance. You either create physical distance, or you create psychological distance. And dehumanization is one of the ways you create psychological distance. You distance people from seeing their fellow human being as a human being. Segregation is another way of creating that distance, which then lends itself to dehumanization. Comparing the people to women, to animals, anything other than human, as another step in dehumanization and getting people to separate themselves from those people. And then they create specific groups. The next stage, they create specific groups and organizations to enforce discriminatory policies. You further broadcast propaganda to polarize population. And then while steps 7, 8, 9, and 10 go from actually preparing the removal, relocation of people to the persecution, the extermination of the group, and finally the denial that such a crime ever occurred. So that process, it can take years, it can take decades, but it's something that can turn even the most regular person into a virulent proponent of genocide if they are not fastidious in their opposition to any such language, especially in the early stages, because they get fed this steady stream of propaganda of how their actions are justified. Their loyalty to their in group becomes tested by their willingness to engage in those harmful actions. They stay with that group, they'll do whatever they're told is good, even if it leads to other people being hurt. And it just creates an evil. But it's an ordinary evil. It's an evil that is convinced of its virtue. It is wrapped up in ideology and social conformity because, you know, humans are social creatures and it drives us to cooperate. But that sociality can be narrowed down to test our in group. And that's where Bregman actually gets into an interesting point about empathy. Right, because we tend to see empathy as a positive thing, and it can be. But as Bregman notes, drawn from psychologist Paul Bloom's work, empathy can also make us partial, irrational and even cruel because it can narrow our focus to those people who are like us and ignore others. That's why soldiers can fight and kill other people because they feel empathy for their in group, their homeland citizens, or their comrades in arms. Their loyalty and affection for the people they care about supersedes the lives of the people that they don't care about. Now, of course, I want to look at systems when we're talking about this because I don't think that this hijacking of empathy is inevitable. You know, nationalism, propaganda, these things play a role in how people end up being separated in this way. And it's in groups and out groups. But you know, there is also indications that in group and out group separation can occur even in the absence of a state. So it is something we have to be continuously vigilant of. Another aspect of a systemic analysis or approach is looking at how our position within society also shapes how we operate, how we treat people, how we think and how we act. Bregman cites neuroscience research that demonstrates how authority literally changes how we think. Powerful people become less empathetic and are more likely to see others as tools rather than independent people. You know, this is not new information per se. You know, the environments that powerful people are in both shapes them and are shaped by them. The saying has long gone that power corrupts and absolute power corrupts absolutely. And spaces like Silicon Valley, like Wall street, like Washington D.C. corporate boardrooms and all the other upper echelons of government. They divorce rulers and authorities from ordinary people, their insular spaces that keep them from being challenged or being grounded by the impact of their actions on others. So powerful people don't have to care. And I think such hierarchies are attractive to people who are already inclined to do bad, even if they believe that they're doing good. The authoritarians, the supremacists, the abusers, they are attracted to those positions. But even good intentioned people could lose themselves in authority too. Because authorities as a whole, existing in this bubble that rewards their worst instincts, end up further shaping the system around their worst instincts around distrust, selfishness, exploitation and so on to reward themselves and their patterns of behaviour. And thus through the social nocebo effect, people end up fulfilling that expectation created by the system.