David McRaney (4:35)
We get angry. Okay, so what is this thing, intellectual humility? Well, we usually take the psychological frame whenever we're discussing things on this podcast. And when psychologists use this term, intellectual humility, it means the degree to which you recognize, accept and willingly admit the limitations of your cognitive abilities. So what does that mean? Well, to be intellectually humble is to embrace the likelihood that on any, any topic, big or small, you might be wrong about some or all the things you believe, feel and assume, thanks to an assortment of biases, fallacies and heuristics. And intellectual humility requires an understanding that the word wrong can mean many things. Acknowledging the possibility of your wrongness could describe admitting the beliefs you hold in high certainty could be false, or the attitudes you currently hold might be founded on poor or incomplete evidence. Or the opinions you routinely share could be biased and very well could change if someone were to present you with good arguments to their contrary. Complicating matters, and we're going to do that a lot in this episode, is the fact that we often feel like we are well aware of all of these things. But the research into intellectual humility reveals that though we may think of ourselves as open to new ideas and perspectives and conscious of our individual levels of ignorance from subject to subject, we tend to approach most situations with an undeserved overconfidence in our understanding. For instance, in a study by Leonid Rosenblit and Frank keel back in 2011, researchers asked participants to rate how well they understood the mechanics of everyday things like zippers, toilets and locks. People usually rated themselves as having a pretty good grasp of how such things worked. But when asked to provide detailed, step by step explanations, most could not do that. And that fact came as a surprise to the people who previously believed they could do those things provide step by step explanations? Explain it in detail. Psychologists call this the illusion of explanatory depth, the belief you understand something better than you truly do. It's a cognitive bias, one of more than 180 we've identified so far, and each of these reliably skews your perceptions and affects your judgments from one moment to the next. This one in particular. The illusion of explanatory depth. It leaves you overconfident in your understanding of most things and thus unmotivated to truly understand them until one day your zipper will not zip, your toilet will not flush, and you can't figure out why. Later studies have revealed the illusion of explanatory depth extends well beyond zippers and toilets and bicycles and helicopters and coffee makers. For instance, when researchers asked for people's opinions on topics like healthcare reform or carbon taxes, people tended to produce strong, emotionally charged positions. But when asked to explain those issues in detail, most realized they had only a basic grasp, and as a result, their certainty dipped and their opinions became less extreme. But that doesn't often happen. You don't often get that question right to your face. Please explain that thing that you feel very strongly about in detail. And studies like these reveal we have a rather complicated relationship with our own ignorance. We tend to discover our incomprehension by surprise, sideswiped from our blind spots because we were unaware those blind spots existed. For the most part. That's because we rarely go looking for evidence of our ignorance unless motivated to do so. And when motivated to do so, we often try to prove to ourselves that we were right all along, especially when we feel like we have a pretty good grasp of what is and is not. So over the years, I've learned a lot of things that required me to unlearn a lot of other things before I could add the new things I learned to the collection of things I thought I knew for sure. For instance, I used to believe the 1938 radio broadcast of the War of the Worlds led to a mass panic. But it did not. Rumors of such a panic spread via newspaper think pieces at the time about how getting news from anywhere other than newspapers was a bad idea. I also used to believe you could boil a frog, a live frog, by slowly and gradually raising the temperature of the water it was in. It turns out that is not true. They jump right out once they become uncomfortable. I also used to believe lemmings sometimes marched off cliffs because they blindly followed each other while walking in a single file. And that one's been A nugget of popular but untrue folklore since the 1800s, long before the 1990s video game that perpetuated the myth with its whimsical gameplay, and the 1950s Disney documentary that did the same by tossing an unsettling number of real lemmings off of a cliff. In each case, right up until the moment I received evidence to the contrary, all this misinformation, these supposed facts, felt true to me. I had believed them for decades, and I had accepted them in part because they seemed to confirm all sorts of other ideas and opinions floating around in my mind. Plus, they would have been great ways to illustrate complicated concepts if not for the pesky fact that they were, in fact, not facts. That's one of the reasons why common misconceptions and false beliefs like these spread from conversation to conversation and survive from generation to generation to become anecdotal currency in our marketplace of ideas. They confirm our assumptions and validate our opinions, and thus they raise few skeptical alarms. They make sense, and they help us make sense of other things. And as Carl Jung once wrote, the pendulum of the mind oscillates between sense and nonsense, not between right and wrong. Which is another thing that I used to believe until right before recording this, I discovered Carl Jung, in fact, never said those things. It turns out I was wrong. Which brings me back to the topic at hand. Intellectual humility. It's a fun term to say intellectual humility. It somewhat ironically sounds fancy and smart and something an educated person who is not all that humble would utter in reference to something another not all that humble, educated person just referenced over a tincture of absinthe or something. In reality, it's a very simple concept, at least on the surface. But then again, so is everything. So, ironically, to proceed with intellectual humility is to assume that everything is more complicated than it first appears introduced, including intellectual humility. So, with that in mind, let's take a look at this idea one word at a time. Intellectual. The word means something that is in the domain of the intellect. And the intellect is an abstract term and catch all word slash, concept, related in the Old Latin to intelligence, which itself is related to the concept of understanding. And understanding is a word that truly originally meant to stand under among something, but metaphorically, as in to be close to it, underneath its shadow, near that which is throwing the shadow. And in terms of a thing that has meaning to stand under, to understand it is to comprehend it, to grasp the idea of it. So intellect is an umbrella term, concept and category for all the Stuff that feels like understanding and knowing things, but also thinking things and thinking about thinking, and also thinking about thinking about thinking. It's everything that might fall into a conceptual bucket related to using the mind to make sense of stuff both real and imagined, both physical and mental. In that bucket are concepts like reasoning, both the philosophical kind and the psychological kind, and the latter, the psychological kind. It's just coming up with reasons and justifications and rationalizations and arguments and opinions. Intellect includes all that, plus making plans and building mental models and agreeing upon definitions and so on, all those things. But it is also understanding or attempting to understand all those things, especially in the abstract, which is to say, using words that aren't exactly concrete in the way a word like banana is pretty concrete because it refers to a thing that seems to truly exist outside of the mind, something that two people could point at. Thinking in the abstract isn't like pointing at a banana. It's thinking about things, using words about things that exist mostly in the mind, like reason and truth and justice and beauty and bravery and kindness and ownership, wisdom, intelligence, intellect, and, yes, that which is intellectual. Okay, so that's intellectual. The other word, humility, comes from a very old word in Proto Indo European. That's the language family that preceded many, many modern languages, like English and German and French and so on. And that very old word was the term for the earth, as in the ground and dirt and all things, dirt, like, and earth in. At some point, that became the word hummus in Latin, which also meant all those things. So to be humble was to be on the ground metaphorically, which is to say low in rank, in a system of things that are ranked. Taken further metaphorically, it came to mean meek and modest and not egotistical or boastful or braggadocious or performative. Humility as an idea, as a concept, later evolved as a metaphorical category and became an abstract term for remaining unbothered by the fact that you aren't the best at everything or even most things. To be humble and to carry yourself with humility came to mean that you accept you are not a God or a demigod. And taken to the extreme, you aren't even a particularly fantastic human in all regards. Therefore, intellectual humility requires you to accept and be unbothered by the fact that there are things you do not know and that there are things you do not understand. And then the next level of intellectual humility is to accept and be unbothered by the fact that your opinions are changeable. The things you believe might be Wrong. And the attitudes you hold might be the result of experiences you've had. And if you have new experiences, those attitudes might be different. Then full and true and total intellectual humility is to recognize that you use a brain to make sense of the world. And that brain has limitations and biases that make you inherently fallible. That fallibility is biological, and that biology generates your mind and your memories and your emotions, and it can lead to all sorts of wrongness, moral, ethical, factual, attitudinal, behavioral, and more. So, to be intellectually humble is to understand that the things you assume you understand may not be truly understood. And the things you think, think, feel, believe and do are always worth reconsidering, especially when you encounter new experiences and information and interactions with other brains that challenge and or contradict your current understanding, your current assumptions, and your current concepts of what is and is not. So to be intellectually humble is to accept that you are the unreliable narrator in all the stories you tell yourself about yourself and about others and about the world in general. And then if you want to go even deeper, you can. And if you want to get even weirder, you so can. Okay, the deep weird level. Science as a word comes from a Latin verb, scientia, which means to know. It came from an older word which meant to divide into parts. And that same word gave us words like incision and schism. Scientia is another words like conscience. Conscience to know your own conduct and omni science to know everything that can be known and pre science to know beforehand. You can be conscientious. You can imagine a being that is omniscient, and you can kind of sorta as a person, be prescient, which is to say you can know something is going to be true before it is a thing that has become true. For instance, if you plug the best observations and calculations we have concerning the star at the center of our solar system into a computer, you can then quote, unquote, no, it will one day swell into a red giant and swallow the earth. That is decidedly prescient because it isn't a prediction. A prediction would be a maybe. Prescience is certainty. It's knowing. Which leads me to a fun word I hope you add to your vocabulary, niscience. To be nisient is to not know something that can't be known, ever. At least not by you. It's not even knowing that you can't know it. For instance, your cat. Your cat can't read or understand the terms and conditions for Spotify. So if your Cat clicked on. I agree. Before signing up for the service. We wouldn't consider that binding. There are vast expanses of ignorance your cat can't even imagine and could never gain the knowledge required to rid itself of that ignorance. That's the definition of nisience. At least that's the one I prefer. Because once you accept this definition applies to cat knowledge, you can begin to wonder about parallels in human knowledge. Are there some things that, just like your cat, you can never know? That you can never know? Are there things that maybe no human can ever understand because our bounded rationality is limited in ways that would require a more complex and powerful mind making thing than the human brain to comprehend? Are there levels of understanding in domains like physics that are as distant from human understanding as. As the human level of understanding of watchmaking is to orangutans? I think I pronounced that correctly. Orangutan. Orangutans. Orangutan. It's fun stuff. I'd like to think that we've got the capacity to understand quite a bit and that the universe is mostly comprehensible to us, given enough study, enough dividing into parts, enough science. But it is with great intellectual humility that I must accept that this is merely an assumption on my part. I don't know what I don't know about knowing things. In my experience. I will say that this never really bothers scientists the way it seems to bother the rest of us, the way it seems to bother people whose worldviews require an unyielding certainty. It certainly doesn't bother scientists as much as it bothers people who have found themselves within ideological and tribal identity extremes like cult members or conspiracy theorists or religious zealots or polarized political reactionaries. Also, in my experience, the most intellectually humble people in the world are scientists because every single time they read a new paper, it requires them to fully engage in intellectual humility, especially in their own areas of expertise. Every experiment they run is an exercise in intellectual humility. If they start to stray from that, other scientists will notice and let them know, and then they'll let everyone else know. The fall from grace for refusing to accept you might be wrong in science is a career threatening plummet. Science, the institution works because unlike politics and business, the people in scientific institutions will shame and ostracize their peers for not accepting evidence that calls into question their prior assumptions. The primate part of us that is ultrasocial and cares a whole lot about what other people think of us. Science repurposed that to maintain a culture where One's reputation depends on constantly pursuing and demonstrating and proselytizing intellectual humility.