It all started with Cheerios. Jonah Lehrer was once again standing in a supermarket aisle, crippled by the thought of which variety of whole-oat goodness to buy: honey nut or apple cinnamon. “It was an embarrassing waste of time,” he says, “and yet it happened to me all the time.” Lehrer decided that he had to figure out what was going on inside his brain when he contemplated such issues. And his curiosity transcended breakfast. How did the brain process picking a house? Choosing stocks? Split-second, life-or-death decisions? These questions led the author on a path from the epistemological roots of rational versus emotionally driven thinking to leveraging the tools of modern-day neuroscience to look inside the brain and see how it actually thinks. The result is his second book, How We Decide, a real-world exploration of the brain’s capacity for decision making. With it, Lehrer shows how different decisions require different mental tools; that rational thought isn’t always the answer; and that understanding how our brain processes information can improve the thousands of choices, big and small, we make every day.
Lehrer, who is editor at large for Seed magazine, sat down with senior editor Greg Boustead to discuss the decision-making skills of pilots and world leaders, the problem with certainty, and the best way to handle a Stephen Colbert interview (“don’t try to be funny”).
Seed: In How We Decide you talk about pilots extensively. During your research, you worked in flight simulators to get a feel for making critical decisions under duress.
Jonah Lehrer: Yes, they’re unbelievably realistic. And I spent time with pilots to try to understand how they think.
Seed: Okay, let’s start with an actual event that occurred not far from here. How did Chesley Sullenberger, the pilot of the US Airways flight that landed on the Hudson a couple of months ago, choose to divert to the river to execute that flawless emergency landing without a single fatality?
JL: Good question. The first thing Captain Sullenberger had going for him was that he had practiced very similar situations in flight simulators. In flight school, they practice all of these terrifying scenarios. Part of the practice is to learn how to actually land on water or how you technically steer a jet around an island full of skyscrapers. But they also learn something even more important in flight simulators: how to think under pressure and how to think through fear. What he had learned is how to control his emotional brain and remain calm, even though he was very scared. That’s a crucial skill pilots hone that we can all learn from, which is that it takes practice and hard work to learn how to control these very powerful negative emotions.
Seed: Is it an example of what you describe as rational deliberation or an emotional gut instinct?
JL: In most cases, the instincts of pilots are to follow the advice of Air Traffic Control. But they were telling him to proceed to Teterboro Airport, in New Jersey. So he actually went against that natural instinct. I think if you had tried to map the decision onto the distinct circuitry of his brain, what you would have seen is Sullenberger’s prefrontal cortex light up as it tried to compensate for the very excited activity taking place in his amygdala. That gets back to the larger point that pilots have really rigged the system: First, they have learned how to make decisions because they practice it, and they’ve also developed systems like autopilot, all those computers in the cockpit that help them compensate for the innate limitations of the human brain. That’s why, as bizarre as it sounds, the safest place to be just about anywhere is 30,000 feet up in the air traveling at 500 mph.
Seed: Should hugely consequential decisions always be rationally deliberated?
JL: No. Research suggests that it’s complex decisions, the ones that involve lots of information, that benefit the most from unconscious emotional processing. The conscious brain can only handle a very limited amount of information at one time — seven digits, plus or minus two. Unconsciously, however, you can process tons of information. It’s these complex decisions — like choosing a car, an apartment, or a leather couch — that often require the rational brain to turn off to some degree.
Seed: So you can overanalyze a situation?
JL: Definitely. I talk in the book about a study involving strawberry jam, where the more people think about which jam they prefer, the less accurate their preferences become. So they move further away from actually finding the best-tasting strawberry jam. I think that happens to a lot of us all the time. So that’s definitely something to be aware of. For instance, when shopping at the grocery store, I invent lots of reasons for why I should buy the Honey Nut Cheerios or the gouda or the peppermint floss, or whatever. A potential trap of thinking about thinking—what we call metacognition—is that you can get stuck
in a recursive loop where you are thinking about thinking about thinking, and so on—until it’s like standing in a hall of mirrors. And then before you know it, half an hour has passed at the supermarket, and you still don’t know which floss to buy. All of these tricks and pieces of advice come with limits. It’s really a testament to just how mysterious and complex the human brain is.
Seed: The utility of metacognition brings to mind the story of a social psychologist who sat in a crowded bar and essentially watched a building across the street burn to the ground because he assumed the bartender or someone else had called 911. He did this even though he taught that very concept of “bystander effect” in his class every semester. What are the limitations of the brain’s capacity to be self-aware?
JL: That’s a great anecdote. The brain can be profoundly ignorant of itself. As Nietzsche said, “We’re often most ignorant of what is closest to us.” This is especially true in the context of the brain. We are often most blind to these flaws that are built into us, simply because we take them for granted. At the same time, I do think there’s evidence that once you teach people about a lot of these flaws, they can learn to compensate for them. But certainly metacognition has very real limits.
Seed: Have you found those limits?
JL: Yes, since I’ve started writing about it, I’ve discovered many of them. The brain is often a very tough thing to eavesdrop on. So that’s the challenge. But I’ve discovered that you can get a little bit better at it. I’ve become a little more sensitive to what my emotional brain tells me I want.
Seed: We have a new president with what seems to be a distinctly different decision-making process than that of former President Bush. How important are the decision-making styles of the people we put in power?
JL: There are probably few more important variables to evaluate any leader than by the quality of how they make decisions. In the run-up to the election, as Obama was practicing for one of the debates, he was in a really bad mood and acting absolutely grumpy. So he sent away his entire staff for half an hour while he cleared his head. This strikes me as a perfect example of metacognition, of a kind of emotional intelligence. He knew he was in no mental state to practice, so he said, I’ve got to do this first and then I’ll be ready to actually sit down and do what I need to do. That demonstrates a tremendous self-awareness.
Seed: So if our gut is best at weighty decisions, a leader ought to think, “Should we go to war? Yeah, I’m feeling pretty good about this”?
JL: Well, here’s the big caveat, and this is maybe the main distinction between Obama and Bush. There’s been extensive research over the last few decades about the danger of certainty, about believing you’re right. What that causes the brain to do is ignore all the evidence that suggests you’re wrong. We clearly tend to filter the world to conform to our ideology, to our preconceived notions. So if I had to identify one flaw of the Bush administration, it’s not that simply Bush trusted his gut instincts or that he was a “decider.” It was that he and his entire administration fell victim to the certainty trap. And I think you saw that very clearly with the Iraq war and WMDs. They believed they knew that Saddam Hussein had them. And so they ignored lots of relevant evidence and dissenting voices telling them that there were no WMDs. It wasn’t simply his gut instincts that led him astray, it was the fact that he didn’t seek out those dissident voices. And that’s a very natural human flaw, one of the frailties of the human brain. It’s also why liberals watch MSNBC and conservatives watch Fox News. It’s nice to have one’s beliefs reinforced. But it’s dangerous when leading a country.
Seed: Okay, last one, and it’s a big one: honey nut or apple cinnamon?
JL: [Laughs.] Honey nut! I’ve since learned to go with my emotional brain, and it turns out it just wants honey nut.
Originally published March 18, 2009