## Thinking Meta

Universe in 2009 / by /

is more than a matter of “gut feeling” — it’s the willingness to reflect on the decision-making process itself.

Page 2 of 2

The larger point, of course, is that humans aren’t rational calculators. According to Kahneman and Tversky, when people are confronted with an uncertain situation, they don’t carefully evaluate the information, or compute the Bayesian probabilities, or do much thinking at all. Instead, their decisions depend upon a short list of emotions, instincts, and mental shortcuts. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether.

So far, so bleak: The human mind is an imperfect computer, stuffed full of programming bugs. But some researchers now believe that there may be a way of avoiding such innate psychological mistakes. When we flex our metacognitive muscles and think about how we are thinking, we are able to notice when we’re thinking poorly. The end result is that we avoid those avoidable mental blunders.

Imagine, for instance, that you’re taking part in the following experiment. A scientist puts \$50 on the table and asks you to decide between two options. The first option is an all-or-nothing gamble: There is a 40 percent chance that you will win the entire \$50 and a 60 percent chance that you will win nothing. The second option, however, is a sure thing. If you choose this alternative, you get \$20. And, indeed, that’s what most people choose.

But now let’s play the game again. The gamble option hasn’t changed: You still have a 40 percent chance of winning the entire \$50. This time, however, the sure thing is described as a loss of \$30, instead of as a gain of \$20.

The two games are identical. In both cases, you could walk away with \$20 of the original \$50. But the different descriptions of the options strongly affect how people play. When the choice is framed in terms of gaining \$20, only 42 percent of people choose the gamble. But when the same choice is framed in terms of losing \$30, 62 percent of people opt to roll the dice. This human foible is known as the framing effect, and it’s a by-product of loss aversion. The effect helps explain why people are much more likely to buy meat when it’s labeled 85 percent lean instead of 15 percent fat. Or why twice as many patients opt for surgery when they have an 80 percent chance of surviving as opposed to a 20 percent chance of dying.

When neuroscientists studied people playing this gambling game inside an fMRI machine, they saw the precise brain regions activated by these two different yet equivalent frames. They found that people who chose to gamble, when faced with the prospect of losing \$30, were misled by an excited amygdala, an almond-shaped group of neurons in the brain often associated with negative feelings.

However, when the scientists looked at the brains of subjects who were not swayed by the different frames, they discovered something surprising. The amygdalas of these “rational” people tended to be just as excitable as those of people who were susceptible to the framing effect. “We found that everyone showed emotional biases; no one was totally free of them,” says Benedetto De Martino, the neuroscientist who led the experiment. Even people who instantly realized that the two different descriptions were identical — they saw through the framing effect — still experienced a surge of negative emotion when they looked at the loss frame.

What, then, caused the stark differences in behavior? Activity in the prefrontal cortex, the fold of tissue thought to be responsible for metacognition, offered some clues. When there was more activity, people were better able to resist the framing effect. They could look past their biases and realize that both descriptions were equivalent. According to De Martino, “People who are more rational don’t perceive emotion less, they just regulate it better.” This study neatly illuminates the benefits of metacognition. Because we’re capable of reflecting on our own mental states, we can figure out why we are feeling what we are feeling. If the particular sentiment makes no sense — if the amygdala is simply responding to a loss frame, for example — then we can discount it. As Dan Ariely, a behavioral economist at Duke University, says, “Unless we are aware of our tendency to act irrationally in certain situations, then we’ll continue to act irrationally. That much is predictable.” Although the mind is full of flaws, we can learn to outsmart them.

And this doesn’t just apply to simple gambles and coin flips. Consider the results of Tetlock’s epic survey of expert judgment. In the early 1980s, Tetlock picked 284 people who made their living “commenting or offering advice on political and economic trends” and began asking them to make predictions about future events. In each case the pundits were asked to rate the probability of several possible outcomes. By the end of the study in 2003, Tetlock had quantified 82,361 different predictions.

The first thing Tetlock discovered is that most “experts” were utterly useless. Although they were paid for their keen insights into world affairs, these experts tended to perform worse than random chance. Most of Tetlock’s questions had three possible answers; the experts, on average, made a correct prediction less than 33 percent of the time. But not everybody was so inaccurate. In addition to quantifying the (mis)predictions of the experts, Tetlock interrogated them about how they’d made up their minds. He found that the most successful experts in his study were those who consistently reflected on their own thought processes. This act of “mental eavesdropping” enabled them not only to avoid stupid errors (like loss aversion) but also to modulate their minds. The end result was vastly improved performance.

For neuroscientists the next challenge is figuring out how different situations benefit from different kinds of thought. When should we rely on reason? And what decisions are best left to our emotions and instincts? While much remains to be discovered, there’s some preliminary evidence that simple problems — those involving a limited number of variables — are best suited for deliberate thought, so that people don’t make any obvious mistakes. In contrast, complex problems seem to benefit from the processing powers of the unconscious (which generates our emotions), as long as people first take the time to carefully assimilate all the relevant facts. Given the distinct talents of our various cognitive styles, a willingness to think about our own thought processes and to adjust our mode of decision making to the task at hand could transform the way we use the mind. Before we do anything else, we should go meta.

Originally published February 16, 2009

Page 2 of 2

Share this Stumbleupon Reddit Email + More

## Now on SEEDMAGAZINE.COM

• ### Ideas

#### I Tried Almost Everything Else

John Rinn, snowboarder, skateboarder, and “genomic origamist,” on why we should dumpster-dive in our genomes and the inspiration of a middle-distance runner.

• ### Ideas

#### Going, Going, Gone

The second most common element in the universe is increasingly rare on Earth—except, for now, in America.

• ### Ideas

#### Earth-like Planets Aren’t Rare

Renowned planetary scientist James Kasting on the odds of finding another Earth-like planet and the power of science fiction.

SEEDMAGAZINE.COM by Seed Media Group. ©2005-2015 Seed Media Group LLC. All Rights Reserved.

Sites by Seed Media Group: Seed Media Group | ScienceBlogs | Research Blogging | SEEDMAGAZINE.COM