In the Boston Globe this past weekend, Joe Keohane's wrote
a fascinating summary of how our biases influence the new information we take in or fail to take in. It's been picked up by others in the blogosphere, including
Jonah Lehrer (author of the excellent book,
How We Decide among other things).
I ask you to read these pieces with how people think about energy and climate change in the back of your mind. With how you think about climate change and energy issues in mind. Before you go off and read these interesting things, consider:
- Is your mind made up on climate change?
- On the Marcellus Shale?
- On nuclear power?
And, to quirk it up a bit, is your mind made up about how you feel about:
- Your spouse or significant other?
- Your ex-spouse or former significant other?
What would it take to change your mind? For a lot of us, it feels like it's next to impossible to change our minds on the ideas we hold dear. And, according to Keohane's summary, facts that counter our point of view often make us dig into our positions more deeply. Here's the first of a few longish excerpts (but I still urge you to read the whole piece):
In the end, truth will out. Won’t it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”
Ouch.
I suppose it's unsurprising that we dig things that reinforce our beliefs and shut out things that counter them. It affirms that we who are in the science education business are in a very difficult spot. We deal with stuff that is seen by some as contentious stuff -- evolution, climate change, the Marcellus Shale. These issues tend to polarize people, and, it turns out providing evidence can make people at both poles hold onto their conclusions more tightly. Zow. What's an educator to do? What's an advocate to do?
And talking about it in this way can make whatever position sound like propaganda. Sigh.
Keohane's piece and Lehrer's response to it focus largely on the political. Of course, we can't say much about climate change without saying something political, or something that should be political, anyway. Important science is always political because we want politicians to act on, or at least be well-informed about, important issues. (But my opinions on such things are mine, and not necessarily those of my employer).
Here's an explicitly political excerpt, but one that relates to verifiable data:
...it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)
Now, what's that got to do with climate change? Well, for me, it raises the interesting question of what would be the outcome of parallel questions about climate change. This line, repeated from above, is a real attention grabber:
Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic.
It's clearly not a universal truth -- I do know a few climate change deniers who know more than the average American about climate change, but most of the climate deniers I know really have weak understandings of the underlying science. Couple that with strong opinion, and you've got a problem.
See more on characterizations of how different segments of the population see global warming
here.
So, if facts often backfire, what are we to do? Keohane does have a few suggestions -- build self-esteem. Folks who feel good about themselves are better, more open listeners. Also, if you can "hit them between the eyes" with evidence bluntly presented and objective, it may make a difference. (That was my goal in
writing and talking about where gasoline goes, by the way.)
Of course, as an educator, we long to believe that education is the answer, but the most educated can be the most difficult to change:
A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.
Ugh. (Ugh partly because that hits uncomfortably close to home).
Keohane also notes that one of the researchers he cites, Brendan Nyhan, suggests shaming the media sources. Keohane notes that those media sources can be pretty shameless, so that may be wishful thinking.
I suggest looking to the deeply positions we've held as individuals that we no longer hold. What made us change our minds? Or what helped? (There I go asking you to be metacognitive again). Being hit over the head with blunt evidence has worked powerfully for me, as has building my general confidence. What's worked for you?
What would these approaches actually look like in practice. Again, I'll ask you to give me feedback in the comment section. Please.
It's also worth pondering that such a shift in worldview, especially if driven by being metaphorically hit over the head, can hurt like heck. Or worse than heck -- remember a central idea here is that being wrong about important stuff is painful and we try to avoid such realizations. We ought to think about how to make coming down easier as well as coming to whatever realizations we may be after.
Thanks to my various Facebook friends who bring these interesting things to my attention.
3 comments:
Great! Many thanks for doing this article! It turned out invaluable.
Very easy to understand info. Thanks!
This short article afflicted me with a fantastic thought concerning this topic. Great performance. Many thanks
Post a Comment