Came across this a week or so ago. Though it was apropos to some of the discussions we have around here. Four pages, long read. But interesting.
(Tread lightly. The Boston Globe website is a bit of a pop-up minefield.)
Any votes for making this one a sticky?
Kevin
(Tread lightly. The Boston Globe website is a bit of a pop-up minefield.)
Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
“The general idea is that it’s absolutely threatening to admit you’re wrong,†says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire†— is “a natural defense mechanism to avoid that cognitive dissonance.â€
Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.
“The general idea is that it’s absolutely threatening to admit you’re wrong,†says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire†— is “a natural defense mechanism to avoid that cognitive dissonance.â€
Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.
Kevin
Comment