[MD] Food for thought

Mary marysonthego at gmail.com
Tue Jul 13 17:23:35 PDT 2010


http://news.firedoglake.com/2010/07/12/exposed-to-facts-the-misinformed-beli
eve-lies-more-strongly/

 



Exposed
<http://news.firedoglake.com/2010/07/12/exposed-to-facts-the-misinformed-bel
ieve-lies-more-strongly/>  to Facts, the Misinformed Believe Lies More
Strongly


By: David <http://news.firedoglake.com/author/dday/>  Dayen Monday July 12,
2010 6:50 am 

	

A truly disturbing study from researchers at my alma mater, the University
of Michigan, reveals
<http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backf
ire/>  that political partisans reacted to facts that contradicted their
worldview by clinging closer to their worldview.

In a series of studies in 2005 and 2006, researchers at the University of
Michigan found that when misinformed people, particularly political
partisans, were exposed to corrected facts in news stories, they rarely
changed their minds. In fact, they often became even more strongly set in
their beliefs. Facts, they found, were not curing misinformation. Like an
underpowered antibiotic, facts could actually make misinformation even
stronger.

This bodes ill for a democracy, because most voters - the people making
decisions about how the country runs - aren't blank slates. They already
have beliefs, and a set of facts lodged in their minds. The problem is that
sometimes the things they think they know are objectively, provably false.
And in the presence of the correct information, such people react very, very
differently than the merely uninformed. Instead of changing their minds to
reflect the correct information, they can entrench themselves even deeper.

"The general idea is that it's absolutely threatening to admit you're
wrong," says political scientist Brendan Nyhan, the lead researcher on the
Michigan study. The phenomenon - known as "backfire" - is "a natural defense
mechanism to avoid that cognitive dissonance."

As someone who engages in political persuasion through the use of facts,
this is the kind of study that borders on making me quit the business. I do
take care, and encourage others to do so as well, to not allow my beliefs to
color the facts, or at least not allow some manner of surety in my beliefs
challenge facts when they come about. But that doesn't appear to be the
American character. More people think with their gut than their brain, to
paraphrase Stephen Colbert. 

But the research on the subject shows this phenomenon as part of the human
condition, the desire to order facts around a particular view of the world.
Though it should be noted that the literature basically finds this to be
more prevalent on the conservative side of the ledger, which if you
understand the term "conservative" to be wedded to the status quo makes a
fair bit of sense.

New research, published in the journal Political Behavior last month,
suggests that once those facts - or "facts" - are internalized, they are
very difficult to budge. In 2005, amid the strident calls for better media
fact-checking in the wake of the Iraq war, Michigan's Nyhan and a colleague
devised an experiment in which participants were given mock news stories,
each of which contained a provably false, though nonetheless widespread,
claim made by a political figure: that there were WMDs found in Iraq (there
weren't), that the Bush tax cuts increased government revenues (revenues
actually fell), and that the Bush administration imposed a total ban on stem
cell research (only certain federal funding was restricted). Nyhan inserted
a clear, direct correction after each piece of misinformation, and then
measured the study participants to see if the correction took.

For the most part, it didn't. The participants who self-identified as
conservative believed the misinformation on WMD and taxes even more strongly
after being given the correction. With those two issues, the more strongly
the participant cared about the topic - a factor known as salience - the
stronger the backfire. The effect was slightly different on self-identified
liberals: When they read corrected stories about stem cells, the corrections
didn't backfire, but the readers did still ignore the inconvenient fact that
the Bush administration's restrictions weren't total.

Interestingly, one antidote researchers have found to this is self-esteem.
Respondents who felt good about themselves were consistently more willing to
accept new information, whereas those who felt threatened or agitated - say,
your average Rush Limbaugh listener - were not. Another way to get facts to
stick is through direct appeals. Yet media consumers get their information
indirectly, through filters and outlets they either trust or imagine to have
a bias, and they set their perceptions accordingly.

For individuals, broadening your sources of information probably helps to
find a consensus on some facts. But I wouldn't be so sure it would work.
We're rapidly moving to a post-truth era in politics, and the data suggests
that the agreed-upon set of facts has gone the way of the dinosaur.

 

 

Mary

 

- The most important thing you will ever make is a realization.

 




More information about the Moq_Discuss mailing list