How much damage can smears do?

According to an article in The Guardian 'The power of election smears | Comment is free | The Guardian'





The power of election smears | Comment is free | The Guardian
http://www.guardian.co.uk/commentisfree/2010/may/01/bad-science-election-smears

Elections are a time for smearing, and the Daily Mail's desperate story about Nick Clegg and the Nazis is my favourite so far. Generally the truth comes out, in time. But how much damage can smears do?

An experiment published this month in the journal Political Behaviour sets out to examine the impact of corrections, and what they found was more disturbing than expected: far from changing people's minds, if you are deeply entrenched in your views, a correction will only reinforce them.

The first experiment used articles claiming that Iraq had weapons of mass destruction immediately before the US invasion. The 130 participants were asked to read a mock news article, attributed to Associated Press, reporting on a Bush campaign stop in Pennsylvania during October 2004.

The article described Bush's appearance as "a rousing, no-retreat defence of the Iraq war" and quoted a line from a genuine Bush speech from that year, suggesting that Saddam Hussein really had WMD, which he could have passed to terrorists. "There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks, and in the world after September 11," said Bush, "that was a risk we could not afford to take."

The 130 participants were then randomly assigned to one of two conditions. For half, the article stopped there. For the other half, the article included a correction: it discussed the release of the Duelfer report, which documented the lack of Iraqi WMD stockpiles or an active production programme immediately prior to before the US invasion.

After reading the article, subjects were asked to state whether they agreed with the statement: "Immediately before the US invasion, Iraq had an active weapons of mass destruction programme, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before US forces arrived." Their responses were measured on a five-point scale ranging from "strongly disagree" to "strongly agree".

As you would expect, those who self-identified as conservatives were more likely to agree with the statement. More knowledgeable participants (independently of political persuasion) were less likely to agree. Then the researchers looked at the effect of whether you were also given the correct information at the end of the article, and this was where things got interesting. They had expected the correction would become less effective in more conservative participants, and this was true, up to a point: so for very liberal participants, the correction worked as expected, making them more likely to disagree with the statement that Iraq had WMD when compared with those who were very liberal but received no correction.

For those who described themselves as left of centre, or centrist, the correction had no effect either way. But for people who placed themselves ideologically to the right of centre, the correction wasn't just ineffective, it backfired: conservatives who received a correction telling them that Iraq did not have WMD were more likely to believe that Iraq had WMD than people given no correction. Where you might have expected people to dismiss a correction that was incongruous with their pre-existing view, or regard it as having no credibility, it seems that such information actively reinforced their false beliefs.

Maybe the cognitive effort of mounting a defence against incongruous new facts entrenches you further. Maybe you feel marginalised and motivated to dig in your heels. Who knows? But these experiments were then repeated, in various permutations, on the issue of tax cuts (or rather, the idea that tax cuts had increased national productivity so much that tax revenue increased overall) and stem cell research.

All the studies found the same thing: if a dodgy fact fits with your prejudices, a correction only reinforces these. If your goal is to move opinion, this depressing finding suggests that smears work and, what's more, corrections don't challenge them much: because for people who already agree with you, it only make them agree even more.

 

So as well as being pretty worrying, does this explain how Fox News does it?