Here’s an article on a fascinating topic: our brains, journalism and manipulation. Courtesy of Mark Thoma, at Economist’s View.
"Unsmearing the Smear"
Free Exchange looks at a report on "How unscrupulous campaign strategists are taking advantage of a quirk in our brains – and what reporters can do to stop helping them":
Unsmearing the smear, Free Exchange, The Economist: An interesting behavioural look at the world of political mud-slinging… It helpfully begins:
According to a recent Pew Research Center survey, Americans increasingly get their news from multiple sources. More than one-third use Internet-based sources such as Web sites, blogs, and even social networking sites. Only a minority rely entirely on traditional sources, including print, radio, television, and cable news. The survey did not include chain e-mail, which has fed rumors… This proliferation of sources creates competitive pressure on journalists to bend their standards in order to get a story quickly.
It’s always good to see blogging given a clear edge over crazy, ungrammatical emails typed in multi-coloured fonts. The piece continues:
Our brains tend to remember facts that accord with our worldview, and discount statements that contradict it. …The human brain also does not save information permanently, as do computer drives and printed pages. Recent research suggests that every time the brain recalls a piece of information, it is "written" down again and often modified in the process. Along the way, the fact is gradually separated from its original context. For example, most people don’t remember how they know that the capital of Massachusetts is Boston.
This is actually quite a serious point.
Repetition in any context strengthens memory. Incredibly it also creates its own aura of credibility:
In another Stanford study, students were exposed repeatedly to the unsubstantiated claim that Coca-Cola is an effective paint thinner. Those who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than the National Enquirer), giving it a gloss of credibility. Thus the classic opening line "I think I read somewhere," or even reference to a specific source, is often used to support falsehoods. Similarly, psychologist Daniel Gilbert and his colleagues have shown that if people are distracted from thinking critically, they default to automatically accepting statements as true.
A week or so ago, Mark Thoma responded to a piece on "Libertarian Paternalism" by noting that he didn’t like feeling like he was being manipulated. It’s interesting to me that even if we assume that journalists are impartial actors, there is an asymmetry in the presentation of information, since the smearers are presumably well aware of these findings and using them to their advantage. We are being manipulated.
So here is the question: should journalists actively study behavioural economics in an effort to adjust their coverage such that the effect of the coverage will be something closer to factual truth? Either the media aims to be deliberately manipulative in an effort to produce better content, or the media will abet the manipulations of others, by predictably being not manipulative. Which is preferable?
Let me add the following from the article as an example of how journalists might use this in their reports:
Journalists should avoid presenting both sides of a story when one is false – and take into account how readers’ brains process the disagreements. The following four rules can guide their efforts.
1. State the facts without reinforcing the falsehood. Repeating a false rumor can inadvertently make it stronger. In covering … controversy…, many journalists repeat… the charges against the candidate – often citing polling data on how many Americans believe them – before noting that the beliefs [a]re false. Particularly damaging is the common practice of replaying parts of an ad before debunking its content.
A related mistake is saying that something is newsworthy because "the story is out there." Reporting on coverage by a less credible source such as The Drudge Report, even with disclaimers, will inevitably spread the story. False statements should not be presented neutrally since they are likely to be remembered later as being true.
2. Tell the truth with images. Nearly half of the brain is dedicated to processing visual information. When images do not match words, viewers tend to remember what they see, not what they hear. Karl Rove has said that campaigns should be run as if the television’s sound is turned down.
Television journalists should avoid presenting images that contradict the story. One recent CNN …story featured a threatening swarthy face subtitled "Obama the Antichrist?" – a statement that CNN would presumably not claim to be true.
3. Provide a compelling storyline or mental framework for the truth. Effective debunking requires replacing the falsehood with positive content. …
4. Discredit the source. Ideas have special staying power if they evoke a feeling of disgust. Indeed, brain pathways dedicated to processing disgust can be activated by descriptions of morally repellent behavior. The motives of the purveyors of falsehoods can provide a powerful story hook. A recent example is the press coverage pointing out Obama Nation author Jerome Corsi’s motivations and past of racist Web commentary and allegations of Bush Administration complicity in the 9/11 attacks.
To avoid contributing to the formation of false beliefs, journalists may need to re-examine their practices. In 1919, Supreme Court Justice Oliver Wendell Holmes wrote that "the best test of truth is the power of the thought to get itself accepted in the competition of the market." Our brains do not naturally obey this admirable dictum. But by better understanding the mechanisms of memory, perhaps journalists can move their modern audience closer to Holmes’s ideal.
"Journalists should avoid presenting both sides of a story when one is false." I think that alone would take us a long way. But from what I’ve seen written about economics, I wonder if journalists have the knowledge to make that determination enough of the time.
My comment: There are many examples of frequently repeated statements and beliefs gaining the status of quasi-facts, regardless of the lack of credible evidence. Often writers are biased or financially interested, other times, they may not have a solid basis on which to evaluate the evidence, and may not know they don’t. This is not an answer, but more of an acknowledgement of the complexity of the problem across all areas from the health sciences, to the stock market, to economics, to politics, etc. – Ilene
Previous article by the authors of the report prompting the commentary: Your Brain Lies to You, by Sam Wang and Sandra Aamodt.