follow CCP

Recent blog entries
« Yale University "Science of Science Communication" course | Main | An interesting story: on whether "strengthening self-defense law deters crime" »
Friday
Jan112013

Amazingly cool & important article on virulence of ideologically motivated reasoning

Political psychologist Brendan Nyhan  and his collaborators Jason Reifler & Peter Ubel just published a really cool paper in Medical Care entitled “The Hazards of Correcting Myths About Health Care Reform.” It shows just how astonishingly resistant the disease of ideologically motivated reasoning is to treatment with accurate information. And like all really good studies, it raises some really intersting questions.

NRU conducted an experiment on the effect of corrections of factually erroneous information originating from a partisan source. Two groups of subjects got a news article that reported on false assertions by Sarah Palin relating to the role of “death panels” in the Obamacare national health plan.  One group received in addition a news story that reported that “nonpartisan health care experts have concluded that Palin was wrong.”  NRU then compared the perceptions of the two groups.

Well, one thing they found is that the more subjects liked Palin, the more likely they were to believe Palin’s bogus “death panel” claims.  Sure, not a big surprise.

They also found that the impact of being showing the “correction” was conditional on how much subjects liked Palin: the more they liked her, the less they credited the correction. Cool, but again not startling.

What was mind-blowing, however, was the interaction of these effects with political knowledge.  As subjects became more pro-Palin in their feelings, high political knowledge subjects did not merely discount the “correction” by a larger amount than low political knowledge ones. Being exposed to the “nonpartisan experts say Palin wrong” message actually made high-knowledge subjects with pro-Palin sentiments credit her initially false statements even more strongly than their counterparts in the “uncorrected” or control condition!

The most straightforward interpretation is that for people who have the sort of disposition that “high political knowledge” measures,  the “fact check”-style correction itself operated as a cue that the truth of Palin's statements was a matter of partisan significance, thereby generating unconscious motivation in them to view her statements as true.

That’s singularly awful.

There was already plenty of reason to believe that just bombarding people with more and more “sound information” doesn’t neutralize polarization on culturally charged issues like climate change, gun control, nuclear power, etc. 

There was also plenty of reason to think that individuals who are high in political knowledge are especially likely to display motivated reasoning and thus to be especially resistant to a simple “sound information” bombardment strategy.

But what NRU show is that things have become so bad in our polarized society that trying to correct partisan-motivated misperceptions of facts can actually make things worse!  Responding to partisan misinformation with truth is akin to trying to douse a grease fire with water!

But really, I’d say that the experiment shows only potentially how bad things can get.

First, the NRU experimental design, like all experimental designs, is a model of real-world dynamics.  I’d say the real-world setting it is modeling is one in which an issue is exquisitely fraught; Palin & Obamacare are each flammable enough on their own, so when you mix them together you’ve created an atmosphere just a match strike away from an immense combustion of ideologically motivated reasoning.

Still, there is plenty of reason to believe that there are conditions, issues, etc.  like that in the world. So the NRU model gives us reason to be very wary of rushing around trying to expose “lies” as a strategy for correcting misinformation.  At least sometimes, the the study cautions, you could be playing right into the misinformer’s hands.

Actually, I think that this is the scenario on the mind of those who’ve reacted negatively to the proposed use of climate change “truth squads”—SWAT teams of expert scientists who would be deployed to slap down every misrepresentation made by individuals or groups who misrepresent climate science.  The NRU study gives more reason to think those who didn’t like this proposal were right to think this device would only amplify the signal on which polarization feeds.

Second, interpreting NRU, however, depends in part on what is being measured by “political knowledge.”

Measured with a civics quiz, essentially, “political knowledge” is well-known to amplify partisanship.

But why exactly?

The usual explanation is that people who are “high” in political knowledge literally just know more and hence assign political significance to information in a more accurate and reliable way. This by itself doesn’t sound so bad. People’s political views should reflect their values, and if getting the right fit requires information, then the "high" political knowledge individuals are engaged in better reasoning. Low-knowledge people bumble along and thus form incoherent views.

But that doesn’t seem satisfying when one examines how political knowledge can amplify motivated reasoning.  When people engage in ideologically motivated reasoning, they give information the effect that gratifies their values independently of whether doing so generates accurate beliefs.  Why would knowing more about political issues make people reason in this biased way?

Another explanation would be that “political knowledge” is actually measuring the disposition to define oneself in partisan terms. In that case, it would make sense to think of high knowledge as diagnostic or predictive of vulnerability to ideologically motivated reasoning. People with strong partisan identities are the ones who experience strong unconscious motivation to use what they know in a way that reinforces conclusions that are ideologically congenial.

Moreover, in that case, being low in “political knowledge” arguably makes one a better civic reasoner. Because one doesn’t define oneself so centrally with respect to one’s ideology or party membership, one gives information an effect that is more reliably connected to its connection to truth.  Indeed, in NRU the “low knowledge” subjects seemed to be responding to “corrections” of misinformation in a normatively more desirable way—assuming what we desire is the reliable recognition and open-minded consideration of valid evidence. 

I would say that the “partisan identity” interpretation of political knowledge is almost certainly correct, but that the “knows more, reasons better” interpretation is likely correct too.  The theoretical framework that informs cultural cognition asserts that it is rational for people to regard politically charged information in a manner that reliably connects their beliefs to those that predominate in their group because the cost of being “out of synch” on a contentious matter is likely to be much higher than the cost of being “wrong”—something that on most political issues is costless to individuals, given how little impact their personal beliefs have on policymaking.  If so, then, we should expect people who “know more” and “reason better” to be more reliable in “figuring out” what the political significance of information is—and thus more likely to display motivated reasoning.

In support of this, I’d cite two CCP studies. The first showed that individuals who have higher levels of science comprehension are more likely to polarize on climate change. The second shows that individuals who are higher in “cognitive reflection,” as measured by the CRT test, show an even greater tendency to engage in culturally or ideologically motivated reasoning when evaluating information.

These studies belie an interpretation of NRU that suggests that “low knowledge” subjects are reasoning in a higher quality way because they are not displaying motivated cognition.  In truth, higher quality reasoning makes motivated reasoning worse.

Because it is rational for people to fit their perceptions of risk and other policy-consequential facts to their identities (indeed, because this is integral to their capacity to participate in collective knowledge), the way to avert political conflict over policy-relevant science isn't to flood the political landscape with "information." It is to protect the science communication enviroment from the antagonistic social meanings that are the source of the conflict between the individual interest that individuals have in forming and expressing commitment to particular cultural groups and the collective one that the members of all such groups have in converging on the best available evidence of how to secure their common ends.

What gives me pause, though, is an amazingly good book that I happen to be reading right now: The Ambivalent Partisan by Lavine, Johnston & Steenbergen. LJS reports empirical results identifying a class of people who don’t define themselves in strongly partisan terms, who engage in high quality reasoning (heuristic and systematic) when examining policy-relevant evidence, and who are largely immune to motivated reasoning.  

That would make these ambivalent partisans models of civic virtue in the Liberal Republic of Science. I suppose it would mean too that we ought to go on a crash program to study these people and see if we could concoct a vaccine, or perhaps a genetic modification procedure, to inculcate these dispositions in others. And more seriously still (to me at least!), such findings might suggest that I need to completely rethink my understanding of  cultural cognition as integral to rational engagement with information at an individual level. . . . I will give a fuller report on LJS in due course.

I can report for now, though, that NRU & LJS have both enhanced my knowledge and made me more confused about things I thought I was figuring out. 

Important contributions to scholarly conversation tend to have exactly that effect!

 References

Delli Carpini, M.X. & Keeter, S. What Americans Know About Politics and Why It Matters. (Yale University Press, New Haven; 1996).

Hovland, C.I. & Weiss, W. The Influence of Source Credibility on Communication Effectiveness. Public Opin Quart 15, 635-650 (1951-52).

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Kahan, D. Ideology, Cognitive Reflection, and Motivated Cognition, CCP Working Paper No. 107 (Nov. 29, 2012).

Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).

Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change 2, 732-735 (2012).

Lavine, H., Johnston, C.D. & Steenbergen, M.R. The ambivalent partisan : how critical loyalty promotes democracy. (Oxford University Press, New York, NY; 2012).

Nyhan, B., Reifler, J. & Ubel, P.A. The Hazards of Correcting Myths About Health Care Reform. Medical Care Publish Ahead of Print, 10.1097/MLR.1090b1013e318279486b (9000).

Zaller, J.R. The Nature and Origins of Mass Opinion. (Cambridge Univ. Press, Cambridge, England; 1992).

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (12)

I think modern media contributes a lot to this phenomenon. It is not just the voters who behave like this, consumers in the IT market and other areas also display similar behaviors. It's just the information bomb that makes people more and more polarized I think

January 11, 2013 | Unregistered Commenterzhipeng

@Zhipeng:
If the enlargement of information itself generates political polarization over what is known to science, shouldn't the invention of the printing press made it harder rather than easier for scientific knowledge to disseminate?

January 11, 2013 | Unregistered Commenterdmk38

That’s singularly awful.

I really don't understand why you seem somewhat surprised by this.

Wasn't the most significant finding (statistically speaking), in your study of motivated reasoning in the climate debate, that more information only strengthens opinions in ways that are predictable by cultural, political, or personal orientation?

Of course the participants would view "non-partisan expert" information as biased - because if it ran counter to their starting orientation, it would be viewed as a threat to their biases. (Non-partisan expert" opinion that was consistent with their starting orientation would be considered as non-biased). Anything that threatens their preexisting opinion is viewed as biased. In this case, anything that might put into question the veracity of Sarah Palin's world view (an extension of their own world view), is threatening. It threatens them personally because it suggests they were wrong in a previously formulated opinion. It threatens their social and cultural and political orientations.

Further, the very terms "non-partisan" and "expert" are in themselves laden with cultural and political overtones. Read any "skeptical" climate blog and look at the commonly expressed views about "experts" (Judith Curry put up a post on the topic of "experts" just the other day - providing many good example). Distrust of "expert" opinion exists on both sides of the political divide (as can be seen in the material Judith used as the basis for her post), but it is a major source for energizing a sense of victimization on the right, and by extension, the climate "skept-o-sphere."

January 12, 2013 | Unregistered CommenterJoshua

A civics quiz, essentially, “political knowledge” has long been shown to amplify partisanship.

and

But that doesn’t seem satisfying when one examines how political knowledge can amplify motivated reasoning.

I have to wonder, here, about direction of causality. Has your evidence for these statements show direction?

If someone has a strong cultural, social, or personal orientation towards a particular political ideology, they are very likely to be interested in acquiring more information - because they are interested, but also <Strong>because they are highly motivated to confirm their biases. The more they read, the more material they come across that might threaten their biases - and thus, as a defense against that threat, they seek to acquire more information (and in selective treatment of evidence that goes along with motivated reasoning, the additional information they acquire further confirms their biases). It is a vicious cycle.

Someone who is more indifferent is less likely to seek out information (and accordingly has fewer biases to be threatened, and hence would be more open to "non-partisan expert" information, and would be less likely to see "non-partisan expert" information as inherently biased unless, of course, those experts agreed with them).

One explanation would be that “political knowledge” is actually measuring the disposition to define oneself in partisan terms. In that case, it would make sense to think of high knowledge as diagnostic to vulnerability to ideologically motivated reasoning.

Again - I think that direction of causality could be at issue, i.e., vulnerability to ideologically motivated reasoning drives people to acquire more information.

And in that case, being low in “political knowledge” arguably makes one a better civic reasoner.

I think that "better" is a mistake here. I think there is nothing inherent in being low in political knowledge that makes someone a better reasoner. Being low in political knowledge indicates that someone is less motivated, specifically, on issues that resonate with political identifications. More motivation to confirm bias stimulates an interest in acquiring more information. The test, I would suppose, would be to measure the reasoning of "high political knowledge" subjects against "low political knowledge" subjects on issues that have very little political overlay. My guess (obviously, based on my own biases as someone interested in politics), is that there won't be much effect, but to the extent that there is an effect, it will show "better" reasoning (again, I think that is a poor term) among those more politically informed.

To ramble a bit....

Speaking of my biases - I have worked a lot with scientists and academics from Japan, and other Asian countries. I am frequently surprised with the lack of knowledge or interest they have had in political issues - even within their own countries. In my experience, in a very general sense, American scientists and academics tend to be relatively more informed and interested in political issues. Now of course, I'm not saying that as a rule, the Asian scientists and academics have "poorer" reasoning skills - but I do think that as a general rule, their lack of knowledge and interest in politics correlates with attributes related to their reasoning process that differs from what one typically finds in their American counterparts. I think we can see non-anecdotal evidence if we look at the relative prevalence of highly contested political debates in the media in different countries, respectively). Again, based on my anecdotal experiences, working as an instructor for international (again, predominantly Asian) graduate students, I have seen clear differences in the dominant educational paradigms w/r/t the value placed upon acquiring diverse viewpoints, and in how information is organized and communicated, that correlate with cultural differences. A metaphor I use is to consider asking a poet, an artist, a geologist, a historian, and a civil engineer each to describe a grassy clearing.

A bit rambling there (I'm trying to fit a lot into a small space - I hope it makes at least some sense), but my point is that I think that an interest in exploring diverse viewpoints (and as a result, interest in gaining more political knowledge) is a culturally influenced drive, and it doesn't correlate with "better" or "poorer" reasoning - but with different attributes in how one reasons.

January 12, 2013 | Unregistered CommenterJoshua

One more post - I'm just kind of free associating, and won't be offended in the least if you think that they're too incoherent to pass through moderation.

It is to protect the science communication enviroment from the antagonistic social meanings that are the source of the conflict between the individual interest that individuals have in forming and expressing commitment to particular cultural groups and the collective one that the members of all such groups have in converging on the best available evidence of how to secure their common ends.

Hmm. Good luck with that. The horse is already out the barn, IMO. At this point, "science communication" is already heavily laden with antagonistic social meanings. Look at polls that show notable differences in "trust in scientists" that break down along partisan lines. Note how the %'s have switched: conservatives used to be the most trusting, liberals the least - and now it is reversed (with no changes in trust level among liberals and moderates, they're the same as they were, but "trust" has dropped notably among conservatives).

Any attempts to rectify that problem will fail, IMO, in the same way that adding "more information" fails as way to reduce partisanship in politically charged debates. It will fail for the same reason as the SWAT teams. SWAT teams that accept AGW as dangerous will necessarily antagonize "skeptics" but reinforce "realists." SWAT teams that reject AGW as dangerous will necessarily antagonize "realists" but reinforce "skeptics." Climate combatants on both sides will necessarily look to define any SWAT team in one way or the other. Those with less interest in climate will look to those with more interest to give them cues as to how to classify a SWAT team. Attempts by scientists to improve science communication will be necessarily politicized.

From your link:

The problem isn’t the public’s reasoning capacity; it’s the polluted science-communication environment that drives people apart, says Dan Kahan.

This again, to me, looks like an assumption of direction of causality. In other words, political orientation in the science prevents non-motivated reasoning because the inherent political association of the science stimulates a drive towards confirmation bias.

But I see the a bilateral direction of flow. No doubt, political orientation among scientists contributes to driving motivated reasoning, but motivated reasoning drives people to politicize the science. This will happen if the science communication is completely neutral (which is an unrealistic impossibility anyway).

In my view, the way out is through a deliberative and structured process of collaboration, as we can see in collaborative participator urban planning, where scientific information is provided in a non-hierarchical format and processed through stakeholder dialog where the scientists are, essentially, one stakeholder among many stakeholders. That is not a change in "science communication' per se, but a change in the context in which scientific information is processed.

January 12, 2013 | Unregistered CommenterJoshua

I love this post at two levels. First I am trying to discover how to bridge the gap between the polarized parties. I seem to be working in the LJS sense of a non partisan 'rational' individual. I find that the more 'polarized' the person is the trickier it is to have a productive conversation. I find that the topic, progressive or conservative, is not particularly important. The 'strength' of the belief is the critical variable. I also find that bombarding the person with facts is counterproductive unless the facts are presented in a nuanced way. In part, I have become an amateur diplomat in attempting to have conversations on flammable subjects.
At a more interesting level, at least for me, one of my companies has created a model of learning and memory based entirely on molecular biology, biophysics, and biochemistry. This model, as an emergent property, shows the behavior discussed in the post above. The model also predicts the causes of this behavior and plausible ways to change it. As a disclaimer, some of the plausible ways take a very large amount of work.

January 12, 2013 | Unregistered CommenterEric Fairfield

Dan -

I don't know if you're familiar with Sam Wang's blog. I think you might find it interesting more generally, but specific to the gun control debate, I thought you might find these posts of interest:

http://election.princeton.edu/2012/12/14/did-the-federal-ban-on-assault-weapons-matter/

http://election.princeton.edu/2012/12/22/scientific-americans-gun-error/

Also, while I'm at it - a really interesting chart linked at Sam's blog (click to embiggen):

http://xkcd.com/1127/

January 16, 2013 | Unregistered CommenterJoshua

I agree with much of Joshua and Eric's post. IMO, the problem/truth is that knowledge should be taken to mind, not heart. Politics is for many, if not most, a condition of the heart. I think if you could devise a way to test the indifferent or ambvalent partisan, what you would measure is that these individuals do a better job of meeting this standard. Also, meeting the standard will be both individual, and the subject discussed. It will be a rare individual indeed that will meet the standard in all subjects or discussions.

I also think that past kowledge and experience is a variable poorly constrained and this relates to Joshua's comments. Use the Sarah Palin example. A person has noted that Sarah makes mistakes, that the news media compounds and exaggerates these effects, and reads the correction, that person could come to side with Sarah, not from cognitive motivational error, but the judgement is against the news source(s). This brings in Eric's point as well, memory. It also highlights the lack of constraint or the potential unrecognized motivated reasoning of the authors rather than the subjects that would influence their work. In terms of memory and remembrance, how does one control for this effect? I would think that this would help explain why the effect of knowledge made those who were inclined towards Sarah to be even more so. They would have the knowledge, and the memory of the past news bias, and would be judging that. Note that the typical left/right liberal/conservative bias is heightened by the percieved and in many cases real bias that the news brings. Perhaps this is what that study is actually measuring, which highlights Joshua's comment of causality.

And I extend an invitation to read any "alarmist", those that Joshua would name as "realists", blog or post low probabilities or IPCC chosen poor methodologies such as: CO2 warmth could be good for the environment; that 2XCO2 ECS may be 2 C and the IPCC indicates that this is in the beneficial range, that the methodology of shortening time periods that the skeptics use for saying the warming has stopped is the same that the IPCC and peer-reveiwed authors have used for saying that the warming is accellerating; and many others, if you want further insight as to how passion can cloud judgement in areas where it can be assumed that the persons are reasonable correct. Being correct does not stop motivated reasoning, and I think that the studies indicate that "being correct" can lead to a worsening of motivated reasoning. I think you will find that these supposed "realists" will engage in motivated reasoning as much as other humans.

If you are going to re-examin your understanding, you should look at the intrinsic assumptions in the Ambient Partisan wrt memory and causality. I don't think you are far off. I have my doubts about how one can seperate expierence with anything political or policy on the level indicated.

January 16, 2013 | Unregistered CommenterJohn F. Pittman

Dan -

One of my focuses in education, at multiple levels, has been to work with minorities and people of non-American or non-traditional backgrounds to function effectively in American educational or business contexts. Within that work, in particular w/r/t minorities in academia - the concept of "stereotype threat" is a frame that I have found useful for understanding what kinds of factors lead, often, break down in reasoning processes.

http://en.wikipedia.org/wiki/Stereotype_threat

Of course, that is only one type of "threat" that interferes with sound reasoning processes. I have also worked with students who had "learning disabilities." It is amazing to be working with such a student, and see their reasoning process become more or lees completely paralyzed as soon as they feel their identity threatened by the possibility that someone is associating them with a disability.

And it certainly doesn't only happen with students identified as "learning disabled." Watch virtually anyone who has come to identify as someone who "can't do math" (another form of stereotype) when they are asked to tackle a task that requires math: again (often) near complete shutdown of reasoning processes.

You'll see people under those circumstances become more or less incapable of assimilating or processing even the simplest of information, and unable to manifest previous understanding that is easily producible in a context where no threat is perceived.

Just some thoughts.

January 17, 2013 | Unregistered CommenterJoshua

@jJosh: yes, that's a fascinating & disturbing phenomenon. My collaborator (and former next door neighbor before he move West) Geoff Cohen has done some great studies on how the impact of stereotype threat can be countered via affirmation; it is remarkable how simple but effective this can be. Likely you've seen (am not in position to check, but would be shocked if not in the Wikipedia entry. Works for some types of I'd-protective motivated reasoning too.

January 17, 2013 | Unregistered Commenterdmk38

They sound like my kind of people.

January 18, 2013 | Unregistered CommenterIsabel Penraeth

The weakness I see in the work you are doing is your apparent belief that there exists an unbiased scientific consensus position on each politically/culturally contentious subject. That belief is based (incorrectly I think) on the assumption that scientific experts in a particular field are always able to come to a purely rational conclusion, free of their own cultural predispositions.

All of my many years of working in science tell me just the opposite: scientists are just as subject to bias, from multiple causes, as anyone else (and maybe more so!). Of course, if a field continues to be scientifically important for a long time, reality will ultimately impose itself, and the field will gradually reach a consensus which is reasonably consistent with reality. How long that process takes depends on many factors, including the personal and intellectual investment leaders in the field may have made in a dominant paradigm (as Thomas Kunh described), and whether or not the field is experimental (usually faster progress) or limited to observation and modeling (usually much slower progress).

Most scientific fields linked to socially contentious issues (eg nuclear waste disposal) have at least some reasonable range of political/cultural views among recognized experts. This helps to reduce the likelihood of a dominant cultural bias. But it seems to me that certain fields, including climate science, are dominated by people who share a specific set of cultural values, values which lead to a consensus position that may be substantially biased. People who choose to *enter* the field of climate science are overwhelmingly "egalitarian, communitarian" in their personal views, almost certainly predisposed to the belief that human economic activity is badly damaging to the Earth's environment, and further believe it is *morally* correct for environmental damage be minimized and/or reversed. Any field which attracts mainly people with a common world view and common predisposition ought rationally be considered with a degree of skepticism.

You observe that those who are technically trained become, on average, more skeptical when they learn more about climate science, and you relate that to their personal world view influencing what they choose to believe. I would argue that a perfectly rational evaluation is also involved: what is being said by an expert is import to listen to, but so is a parallel evaluation of possible bias in the expert making the statement. This is especially true when you are in a position to independently determine that there is much greater uncertainty than what is being suggested by experts. Merck's HPV vaccine fiasco was largely the result of a (correct) conclusion by many that the push for obligatory immunization was not simply motivated by unbiased expert opinion. The result was horribly counterproductive for Merck, and arguably counterproductive for public health; "idiotic" would seem to accurately describe Merck's strategy. Those advocating immediate and drastic public policy to *force* reduced CO2 emissions would do well to consider how people can (and do!) infer and evaluate the motivations of others.

When you completely discount the possibility that the consensus in climate science is biased you weaken your entire argument. Separate the opinions of climate scientists who do not hold strong egalitarian/communitarian views from the rest and I think you will find a very different consensus view.

January 22, 2013 | Unregistered CommenterSteve Fitzpatrick

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>