follow CCP

Recent blog entries
« Cognitive illiberalism: anatomy of a bias | Main | Does economic self-interest explain climate change skepticism? »
Thursday
Mar082012

Misinformation and climate change conflict

reposted from Talkingclimate.org

I’m going to resist the academic’s instinct to start with a long, abstract discussion of "cultural cognition' and the theory behind it. Instead, I’m going to launch straight into a practical argument based on this line of research. My hope is that the argument will give you a glimpse of the essentials—and an appetite for delving further.

The argument has to do with the contribution that misinformation makes to the dispute over climate change. I want to suggest that the normal account of this is wrong.

The normal account envisions, in effect, that the dispute is fueled by an external force—economic interest groups, say—inundating a credulous public with inaccurate claims about risk.

I would turn this account more or less on its head: the climate change dispute, I want to argue, is fueled by a motivated public whose (unconscious) desire to form certain perceptions of risk makes it possible (and profitable) to misinform them.

As evidence, consider an experiment that my colleagues at the Cultural Cognition Project and I did.

In it, we asked  the participants (a representative sample of 1500 U.S. adults) to examine the credentials of three scientists and tell us whether they were “knowledgeable and credible expertsabout one or another risk—including climate change, disposal of nuclear wastes, and laws allowing citizens to carry concealed weapons in public. Each of the scientists (they were fictional; we told subjects that after the study) had a Ph.D. in a seemingly relevant field, was on the faculty of an elite university, and was identified as a member of the National Academy of Sciences.Whether study subjects deemed the featured scientists to be “experts,” it turned out, was strongly predicted by two things: the position we attributed to the scientists (in short book excerpts); and the cultural group membership of the subject making the determination.

Where the featured scientist was depicted as taking what we called the “high risk” position on climate change (it’s happening, is caused by humans, will have bad consequences, etc.) he was readily credited as an “expert” by subjects with egalitarian and communitarian cultural values, a group that generally sees environmental risks as high, but not by subjects with hierarchical and individualistic values, a group that generally sees environmental risks as low. However, the positions of these groups shifted—hierarchical individualists more readily saw the same scientist as an “expert,” while egalitarian comuniatarians did not—when he was depicted as taking a “low risk” position (climate change is uncertain, models are unreliable, more research necessary).

The same thing, moreover, happened with respect to the scientists who had written books about nuclear power and about gun control: subjects were much more likely to deem the scientist an “expert” when he advanced the risk position that predominated in the subjects’ respective cultural groups than when he took the contrary position.

This result reflects a phenomenon known as “motivated cognition.” People are said to be displaying this bias when they unconsciously fit their understandings of information (whether scientific data, arguments, and even sense impressions) to some goal or end extrinsic to forming an accurate answer.

The interest or goal here was the stake study subjects had in maintaining a sense of connection and solidarity with their cultural groups. Hence, the label cultural cognition, which refers to the tendency of individuals to form perceptions of risk that promote the status of their groups and their own standing within them.

Cultural cognition generates my unconventional “motivated public” model of misinformation. The subjects in our study weren’t pushed around by any external misinformation provider. Furnished the same information, they sorted themselves into the patterns that characterize public divisions we see on climate change.

This kind of self-generated biased sampling—the tendency to count a scientist as an “expert” when he takes the position that fits one’s group values but not otherwise—would over time be capable all by itself of generating a state of radical cultural polarization over what “expert scientific consensus” is on issues like climate change, nuclear power, and gun control.

In this environment, does the deliberate furnishing of misinformation add anything? Certainly.

But the desire of the public to form culturally congenial beliefs supplies one of the main incentives to furnishing them with misleading information. To protect their cultural identities, individuals more readily seek out information that supports than that challenges the beliefs that predominate in their group. The motivated public’s desire for misinformation thus makes it profitable to become a professional misinformer—whether in the media or in the world of public advocacy.

Other actors will have their own economic interest in furnishing misinformation. How effective their efforts will be, however, will still depend largely on how culturally motivated people are to accept their message. If this weren’t so, the impact of the prodigious efforts of commercial entities to convince people that climate change is a hoax, that nuclear power is safe, and that concealed-carry laws reduce crime would wear away the cultural divisions on these issues.

The reason that individuals with different values are motivated to form opposing positions on these issues is the symbolic association of them with competing groups.  But that association can be created just as readily by accurate information as by misinformation if authority figures identified with only one group end up playing a disproportionate role in communicating it.

One can’t expect to win an “information war of attrition” in an environment like this. Accurate information will simply bounce off the side that is motivated to resist it.

So am I saying, then, that things are hopeless? No, far from it.

But the only way to devise remedies for these pathologies is to start with an accurate understanding of why they occur. 

The study of cultural cognition shows that the conventional view of misinformation (external source, credulous public) is inaccurate because it fails to appreciate how much more likely misinformation is to occur and to matter when scientific knowledge becomes entangled in antagonistic cultural meanings.

How to free science from such entanglements is something that the study of cultural cognition can help us to figure out too. 

I hope  you are now interested in knowing how -- and in just knowing more!

Sources:

Kahan, D.M. Cultural Cognition as a Conception of the Cultural Theory of Risk. in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk (eds. Hillerbrand, R., Sandin, P., Roeser, S. & Peterson, M.) 725-760 (Springer London, Limited, 2012).

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Kahan, D., Braman, D., Cohen, G., Gastil, J. & Slovic, P. Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law Human Behav 34, 501-516 (2010).

Kahan, D.M. & Braman, D. Cultural Cognition of Public Policy. Yale J. L. & Pub. Pol'y 24, 147-170 (2006).

Kahan, D.M., Braman, D., Slovic, P., Gastil, J. & Cohen, G. Cultural Cognition of the Risks and Benefits of Nanotechnology. Nature Nanotechnology 4, 87-91 (2009).

Kahan, D.M., Jenkins-Smith, H. & Braman, D. Cultural Cognition of Scientific Consensus. J. Risk Res. 14, 147-174 (2011).

 

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (3)

Have you considered the alternative hypothesis, that people factor in the accuracy of the expert's past statements into their judgement of expertise? If an expert makes statements on 10 questions, three of which you already know the answers to, do you believe them on the remaining seven? Well, if they get the three questions wrong, they're not much of an expert, are they?

You could test that by giving examples of ideology-neutral past statements to balance their record. If they make 10 statements, 7 of which you know to be true and 3 which conflict with your beliefs, does that fare better or worse than 10 statements, 7 correct, 3 contrary to popular opinion but politically neutral?

It would also be interesting to know on what basis people judge expertise: qualifications, professional seniority, experience, observed or reported accuracy, clarity of explanation, conflicts of interest, displayed ethics, etc. There's no reason to suppose everyone weights qualifications to the same degree.

July 29, 2012 | Unregistered CommenterNiV

NiV: Nope, didn't do that. We held expertise constant & manipulated position; so whatever assumptions the subjects made about the track records of the experts would have been a wash with respect to the hypothesis-- that congruence between experts' position and one that prevails in subjects' cultural groups would exert decisive weight. Indeed, the experts were all fictional. Also, all were super qualified by ordinary criteria-- training, current employment, NAS membership. But the point is, however people were weighting conventional criteria of expertise, they were in fact being influenced by the fit between expert's *view* and the subjects' own cultural predispositions.

July 30, 2012 | Registered CommenterDan Kahan

I think NiV's hypothesis is built into your experiment. Climate change is not a neutral question for your experiment. There is a significant amount of knowledge available to your subjects. The theoretical 10 questions NiV would use to evaluate expertise are answered for your subjects by the positions of the "experts". If your experts give answers contradictory to available knowledge they would be judged less expert. "Ordinary criteria" would be trumped by knowledge of correct answers. Three highly qualified physicians would not be judged the same if one were to say infection is caused by "bad air". You would probably reproduce your results when substituting evolution for climate change.

March 14, 2013 | Unregistered CommenterHarry Engstrom

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>