follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« "Climate change caused ...": linguistics, empirics, & reasoned discourse | Main | WSMD? JA! Episode 2: cultural polarization on death penalty & climate change, 2006 vs. 2012 »

The science communication problem: one good explanation, four not so good ones, and a fitting solution

I was on a panel Saturday on “public policy and science” at the CSICon conference in Nashville. My friend Chris Mooney was on it, too. I didn’t speak from a text, but this is pretty close to what I rember saying; slides here.

I’m going to discuss the “science communication problem” – the failure of sound, widely disseminated science to settle public controversies over risks and other policy-relevant facts that admit of scientific investigation.

What makes this problem perplexing isn’t that we have no sensible explanation it. Rather it’s that we have too many.

There are always more plausible accounts of social phenomena than are actually true.  Empirical observation and meansurement are necessary--not just to enlarge collective knowledge but also to steer people away from deadends as they search for effective solutions to the society’s problems.

In this evidence-based spirit, I’ll identify what I regard as one good explanation for the science communication problem and four plausible but not so good ones. Then I’ll identify a “fitting solution”—that is, a solution that fits the evidence that makes the good explanation better than the others.

One good explanation: identity-protective cognition

Identity-protective cognition (a species of motivated reasoning) reflects the tendency of individuals to form perceptions of fact that promote their connection to, and standing in, important groups.

There are lots of instances of this. Consider sports fans who genuinely see contentious officiating calls as correct or incorrect depending on whether those calls go for or against their favorite team.

The cultural cognition thesis posits that many contested issues of risk—from climate change to nuclear power, from gun control to the HPV vaccine—involve this same dynamic. The “teams,” in this setting, are the groups that subscribe to one or another of the cultural worldviews associated with “hierarchy-egalitarianism” and “individualism-communitarianism.”

CCP has performed many studies to test this hypothesis. In one, we examined perceptions of scientific consensus. Like fans who see the disputed calls of a referree as correct depending on whether they favor their team or its opponent, the subjects in our study perceived scientists as credible experts depending on whether the scientists’conclusions supported the position favored by members of the subjects’ cultural group or the one favored by the members of a rival one on climate change, nuclear power, and gun control.

Not very good explanation # 1: Science denialism

“Science denialism” posits that we see disputes over risks in the US because there is a significant portion of the populatin that doesn’t accept that the authority of science as a giude for policymaking.

The same study of the cultural cognition of scientific consenesus suggests that this isn’t so. No cultural group favors policies that diverge from scientific consensus on climate change, nuclear power, or gun control. But as a result of idenity-protective cognitoin, they are culturally polarized over what the scientific consensus is on those issues.

Moreover, no group is any better at discerning what scientific consensus is than any other. Ones that seem to have it right, e.g., on climate change are the most likely to get it wrong on deep geologic isolation of nuclear wastes, and vice versa.

Not very good explanation #2: Misinformation

I certainly don’t dispute that there’s a lot of misinformation out there. But I do question whether it’s causing public controversy over policy-relevant science. Indeed, causation likely runs the other way.

Again, consider our scientific consensus study. If the sort of “biased sampling” we observed in our subjects is typical of the way people outside the lab assess evidence on culturally contested issues, there won’t be any need to mislead them: they’ll systematiclly misinform themselves on the state of scientific opinion.

Still, we can be sure they’ll very much appreciate the efforts of anyone who is willing to help them out. Thus, their motivation to find evidence supportive of erroneous but culturally congenial beliefs will spawn a cadre of misinformers, who will garner esteem and profit rather than ridicule for misrepresenting what’s known to science.

The “misinformation thesis” has got things upsidedown.

Not very good explanation #3: “Bounded rationality”

Some people blame controversy over policy-relevant science on deficits in the public’s reasoning capacities.  Ordinary members of the public, on this view, know too little science and can’t understand it anyway because they use error-prone, heuristic stratetgies for interpsteing risk information.

Plausible, sure. But wrong, it turns out, as an explantion for the science communication problem: higher levels of science literacy and quantiative reasoning ability, a CCP study found, don’t quiet cultural polarization on issues like climate change and nuclear power; they magnify it.

Makes sense given identity-protective cognition. People who are motivated to form perceptions that fit their cultural identities can be expected to use their greater knowledge and technical reasoning facility to help accomplish that—even if generates erroneous beliefs about societal risks.

Not very good explanation #4: Authoritarian personality

The original authoritarian-personality of Adorno and his colleagues is often dismissed as an exercise in polemics disguised as social science.

But in recent years, a serious body of scholarship has emerged on correlations between dogmatism, closed-mindedness, and like personality traits, on the one hand, and conservative ideology, on the other. This work is insigthfully synthesized in Mooney’s The Republic Brain.

Does this revitalized “authoritarian personality” position explain public controversy over policy-relevant science?

It’s odd to think it does, given the role that identity-protective cognition plays in such controversies. Identity-protective cognition affects all types of perception (not just evaluations of evidence but brute sense impressions) relating to all manner of group affinities (not just politics but college sports-team allegiances). So why would the impact of identity-protective cognition be linked to a personality trait found in political conservatives?

But the point is, we should just test things – with valid study designs. Is the score on an “open mindendess” test a valid predictor of the sort of identity-protective reasoning that generates disputes over climate change, the HPV vaccine, nuclear power, guns?

I did a study recently designed to answer to this question. I examined whether liberal Democrats and conservative Republicans would displayed identity-protective cogntion in assessing evidence of the validity of the Cognitive Reflection Test (CRT)—which is in fact a valid measure of reflective, open-minded engagement with information.

They both did, and to the same degree.  When told that climate-skeptics got a higher CRT score (and here were presumably more open-minded), liberal Democrats were much less likely to view the test as valid than when they were told that climate-believes got a higher score (indicating they were more open-minded). The mirror-image pattern emerged for conservative Republicans.

What’s more, this effect was magnified by the disposition measured by CRT. That is, the subjects most inclined to employ conscious, reflective reasoning were the most prone to identity-protective cogniton—a result consistent with our findings in the Nature Climate Change study.

The  new “authoritarian personality” work might be identifying real differences between liberals and conservatives. But there’s little reason to think that what it’s telling us about them has any connection to identity-protective cognition—the dynamic that has been shown with direct evidence to play a significant role in the science communication problem.

A fitting solution: The separation of meaning and fact

Identity-protetive cognition is the problem. It affects liberals and conservatives, interferes with the judgment of even the most scientifically literate and reflective citizens, and feeds off even sound information as it creates an appetite for bad.

We need a solution, then, fitted to counteracting it. The one I propose is the formation of "science communication environment" protection capacity in our society.

Policy-consequential facts don’t inevitably become the source of cultural conflict. Indeed, they do only in the rare cases where they become suffosed with highly charged and antagonistic cultural meanings.

These meanings are a kind of pollution in the science communication environment, one that interferes with the usually reliable faculty ordinary people employ to figure out who knows that about what.

The sources of such pollution are myriad. Strategic behavior is one. But simple miscalculation and misadventure also play a huge role.

The well-being of a democratic society requires protecting the science communication environment from toxic meanings. We thus need to use our knowledege to understanding how such meanings are formed. And we need to devote our political resolve to developing procedures and norms that counteract the forms of behavior—intentional and inadvertent—that generate this form of pollution.

A wall of separation between cultural meaning and scientific fact is integral to the constitution of the Liberal Republic of Science.

PrintView Printer Friendly Version

EmailEmail Article to Friend

References (1)

References allow you to track sources for this article, as well as articles that were written in response to this article.

Reader Comments (5)

I think your "Not good No.3" somewhat undercuts the justification for your "not good No. 1," which think is generally the weakest of your four "not goods." You certainly don't offer substantiation for it. And, when fundamentalist Christians already reject the theory of evolution, we have actual countervailing evidence.

October 30, 2012 | Unregistered CommenterSocraticGadfly

I don't think you are correct in describing denialism as an explanation. Denialism is a symptom, it's not a cause of the problems people have with science itself. As we have written extensively on at the denialism blog, the central problem is ideology, which to me seems more or less synonymous with "Identity-protective cognition". When you have an ideology or religion or other deeply held belief which is central to how you perceive the world, when it comes under threat, it's not just a matter of cognitive dissonance, but nearly existential crisis that results. Your values, your worldview, your perception of the order of life (and even the afterlife), may come under threat from some scientifically-verifiable fact, and it becomes more important to deny that fact rather than allow a conflict with the central truths you hold most dear.

Denialism is the symptom of this conflict, because denialism isn't really about being "anti-science". Denialists and psuedoscientists clearly are not anti-science because they crave the validity that science bestows on a belief. If the didn't crave it so badly, they wouldn't go to all the trouble of trying to assert the science is really on their side. Everyone wants science to confirm their deeply-held beliefs, sadly, for about 99.9% of us, at some point, it won't.

November 1, 2012 | Unregistered CommenterMarkHoofnagle

I'm curious about the term, "Liberal Republic of Science." Is that a neologism of your own? If so, I'd be interested in any pointers you can give to a more-detailed discussion of what you mean by it. I'm curious, too, if you've considered that the term itself might already be encumbered with antagonistic cultural meanings by virtue of its including the word "Liberal."

That question aside, thank you very much for sharing your views on this topic. I've been following your recent writings with interest. I'm hopeful that your work points to a way forward on a set of collective problems that are both important to solve and frustratingly resistant (at least so far) to untangling.

November 12, 2012 | Unregistered CommenterJohn Callender


1. On LRS: I'm pretty sure I "originated" the term; of course, since human brains are to memes what pre-paid, disposable cell phones are to humans, does it matter?...
2. What is it? This merits a blog post. Or 3 or 4. For now: it is the reciprocal relationship between The Logic of Scientific Discovery & the Open Society (so of course Karl Popper is the James Madison of the Constitution of the Liberal Republic of Science). It is the political regime (in sense of type of political order) that is the occasion for the science of science communication as a *political science," which are aimed at perfecting this regime by resolving all its paradoxes (the central one of which, "Popper's dilemma," is the necessity of extinguishing any shared authoritative means for certifying what's known as a condition of enjoying the unparalleled capacity for knowledge that is science.) It is also the very best collective way of life ever known in human history .....
3. On the meaning, cultural & otherwise of "Liberal" in LRS: It's not "liberal" in contemporary American electoral politics sense but "Liberal" in the historical philosophical sense that sees securing the maxium reciprocal liberty of individuals as the criterion for evaluating the legitimacy of the state. And in this sense, "we are all :Liberals, we are all Republicans...." to paraphrase Thomas Jefferson (in his 1800 inaugural) -- both Democrats and Republicans, liberals & conservqatives. Those differences are of tremendous consequence; but they are against the bkrd of shared political culture that is very distinctive, very new, that takes for granted the rejection of things that for most of human history & still in many parts of the world made people violent zealots & also ignoramuses.... I see what you are saying about how use of "Liberal" can connote "liberal" & hence have the effect you worry about. But doing science is not the same thing as communicating it, even when the science is the science of science communication ! That is to say, someone else can figure out how to frame these ideas to rid them of whatever resonances might be barriers to their open-minded contemplation by others whose cultural outlooks are different from my own -- if that's even a valuable thing to do, given that I am not someone whose pronouncements figure at all in the larger public discourse. I hope what I'm saying will contribute (it's right in any case for me to try to contribute) to the perfection of the 'new political science" that is needed in the Liberal Republic of Science; for that, I'll use the idioms that I think promote engagement by, or at least do not antagonise, those who are part of s certain culture (a scholarly one, a philosophical one). But what ways of communicating will then implement what I'm describing -- that's entirely different, I think.... But you see, I am talking to you; and you are saying I need to make myself clearer; and I agree!

November 14, 2012 | Registered CommenterDan Kahan

@Markhoofnagle: My clipped summary of my point likely obscures this, but I *think* we agree.

By "science denialism" here, I mean to be referring to idea that there is a segment of society that denies normative authority of science (& not specifically to "denial of climate change" or of any particular aspect of it). E.g., Antiscience Beliefs Jeopardize U.S. Democracy, in Scientific American recently.

The idea that there is a persistent and perhaps growing "anti-science" element in US politics & society is definitely widespread sort of talking point thing. But I agree it *isn't* a *good* explanation for climate change controversy -- or other political conflicts over risk -- b/c in fact the groups who are polarized *all* believe science is consistent with their view. You agree with that, I think. I agree with you, too, that the reason they perceive this -- and necessarily misperceive it some fraction of the time -- is ideology, which is shaping their engagement with informtoin on what state of science is.

p.s. Sorry took so long to respond -- overactive spam filiter!

November 22, 2012 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>