Key Insight

I was on a panel Saturday on “public policy and science” at the CSICon conference in Nashville. My friend Chris Mooney was on it, too. I didn’t speak from a text, but this is pretty close to what I rember saying; slides here. I’m going to discuss the “science communication problem” – the failure of sound, widely disseminated ... Read more

I was on a panel Saturday on “public policy and science” at the CSICon conference in Nashville. My friend Chris Mooney was on it, too. I didn’t speak from a text, but this is pretty close to what I rember saying; slides here.

I’m going to discuss the “science communication problem” – the failure of sound, widely disseminated science to settle public controversies over risks and other policy-relevant facts that admit of scientific investigation.

What makes this problem perplexing isn’t that we have no sensible explanation it. Rather it’s that we have too many .

There are always more plausible accounts of social phenomena than are actually true.  Empirical observation and meansurement are necessary–not just to enlarge collective knowledge but also to steer people away from deadends as they search for effective solutions to the society’s problems.

In this evidence-based spirit, I’ll identify what I regard as one good explanation for the science communication problem and four plausible but not so good ones. Then I’ll identify a “fitting solution”—that is, a solution that fits the evidence that makes the good explanation better than the others.

One good explanation: identity-protective cognition

Identity-protective cognition (a species of motivated reasoning) reflects the tendency of individuals to form perceptions of fact that promote their connection to, and standing in, important groups.

There are lots of instances of this. Consider sports fans who genuinely see contentious officiating calls as correct or incorrect depending on whether those calls go for or against their favorite team.

The cultural cognition thesis posits that many contested issues of risk—from climate change to nuclear power, from gun control to the HPV vaccine—involve this same dynamic. The “teams,” in this setting, are the groups that subscribe to one or another of the cultural worldviews associated with “hierarchy-egalitarianism” and “individualism-communitarianism.”

CCP has performed many studies to test this hypothesis. In one , we examined perceptions of scientific consensus. Like fans who see the disputed calls of a referree as correct depending on whether they favor their team or its opponent, the subjects in our study perceived scientists as credible experts depending on whether the scientists’conclusions supported the position favored by members of the subjects’ cultural group or the one favored by the members of a rival one on climate change, nuclear power, and gun control.

Not very good explanation # 1: Science denialism

“Science denialism” posits that we see disputes over risks in the US because there is a significant portion of the populatin that doesn’t accept that the authority of science as a giude for policymaking.

The same study of the cultural cognition of scientific consenesus suggests that this isn’t so. No cultural group favors policies that diverge from scientific consensus on climate change, nuclear power, or gun control. But as a result of idenity-protective cognitoin, they are culturally polarized over what the scientific consensus is on those issues.

Moreover, no group is any better at discerning what scientific consensus is than any other. Ones that seem to have it right, e.g., on climate change are the most likely to get it wrong on deep geologic isolation of nuclear wastes, and vice versa.

Not very good explanation #2: Misinformation

I certainly don’t dispute that there’s a lot of misinformation out there. But I do question whether it’s causing public controversy over policy-relevant science. Indeed, causation likely runs the other way.

Again, consider our scientific consensus study. If the sort of “biased sampling” we observed in our subjects is typical of the way people outside the lab assess evidence on culturally contested issues, there won’t be any need to mislead them: they’ll systematiclly misinform themselves on the state of scientific opinion.

Still, we can be sure they’ll very much appreciate the efforts of anyone who is willing to help them out. Thus, their motivation to find evidence supportive of erroneous but culturally congenial beliefs will spawn a cadre of misinformers, who will garner esteem and profit rather than ridicule for misrepresenting what’s known to science.

The “misinformation thesis” has got things upsidedown.

Not very good explanation #3: “Bounded rationality”