follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Miss the posts on the CCP dictionary/glossary, whatever? Well, here you go-- CRT & numeracy. | Main | More glossary entries: pattern recognition, professional judgment & situation sense »

A science of science communication manifesto ... a fragment 

From something I'm working on . . . .

Our motivating premise is that advancement of enlightened policymaking  depends on addressing the science communication problem. That problem consists in the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent public conflict over policy-relevant facts to which that evidence directly speaks. As spectacular and admittedly consequential as instances of this problem are, entrenched public confusion about decision-relevant science is in fact quite rare. They are not a consequence of constraints on public science comprehension, a creeping “anti-science” sensibility in U.S. society, or the sinister acumen of professional misinformers.  Rather they are the predictable result of a societal failure to integrate two bodies of scientific knowledge: that relating to the effective management of collective resources; and that relating to the effective management of the processes by which ordinary citizens reliably come to know what is known (Kahan 2010, 2012, 2013).

The study of public risk perception and risk communication dates back to the mid-1970s, when Paul Slovic, Daniel Kahneman, Amos Tversky, and Baruch Fischhoff began to apply the methods of cognitive psychology to investigate conflicts between lay and expert opinion on the safety of nuclear power generation and various other hazards (e.g., Slovic, Fischhoff & Lichtenstein 1977, 1979; Kahneman, Slovic & Tversky 1982).  In the decades since, these scholars and others building on their research have constructed a vast and integrated system of insights into the mechanisms by which ordinary individuals form their understandings of risk and related facts. This body of knowledge details not merely the vulnerability of human reason to recurring biases, but also the numerous and robust processes that ordinarily steer individuals away from such hazards, the identifiable and recurring influences that can disrupt these processes, and the means by which risk-communication professionals (from public health administrators to public interest groups, from conflict mediators to government regulators) can anticipate and avoid such threats and attack and dissipate them when such preemptive strategies fail (e.g., Fischhoff & Scheufele 2013; Slovic 2010, 2000; Pidgeon, Kasperson & Slovic 2003; Gregory 2005; Gregory, McDaniels & Field 2001) .

Astonishingly, however, the practice of science and science-informed policymaking has remained largely innocent of this work.  The persistently uneven success of resource-conservation stakeholder proceedings, the sluggish response of localities the challenges posed by climate-change, and the continuing emergence of new public controversies such as the one over fracking—all are testaments (as are myriad comparable misadventures in the domain of public health) to the persistent failure of government institutions, NGOs, and professional associations to incorporate the science of science communication into their efforts to promote constructive public engagement with the best available evidence on risk.

This disconnect can be attributed to two primary sources.  The first is cultural: the actors most responsible for promoting public acceptance of evidence-based policymaking do not possess a mature comprehension of the necessity of evidence-based practices in their own work.  For many years, the work of policymakers, analysts, and advocates has been distorted by the more general societal misconception that scientific truth is “manifest”—that because science treats empirical observation as the sole valid criterion for ascertaining truth, the truth (or validity) of insights gleaned by scientific methods is readily observable to all, making it unnecessary to acquire and use empirical methods to promote its public comprehension (Popper 1968).

Dispelled to some extent by the shock of persistent public conflict over climate change, this fallacy has now given way to a stubborn misapprehension about what it means for science communication to be genuinely evidence based.  In investigating the dynamics of public risk perception, the decision sciences have compiled a deep inventory of highly diverse mechanisms (“availability cascades,” “probability neglect,” “framing effects,” “fast/slow information processing,” etc.). Used as expositional templates, any reasonably thoughtful person can construct a plausible-sounding “scientific” account of the challenges that constrain the communication of decision-relevant science. But because more surmises about the science communication problem are plausible than are true, this form of story telling cannot produce insight into its causes and cures. Only gathering and testing empirical evidence can.

Some empirical researchers have themselves contributed to the failure of practical communicators to appreciate this point. These scholars purport to treat general opinion surveys and highly stylized lab experiments as sources of concrete guidance for actors involved in promoting public engagement with information relevant to particular risk-regulation or related policy issues. Even when such methods generate insight into general mechanisms of consequence, they do not—because they cannot—yield insight into how those mechanisms can be brought to bear in particular circumstances.  The number of plausible surmises about how to reproduce in the field results that have been observed in the lab exceeds the number that truly will as well. Again, empirical observation and testing are necessary—now in the field.  The number of researchers willing to engage in field-based research, and unwilling to acknowledge candidly the necessity of doing so, has stifled the emergence of a genuinely evidence-based approach to the promotion of public engagement with decision-relevant science (Kahan 2014).

The second source of the disconnect between the practice of science and science-informed policymaking, on the one hand, and the science of science communication, on the other, is practical: the integration of the two is constrained by a collective action problem.  The generation of information relevant to the effective communication of decision-relevant science—including not only empirical evidence of what works and what does not but practical knowledge of the processes for adapting and extending it in particular circumstances—is a public good.  Its benefits are not confined to those who invest the time and resources to produce it but extend as well to any who thereafter have access to it.  Under these circumstances, it is predictable that producers, constrained by their own limited resources and attentive only to their own particular needs, will not invest as much in producing such information, and in a form amenable to the dissemination and exploitation of it by others, as would be socially desirable.  As a result, instead of progressively building on their successive efforts, each initiative that makes use of evidence-based methods to promote effective public engagement with policy-relevant science will be constrained to struggle anew with the recurring problems.

Fischhoff, B. & Scheufele, D.A. The science of science communication. Proceedings of the National Academy of Sciences 110, 14031-14032 (2013).

Gregory, R. & McDaniels, T. Improving Environmental Decision Processes. in Decision making for the environment : social and behavioral science research priorities (ed. G.D. Brewer & P.C. Stern) 175-199 (National Academies Press, Washington, DC, 2005).

Gregory, R., McDaniels, T. & Fields, D. Decision aiding, not dispute resolution: Creating insights through structured environmental decisions. Journal of Policy Analysis and Management 20, 415-432 (2001).

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Kahan, D. Making Climate-Science Communication Evidence Based—All the Way Down. In Culture, Politics and Climate Change: How Information Shapes Our Common Future, eds. M. Boykoff & D. Crow. (Routledge Press, 2014).

Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).

Kahan, D.M. A Risky Science Communication Environment for Vaccines. Science 342, 53-54 (2013).

Kahneman, D., Slovic, P. & Tversky, A. Judgment under uncertainty : heuristics and biases (Cambridge University Press, Cambridge ; New York, 1982).

Pidgeon, N.F., Kasperson, R.E. & Slovic, P. The social amplification of risk (Cambridge University Press, Cambridge ; New York, 2003).

Popper, K.R. Conjectures and refutations : the growth of scientific knowledge (Harper & Row, New York, 1968).

Slovic, P. The feeling of risk : new perspectives on risk perception (Earthscan, London ; Washington, DC, 2010).

Slovic, P. The perception of risk (Earthscan Publications, London ; Sterling, VA, 2000).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (13)

link drop:

February 6, 2018 | Unregistered CommenterJonathan

"link drop:"

Yep. Seen it before. Strawman fallacy. Plus at least one outright lie.

See also:

February 6, 2018 | Unregistered CommenterNiV

asymmetric link drop:

February 6, 2018 | Unregistered CommenterJonathan

Methinks the definition of "junk" might be an obstacle for a few folks.

Just a hunch.

February 6, 2018 | Unregistered CommenterJoshua

link drop:

(no non-paywall version of the embedded paper)

February 7, 2018 | Unregistered CommenterJonathan

The Oxford study uses the methodology of the drunk looking for his keys near a lamppost. Passerby tries to help for a while, finally asks "Are you sure you dropped them near here?" Drunk says, "No, I dropped them way over there, but here the light is better."

The alt-right fled both Facebook and Twitter eons ago; a few were expelled for PC reasons, but most left in disgust at the algorithmic abuse promoting tired PC memes. The "junk" promoters left on those websites are either provocateurs, intent on wasting SJW time and resources - the list of funders for the Oxford study is fairly comprehensive - or legitimate alt-right political actors like Nehlen, posting on the old media to distract from their true action networks obviously operating beyond the scope of those foolishly ignorant researchers.

So I must take exception to Dan's
"....As spectacular and admittedly consequential as instances of this problem are, entrenched public confusion about decision-relevant science is in fact quite rare. They are not a consequence of constraints on public science comprehension, a creeping “anti-science” sensibility in U.S. society, or the sinister acumen of professional misinformers."

"Sinister acumen of professional misinformers" is precisely what is guiding the avalanche of the junk news flooding the old media - the purpose is to insidiously, but mercilessly and utterly, destroy all public credibility of "experts" who have embraced PC causes. Guns, climate change, open borders, trade deals, and on and on. The more data these experts try to add to their argument the more brutally they will be attacked.

I do not believe there is anything worth saving on the other side of the divide. Nobody on the alt-right does.

So instead of looking for lost keys where they are not, look at places where those Oxford fools didn't, like our IPs, which, same as the SJWs, are concentrated on the 2 coasts and on large metro aread in between. They are not in flyover country as Obama claimed - but hardly anything he ever said was true, so no surprise there. Here is an IP map (credit: Fashmaps - love the name, btw!)

February 7, 2018 | Unregistered CommenterEcoute Sauvage

More links:

Asheley Landrum is one of Dan's co-conspirators from the HBV/HPV, Zika and SC papers. Which makes it interesting that she co-authored this blatant OSI-non-denialism: (no non-paywall version)

February 7, 2018 | Unregistered CommenterJonathan

Stumbled over this from last summer:

February 7, 2018 | Unregistered CommenterJonathan

Dan -

Better knowledge of evolution leads to greater acceptance of the concept

What's up with that?

February 7, 2018 | Unregistered CommenterJoshua

Dan -

hmmm. This too?

Using a new demographically representative survey (N = 1100) that includes a detailed measure of evolution knowledge, we find that knowledge predicts level of acceptance, even after accounting for the effects of religion and politics. These results demonstrate that Americans’ views on evolution are significantly influenced by their knowledge about this theory and therefore might be amenable to change.

What's up with that, also?

February 7, 2018 | Unregistered CommenterJoshua

Jonathan -

Really enjoyed that "Theconversation" article. Thanks.

February 7, 2018 | Unregistered CommenterJoshua


That problem consists in the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent public conflict over policy-relevant facts to which that evidence directly speaks. As spectacular and admittedly consequential as instances of this problem are, entrenched public confusion about decision-relevant science is in fact quite rare.

I'm going to try to parse these challenging-to-parse sentences.

A confluence of several factors has to come together.
1) valid, compelling, and widely accessible scientific evidence -- creates "facts"
2) There is a public conflict over "facts"
3) The facts from 1) directly speaks to the facts needed in 2)

Next sentence: This confluence rarely happens

How am I doing?? Is this what you were trying to say?

February 8, 2018 | Unregistered CommenterCortlandt

Cortlandt - any chance you can replace "confluence" with something countable (as edges of fluid dynamics equations are not) and preferably binary? Noise v. signal or similar discrete measurement would do.

Example from unrelated news:
This is bad news to the posters cited in the The Hill, but an occasion for celebration in the (not-cited) opposite camp. The dead black-lives-matter activist had previously achieved notoriety for grabbing a confederate flag from a demonstrator after jumping over a police barrier.

Facts do not appear to be doubted by either group. Their interpretation clearly is. Conseequently, I wonder if Dan's definition would benefit by an extension to epistemology generally, ie not limited to the sciences.

February 8, 2018 | Unregistered CommenterEcoute Sauvage

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>