follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« I ♥ NCAR/UCAR--because they *genuinely* ♥ communicating science (plus lecture slides & video) | Main | Science comprehension ("OSI") is a culturally random variable -- and don't let anyone experiencing motivated reasoning tell you otherwise! »
Thursday
Mar272014

The sources of evidence-free science communication practices--a fragment...

From something I'm working on...

Problem statement. Our motivating premise is that advancement of enlightened conservation policymaking  depends on addressing the science communication problem. That problem consists in the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent public conflict over policy-relevant facts to which that evidence directly speaks. As spectacular and admittedly consequential as instances of this problem are, states of entrenched public confusion about decision-relevant science are in fact quite rare. They are not a consequence of constraints on public science comprehension, a creeping “anti-science” sensibility in U.S. society, or the sinister acumen of professional misinformers.  Rather they are the predictable result of a societal failure to integrate two bodies of scientific knowledge: that relating to the effective management of collective resources; and that relating to the effective management of the processes by which ordinary citizens reliably come to know what is known (Kahan 2010, 2012, 2013).

The study of public risk perception and risk communication dates back to the mid-1970s, when Paul Slovic, Sarah Lichtenstein, Daniel Kahneman, Amos Tversky, and Baruch Fischhoff began to apply the methods of cognitive psychology to investigate conflicts between lay and expert opinion on the safety of nuclear power generation and various other hazards (e.g., Slovic, Fischhoff & Lichtenstein 1977, 1979; Kahneman, Slovic & Tversky 1982).  In the decades since, these scholars and others building on their research have constructed a vast and integrated system of insights into the mechanisms by which ordinary individuals form their understandings of risk and related facts. This body of knowledge details not merely the vulnerability of human reason to recurring biases, but also the numerous and robust processes that ordinarily steer individuals away from such hazards, the identifiable and recurring influences that can disrupt these processes, and the means by which risk-communication professionals (from public health administrators to public interest groups, from conflict mediators to government regulators) can anticipate and avoid such threats and attack and dissipate them when such preemptive strategies fail (e.g., Fischhoff & Scheufele 2013; Slovic 2010, 2000; Pidgeon, Kasperson & Slovic 2003; Gregory, McDaniels & Field 2001; Gregory & Wellman 2001).

Astonishingly, however, the practice of science and science-informed policymaking has remained largely innocent of this work.  The persistently uneven success of resource-conservation stakeholder proceedings, the sluggish response of local and national governments to the challenges posed by climate-change, and the continuing emergence of new public controversies such as the one over fracking—all are testaments (as are myriad comparable misadventures in the domain of public health) to the persistent failure of government institutions, NGOs, and professional associations to incorporate the science of science communication into their efforts to promote constructive public engagement with the best available evidence on risk.

This disconnection can be attributed to two primary sources.  The first is cultural: the actors most responsible for promoting public acceptance of evidence-based conservation policymaking do not possess a mature comprehension of the necessity of evidence-based practices in their own work.  For many years, the work of conservation policymakers, analysts, and advocates has been distorted by the more general societal misconception that scientific truth is “manifest”—that because science treats empirical observation as the sole valid criterion for ascertaining truth, the truth (or validity) of insights gleaned by scientific methods is readily observable to all, making it unnecessary to acquire and use empirical methods to promote its public comprehension (Popper 1968).

Dispelled to some extent by the shock of persistent public conflict over climate change, this fallacy has now given way to a stubborn misapprehension about what it means for science communication to be truly evidence based.  In investigating the dynamics of public risk perception, the decision sciences have amassed a deep inventory of highly diverse mechanisms (“availability cascades,” “probability neglect,” “framing effects,” “fast/slow information processing,” etc.). Used as expositional templates, any reasonably thoughtful person can construct a plausible-sounding “scientific” account of the challenges that constrain the communication of decision-relevant science (e.g., XXXX 2007, 2006, 2005). But because more surmises about the science communication problem are plausible than are true, this form of story-telling cannot produce insight into its causes and cures. Only gathering and testing empirical evidence can.

Sadly, some empirical researchers have contributed to the failure of practical communicators to appreciate this point. These scholars purport to treat general opinion surveys and highly stylized lab experiments as sources of concrete guidance for actors involved in communicating science relevant to risk-regulation or related policy issues (e.g., XXX 2009). Such methods have yielded indispensable insight into general mechanisms of consequence to science communication. But they do not—because they cannot—furnish insight into how to engage these mechanisms in particular settings in which science must be communicated.  The number of plausible surmises about how to reproduce in the field results that have been observed in the lab likewise exceeds the number that are true. Again,Paul Slovic & Sarah Lichtenstein, risk perception field research, Las Vegas, 1969! empirical observation and testing are necessary—in the field, for this purpose.  The number of researchers willing to engage in field-centered research, and unwilling to acknowledge candidly the necessity of doing so, has stifled the emergence of a genuinely evidence-based approach to the promotion of public engagement with decision-relevant science (Kahan 2014).

The second source of the disconnect between the practice of science and science-informed policymaking, on the one hand, and the science of science communication, on the other, is practical: the integration of the two is constrained by a collective action problem.  The generation of information relevant to the effective communication of decision-relevant science—including not only empirical evidence of what works and what does not but also practical knowledge of the processes for adapting and extending it in particular circumstances—is a public good.  Its benefits are not confined to those who invest the time and resources to produce it but extend as well to any who thereafter have access to it.  Under these circumstances, it is predictable that producers, constrained by their own limited resources and attentive only to their own particular needs, will not invest as much in producing such information, and in a form amenable to the dissemination and exploitation of it by others, as would be socially desirable.  As a result, instead of progressively building on their successive efforts, each initiative that makes use of evidence-based methods to promote effective public engagement with conservation-relevant science will be constrained to struggle anew with the recurring problems.

This proposal would attack both of sources of the persistent inattention to the science of science communication....

References

Fischhoff, B. & Scheufele, D.A. The science of science communication. Proceedings of the National Academy of Sciences 110, 14031-14032 (2013).

Gregory, R. & McDaniels, T. Improving Environmental Decision Processes. in Decision making for the environment : social and behavioral science research priorities (ed. G.D. Brewer & P.C. Stern) 175-199 (National Academies Press, Washington, DC, 2005).

Gregory, R., McDaniels, T. & Fields, D. Decision aiding, not dispute resolution: Creating insights through structured environmental decisions. Journal of Policy Analysis and Management 20, 415-432 (2001).

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Kahan, D. Making Climate-Science Communication Evidence Based—All the Way Down. In Culture, Politics and Climate Change: How Information Shapes Our Common Future, eds. M. Boykoff & D. Crow. (Routledge Press, 2014).

Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).

Kahan, D.M. A Risky Science Communication Environment for Vaccines. Science 342, 53-54 (2013).

Kahneman, D., Slovic, P. & Tversky, A. Judgment under uncertainty : heuristics and biases (Cambridge University Press, Cambridge ; New York, 1982).

Pidgeon, N.F., Kasperson, R.E. & Slovic, P. The social amplification of risk (Cambridge University Press, Cambridge ; New York, 2003).

Popper, K.R. Conjectures and refutations : the growth of scientific knowledge (Harper & Row, New York, 1968).

Slovic, P. The feeling of risk : new perspectives on risk perception (Earthscan, London ; Washington, DC, 2010).

Slovic, P. The perception of risk (Earthscan Publications, London ; Sterling, VA, 2000).

Slovic, P., Fischhoff, B. & Lichtenstein, S. Behavioral decision theory. Annu Rev Psychol 28, 1-39 (1977).

Slovic, P., Fischhoff, B. & Lichtenstein, S. Rating the risks. Environment: Science and Policy for Sustainable Development 21, 14-39 (1979).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (3)

A small point, maybe just semantic, but I don’t think so; 'conservation’ policy making evokes a narrow spectrum of issues, to me…about species and habitat and that sort of thing. It doesn’t capture the breadth of the issues to which I think you’re referring.

I think a core problem lies in the assumption, which still seems implicit in the goal of effective science communication, that there is a ‘right’ or ‘true’ or ‘actual’ reality about "valid, compelling, and widely accessible scientific evidence” that good communication can help people all see in some common way. It still seems to include the assumption that ‘ scientific truth is manifest’, or at least, that good science communication can make this “ ruth" shine through and win people’s hearts and minds. Yes, good framing and respect for cultural congeniality certainly can reduce the polarization around some issues…the "persistent public conflict over policy-relevant facts”. But about this thing called ‘evidence’…. There is ultimately no such thing, only the way we each interpret the facts we have through the instincts and heuristics and biases and feelings and experiences and group affiliations and….through the Affect Heuristic that colors how we see EVERYTHING. As Antonio Damasio makes a powerful case for in DesCartes Error, (a book Paul Slovic finds really foundational, and so do I , all perception is inescapably subjective.

The evidence for this seems pretty strong to me, ergo my quarrel with your case that "states of entrenched public confusion about decision-relevant science are in fact quite rare” The polarization is rare, yes. But good lord there sure seems to be a lot of confusion about decision-relevant facts, on tons of issues where CC-based investigation finds no polarization. There are people who dispute overwhelming evidence about; vaccines, fluoride, GMOs, the age of the Earth, evolution, homeopathy, the health effects of living near wind turbines, how effectively they can drive when they have drunk quantifiably too much alcohol, the greater likelihood of pregnancy or sexually transmitted diseases when engaging in intercourse with no barrier protection, smoking, obesity…and on and on and on. In fact, it seems far more common that a gap exists, often a chasm, between what some people believe, and what the facts seem to say. That is not 'conflict', but it is deeply entrenched confusion about decision-relevant science.

So the big question here is…what do you mean by ‘science communication’. What is the goal? If the idea is to communicate in ways that reduce conflict, your focus and goal…a goal I applaud and share….that’s one thing. But if it is to help people understand the facts more ‘correctly’, a broader goal also critical to help society make more intelligent and health policy choices on controversial risk issues (including those that may not have reached the status of polarization/conflict), then I think we have to be humble and respect what the study of cognition seems to say…that because of the affective way we perceive everything, no matter how much research is done on how to communicate about scientific evidence to help people understand it ‘accurately’, ‘accurately’ will ultimately remain in the eye…the heart…of the beholder

--

You are absolutely right that efforts to communicate about risk evidence too often fail to apply the "vast and integrated system of insights into the mechanisms by which ordinary individuals form their understandings of risk and related facts”, insights into risk perception which can empower more effective risk science communication. I encounter a lot of people who say they do/teach/consult in risk communication, who mostly overlook – or just plain don’t know about - all the evidence that explains why people feel the way they do about risk issues. Those insights absolutely DO need to be understood and applied in order to communicate more effectively. And specific communication programs on specific issues for specific audiences need to be researched and tested. The way the Mental Models approach of Bostrom et.al. Recommends .http://books.google.com/books/about/Risk_Communication.html?id=kfM1OZNjeAwC
( My ever-evolving definition of risk communication tries to capture this: "Actions, words, and other interactions that demonstrate an understanding of and respect for the feelings of the intended audience, intended to establish trust and thereby increase the influence that information being conveyed will have on the judgments and choices of the audience.”)

But here too we get back to the question of goals. We have to be humble about how far we think we can get helping everybody gets the facts right…to see the manifest truth in scientific evidence. That can’t be achieved. I don’t even claim to practice risk or science communication. I call what I teach ‘risk relationship management’, which is an effort to establish more positive (or at least less contentious and negative) working relationships, which can brings disagreeing parties closer to a shared perspective on a controversial risk issue, and thereby toward progress and resolution. Sort of like your effort to reduce conflict and polarization.

March 29, 2014 | Unregistered Commenterdavid ropeik

Well, nobody agrees or objects to my simplified scenario, so I will try again: The solution to global warming (for example) begs a "public" or "centralized control" solution. Liberals feel empowered by centralized control, conservatives feel disempowered. Conservatives thus tend to downplay the evidence of global warming, liberals tend to exaggerate it. This dispute will not be resolved (as noted above) by more solid and well-communicated science facts. Its a political problem and needs a political solution. Conservatives are not generally pro-pollution any more than liberals are generally against making responsible use of natural energy resources, and this is the common ground. I don't have a solution, but whatever it is, it must enhance or reduce the political power of both sides more or less equally. I think the blinders on both sides would tend to disappear if this were to happen. Energy spent by either side to "win" the political battle is wasted energy, only adding to the pollution of the science communication environment.

April 1, 2014 | Unregistered CommenterFrankL

@FrankL:

I agree w/ that -- although it seems to beg question why nuclear (lots of centralized control there, both b/c of safety & b/c energy production involves natural monopolies aplenty) is low for "conservs" & high for "libs," & why "lib" control valuers aren't enticed by opportunity to regulate "deviant" sexual practices to see homosexuality as source of disease or threat to children etc. -- the "conservative" position...

But how about take up a real challenge-- explain a Ludwick, pls

April 2, 2014 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>