Here is a piece of something. . . .
I. An introductory concept: the “science communication environment”
In order to live well (really, just to live), all individuals (all of them—even scientists!) must accept as known by science vastly more information than they could ever hope to attain or corroborate on their own. Do antibiotics cure strep throat (“did mine”)? Does vitamin C (“did mine”)? Does smoking cause cancer (“. . . happened to my uncle”)? Do childhood vaccinations cause autism (“. . . my niece”)? Does climate change put us at risk (“Yes! Hurricane Sandy destroyed my house!”)? How about legalizing gay marriage (“Yes! Hurricane Sandy destroyed my house!”)?
The expertise individuals need to make effective use of decision-relevant science consists less in understanding particular bodies of specialized knowledge than in recognizing what has been validly established by other people—countless numbers of them—using methods that no one person can hope to master in their entirety or verify have been applied properly in all particular instances. A foundational element of human rationality thus necessarily consists in the capacity to reliably identify who knows what about what, so that we can orient our lives to exploit genuine empirical insight and, just as importantly, steer clear of specious claims being passed off by counterfeiters or by those trading in the valueless currency of one or another bankrupt alternative to science’s way of knowing (Keil 2010).
Individuals naturally tend to make use of this collective-knowledge recognition capacity within particular affinity groups whose members hold the same basic values (Watson, Kumar & Michelsen 1993). People get along better with those who share their cultural outlooks, and can thus avoid the distraction of squabbling. They can also better “read” those who “think like them”—and thus more accurately figure out who really knows what they are talking about, and who is simply BS’ing. Because all such groups are amply stocked with intelligent people whose knowledge derives from science, and possess well functioning processes for transmitting what their members know about what’s collectively known, culturally diverse individuals tend to converge on the best available evidence despite the admitted insularity of this style of information seeking.
The science communication environment comprises the sum total of the everyday cues and processes that these plural communities of certification supply their members to enable them to reliably orient themselves with regard to valid collective knowledge. Damage to this science communication environment—any influence that disconnects these cues and processes from the collective knowledge that science creates—poses a threat to individual and collective well-being every bit as significant as damage to the natural environment.
Persistent public conflict over climate change is a consequence of one particular form of damage to the science communication environment: the entanglement of societal risk risks with antagonistic cultural meanings that transform positions on them into badges of membership in and loyalty to opposing cultural groups (Kahan 2012). When that happens, the stake individuals have in maintaining their standing within their group will often dominate whatever stake they have in forming accurate beliefs. Because nothing an ordinary member of the public does—as consumer, voter, or public advocate—will have a material impact on climate change, any mistake that person makes about the sources or consequences of it will not actually increase the risk that climate change poses to that person or anyone he or she cares about. But given what people now understand positions on climate change to signify about others’ character and reliability, forming a view out of line with those in one’s group can have devastating consequences, emotional as well as material. In these circumstances individuals will face strong pressure to adopt forms of engaging information—whether it relates to what most scientists believe (Kahan, Jenkins-Smith & Braman 2011) or even whether the temperature in their locale has been higher or lower than usual in recent years (Goebbert, Jenkins-Smith, et al. 2012)—that more reliably connects them to their group than to the position that is most supported by scientific evidence.
Indeed, those members of the public who possess the most scientific knowledge and the most developed capacities for making sense of empirical information are the ones in whom this “myside bias” is likely to be the strongest (Kahan, Peters, et al. 2012; Stanovich & West 2007). Under these pathological circumstances, such individuals be expected to use their knowledge and abilities to search out forms of identity-supportive evidence that would likely evade the attention of others in their group, and to rationalize away identity-threatening forms that others would be saddled with accepting. Confirmed experimentally (Kahan 2013a; Kahan, Peters, Dawson & Slovic 2013), the power of critical reasoning dispositions to magnify culturally biased assessments of evidence explains why those members of the public who are highest in science literacy and quantitative reasoning ability are in fact the most culturally polarized on climate change risks. Because these individuals play a critical role in certifying what is known to science within their cultural groups, their errors propagate and percolate through their communities, creating a state of persistent collective confusion.
The entanglement of risks and like facts with culturally antagonistic meanings is thus a form of pollution in the science communication environment. It literally disables the faculties of reasoning that ordinary members of the public rely on—ordinarily to good effect—in discerning what is known to science and frustrates the common stake they have in recognizing how decision-relevant science bears on their individual and collective interests. It thus deprives them, and their society, of the value of what is collectively known and the investment they have made in thieir own ability to generate, recognize, and use that knowledge.
Protecting the science communication environment from such antagonistic meanings is thus an essential element of effective science communication--indeed of enlightened self-government (Kahan 2013b). Because the entanglement of positions on risk with cultural identity impels ordinary members of the public to use their knowledge and reason to resist evidence at odds with their groups’ views, nothing one does to make scientific information more accessible or widely distributed can be expected to counteract the forms of group polarization that this toxin generates.
Goebbert, K., Jenkins-Smith, H.C., Klockow, K., Nowlin, M.C. & Silva, C.L. Weather, Climate and Worldviews: The Sources and Consequences of Public Perceptions of Changes in Local Weather Patterns. Weather, Climate, and Society (2012).
Kahan, D. Why We Are Poles Apart on Climate Change. Nature 488, 255 (2012).
Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The Polarizing Impact of Science Literacy and Numeracy on Perceived Climate Change Risks. Nature Climate Change 2, 732-735 (2012).
Keil, F.C. The Feasibility of Folk Science. Cognitive science 34, 826-862 (2010).
Watson, W.E., Kumar, K. & Michaelsen, L.K. Cultural Diversity's Impact on Interaction Process and Performance: Comparing Homogeneous and Diverse Task Groups. The Academy of Management Journal 36, 590-602 (1993).