The quality of the science communication environment and the vitality of reason

The Motivated Numeracy and Enlightened Self-Government working paper has apparently landed in the middle of an odd, ill-formed debate over the “knowledge deficit theory” and its relevance to climate-science communication. I’m not sure, actually, what that debate is about or who is involved.  But I do know that any discussion framed around the question “Is the knowledge-deficit theory valid?” is too simple to generate insight. There are indeed serious, formidable contending accounts of the the nature of the “science communication problem”–the failure of citizens to converge on the best available evidence on the dangers they face and the efficacy of measures to abate them.  The antagonists in any “knowledge-deficit debate” will at best be stick-figure representations of these positions. 

Below is an excerpt from the concluding sections of the MNESG paper. It reflects how I see the study findings as contributing to the position I find most compelling in the scholarly discussion most meaningfully engaged with the science communication problem. The excerpt can’t by itself supply a full account of the nature of the contending positions and the evidence on which they rest (none is wholly without support). But for those who are motivated to engage the genuine and genuinely difficult questions involved, the excerpt might help to identify for them paths of investigation that will lead them to locations much more edifying than the ones in which the issue of “whether the knowledge deficit theory is valid” is thought to be a matter worthy of discussion.

5.2. Ideologically motivated cognition and dual process reasoning generally

The ICT hypothesis corroborated by the experiment in this paper conceptualizes Numeracy as a disposition to engage in deliberate, effortful System 2 reasoning as applied to quantitative information. The results of the experiment thus help to deepen insight into the ongoing exploration of how ideologically motivated reasoning interacts with System 2 information processing generally.

As suggested, dual process reasoning theories typically posit two forms of information processing: a “fast, associative” one “based on low-effort heuristics”, and a “slow, rule based” one that relies on “high-effort systematic reasoning” (Chaiken & Trope 1999, p. ix). Some researchers have assumed (not unreasonably) that ideologically motivated cognition—the tendency selectively to credit or discredit information in patterns that gratify one’s political or cultural predispositions—reflects over-reliance on the heuristic forms of information processing associated with heuristic-driven, System 1 style of information processing (e.g., Lodge & Taber 2013; Marx et al. 2007; Westen, Blagov, Harenski, Kilts, & Hamann, 2006; Weber & Stern 2011; Sunstein 2006).

There is mounting evidence that this assumption is incorrect. It includes observational studies that demonstrate that science literacy, numeracy, and education (Kahan, Peters, Wittlin, Slovic, Ouellette, Braman & Mandel 2012; Hamilton 2012; Hamilton 2011)—all of which it is plausible to see as elements or outgrowths of the critical reasoning capacities associated with System 2 information processing—are associated with more, not less, political division of the kind one would expect if individuals were engaged in motivated reasoning.

Experimental evidence points in the same direction. Individuals who score higher on the Cognitive Reflection Test, for example, have shown an even stronger tendency than ones who score lower to credit evidence selectively in patterns that affirm their political outlooks (Kahan 2013). The evidence being assessed in that study was nonquantitative but involved a degree of complexity that was likely to obscure its ideological implications from subjects inclined to engage the information in a casual or heuristic fashion. The greater polarization of subjects who scored highest on the CRT was consistent with the inference that individuals more disposed to engage systematically with information would be more likely to discern the political significance of it and would use their critical reasoning capacities selectively to affirm or reject it conditional on its congeniality to their political outlooks.

The experimental results we report in this paper display the same interaction between motivated cognition and System 2 information processing. Numeracy predicts how likely individuals are to resort to more systematic as opposed to heuristic engagement with quantitative information essential to valid causal inference. The results in the gun-ban conditions suggest that high Numeracy subjects made use of this System 2 reasoning capacity selectively in a pattern consistent their motivation to form a politically congenial interpretation of the results of the gun-ban experiment.  This outcome is consistent with that of scholars who see both systematic (or System 2) and heuristic (System 1) reasoning as vulnerable to motivated cognition (Cohen 2003; Giner-Sorolla & Chaiken 1997;  Chen, Duckworth & Chaiken 1999).

These findings also bear on whether ideologically motivated cognition is usefully described as a manifestation of “bounded rationality.” Cognitive biases associated with System 1 reasoning are typically characterized that way on the ground that they result from over-reliance on heuristic patterns of information processing that reflect generally adaptive but still demonstrably inferior substitutes for the more effortful and more reliable type of information processing associated with System 2 reasoning (e.g., Kahneman 2003; Jolls, Sunstein & Thaler 1998).

We submit that a form of information processing cannot reliably be identified as “irrational,” “subrational,” “boundedly rational” or the like independent of what an individuals’ aims are in making use of information. It is perfectly rational, from an individual-welfare perspective, for individuals to engage decision-relevant science in a manner that promotes culturally or politically congenial beliefs. Making a mistake about the best-available evidence on an issue like climate change, nuclear waste disposal, or gun control will not increase the risk an ordinary member of the public faces, while forming a belief at odds with the one that predominates on it within important affinity groups of which they are members could expose him or her to an array of highly unpleasant consequences (Kahan 2012). Forms of information processing that reliably promote the stake individuals have in conveying their commitment to identity-defining groups can thus be viewed as manifesting what Anderson (1993) and others (Cohen 2003; Akerlof and Kranton 2000; Hillman 2010; Lessig 1995) have described as expressive rationality.

If ideologically motivated reasoning is expressively rational, then we should expect those individuals who display the highest reasoning capacities to be the ones most powerfully impelled to engage in it (Kahan et al. 2012). This study now joins the rank of a growing list of others that fit this expectation and that thus supports the interpretation that ideologically motivated reasoning is not a form of bounded rationality but instead a sign of how it becomes rational for otherwise intelligent people to use their critical faculties when they find themselves in the unenviable situation of having to choose between crediting the best available evidence or simply being who they are.

6. Conclusion: Protecting the “science-communication environment”

To conclude that ideologically motivated reasoning is expressively rational obviously does not imply that it is socially or morally desirable (Lessig 1995). Indeed, the implicit conflation of individual rationality and collective wellbeing has long been recognized to be a recipe for confusion, one that not only distorts inquiry into the mechanisms of individual decisionmaking but also impedes the identification of social institutions that remove any conflict between those mechanisms and attainment of the public good (Olson 1965). Accounts that misunderstand the expressive rationality of ideologically motivated cognition are unlikely to generate reliable insights into strategies for counteracting the particular threat that persistent political conflict over decision-relevant science poses to enlightened democratic policymaking.

Commentators who subscribe to what we have called the Science Comprehension Thesis typically propose one of two courses of action. The first is to strengthen science education and the teaching of critical reasoning skills, in order better to equip the public for the cognitive demands of democratic citizenship in a society where technological risk is becoming an increasingly important focus of public policymaking (Miller & Pardo 2000). The second is to dramatically shrink the scope of the public’s role in government by transferring responsibility for risk regulation and other forms of science-informed policymaking to politically insulated expert regulators (Breyer 1993). This is the program advocated by commentators who believe that the public’s overreliance on heuristic-driven forms of reasoning is too elemental to human psychology be corrected by any form of education (Sunstein 2005).

Because it rejects the empirical premise of the Science Comprehension Thesis, the Identity-protective Cognition Thesis takes issue with both of these prescriptions. The reason that citizens remain divided over risks in the face of compelling and widely accessible scientific evidence, this account suggest, is not that that they are insufficiently rational; it is that the that they are too rational in extracting from information on these issues the evidence that matters most for them in their everyday lives. In an environment in which positions on particular policy-relevant facts become widely understood as symbols of individuals’ membership in and loyalty to opposing cultural groups, it will promote people’s individual interests to attend to evidence about those facts in a manner that reliably conforms their beliefs to the ones that predominate in the groups they are members of. Indeed, the tendency to process information in this fashion will be strongest among individuals who display the reasoning capacities most strongly associated with science comprehension.

Thus, improving public understanding of science and propagating critical reasoning skills—while immensely important, both intrinsically and practically (Dewey 1910)—cannot be expected to dissipate persistent public conflict over decision-relevant science. Only removing the source of the motivation to process scientific evidence in an identity-protective fashion can. The conditions that generate symbolic associations between positions on risk and like facts, on the one hand, and cultural identities, on the other, must be neutralized in order to assure that citizens make use of their capacity for science comprehension.[1]

In a deliberative environment protected from the entanglement of cultural meanings and policy-relevant facts, moreover, there is little reason to assume that ordinary citizens will be unable to make an intelligent contribution to public policymaking. The amount of decision-relevant science that individuals reliably make use of in their everyday lives far exceeds what any of them (even scientists, particularly when acting outside of the domain of their particular specialty) are capable of understanding on an expert level. They are able to accomplish this feat because they are experts at something else: identifying who knows what about what (Keil 2010), a form of rational processing of information that features consulting others whose basic outlooks individuals share and whose knowledge and insights they can therefore reliably gauge (Kahan, Braman, Cohen, Gastil & Slovic 2010).

These normal and normally reliable processes of knowledge transmission break down when risk or like facts are transformed (whether through strategic calculation or misadventure and accident) into divisive symbols of cultural identity. The solution to this problem is not—or certainly not necessarily!—to divest citizens of the power to contribute to the formation of public policy. It is to adopt measures that effectively shield decision-relevant science from the influences that generate this reason-disabling state (Kahan et al. 2006).

Just as individual well-being depends on the quality of the natural environment, so the collective welfare of democracy depends on the quality of a science communication environment hospitable to the exercise of the ordinarily reliable reasoning faculties that ordinary citizens use to discern what is collectively known. Identifying strategies for protecting the science communication environment from antagonistic cultural meanings—and for decontaminating it when such protective measures fail—is the most critical contribution that decision science can make to the practice of democratic government.

[1] We would add, however, that we do not believe that the results of this or any other study we know of rule out the existence of cognitive dispositions that do effectively mitigate the tendency to display ideologically motivated reasoning. Research on the existence of such dispositions is ongoing and important (Baron 1995; Lavine, Johnston & Steenbergen, 2012). Existing research, however, suggests that the incidence of any such disposition in the general population is small and is distinct from the forms of critical reasoning disposition—ones associated with constructs such as science literacy, cognitive reflection, and numeracy—that are otherwise indispensable to science comprehension. In addition, we submit that the best current understanding of the study of science communication indicates that the low incidence of this capacity, if it exists, is not the source of persistent conflict over decision-relevant science. Individuals endowed with perfectly ordinary capacities for comprehending science can be expected reliably to use them to identify the best available scientific evidence so long as risks and like policy-relevant facts are shielded from antagonistic cultural meanings.

Leave a Comment