follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Law & Cognition 2016, Session 6: Reading list & questions | Main | Modeling the incoherence of coherence based reasoning: report from Law & Cognition 2016 »
Saturday
Oct012016

Weekend up(back)date: cultural cognition vs. Bayesian updating of scientific consensus

We've been having so much fun with "Bayesian vs. X" diagrams in Law & Cognition 2016 that I thought I'd dredge up a vintage use of this heuristic.  This is from Kahan, D.M., Jenkins-Smith, H. & Braman, D., Cultural Cognition of Scientific Consensus, J. Risk Res. 14, 147-174 (2011)

 5.1. Summary of findings

The goal of the study was to examine a distinctive explanation for the failure of members of the public to form beliefs consistent with apparent scientific consensus on climate change and other issues of risk. We hypothesized that scientific opinion fails to quiet societal dispute on such issues not because members of the public are unwilling to defer to experts but because culturally diverse persons tend to form opposing perceptions  of  what  experts  believe.  Individuals  systematically  overestimate  the degree of scientific support for positions they are culturally predisposed to accept as a result of a cultural availability effect that influences how readily they can recall instances of expert endorsement of those positions.

The study furnished two forms of evidence in support of this basic hypothesis. The first was the existence of a strong correlation between individuals’ cultural values and their perceptions of scientific consensus on risks known to divide persons of opposing worldviews. Subjects holding hierarchical and individualistic outlooks, on the one hand, and ones holding egalitarian and communitarian outlooks, click me for a closer look!on the other, significantly disagreed about the state of expert opinion on climate change, nuclear waste disposal, and handgun regulation. It is possible, of course, that one or the other of these groups is better at discerning scientific consensus than the other. But because the impressions of both groups converged and diverged from positions endorsed in NAS ‘expert consensus’ in a pattern reflective of their respective predispositions, it seems more likely that both hierarchical individualists and egalitarian communitarians are fitting their perceptions of scientific consensus to their values.

The second finding identified a mechanism that could explain this effect. When asked to evaluate whether an individual of elite academic credentials, including membership in the NAS, was a ‘knowledgeable and trustworthy expert’, subjects’ answers proved conditional on the fit between the position the putative expert was depicted as adopting (on climate change, on nuclear waste disposal, or on handgun regulation) and the position associated with the subjects’ cultural outlooks. . . .

5.2. Understanding the cultural cognition of risk

Adding this dynamic to the set of mechanisms through which cultural cognition shapes perceptions of risk and related facts, it is possible to envision a more complete picture of how these processes work in concert. On this view, cultural cognition can be seen as injecting a biasing form of endogeneity into a process roughly akin to Bayesian updating.

Even as an idealized normative model of rational decision-making, Bayesian information processing is necessarily incomplete. Bayesianism furnishes an algorithm for rationally updating one’s beliefs in light of new evidence: one’s estimate of the likelihood of some proposition should be revised in proportion to the probative weight of any new evidence (by multiplying one’s ‘prior odds’ by a ‘likelihood ratio’ that represents how much more consistent new evidence is with that proposition than with its negation; Raiffa 1968). This instruction, however, merely tells a person how a prior estimate and new evidence of a particular degree of probity should be combined to produce a revised estimate; it has nothing to say about what her prior estimate should be or, even more importantly, how she should determine the probative force (if any) of a putatively new piece of evidence.

Consistently with Bayesianism, an individual can use pretty much any process she wants – including some prior application of the Bayesian algorithm itself – to determine the probity of new evidence (Raiffa 1968), but any process that gauges the weight (or likelihood ratio) of the new evidence based on its consistency with the individual’s prior estimate of the proposition in question will run into an obvious difficulty. In the extreme, an individual might adopt the rule that she will assign no probative weight to any asserted piece of evidence that contradicts her prior belief. If she does that, she will of course never change her mind and hence never revise a mistaken belief, since she will necessarily dismiss all contrary evidence, no matter how well founded, as lacking credibility. In a less extreme variant, an individual might decide merely to assign new information that contradicts her prior belief less probative weight than she otherwise would have; in that case, a person who starts with a mistaken belief might eventually correct it, but only after being furnished with more evidence than would have been necessary if she had not discounted any particular item of contrary evidence based on her mistaken starting point. A person who employs Bayesian updating is more likely to correct a mistaken belief, and to do so sooner, if she has a reliable basis exogenous to her prior belief for identifying the probative force of evidence that contravenes that belief (Rabin and Schrag 1999).

When mechanisms of cultural cognition figure in her reasoning, a person processes information in a manner that is equivalent to one who is assigning new information probative weight based on its consistency with her prior estimation (Figure 9). Because of identity protective cognition (Sherman and Cohen 2006; Kahan et al. 2007) and affect (Peters, Burraston, and Mertz 2004), such a person is highly likely to start with a risk perception that is associated with her cultural values. She might resolve to evaluate the strength of contrary evidence without reference to her prior beliefs. However, because of culturally biased information search and culturally biased assimilation (Kahan et al. 2009), she is likely to attend to the information in a way that reinforces her prior beliefs and affective orientation (Jenkins-Smith 2001).

Perhaps mindful of the limits of her ability to gather and interpret evidence on her own, such an individual might choose to defer or to give considerable weight to the views of experts. But through the cultural availability effect examined in our study, she is likely to overestimate the proportion of experts who hold the view consistent with her own predispositions. Like the closed-minded Bayesian whose assessment of the probative value of new information is endogenous to his prior beliefs, then, such an individual will either not change her mind or will change it much more slowly than she should, because the same predisposition that informs her priors will also be unconsciously shaping her ability to recognize and assign weight to all manner of evidence, including the opinion of scientists (Zimper and Ludwig 2009).

References

Jenkins-Smith, H. 2001. Modeling stigma: An empirical analysis of nuclear waste images of Nevada. In Risk, media, and stigma: Understanding public challenges to modern science and technology, ed. J. Flynn, P. Slovic, and H. Kunreuther, 107–32. London/Sterling, VA: Earthscan.

Kahan, D.M., D. Braman, J. Gastil, P. Slovic, and C.K. Mertz. 2007. Culture and identity- protective cognition: Explaining the white-male effect in risk perception. Journal of Empirical Legal Studies 4, no. 3: 465–505.

Kahan, D.M., D. Braman, P. Slovic, J. Gastil, and G. Cohen. 2009. Cultural cognition of the risks and benefits of nanotechnology. Nature Nanotechnology 4, no. 2: 87–91.

Peters, E.M., B. Burraston, and C.K. Mertz. 2004. An emotion-based model of risk perception and stigma susceptibility: Cognitive appraisals of emotion, affective reactivity, world- views, and risk perceptions in the generation of technological stigma. Risk Analysis 24, no. 5: 1349–67

Rabin, M., and J.L. Schrag. 1999. First impressions matter: A model of confirmatory bias. Quarterly Journal of Economics 114, no. 1: 37–82.

Raiffa, H. 1968. Decision analysis. Reading, MA: Addison-Wesley.

Sherman, D.K., and G.L. Cohen. 2006. The psychology of self-defense: Self-affirmation theory. In Advances in experimental social psychology, ed. M.P. Zanna, 183–242. San Diego, CA: Academic Press

Zimper, A., and A. Ludwig. 2009. On attitude polarization under Bayesian learning with non-additive beliefs. Journal of Risk and Uncertainty 39, no. 2: 181–212

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>