follow CCP

Recent blog entries
popular papers

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk


Still another metacognition question 

How about this one, which is a classic in study of critical reasoning? What's answer & more importantly what percent of general public get it right? Why don't 100% get the correct answer? How do self-described "tea party" membes do (whatever happened to those guys?) Answers anon . . . . 


Year in review for CCP research, including the conservation-of-perplexity principle

The -est CCP research findings of the year . . .

1. Saddest: et tu, AOT? As is so for CRT, Numeracy, Ordinary Science Intelligence, etc., higher scores on the Actively Open-minded Thinking assessment are associated with more polarization on climate change.

2. Happiest: Do you like to be surprised?  Like a pot that is too shy to boil when being observed, some research findings reveal themselves only when one wasn’t even looking for them.  Add to that category the finding that science curiosity turns out to predict a disposition to expose oneself to surprising pieces of information that are contrary to one’s political predispositions, thereby mitigating polarization. Cool. 


3. Weirdest: Easily disgusted partisans apparently converge on highly contested issues like climate change and illegal immigration.  Just as energy can neither be created nor destroyed, perplexity is always conserved in empirical research: if you make any progress in trying to understand one mystery, you can be confident your efforts will reveal at least one additional thing that defies ready understanding and that begs for further investigation.  So here is one new thing I really don’t get!


How about another meta-cognition quiz question?

About what fraction of general public gets this one correct?

Jack is looking at Anne but Anne is looking at George. Jack is married but George is not. Is a married person looking at an unmarried person? [a. Yes; b. No; c.Cannot be determined.]

Anwers--to both questions-- later today

1.          LOOKING. Jack is looking at Anne but Anne is looking at George. Jack is married but George is not. Is a married person looking at an unmarried person? [a. Yes c.  No c.  Cannot be determined.]


Pathogen disgust & GM-food and vaccine-risk perceptions ... a fragment

From something am working on ... stay tuned:

3.1. Preliminary findings

a. PDS and political outlooks. Commentators often report that disgust sensitives, including the type measured bythe “pathogen disgust scale” (PDS), are correlated with left-right political orientations (Terrizzi et al., 2013; but see Tybur et al. 2010). In this large, nationally diverse sample, however, the relationship between PDS scores and political conservativism was trivially small (0.09, p < 0.01)


b.  Vaccine and GM risk perceptions and political outlooks. In the popular media, both vaccine and GM risk perceptions are frequently depicted as associated with “liberal” outlooks (e.g., Shermer 2013). Empirical data do not support this view (e.g., Kahan 2015; Kahan 2016).  In this study, too, there was no meaningful correlation  (r = 0.00, p= 0.96) between GM risk perceptions and political outlooks. For vaccines, there were small to moderate correlations, but the direction was contrary to the popular-commentary position: right-leaning scores on the political outlook measure predicted both more concern over vaccine risk perceptions (ISRPM: 0.09 p < 0.01) and less support for mandatory vaccination (r = -0.24, p < 0.01). 


Meta probabilistic thinking quiz...

About what percentage of US population will get correct answer (i.e., all 3 correct)? (a) 0-10%; (b) 11%-to 25%; (c) 26% to 50%; (d) 51%-75%; or (e) 76% to 100%?

Will post answer later today



Weekend reading list

Hey-- this is just like strategy followed by commenters on this blog!


Canadians know a thing or two about cultural conflict so this is probably worth taking a close look at.


Are individualist societies doomed? Find out.


Another paper crosses line from "in press" to "in print": Actively Open-minded Thinking & climate change polarization

Journals are cranking up publication speed to meet holiday demand.  And this one is *free* too.


Zika & culturally antagonistic memes -- now in print! Great stocking stuffer

Hurry up --get yours before sells out! (Best thing: it's free!)


Sufficient evidence of disgust-sensitivity & GM-food & vacciene risk perceptions ... a fragment

From conference paper due imminently ... more to come anon

2. Study

2.1. Inference strategy

This paper rests on a simple theoretical premise: that rejection of a “null hypothesis” with respect to the correlation between pathogen disgust sensitivity, on the one hand, and GM-food and vaccine risk perceptions, on the other, is not sufficient to support the conclusion that disgust sensitivity meaningfully explains these risk perceptionss.  Like all valid latent variable instruments, any scale used to measure pathogen disgust sensitivity will be imperfect. Such a scale will be highly correlated with, and thus reliably measure, a particular form of disgust sensitivity. But such a scale can still be expected to be weakly or even modestly correlate with additional negative affective dispositions.  As a result, there can be modest yet practically meaningless correlations between the pathogen disgust sensitivity scale and all manner of risk perceptions that excite negative affective reactions unrelated to disgust.

A comparative analysis is thus appropriate.  If disgust genuinely explains perceived risks over vaccines and GM foods, the relationship between a valid measure of pathogen disgust (PD) and those putative risk sources should be comparable to the relatively large ones between PD and attitudes one has good reason to believe are grounded in disgust. By the same token, if the correlation between the measure of PD and GM-food and vaccine risk perceptions, respectively, is comparable in magnitude to ones between the PD measure and putative risk sources that do not plausibly excite disgust, then there will be less reason to conclude that pathogen disgust sensitivity does not play an important role in explaining differences in the perceived risk of GM foods and vaccines.

This was the inference strategy that informed design of this study....


Sad news . . . 

CCP founding member Ann Richards (TC) passed away last night. I estimate she was happy every day of her life (of 15 yrs) except for the last 3. I predict that I will, after a day or 2, be happier everyday for the rest of mine as a result of having had the benefit of her companionship...






Weekend update: *This* is what scientific *dis*sensus looks like...

The two scientists depicted in this photograph are researchers in the culturally divisive "cats or birds?" field, & they are performing a so-called adversarial collaboration.



Weird ... does high disgust sensitivity mitigate political polarization??...

I did a couple of posts a while back (here & here) on disgust and GM-food- and vaccine-risk perceptions.

The upshot was that, contrary to the argument advanced by some scholars and by some popular-writing commentators, neither of these risk perceptions appeared to be distinctively related to disgust sensitivities. These perceptions, and some related policy preferences, were not any more meaningfully correlated with disgust sensitivity than were myriad other risks perceptions and policy preferences that aren’t plausibly viewed as disgust related (e.g., falling down elevator shafts, flying on commercial airliners, raising income taxes for the wealthy, enacting campaign finance laws, etc.).

But here’s another thing: the disgust sensitivity measure we used—the so called “pathogen disgust” scale (PDS), which is supposed to measure a disposition to be disgusted and hence afraid of sources of bodily invasion—has some truly weird interactions with political outlooks.

Take a look for yourself: 

Basically, increasing disgust sensitivity makes the group that otherwise is inclined to perceive low risk or express low support for risk-abating policies experience an inversion of that sensibility. As a result, on issues where there was substantial political polarization, there is a convergence of positions among the citizens of highest disgust sensitivity.

Why would that be? 

What’s especially weird is that PDS is supposed to predict political conservativism; yet here we have high-disgust conservatives clearly behaving more like liberals on climate  change, and high-disgust liberals behavior more like conservatives (it didn’t in our survey; the relationship between disgust and conservative outlooks was trivail in magnitude: r = 0.09).

Maybe I just don’t feel very imaginative today, but I am not inclined to come up with a story that fits the data. 

Instead I’m experiencing a bit of uncertainty about whether I should really be trusting the “pathogen disgust” scale.  It seems, basically, to be eliciting a kind of generic survey agreement bias; it’s influence is most detectable only in that portion of the population whose members aren’t already inclined to agree with the survey item and who thus can move in concern without the constraint of a ceiling effect in the outcome measure. . . .

But what do others think?



New NAS report on #scicomm

Here's something to read & discuss ...


Gore's sequel -- good idea or bad?  

I'll leave it to the 14 billion regular readers of this blog: you tell me, useful, helpful, etc. or not 


Weekend update: birth announcement--twins (sort of) on politically motivated reasoning

The Emerging Trends review commentary on "politically motived reasoning" is now officially published

As you can see, the working paper turned out to be siamese twins, who were severed at the spleen & published as a "two part" set:


  • Kahan, D. M. (2016). The Politically Motivated Reasoning Paradigm, Part 1: What Politically Motivated Reasoning Is and How to Measure It Emerging Trends in the Social and Behavioral Sciences: John Wiley & Sons, Inc.
  • Kahan, D. M. (2016). The Politically Motivated Reasoning Paradigm, Part 2: Unanswered Questions Emerging Trends in the Social and Behavioral Sciences: John Wiley & Sons, Inc.






Making science documentaries that matter in a culturally divided society (lecture summary plus slides)

Here is the gist of my presentation at the World Congress of Science and Factual producers in Stockholm on 12/7. ( slides)

1.  I can make movies, too! Plus “identity protective cognition.” I know most of you are expert filmmakers. Well, it turns out I made a movie once myself. 

It was “produced” for use in the study featured in “They Saw a Protest.”  The production values, I’m sure, seem quite low. There are two reasons for that. One is that the production values are low. The other is that swinging my recording device around erratically helped to generate a montage of scenes that, with suitable editing, could be made to plausibly appear to be scenes from either an anti-abortion protest outside an abortion clinic or an anti-“Don’t ask, don’t tell” one held outside a college recruitment center.

Subjects, instructed to assume the role of juror, were assigned either to the “abortion clinic” condition or the “recruitment center” condition.

As you can see, subjects’ perc eptions of the coercive nature vel non of the protestors, and the corresponding justification or lack thereorf on the part of the police for dispersing the demonstrators, varied depending on the condition to which the subjects were assigned and their cultural values: subjects of opposing values disagreed with one another on key facts when they were assigned to the same condition; at the same time, subjects who shared cultural values disagreed with one another when assigned to different conditions.

The resulting pattern of perceptions reflects identity-protective cognition. That is, subjects of particular values gravitated toward assessments of what they saw that conformed to the position that was most congruent with their groups’ postion on the cause of the protestors.

2. Identity-protective reasoning on climate change, etc. The gist of my talk is that many public controversies over risk fit this same pattern. That is, when appraising societal risks, individuals of opposing cultural outlooks can be expected to form perceptions of fact that reflect and reinforce their cultural allegiances.

As an example, consider the results of “Cultural Cognition of Scientific Consensus.”  That study found that “hierarch individualists” and “egalitarian communitarians” were both inclined to selectively recognized and dismiss the expertise of the featured scientists in patterns that corresponded to whether the attributed position of the putative expert—on climate change, nuclear waste disposal, or concealed handguns--was consistent or at odds with the prevailing position in the subjects’ cultural groups.

This is identity-protective cognition, too. Like the subjects in “They Saw a Protest,” the subjects in “Cultural Cognition of Scientific Consensus” selectively affirmed or disputed the expertise of the featured scientists depending on whether his positon cohered with the one in the subjects’ cultural group.

3. System 2 motivated reasoning. The “identity protect cognition” thesis’s primary competitor is the “bounded rationality thesis. The latter holds that disagreements among members of the public is attributable to people’s overreliance on “System 1” heuristic reasoning. This position predicts that as subjects become more proficient in the deliberate, conscious, analytic form of reasoning consistent with “System 2” reasoning, they ought to converge on the best available evidence on that societal risk.

In fact, though, as individuals’ scores on any manner of critical reasoning increase, so too does the intensity with which they affirm their group’s view and denigrate the other’s. 

This result is more consistent with the “identity protective cognition” thesis, which holds that individuals can be expected to devote all their cognitive resources to forming and persisting in the position that predominates in their group as a way of protecting their status within the group.

The problem of non-convergence is a consequence not of too little rationality but instead too much. Forced to choose between a truth-convergent and identity-protective form of reasoning, actors whose personal beliefs have zero impact on their (or anyone else’s) exposure to the putative risk at issue predictably gravitate toward formation of beliefs that secure for themselves the benefits of holding group-convergent beliefs.

But if individually rational, this form of information processing remains collectively irrational. It means that members of a diverse democratic society are less likely to converge on the best-available evidence that is essential to the well-being of all.  Nevertheless, the collective good associated with truth-convergent reasoning doesn’t’ change the psychic incentives of any individual to continue to engage information in a manner that is group-convergent instead.

This is the tragedy of the science communication commons.

4. Lab remedies.  These dynamics impose severe constraints on the use of science documentaries to inform people on controversial issues. Can anything be done to steer members of diverse groups away from this form of information processing?  Here are a couple of possibilities.

a. Two channel communication. One Is the “two channel” science communication model.  This model posits that individuals assess information along two channels—one dedicated to the content of the information and the other to the identity-expressive quality of it. The two must be in synch; if they interfere with each other—if individuals perceive the information on the “meaning” channel signifies that assent to the “content” of the information risks driving a wedge between them and others who share their cultural outlooks—then they will fail to assimilate information transmitted on the content channel, no matter how Cleary it is conveyed.

The nature of the dynamics involved here is illustrated by the CCP’s study on the impact of geoengineering and cultural polarization. Whereas the “anti-pollution” message generated a negative or hostile meaning (“game over”; “we told you so”) to individuals predisposed to climate skepticism, the “geoengineering research” conveyed an identity-affirming meaning (“yes we can”; “more of the same”). Consistently with these opposing messages, subjects in the “anti-pollution” condition displayed attitude polarization relative to the control group, while ones in the “geoengineering” condition displayed diminished polarization.

b. Science curiosity. Individuals who are “science curious” process information  differently from their less curious cultural peers.  They will choose, for example, to read new stories that report exciting or novel scientific findings even when doing so means exposure to information that is hostile to their cultural identity. This plausibly explains why science curiosity, of all the predispositions associated with science comprehension, does not aggravate but rather appears to mitigate cultural polarization.

A useful communication plan, then, might focus on maximizing the congeniality of information to science-curious subjects in the expectation that those individuals, when they interact in their cultural group, will convey—by words and action—that they have confidence in climate science, a message that is likely to carry more weight than “messages” by put-up “messengers” with whom they lack a cultural affinity.

5. What to do? You tell me!  But these are very formative and maddeningly general pieces of advice. What would a program that employs them look like?

I don’t honestly know!  I know nothing in particular about making science films.  What I do know is information about lots of general dynamics relating to science communication; for those insights to be translated into real-world practice would require the “situation sense” of individuals who are intimately involved in communication within particular real-world situations.

My panel mate Sonya Pemberton is in that position.  I’ll let her speak to how she is using the “two channel model” and the phenomena of “scientific curiosity” to advance her science communication objectives.

Once she has, moreover, I will happily join efforts with her or anyone else pursuing these reflective, and well-considered judgments to do what I am best equipped to do, which is to furnish tailored empirical information fitted to enabling that professional to make the best decisions she can.



Off to Stockholm to discuss the science of science filmmaking (& of course, "post truth")

Am off for a week to Stockholm to give a couple of talks & participate in panel discussions. Audience for first is attendees of the World Congress of Science and Factual Producers.  Here's the synopsis of what I'll be saying:

Want to make a difference? Then, don’t “message” the public; satisfy its curiosity

 Can science filmmakers promote public acceptance of the best evidence relating to the reality of human-caused climate change and other disputed science issues? Maybe, but not in the manner that one might think.  In particular, it is a mistake to believe that the simple presentation of factually accurate information, even in a dramatically compelling form, will change people’s minds. Research on cultural cognition shows that most individuals can be expected to selectively credit and discredit such information in patterns that reflect and reinforce the factual positions that predominate within their cultural groups. Indeed, this form of bias, experimental data show, grows in intensity as individuals become more adept at making sense of scientific information. Nevertheless, a segment of the general population appears to be relatively immune to these dynamics. These individuals are ones who possess the highest levels of science curiosity, a general disposition to seek out and consume scientific information for personal pleasure. Science-curious individuals are the core audience for excellent science films.  Although relatively small in number, these individuals occupy a potentially critical niche in the ecology of political opinion formation, since they are situated to credibly vouch for the validity of the best evidence within their cultural communities.  The strategic upshot is that science filmmakers ought to concentrate not on “messaging” the general public but rather on simply making excellent films that satisfy their core audience's distinctive appetite to know what is known.  The new science of science communication, moreover, can help filmmakers unlock the knowledge-promoting energy of science curious citizens by furnishing filmmakers with tools they can use to make their films as appealing to as culturally diverse an audience of viewers as possible.

Somehow this got revised in the program into a statement that suggests I hold the position that science filmmakers are "all wrong" & I'm going to show them how to do it & by presenting research "demolishing" what they believe .... I'd never say that, and that's not the philosophy of the CCP Science of Science Filmmaking Initiative ... So I'll deal with a bit of "post truth" fact correction at the outset of my talk, I suppose.  But it will a lot of fun I'm sure.

Then there's a second talk for SVT, the Swedish public television producer, on misinformation. The 14 billion readers of this blog know how I fee about that.

I'll try to remember to send postcards!


Weekend update: Q & A at Nature

Trump victory and QED's addition of "post truth" to its latest edition has result to an "all talking heads on deck" alert.

Read the interview.


Is cultural cognition an instance of "bounded rationality"? A ten-yr debate

This is basically what I remember saying last week at William & Mary in a workshop co-sponsored by the Law School & Political Science Dep't a couple weeks ago. Slides here.

1. An old but continuing debate.  The paper you read for this workshop—Motivated Numeracy and Enlightened Self Government, Behavioural Policy (in press)—originates in a debate that started 10 yrs ago.

A group of us (me, Paul Slovic, Donald Braman, and John Gastil) had written a critique of Cass Sunstein’s then-latest book Laws of Fear.  In that book, Sunstein had attributed all manner of public conflict over risk to the public’s overreliance on “System 1” heuristic reasoning. The remedy, in Sunstein’s view, was to shift as much risk-regulatory power as possible to politically insulated expert agencies, whose members could be expected to use conscious, effortful “System 2” information processing.

Our response—Fear of Democracy: A Cultural Evaluation of Sunstein on Risk, Harvard L. Rev., 119: 1071-1109—criticized Sunstein for ignoring cultural cognition, which of course attributes a large class of such conflicts to the impact that cultural allegiances play in shaping diverse individuals’ risk perceptions.

The costs of ignoring cultural cognition, we argued, were two-fold. 

Descriptively, without some mechanism that accounts for individual differences in information processing, Sunstein could not explain why so many risk controversies (from climate change to gun control to nuclear power to the HPV vaccine) involve conflicts not between the public and experts but between different segments of the public.

Prescriptively, the cost of ignoring cultural cognition undermined Sunstein’s central recommendation to hand over all risk-regulated decisionmaking to independent expert risk regulators. That recommendation presupposed that all disagreements between the public and experts originated in the public’s bounded rationality, a defect that it was reasonable to assume could not be remedied by any feasible intervention and that generated factual errors unentitled to normative respect in lawmaking.

Cultural cognition, we argued, showed that public risk perceptions on many issues were rooted in diverse citizens’ values.  It wasn’t obvious that expert decisionmaking was “better” than public decisionmaking on risks originating in publicly contested worldviews. Nor was it obvious that conflicts originating in conflicting worldviews could not be resolved by democratic decisionmaking procedures aimed at helping culturally diverse citizens to arrive at shared perceptions of the best available evidence on the dangers that society faces.

In his (very gracious, very intelligent) reply, Cass asserted that cultural cognition could simply be assimilated to his account of the reasoning deficits that distort public decisionmaking: “I argue,” he wrote “that insofar as it produces factual judgments, ‘cultural cognition’ is largely a result of bounded rationality, not an alternative to it.”  “[W]hile it is undemocratic for officials to neglect people’s values, it is hardly undemocratic for them to ignore people’s errors of fact” (Sunstein 2006)

This position—that cultural cognition and affiliated forms of motivated reasoning are rooted in “bounded rationality"—is now the orthodox view in decision science (e.g., Lodge & Taber 2013). 

But we weren’t sure it was right.  As plausible as the claim seemed to be, it hadn’t been empirically tested.  So we set out to determine, empirically, whether the forms of information processing that are characteristic of cultural cognition really are properly attributed to overreliance on heuristic reasoning.

2.  A ten-year research program. The answer we arrived at over a course of a decade of research was that cultural cognition is not appropriately attributed to overreliance on the form of heuristic information processing associated with “System 1” reasoning.  On the contrary, the individuals in whom cultural cognition exerts the strongest effects were those most disposed to use conscious, effortful, “System 2” reasoning.

Click me! I will wither & die w/o attentionThis conclusion was supported by two testing strategies.

The first was the use of observational or survey methods. In these studies we simply correlated various measures of System 1/System 2 reasoning dispositions with public perceptions of risk and related facts. 

If public conflict over risk is a consequence of “bounded rationality,” then one should expect the individuals who evince the strongest disposition to use System 2 reasoning will form risk perceptions more consistent with expert ones than will individuals who evidence the strongest disposition to use System 1 forms of information processing.

In addition, one would expect polarization over contested risk to abate as individuals’ proficiency in System 2 reasoning dispositions increase: those individuals can be expected to “go with the evidence” and refrain from “going with their gut,” which is filled with heuristic-reasoning crap like “what do other people like me think?”

But in fact, those predictions are not borne out by the evidence.

In multiple studies, we found that the individuals who scored highest on one or another measure of the disposition to use conscious, effortful “System 2” information processing were in fact the most polarized on contentious risk issues, including the reality of climate change, the hazards of fracking, the danger of allowing citizens to carry concealed handguns etc. (Kahan, Peters et al. 2012; Kahan 2015; Kahan & Corbin 2016).

Inconsistent with the “bounded rationality” conception, this consistent finding is more consistent with the “cultural cognition thesis,” which posits that individuals can be expected to form identity-protective beliefs and to use all of the cognitive resources at their disposal to do so.

But to nail this inference down, we also conducted a series of experiments, the second type of testing strategy by which we probed Sunstein’s and others’ “bounded rationality” conception of cultural cognition and cognate forms of motivated reasoning.

Click *me*-- I have magic powers that are unlocked by clickingThese experiments consistently showed that individuals highest in the critical reasoning dispositions associated with System 2 information processing were using their cognitive proficiencies to ferret out evidence consistent with their cultural or ideological predispositions and to rationalize the peremptory dismissal of evidence inconsistent with the same (e.g., Kahan 2013).

Motivated Numeracy and Enlightened Self-government (Kahan, Peters et al. in press) reports the results of one of those studies.

3.  So what’s the upshot?  The original debate—over whether cultural cognition is a consequence of overreliance on System 1 heuristic processing—has been resolved, in my opinion.  Insofar as the individuals who demonstrate the greatest disposition to use System 2 reasoning are also the ones who most strongly evince cultural cognition, we can confident that it is not a “cognitive bias.”

But is it a socially desirable form of information processing on socially contested risks?

That’s a different question, one my own answer to which has been very much reshaped by the course of the “Ten Year Debate.”

It is in fact perfectly rational at the individual level to engage information societal risks in an identity-protective rather than a truth-convergent manner.  What an individual personally believes about climate change, e.g., won’t affect the risk she or anyone she cares about faces; whether as consumer, voter, public discussant, etc. her personal behavior will be too inconsequential to matter. 

But given what positions on climate change and other societal risk issues have come to signify about who she is and whose side she is on in a perpetual struggle for status among competing cultural groups, a person who forms a position out of line with her cultural peers risks estrangement from the people on whom she depends on for emotional and material support.

One doesn’t have to be a science whiz to get this.  But if one is endowed with the capacity to make sense of evidence in the manner that is associated with System 2 information processing, it is predictable that she will use those cognitive resources to achieve the everyday personal advantages associated with the congruence between her beliefs and those of her cultural peers.

Of course, if everyone does this all at once, we are indeed screwed.  In that situation, diverse citizens and their democratically accountable representatives won’t converge, or converge nearly as quickly as they should, on the best evidence on the risks they genuinely face. 

But sadly, this fact won’t change the psychic incentives that individuals have to use the forms of reasoning that most reliably connect their beliefs to the positions that signify membership in and loyalty to the identity-defining groups of which she is a member.

This is the tragedy of the science communications commons.

We should do something to dispel this condition.  But what?

That’s a hard question.  But it’s one for which an answer won't be forthcoming if we rely on accounts of public risk perceptions that attempt to assimilate cultural cognition into the “public uses system 1, experts system 2” framework.

I suspect Cass Sunstein by this point would largely agree with everything I’m saying. 

Or at least I hope he does, for the project to overcome “the tragedy of the science communications commons” is one that demands the fierce attention of the very best scholars of public risk perception and science communication.


Kahan. DM and Corbin, JC (2016) A Note on the Perverse Effects of Actively Open-minded Thinking on Climate Change Polarization. Research & Politics, 10.1177/2053168016676705.

Kahan, D.M. Climate-Science Communication and the Measurement Problem. Advances in Political Psychology 36, 1-43 (2015).

Kahan, D.M. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making 8, 407-424 (2013).

Kahan, D.M., Peters, E., Dawson, E. & Slovic, P. Motivated Numeracy and Enlightened Self Government. Behavioural Policy (in press).

Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Clim. Change 2, 732-735 (2012).

Kahan, D.M., Slovic, P., Braman, D. & Gastil, J. Fear of Democracy: A Cultural Evaluation of Sunstein on Risk. Harvard Law Review 119, 1071-1109 (2006).

Lodge, M. & Taber, C.S. The rationalizing voter (Cambridge University Press, Cambridge ; New York, 2013).

Sunstein, C.R. Laws of fear : beyond the precautionary principle (Cambridge University Press, Cambridge, UK ; New York, 2005).

Sunstein, C.R. Misfearing: A reply. Harvard Law Review 119, 1110-1125 (2006).


"They saw an election"-- my 2 cents on election result