follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« What is a "cultural style"? And some thoughts about convergent validity | Main | MAPKIA "answers" episode2: There is no meaningful cultural conflict over vaccine risks, & the tea party doesn't look very "libertarian" to me! »

The value of civic science literacy

Gave talk Wednesday at AGU meeting in San Francisco. Slides here. I was on panel w/ a bunch of talented scholars doing really great studies on teaching climate science. The substance of what to teach (primarily in context of undergraduate science courses) was quite interesting but what was really cool was the their (data-filled) account of the "test theory" issues they are attacking in developing valid, reliable, and highly discriminant measures of "climate science literacy" ("earth is heating up," "humans causing," "we're screwed," they recognized, don't reliably measure anything other than the attitude "I care/believe in global warming"). My talk wasn't on how to impart climate science literacy but rather on what needs to be done to assure that a democratic society gets the full value out of having civically science literate citizens: protect the science communication environment-- a matter that making citizens science literate does not itself achieve. (Gave another talk later at The Nature Conservacy's "All-Science" event but will have to report on that "tomorrow.") Here's what I more-or-less remember saying at AGU:

If this were the conversation I'm usually a part of, then I'd likely now be playing the role of heretic.

That discussion isn't about how to teach climate science to college students but rather about how to communicate climate risks to the public.

The climate-risk communication orthodoxy attributes public controversy over global warming to a deficit in the public's comprehension of science. The prescription, on this view, is to improve comprehension—either through better science education or through better public science communication. 

I’ll call this the “civic science literacy” thesis (or CSL).

I’m basically going to stand CSL on its head.

Public controversy, I want to suggest, is not a consequence of a deficit in public science comprehsnion; it is a cause of it. Such controversy is a kind of toxin that disables the normally reliable faculties that ordinary citizens use to recognize valid decision-relevant science.

For that reason I'll call this position the “science communication environment” thesis (or SCE).  The remedy SCE prescribes is to protect the science communication environment from this form of contamination and to repair it when such protective efforts fail.

This account is based, of course, on data—specifically a set of studies designed to examine the relationship between science comprehension and cultural cognition.

“Cultural cognition” refers to the tendency of people to conform their perceptions of risk to ones that predominate in important affinity groups—ones united by shared values, cultural or political. Cultural cognition has  been shown to be an important source of cultural polarization over climate change and various other risks.

In a presentation I made here a couple of years ago, I discussed a study that examined the connection between cultural cognition and science literacy, as measured with the standard NSF Science Indictors battery.In it, we found that polarization measured with reference to cultural values, rather than abating as science literacy increases, grows more intense. 

This isn’t what one would expect if one believed—as is perfectly plausible—that cultural cognition is a consequence of a deficit in science comprehension (the CSL position).

The result suggests instead an alternative hypothesis: that people are using their science comprehension capacity to reinforce their commitment to the positions on risk that predominate in their affinity groups, consistent with cultural cognition.

That hypothesis is one we have since explored in experiments. The experiments are designed to “catch” one or another dimension of science comprehension “in the act” of promoting group-convergent rather than truth- or science-convergent beliefs.

In one, we found evidence that “cognitive reflection”—the disposition to engage in “slow” conscious, analytical reasoning as opposed to “fast” intuitive, heuristic reasoning—has that effect.

But the study I want quickly to summarize for you now involves “numeracy” and cultural cognition. “Numeracy” refers not so much to the ability to do math but to the capacity and disposition to use quantitative information to draw valid causal inferences.

In the study, we instructed experiment subjects to analyze results from an experiment. Researchers tested the effectiveness of a skin rash cream to a “treatment” condition and a “control” condition. They recorded the results in both conditions.  Our study subjects were then supposed to figure out whether treatment with the skin cream was more likely to make the patients’ rash “better” or “worse.”

This is a standard “covariance detection” problem. Most people get the wrong answer because they use a “confirmatory hypothesis” testing strategy: they note that more patients’ rash got better than worse in the treatment; also that more got better in the treatment than in the control; and conclude the cream makes the rash get better.

But this heuristic strategy ignores disconfirming evidence in the form of the ratio of positive to negative outcomes in the two conditions.  Patients using the skin cream were three times more likely to get better than worse; but those using  not using the skin cream were in fact five times more likely to get better. Using the skin cream makes it more likely that that the rash will get worse than not using it.

By manipulating the column headings in the contingency table, we varied whether the data, properly interpreted, supported one result or the other. As one might expect, subjects in both conditions scoring low in numeracy were highly likely to get the wrong answer on this problem, which has been validated as a predictor of this same kind of error in myriad real-world settings. Indeed, subjects weren’t likely to get the “right” answer only if they scored in about the 90th percentile on numeracy.

We assigned two other groups of subjects to conditions in which they were instructed to analyze the same experiment styled as one involving a gun control ban. We again manipulated the column headings.

You can see that the results in the “gun ban” conditions were  are comparable to the ones in the skin-rash treatments. But obviously, it’s noisier.

The reason is cultural cognition.  You can see that in the skin-rash conditions, the relationship between numeracy and getting the right answer was unaffected by right-left political outlooks.

But in the gun-ban conditions, high-numeracy subjects were likely to get the right answer only when the data, properly interpreted, supported the conclusion congenial to their political values.

These are the raw data.  Here are simulations of the predicted probabilities that low- and high-numeracy would get the right answer in the various conditions.  You can see that low-numeracy were partisans were very unlikely to get the right answer and and high-numeracy ones very likely to get it in the skin-rash conditoins—and partisan differences were trivial and nonsignificant.

In the gun-ban conditions, both low- and high-numeracy partisnas were likely to polarize. But the size of the discrepancy in the probability of getting the right answer was between low-numeracy subjects in each condition was much smaller than the size of the discrepancy for high-numeracy ones.

The reason is that the high-numeracy ones but not the low- were able correctly to see when the data supported the view that predominates in their ideological group. If the data properly interpreted did not support that position, however, the high-numeracy subjects used their reasoning capacity perversely—to spring open a confabulatory escape hatch that enabled them to escape the trap of logic.

This sort of effect, if it characterizes how people deal with evidence of a politically controversial empirical issue, will result in the sort of magnification of polarization conditoinal on science literacy that we saw in the climate-change risk perception study.

It should now be apparent why the CSL position is false, and why it’s prescription of improving science comprehension won’t dispel public conflict over decision-relevant science.

The problem reflected in this sort of pattern is not too little rationality, but too much. People are using their science-comprehension capacities opportunistically to fit their risk perceptions to the one that dominates in their group. As they become more science comprehending, then, the problem only gets aggravated.

But here is the critical point: this pattern is not normal. 

The number of science issues on which there is cultural polarization, magnified by science comprehension, is tiny in relation to the number on which there isn’t.

The reason people of diverse values converge on the safety of medical x-rays, the danger of drinking raw milk, the harmlessness of cell-phone radiation etc is not that they comprehend the science involved. Rather it's that they make reliable use of all the cues they have access to on what’s known to science.

Those cues include the views of those who share their outlooks & who are highly proficient in science comprehension.  That's why partisans of even low- to medium-numeracy don't have really bad skin rashes!

This reliable method of discerning what’s known to science breaks down only in the unusual conditions in which positions on some risk issue—like whether the earth is heating up, or whether concealed carry laws increase or decrease violent crime—become recognizable symbols of identity in competing cultural groups. 

When that happens, the stake that people have in forming group-congruent views will dominate the stake they have in forming science-congruent ones. One’s risk from climate change isn’t affected by what one believes about climate change because one’s personal views and behavior won’t make a difference. But make a mistake about the position that marks one out as a loyal member of an important affinity group, and one can end up shunned and ostracized.

One doesn’t have to be a rocket scientist to form and persist in group-congruent views, but if one understands science and is good at scientific reasoning, one can do an even better job at it.

The meanings that make positions on a science-related issue a marker of identity are pollution in the science communication environment.  They disable individuals from making effective use of the social cues that reliably guide diverse citizens to positions consistent with the best available evidence when their science communication environment is not polluted with such meanings.

Accordingly, to dispel controversy over decision-relevant science, we need to protect and repair the science communication environment.  There are different strategies—evidence-based ones—for doing that. I’d divide them into “mitigation” strategies and “adaption” ones.

Last point.  In saying that SCE is right and CSL wrong, I don’t mean to be saying that it is a mistake to improve science comprehension!

On the contrary.  A high degree of civic science literacy is critical to the well-being of democracy.

But in order for a democratic society to realize the benefit of its citizens’ civic science literacy, it is essential to protect its science communication environment form the toxic cultural meanings that effectively disable citizens’ powers of critical reflection.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (10)

Widespread scientific literacy—in the deep sense of understanding how scientific thought works, not in the shallow sense of knowing how acids and bases work—would, if such a thing existed, be a jewel of inestimable value.

For one thing, if the average person were scientifically-literate (in the deep sense) then the march of human history could have simply bypassed all the profligate stupidity of the current "climate wars". You may not have had any occasion to notice the following, but something that's clear to anybody who spends much time campaigning on the side of "disbelief" is this: the median "believer" "believes" on the basis of (among other things) a delusion about the way science works—specifically, their conviction is predicated on the false idea that majority opinion functions as a form of evidence in science. And they are *wedded* to this falsehood. It is almost impossible to get them to acknowledge it's a lie. You might be educated enough to know it's a lie, Dan, but trust me: you're in a privileged and enlightened minority. The great bulk of "believers" really *doesn't* know any better. I've met the average believer, I've argued with him or her, and I can assure you he or she is blissfully unaware that there's anything repugnant about the ideology of consensualism—you know, that anti-scientific myth Naomi Oreskes introduced into the scientific narrative for the single, narrow purpose of pressuring the uneducated into believing a certain hypothesis in a particular field of science. The average, lay "believer" really does think—literally, no kidding here Dan, don't laugh—that it's the most normal thing in the world for scientific truths to be decided by paying a special-needs teenager like John Cook to go around collecting papers and counting them and coming up with an imbecilic "97%" factoid. You and I roll our eyes at this populist bullshit. But the contempt and boredom such sports inspire in people like you and me is NOT shared by the wider public, which is—I repeat—scientifically illlterate. It would be a major mistake to assume everyone else in the world "knows" that paper-counting is a sub-scientific, demeaning practice with all the evidentiary utility of horoscopy. I wish they knew, but they don't. When the Oreskeites act out these vulgar little farces, ORDINARY PEOPLE THINK THEY'RE WITNESSING A FORM OF SCIENCE.

And, if I might gently rebuke a friend, I don't think it exactly helps matters for you to characterise climate science as "banal"—this unfortunate choice of adjectives can surely only serve to reinforce the popular misconception that the various superstitious rituals that have infiltrated and displaced much of what was once "climate science" are somehow *normal.* This is wrong. These bad new logics are decidedly *abnormal*. John Cook is *abnormal*. Naomi Oreskes, who "is teaching a course in Consensus Science in Vienna this summer," has no precedent in 250 years of modern science. Her existence is pathological, teratomatous. Proper science has never—and couldn't possibly—work the way these charlatans pretend it works.

Want to end the climate wars overnight, Dan?

Simples: you just need to invent and synthesise a water-soluble biomolecule (suitable for occult dispersal in the drinking-water supply) that causes everyone who drinks it to suddenly remember what they should have learned from grade school: that science has nothing AT ALL to do with consensus. I guarantee that opinion polls will then show an immediate and decisive worldwide shift to climate disillusionment. I'd be very surprised if as many as 1 in 10 "believers" still "believed" after realizing that they'd been systematically lied to about the entire epistemological basis of science.

Alternatively, if you don't have access to a chemical-warfare lab, you could try explaining this to the "believer" population.

Either way: climate peace in our time. World's scientists now free to work on non-imaginary problems like cancer. Daniel Kahan awarded Nobel Peace Prize.

December 26, 2013 | Unregistered CommenterBrad Keyes

After 18 years of substantial increase in CO2 with no increase in temperature, the global warming climate models have been scientifically invalidated to a 95% confidence level. The reason that skeptics of catastrophic global warming are more scientifically literate is that more of them have actually analyzed the data and come to the proper scientific conclusion based on that data.

February 17, 2015 | Unregistered CommenterMike


so you're trying to tell us:

the MORE scientifically-literate someone is, the more likely he/she will come to the RIGHT conclusion?

I don't buy it.

Too obvious.


February 18, 2015 | Unregistered CommenterBrad Keyes


Gotta agree w/ @Brad on this one.

BTW, I get why you think that "skeptics of catastrophic global warming are more scientifically literate."

As the post plainly states, the most scentifically literate citizens are the most polarized on AGW.

February 18, 2015 | Registered CommenterDan Kahan


The one thing Dan agrees with me on and

- I can't tell what he's agreeing with me on
- I can't tell what Dan thinks I'm disagreeing with Mike on
- I can't tell what Dan is disagreeing with Mike on

Also, Dan seems to be having one of those laughing-at-a-cosmic-joke-nobody-else-is-privy-to moments:

BTW, I get why you think that "skeptics of catastrophic global warming are more scientifically literate."
Well, come on! Tell us Dan.
As the post plainly states, the most scentifically literate citizens are the most polarized on AGW.
Argh. Dan, are you telling us that by telling us that, you're telling us why you think Mike thinks what he thinks? Or are you telling us something unrelated to the last thing?

February 19, 2015 | Unregistered CommenterBrad Keyes

The posts above are a great argument for the concept of systematic review processes designed to reduce confirmation bias. I wouldn't trust any evidential synthesis from someone so passionate about their view of the truth. I see a neon flashing sign saying confirmation bias. And no, I am not particularly passionate about holding either side of the climate debate.

May 17, 2015 | Unregistered CommenterNeil Barr


Not sure who strikes you as too passionate -- perhaps me -- but I'd say (whether me or not) "passion in defense of truth of one's hypothesis is no vice -- so long as one is willing to update based on valid data." Einstein was pretty hot for relativity even before Eddington's confirmation.

As for peer review, it's okay but hardly a guarantee against confirmation bias...

Actually, in my view, the only reliable peer review is the one that starts after publication & that never ends.

May 18, 2015 | Registered CommenterDan Kahan

Among false arguments, the one that a large majority (somewhere between 80 and 99 percent) of qualified experts find the basis of our understanding of how heat-trapping greenhouse gases accumulating in our atmosphere have increased the energy (heat) in the system (global warming) which is disrupting our planetary circulation (climate change) uncontroversial is somehow irrelevant is clever but neither good nor correct.

If your doctor said you had a problem, you would seek out the best opinions according to professional norms. You would not seek out people who said you didn't have a problem because you didn't want to believe there was a problem.

This argument has taken hold, but it is dishonest and burks most of the known facts.

The substantial infrastructure of unskeptical "skeptic" includes think tanks and anyone and everyone (Lord Monckton, for example) who will present sciencey looking arguments that are hard for the layperson to decipher. This is supported by the wealthiest industries on earth and derives from knowledge gained about how to maximize uncertainty and fake legitimate processes since the era of big tobacco.

Given our predisposition to avoid difficult problems, this false comfort is doing a serious disservice to the future of all of us.

June 9, 2015 | Unregistered CommenterSusan Anderson


The belief that the principle that the fantasy that the experts agree that the hypothesis that current atmospheric CO2 concentration is superoptimal is uncontroversial is meaningless is negotiable is proof pathognomonic that that literacy that we call scientific is missing from the list of things that you’ve studied.

Again: I’m not sure how much clearer this can be put. When I say that the belief that the principle that the fantasy that the experts agree that the hypothesis that current atmospheric CO2 concentration is superoptimal is uncontroversial is meaningless is negotiable is wrong, what part of that do you (sans scientific literacy) suspect I (despite my scientific literacy) am inventing?

I'm not even going to try to demonstrate the truth of what I just said. I've explained it about a billion times on the Internet by now. (Some scientists believe it's closer to 2 billion.)

Also, the resort to the perseverant and perhaps ineradicable trope about doctors and diagnoses and second opinions, NONE OF WHICH exemplify or pertain to or prove anything about science, is another shibboleth that betrays the non-native speaker of the language of science.

Listen, Susan.

As I told you God knows how many years ago, diagnosing a patient is not even vaguely analogous to forming, holding, expressing, defending or testing a scientific hypothesis.

A scientific hypothesis, if vindicated, ALTERS humanity’s understanding of how nature works.

The treatment of patients (the stuff doctors do from 9 to 5) CANNOT POSSIBLY alter it, and doesn’t even attempt to.

It takes humanity’s existing knowledge AS A PREMISE.

Doctors are NOT medical scientists. If a doctor practice medical science during office hours, that’s a malpractice suit begging to be won.

A DOCTOR runs TESTS on a PATIENT to figure out what’s going on with HIS/HER body.

A MEDICAL SCIENTIST runs EXPERIMENTS on SUBJECTS to figure out how bodies IN GENERAL work.

The act of diagnosis is your doctor’s attempt to tell YOU what’s wrong with YOUR body when YOU complain of a tingling, painful left arm that wakes you in the middle of night and gets better after an hour or so but doesn’t respond to paracetamol and is interfering with your day-to-day life.

To do that, your doctor must first read about, learn, accept, believe, be examined on, be certified on, remember all the things “we” (humans) know about “the” (human) body, arm, pain, paracetamol, sleep/wake, sleep deprivation, effects of sleep deprivation on work safety, etc…

and only THEN will she apply this knowledge in her head to deduce what’s wrong with the individual (you) in front of her.

Get the difference?

If no, indicate no by saying “no.”



June 10, 2015 | Unregistered CommenterBrad Keyes

Thanks, Dan, for an excellent article. It makes me stop to rethink my own use of data in debates of this kind. Hopefully the next article contains the "punch line" explaining effective strategies for combating this toxic environment. This article makes me want to dig deeper and explore this site in depth, so well done.

September 15, 2015 | Unregistered CommenterAdam Kuehn

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>