Key Insight

My paper Ideology, Motivated Reasoning, and Cognitive Reflection was published today in the journal Judgment and Decision Making. I’ve blogged on the study that is the focus of the paper before.  In those posts, I focused on the relationship of the study to the “asymmetry thesis,” the view that ideologically motivated reasoning is distinctive of (or at least disproportionately associated with) conservativism. ... Read more

My paper Ideology, Motivated Reasoning, and Cognitive Reflection was published today in the journal Judgment and Decision Making .

I’ve blogged on the study that is the focus of the paper before.  In those posts, I focused on the relationship of the study to the “asymmetry thesis,” the view that ideologically motivated reasoning is distinctive of (or at least disproportionately associated with) conservativism.

The study does, I believe, shed light on (by ripping a fairly decent-sized hole in) the asymmetry thesis. But the actual motivation for and significance of the study lie elsewhere.

The cultural cognition thesis ( CCT ) holds that individuals can be expected to form risk perceptions that reflect and reinforce their connection to groups whose members subscribe to shared understandings of the best life and the ideal society.

It is opposed to various other accounts of public controversy over societal risks, the most significant of which, in my view, is the bounded rationality thesis ( BRT ) .

Associated most prominently with Kahneman’s account of dual process reasoning, BRT attributes persistent conflict over climate change, nuclear power, gun control, the HPV vaccine, etc. to the public’s over-reliance on rapid, visceral, affect-laden, heuristic reasoning—“System 1” in Kahneman’s terms—as opposed to more deliberate, conscious, analytical reasoning— “System 2,” which is the kind of thinking, BRT theorists assert, that characterizes the risk assessments of scientists and other experts.

BRT is quite plausible—indeed, every bit as plausible, I’m happy to admit—as CCT. Nearly all interesting problems in social life admit of multiple plausible but inconsistent explanations.  Likely that’s what makes them interesting.  It’s also what makes empirical testing—as opposed to story-telling—the only valid way to figure out why such problems exist and how to solve them

In my view, every Cultural Cognition Project study is a contribution to the testing of CCT and BRT.  Every one of them seeks to generate empirical observations from which valid inferences can be drawn that give us more reason than we otherwise would have had to view either CCT or BRT as more likely to be true.

In one such study , CCP researchers examined the relationship between perceptions of climate change risk, on the one hand, and science literacy and numeracy, on the other. If the reason that the public is confused (that’s one way to characterize polarization) about climate change and other risk issues (we examined nuclear power risk perceptions in this study too) is that it doesn’t know what scientists know or think the way scientists think, then one would expect convergence in risk perceptions among those members of the public who are highest in science literacy and technical reasoning ability.

The study didn’t find that.  On the contrary, it found that members of the public highest in science literacy and numeracy are the most divided on climate change risks (nuclear power ones too).

That’s contrary to what BRT would predict , particularly insofar as numeracy is a very powerful indicator of the disposition to use “slow” System 2 reasoning.

That science literacy and numeracy magnify rather than dissipate polarization is strongly supportive of CCT.  If people are unconsciously motivated to fit their perceptions of risk and comparable facts to their group commitments, then those who enjoy highly developed reasoning capacities and dispositions can be expected to use those abilities to achieve that end.

In effect, by opportunistically engaging in System 2 reasoning, they’ll do an even “better” job at forming culturally congruent perceptions of risk .

Now enter Ideology, Motivated Reasoning, and Cognitive Reflection . The study featured in that paper was aimed at further probing and testing of that interpretation of the results of the earlier CCP study on science literacy/numeracy and climate change polarization.

The Ideology, Motivated Reasoning, and Cognitive Reflection study was in the nature of experimental follow up aimed at testing the hypothesis that individuals of diverse cultural predispositions will use their “System 2” reasoning dispositions opportunistically to form culturally congenial beliefs and avoid forming culturally dissonant ones.

The experiment reported in the paper corroborates that hypothesis.  That is, it shows that individuals who are disposed to use “System 2” reasoning—measured in this study by use of the Cognitive Reflection Test, another performance based measure of the disposition to use deliberate, conscious (“slow”) as opposed to heuristic-driven (“fast”) reasoning—exhibit greater motivated reasoning with respect to evidence that either affirms or challenges their ideological predispositions.

The evidence on which subjects demonstrated motivated reasoning concerned how “closed-minded” and “unreflective” individuals of opposing ideologies are.

Closed mindedness” is a very undesirable trait generally.

It’s also what those on each side of politically polarized debates like the one over climate change identify as the explanation for the other’s refusal to accept what each side sees as the clear empirical evidence in favor of its own position.

One might thus expect individuals who have a stake in forming perceptions of facts congenial to their cultural commitments to react in a defensive way to evidence that those who share their commitments are less “open-minded” and “reflective” than those who harbor opposing commtiments.