Some data on CRT & "Republican" & "Democratic brains" (plus CRT & religion, gender, education & cultural worldviews)
This is the latest in a series of posts (see here, here, here, here ...) on the relationship between ideology &/or cultural worldviews, on the one hand, and cognitive reasoning dispositions, on the other.
I've now got some new data that speak to this question -- & that say things inconsistent with the increasingly prominent claim that conservative ideology is associated with low-level information processing.
If you already know all about the issue, just skip ahead to "2. New data"; if you are new to the issue or want a brief refresher, read "1. Background" first.
As discussed in a recent post, a series of studies have come out recently that present evidence--observational and (interestingly!) experimental--showing that the tendency to use heuristic or system 1 information processing ("fast" in Kahneman terms, as opposed to "slow" systematic or system 2) is associated with religiosity.
I expressed some agitation on the absence of reported data on the relationship of system 1/system2 reasoning dispositions and ideology.
The source of my interest in such data is the increasing prevalence of what I'll call -- in recognition of Chris Mooney's role in synthesizing the underlying studies -- the Republican Brain Hypothesis (RBH). RBH posits a relationship between conservative political positions and use of low-effort, low-quality, biased, etc. reasoning styles. RBH proponents-- Mooney in particular-- conclude that this link makes Republicans dismissive of policy-relevant science and is thus responsible for the political polarization that surrounds climate change.
Although I very much respect Mooney's careful and fair-minded effort to assemble the evidence in support of RBH, I remain unpersuaded. First, RBH doesn't fit cultural cognition experimental results, which show that the tendency to discount valid scientific evidence when it has culturally non-congenial implications is prominent across the ideological spectrum (or cultural spectra).
Second, as far as I can tell, RBH studies have all featured questionable measures of low-level information processing. The only validated measures of system 1 vs. 2 dispositions -- i.e., the only ones that has been shown to predict the various forms of cognitive bias identified in decision science -- are Shane Frederick's Cognitive Reflection Test (CRT) and Numeracy (CRT is a subcomponent of the latter). The RBH studies tend to feature highly suspect measures like "need for cognition," which are based on study subjects' own professed characterizations of their tendency to engage in critical thinking.
So why are researchers who are interested in testing RBH not using (or if they are using, not reporting data on) the relationship between CRT & political ideology?
A few months ago, I reported in a blog post some data that suggested the being Republican and conservative has a small positive correlation with CRT. In other words, being a conservative Republican predicts being slightly more disposed to use systematic or system 2 reasoning.
The relationship was too small to be of practical importance -- to be a plausible explanation for political polarization on issues like climate change -- in my view. But the point was that the data suggested the opposite of what one would expect if one credits RBH!
The relationship between CRT and the cultural worldview measures was similarly inconsequential -- very small, off-setting correlations with Hierarchy and Individualism, respectively.
2. New data
Okay, here are some new CRT (Cognitive Reflection Test) data that reinforce my doubt about RBH (the "Republic Brain Hypothesis").
The data come from an on-line survey carried out by the Cultural Cognition Project using a nationally representative sample (recruited by the opinion-research firm Polimetrix) of 900 U.S. adults.
The survey included the 3-item CRT test, various demographic variables, partisan self-identification (on a 7-point scale), self-reported liberal-conservative ideology (on a 5-point scale) and cultural worldview items.
Key findings include:
- Higher levels of education and greater income both predict higher CRT, as does being white and being male. These are all results one would expect based on previous studies.
- Also consistent with the newer interesting studies, religiosity predicts lower CRT. (I measured religiosity with a composite scale that combined responses to self-reported church attendance, self-reported personal importance of religion, and self-reported frequency of prayer; α = 0.87).
- However, liberal-conservative ideology has essentially zero impact on CRT, and being more Republican (on the 7-point partisan self-identification measure; but also in simple binary correlations) predicts higher CRT. Not what one would expect if one were betting on RBH!
- Being more individualistic than communitarian predicts higher CRT, being more hierarchical than communitarian predicts essentially nothing. Also not in line with RBH, since these cultural orientations are both modestly correlated with political conservativism.
Now, those are the simple, univariate correlations between the individual characteristics and CRT (click on the thumbnail, right, for the correlation matrix).
But what is the practical significance of these relationships?
To illustrate that, I ran a series of ordered logistic regression analyses (if you'd like to inspect the outputs, click on the thumbnail to left). The results indicate the likelihood that someone with the indicated characteristic would get either 0, 1, 2, or all 3 answers correct on the CRT test.
As illustrated in the Figures above, these analyses reveal that the impact of all of these predictors is concentrated on the likelihood that someone will get 0 as opposed to 1, 2, or 3 answers correct. That is, the major difference between people with the "high-CRT" characteristic and those with the "low-CRT" one is that the former are less likely to end up with a goose egg on the test.
Indeed, that's all that's going on for both religiosity and partisan self-identification; there's no significant (& certainly no meaningful!) difference in the likelihood that those who are high vs. low in religiosity, or who are Republican in self-identification vs. Democrat, will get 1, 2 or 3 answers correct--only whether they will get more than 0.
The likelihood of getting 1 or 2 correct, but not 3, is higher for men vs. women and for more educated vs. less educated individuals. But the differences -- all of them -- look pretty trivial to me. (Not that surprising; few people are disposed to engage in system 2 reasoning on a consistent basis.)
Note, too, that there's essentially no difference between "hierarchical individualists" and "egalitarian communitarians," the members of the cultural communities most divided on environmental issues including climate change. Also none when liberal-conservative ideology and party affiliation are combined.
These are models that look at the predictors of interest in relation to CRT but in isolation from one another. I think it's easy to generate a jumbled, meaningless model by indiscriminatingly "controlling" for co-variates like race, religiosity, and even gender when trying to asses the impact of ideologies and cultural worldviews, or to "control" for ideology when assessing the impact of worldviews or vice versa; people come in packages of these attributes, so if we treat them as "independent variables" in a regression, we aren't modeling people in the real world (more on this topic in some future post).
But just to satisfy those who are curious, I've also included a "kitchen sink" multivariate model of that sort. What it shows is that religion, race, education, and income all predict CRT independently of one another and independently of ideology and cultural worldivews. In such a model, however, neither ideology nor cultural worldviews predict anything significant for CRT.
3. Bottom line
So to sum up -- when we use CRT as the measure of how well people process information, there's no support for RBH. In fact, the zero-order effect for political-party affiliation is in the wrong direction. But the important point is that the effects are just too small to be of consequence -- too tiny to be at the root of the large schisms between people with differing ideological and cultural worldviews over issues involving policy-relevant science.
What does explain those divisions, I believe, is motivated reasoning, a particular form of which is what we are looking at in studies of cultural cognition.
The lack of a meaningful correlation between CRT, on the one hand, and cultural worldviews and political ideologies, on the other, is perfectly consistent with this explanation for risk-perception conflicts, because the evidence that supports the explanation seems to show that motivated reasoning is ample across all cutural and ideological groups.
Indeed, motivated reasoning, it has long been known (although recently forgotten, apparently), affects both system 1 (heuristic) and system 2 (systematic reasoning). Accordingly, far from being a "check" on motivated reasoning, a disposition to use system 2 more readily should actually magnify the impact of this sort of distortion in thinking.
To be sure, being disposed to use heuristic reasoning -- or simply unable to engage in more technical, systematic modes of thought -- will produce all sorts of really bad problems. But the problem of cultural polarization over policy-relevant science just isn't one of them.
In my opinion, the sooner we get that, the sooner we'll figure out a constructive solution to the real problems of science communication in a diverse, democratic society.
David Hoffman asked what seemed -- both immediately & on reflection -- to be an interesting question about CRT & latency. One might expect correct answers to take longer than incorrect ones if CRT is a valid measure of system 2 dispositions (although there are other possible effects that might cut other way). Turns out that Andrew Meyer, a doctoral candidate advisee of Shane Frederick, has been investigating this question (in between 24-48 hour sessions helping Shane to figure out a positive EV video poker algorithm, which Shane is now free to work on full time after having been granted tenure). Meyer responds:
The relation between time on the CRT and various correlates, (intertemporal choice, risky choice, belief in god) is pretty murky.But in our data, people who get the correct answer certainly do spend more time than people who get the intuitive answer.From the data set that I have in front of me: Intuitive bat and ball takes 26 seconds. Correct bat and ball takes 50.And interestingly, when you start messing with the problem very much at all, bolding parts or preceding it with related questions, that difference in latency attenuates.