So today I’ll finally tell you what we did in the information-seeking experiment featured in our new paper “Science Curiosity and Political Information Processing.”
It was pretty darn simple.
We assigned subjects to one of two conditions. In each, subjects were presented with two news story headlines: a “climate realist” one, which announced that scientists had uncovered evidence consistent with human-caused climate change; and a “climate skeptical” one, which announced that scientists had uncovered evidence that qualified or called into question the human contribution to climate change.
The difference in the conditions concerned the relative novelty of the opposing pieces of scientific evidence being featured in the respective headlines.
Thus, in Condition 1—“Realist unsurprising, Skeptical surprising”—the respective newspaper headlines were “Scientists Find Still More Evidence that Global Warming Actually Slowed in Last Decade” and “Scientists Report Surprising evidence: Arctic Ice Melting Even Faster Than Expected.”
In contrast, in Condition 2—“Realist surprising, Skeptical unsurprising” condition—the respective headlines read, “Scientists Report Surprising Evidence: Ice Increasing in Antarctic, Not Currently Contributing To Sea Level Rise” and “Scientists Find Still More Evidence Linking Global Warming to Extreme Weather.”
Subjects were instructed to “pick the story most interesting to you,” and told they’d be asked some questions after they finished reading it.
Aversion to “counterattitudinal” information—that is, information that is contrary to one’s political outlooks—is one of the incidences of politically motivated reasoning. When given the option, partisans tend to seek out information that is consistent with their predispositions rather than information that is contrary to them (Hart, Albarracín et al. 2009).
That’s exactly what we observed among subjects who were relatively low in science curiosity.
Among subjects who were relatively high in science curiosity, however, we saw the opposite effect. Thus, relatively right-leaning science-curious subjects—who tended to be climate skeptical—nevertheless preferred the novel or “surprising” realist news story over the unsurprising skeptical story.
Likewise, relatively left-leaning science-curious subjects—who tended to be climate concerned—preferred the surprising skeptical story over the unsurprising realist one.
The effect sizes, moreover, were quite large: moderately science curious subjects were on average 32-percentage points (± 19, LC = 0.95) more likely to select the story that was contrary to their political predispositions than were moderate science incurious ones.
We are motivated to investigate this hypothesis by an unexpected observation in our “science of science filmmaking” studies. As subjects’ science curiosity increased, their perceptions of contentious risks tended to move in the same direction. Moreover, high-curiosity subjects seemed to resist the normal tendency of individuals to polarize as their proficiency in science comprehension increased.
We surmised that these individuals might be indulging their appetite for surprise by more readily examining evidence that contravened their political predispositions. Being exposed to a greater volume of “counterattitudinal data,” they’d form views that were more uniform, and less prone to polarization conditional on science comprehension.
The experiment results supported this hypothesis.
Does this “prove” that science curiosity negates politically motivated reasoning?
It’s a mistake to think empirical evidence ever proves anything
What it does, if it is the product of a valid design, is furnish more reason than one otherwise would have had for crediting one competing account of some phenomenon over another.
As I explained yesterday, the hypothesis that that science curiosity offsets politically motivated reasoning, is a plausible conjecture—but so is the hypothesis that science curiosity, like other cognitive elements of science comprehension, magnifies this biased form of information processing.
On the scale that registers the strength of the evidence for these respective hypotheses, the experiment result puts an increment of weight down on the side of the first hypothesis.
How much weight?
Well, you can decide that!
But if you are curious for our own views, read the paper: It catalogs our own qualifications and sources of residual uncertainty—and outlines a set of questions for further investigation.
We’re really curious to see if this result stands up to even more critical testing!
Hart, W., Albarracín, D., Eagly, A.H., Brechan, I., Lindberg, M.J. & Merrill, L. Feeling validated versus being correct: a meta-analysis of selective exposure to information. Psychological Bulletin 135, 555-588 (2009).