follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« How to achieve "operational validity"? Translation Science! | Main | Measuring "ordinary science intelligence": a look under the hood of OSI_2.0 »
Wednesday
Jul232014

Constructing an "Ordinary climate science intelligence" assessment: a fragment ...

From Climate Science Communication and the Measurement Problem, Advances in Pol. Psych. (forthcoming):

6.  Measuring what people know about climate science

What do members of the public know about scientific evidence on climate science? Asking whether they “believe in” human-caused climate change does not measure that.  But that does not mean what they know cannot be measured.

a. A disentanglement experiment: the “Ordinary Climate Science Intelligence” instrument. Just as general science comprehension can be measured with a valid instrument, so can comprehension of the science on climate change in particular. Doing so requires items the responses to which validly and reliably indicate test-takers’ climate science comprehension level.

The idea of “climate science comprehension” is hardly straightforward. If one means by it the understanding of and facility with relevant bodies of knowledge essential to doing climate science research, then any valid instrument is certain to show that the level of climate science comprehension is effectively zero in all but a very tiny fraction of the population.

But there are many settings in which the quality of non-experts’ comprehension of much more basic elements of climate science will be of practical concern. A high school science teacher, for example, might aim to impart an admittedly non-expert level of comprehension in students for the sake of equipping and motivating them to build on it in advanced studies. Likewise, without being experts themselves, ordinary members of the public can be expected to benefit from a level of comprehension that enables them reliably to recognize and give proper effect to valid climate science that bears on their decisionmaking, whether as homeowners, businesspeople, or democratic citizens.

Assume, then, that our goal is to form an “ordinary climate science intelligence” (OCSI) instrument.  Its aim would certainly not be to certify possession of the knowledge and reasoning dispositions that a climate scientist’s professional judgment comprises.  It will come closer to the sort of instrument a high school teacher might use, but even here no doubt fall short of delivering a sufficiently complete and discerning measure of the elements of comprehension he or she is properly concerned to instill in students.  What the OCSI should adequately measure—at least this would be the aspiration of it—is a form of competence in grasping and making use of climate science that an  ordinary person would benefit from in the course of participating in ordinary decisionmaking, individual and collective.

There are two challenges in constructing such an instrument.  The first and most obvious is the relationship between climate change risk perceptions and individuals’ cultural identities.  To be valid, the items that the assessment comprises must be constructed to measure what people know about climate science and not who they are.

A second, related problem is the potential for confounding climate science comprehension with an affective orientation toward global warming risk.  Perceptions of societal risk generally are indicators of a general affective orientation. The feelings that a putative risk source evokes are more likely to shape than be shaped by individuals’ assessments of all manner of factual information pertaining to it (Loewenstein et al. 2001; Slovic et al. 2004).  There is an ambiguity, then, as to whether items that elicit affirmation or rejection of factual propositions relating to climate change are measuring genuine comprehension or instead only the correspondence between the propositions in question and the valence of respondents’ affective orientations toward global warming. Existing studies have found, for example, that individuals disposed to affirm accurate propositions relating to climate change—that burning fossil fuels contributes to global warming, for example—are highly likely to affirm many inaccurate ones—e.g., that atmospheric emissions of sulfur do as well—if those statements evince concern over environmental risks generally (Tobler, Visschers & Siegrist 2012; Reynolds et al. 2010).

Two steps were taken to address these challenges in constructing an OCSI instrument, which was then administered to the same survey participants whose general science comprehension was measured with the OSI scale.  The first was to rely on an array of items the correct responses to which were reasonably balanced between opposing affective orientations toward the risk of global warming.   The multiple-choice item “[w]hat gas do most scientists believe causes temperatures in the atmosphere to rise” (“Carbon”) and the true-false one “human-caused global warming will result in flooding of many coastal regions” (“Floods”) evince concern over global warming and thus could be expected to be answered correctly by respondents affectively predisposed to perceive climate change risks as high. The same affective orientation, however, could be expected to incline respondents to give the incorrect answer to items such as “human-caused global warming will increase the risk of skin cancer in human beings” (“Cancer”) and “the increase of atmospheric carbon dioxide associated with the burning of fossil fuels will reduce with photosynthesis by plants” (“Photosynthesis”). By the same token, those respondents affectively disposed to be skeptical of climate change risks could be expected to supply the correct answer to Cancer and Photosynthesis but the wrong ones , Carbon and Floods. The only respondents one would expect to be likely to answer all four correctly are ones who know and are disposed to give the correct response independent of their affective orientations.

The aim of disentangling (unconfounding) affective orientation and knowledge was complimented by a more general assessment-construction tenet, which counsels use of items  that feature incorrect responses that are likely to seem correct to those who do not genuinely possess the knowledge or aptitude being assessed (Osterlind 1998). Because the recent hurricanes Sandy and Irene both provoked considerable media discussion of the impact of climate change, the true-false item “[h]uman-caused global warming has increased the number and severity of hurricanes around the world in recent decades” was expected to elicit an incorrect response from many climate-concerned respondents of low or modest comprehension (who presumably would be unaware of the information the IPCC 5th Assessment (2013, I: TS p. 73) relied upon in expressing “low confidence” in “attributions of changes in tropical cyclone activity to human influence” to date, based on “low level of agreement between studies”).  Similarly, the attention furnished in the media to the genuine decrease in the rate at which global temperatures increased in the last 15 years was expected to tempt respondents, particularly ones affectively disposed toward climate-change skepticism, to give the incorrect response to the true-false item “globally averaged surface air temperatures were higher for the first decade of the twenty-first century (2000-2009)  than for the last decade of the twentieth century (1990-1999).”

The second step taken to address the distinctive challenge of constructing a valid OCSI assessment was to introduce the majority of items with the clause “Climate scientists believe that  . . . .” The goal was to reproduce the effect of the clause “According to the theory of evolution . . .” in eliminating the response differential among religious and nonreligious individuals to the NSF Indicators’ Evolution item.  It is plausible to attribute this result to the clause’s removal of the conflict relatively religious respondents experience between offering a response that expresses their identity and one that signifies their familiarity with a prevailing or consensus position in science.  It was anticipated that using the “Climate scientists believe” clause (and similar formulations in other items) would enable respondents whose identity is expressed by disbelief in human-caused global warming to answer  OCSI items based instead on their understanding of the state of the best currently available scientific evidence.

To be sure, this device created the possibility that respondents who disagree with climate scientists’ assessment of the best available evidence could nevertheless affirm propositions that presuppose human-caused climate change.  One reason not to expect such a result is that public opinion studies consistently find that members of the public on both sides of the climate debate  don’t think their side’s position is contrary to scientific consensus (Kahan et al. 2011).

It might well be the case, however, that what such studies are measuring is not ordinary citizens knowledge of the state of scientific opinion but their commitment to expressing who they are when addressing questions equivalent to “belief in” global warming. If their OCSI responses show that individuals whose cultural identity is expressed by denying the existence of human-caused global warming nevertheless do know what scientists believe about climate change, then this would be evidence that it is the “who are you, whose side are you on” and not the “what do you know” question when they address the issue of global warming in political settings.

Ultimately, the value of the information yielded by the OCSI responses does not depend on whether citizens “believe” what they say they know “climate scientists believe.” Whether they do or not, their answers would necessarily remain valid measures of what such respondents understand to be scientists’ view of the best available evidence. Correct perceptions of the weight of scientific opinion is itself is a critical form of science comprehension, particularly for individuals in their capacity as democratic citizens.  Items that successfully unconfound who are you, whose side are you on from what do you know enable a valid measure of this form of climate science comprehension.

Achieving this sort of decoupling was, it is important to reiterate, the overriding motivation behind construction of the OCSI measure.  The OCSI measure is at best only a proto- assessment instrument. A fully satisfactory “climate science comprehension” instrument would need to be simultaneously broader—encompassing more knowledge domains—and more focused—more calibrated to one or another of the settings or roles in which such knowledge is useful. 

But validly assessing climate-science comprehension in any setting will require disentangling knowledge and identity.  The construction of the OCSI instrument was thus in the nature of an experiment—the construction of a model of a real-world assessment instrument—aimed at testing whether it is possible to measure what people know about climate change without exciting the cultural meanings that force them to pick sides in a cultural status conflict.

References

 IPCC. Climate Change 2013: The Physical Science Basis, Working Group I Contribution to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, Cambridge, England, 2013).

Kahan, D.M., Jenkins-Smith, H. & Braman, D. Cultural Cognition of Scientific Consensus. J. Risk Res. 14, 147-174 (2011).

Loewenstein, G.F., Weber, E.U., Hsee, C.K. & Welch, N. Risk as Feelings. Psychological Bulletin 127, 267-287 (2001).

Osterlind, S.J. Constructing test items : multiple-choice, constructed-response, performance, and other formats (Kluwer Academic Publishers, Boston, 1998).

Reynolds, T. W., Bostrom, A., Read, D., & Morgan, M. G. (2010). Now What Do People Know About Global Climate Change? Survey Studies of Educated Laypeople. Risk Analysis, 30(10), 1520-1538. doi: 10.1111/j.1539-6924.2010.01448.x

Tobler, C., Visschers, V.H.M. & Siegrist, M. Addressing climate change: Determinants of consumers' willingness to act and to support policy measures. Journal of Environmental Psychology 32, 197-207 (2012).

Slovic, P., Finucane, M.L., Peters, E. & MacGregor, D.G. Risk as Analysis and Risk as Feelings: Some Thoughts About Affect, Reason, Risk, and Rationality. Risk Analysis 24, 311-322 (2004).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (6)

The last figure doesn't make sense.
1) The title implies it would be comparing "Ordinary Climate Science Intelligence" to "Global Warming beliefs" but the chart seems to only show only the latter.
2) The vertical dimension is labeled "No. of correct responses" -- that scale makes little sense when applied to beliefs.
3) The question asked is not stated - or at least not fully and explicitly.
4) The text seems to indicate the question related to something about warming in the last few decades and names three answers. Could the respondents answer yes to more than one?
5) For the warming that occurred in the 1980s and 90s the first two answers are both plausible. For this century the last is plausible.

January 31, 2015 | Unregistered CommenterCortlandt Wilson

@Cortland:

It's not well explained -- indeed, it is unexplained -- by the post.

The bars reflect the # of OCSI items correct for test-takers divided on the basis of response to a previous item on climate change: those who believe there is no convincing evidence of global warming in recent decades; those believe there is evidence of such warming and of it being caused "mainly" by human activity; and those who believe there is evidence of such warming but only of it being caused by "natural cycles."

The results are more completely described in the paper.

Also in this blog post, among others.

January 31, 2015 | Registered CommenterDan Kahan

The test is poorly constructed. It presumes "climate scientists" are monolithic in their beliefs. Clearly they are not. not. Some questions are leading. And one is nonsense.

A belief I'd question is a majority position is that hurricane frequency/intensity is not driven upward by human-caused global warming. For one thing frequency and severity are two separate issues. Some believe that frequency is down while severity is up. The reactions to hurricanes Rita and Katrina and Sandy argue against a majority of scientists not believing the odds of storms like those are higher now due to AGW. Yet the "correct" answer is that "climate scientists" do not believe in any correlation.

A leading question is "what gas" causes temperature to rise. The correct answer is there is no one gas and the gas most responsible for so-called greenhouse warming (80%+) is water vapor which somehow didn't make the cut in answers to choose from.

A nonsense question is North Pole Ice Cap. There is no such thing. There is Arctic Sea Ice, which does not raise sea level if it melts and there is the Greenland Ice Sheet which most certainly will raise sea level some 20 meters if it were to completely melt. "North Pole Ice Cap" is thus so ambiguous as to make the question unanswerable.

February 1, 2015 | Unregistered CommenterDavid Springer

@David:

The "which gas" questin is multiple choice-- so why not just look to see if others raise temeperature?

I don't quite understand your own response to hurricane; what do "reactions" to Rita & katrina & Sandy "argue against"? Whose reaction? Anyway, you seem to agree that the answer is climate scientists believe the evidence is currently inconclusive on whether hurricaine activity has been influenced "in recent decades" by human climate change (IPCC says "no"; changes in impact expected in future).

Curious: Do you think there is disagreement on climate scientists on whether increased CO2 will reduce photosynthesis? Cause skin cancer?

the north pole question has been the source of lots of adventure already.

One more thing: do you think that all scientists agree that the earth goes around the sun rather than vice versa? And do you whether they do is of decisive importance in assessing whether that question (one I don't myself like much but it's useful for illustration) is a "valid" one for inclusion in a standardized public "science literacy" assessment? I think you are likely thinking about the OCSI items in a way that I would regard as misunderstanding the theory of how a standardized apptitude test is supposed to work -- & if you don't like the theory, that would be a more interesting thing to discuss than whether the North Pole "really" had an "ice cap".

February 1, 2015 | Registered CommenterDan Kahan

Dan,

Thanks but the question is not made clear by your references.
The "Climate-Science Communication and the Measurement Problem" paper (which I though asked some very salient questions) doesn't list the "belief" questions in the appendix.
1) So I'm still not clear on what the "belief" question was.
2) The text for Fig 12 in the paper indicates the vertical axis shows percentage of correct answers on your OCSI (climate science knowledge) test. The text for the similar graph in the referenced blog post indicates the vertical axis shows correct answers on three different tests.
3) The text for Fig 3 in the paper uses an aggregated score of two questions. The first question was " there [is] solid evidence that the average temperature on earth has been getting warmer over the past few decades". Is that the same "belief" question used reported in the last figure in the present post?

4) The "getting warmer over the past few decades" is the kind of question that grates on me when I try to answer survey questions. I think the best answer is "the question is poorly constructed". I interpret getting warmer as meeting "positive rate of change". There is strong evidence that it got warmer between 1980 and about 1999. After 2000 the rate of change has been effectively zero. Was this "forced choice" between two "wrong" answers done intentionally?

4 b) I note that the last OCSI distinguishes between the decade of 2000-2009 vs 1990-1999. That distinction would made sense for the "getting warmer" question.

February 2, 2015 | Unregistered CommenterCortlandt Wilson

@Cortland--

Ah, I see.

Take a look at pp. 9-10 of preprint & Fig. 5. The "belief in" question (which doesn't use words "belief in," actually; see fn. 1 on that) is not part of the "climate science literacy" scale. (I am guessing that is what you referred to as "Figure 3" in your comment?). The point of constructing the scale was to *see* if belief in or acceptance question measures knowledge as opposed to something else, like the sort of cultural style or identity that is measured by how people answer questions about political outlooks.

For Figure 12, the y-axis is just # of correct out of 9 items on the OCSI. The bars are for 3 different groups of subjects -- those who said "no warming" in last 3 decades, those who said "warming but natural causes," & those who said "warming mainly due to human."

I understand point your objection to the "belief in" question. But I would say to you, as I alluded to in answer to @David, that in theory it's okay for the questions to have those sorts of problems.

The assessments don't try to *grade* test-taker based on "getting right answer"; they try to *measure* an unobserved disposition or trait or apptitude or inventory of knowledge by eliciting responses that *correlate* with possession of that disposition or trait etc. It's possible for questions that don't have right answers to be able to do that; indeed, even questions where *wrong* answer is treated as "correct" can do that. In theory, at least. All that matters is that one be able to show (a) the responses to the items cohere in manner that is consistent with their measuring a single thing & (b) the single thing that is being measured is what one is trying to measure.

(A) is pretty simple.

(B) is harder -- requires external validation. The only real external validation I have here is that those subjects who score highest on the general science comprehension test (OSI), which has been validated pretty well as measure of knowledge & critical reasoining essential for science comprehension, also score highest on OCSI, which is what you'd expect if OCSI were measuring some subset of science comprehension relating to climate science.

February 2, 2015 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>