Should we care about the public's *climate science* literacy? What is "ordinary climate science intelligence" *for*?
A friend of mine posed this very appropriate, very basic, very important question/challenge to me after reading the Measurement Problem:
One thing i was left wondering after reading the paper was "should we care about people's OCSI ["Ordinary Climate Science Intelligence"] scores"? does "climate science comprehension" matter for anything meaningful? should we invest (any) resources in trying to increase people's science comprehension in this arena? and if so, WHY? does it, for example, correlate with people's ability to meaningfully/productively engage in the sorts of collective decision-making and planning processes happening in southeast florida? or do you simply see it as a valid end, not a means to anything on the "confronting climate change" front? i'm not sure what your take is on this, exactly, as it seems at some points that you do advocate communicators learn how to do exactly this (improve comprehension), perhaps by looking to science teachers who have figured out effective strategies in other contexts (evolution); but at other points (i think) you say that climate sci comprehension has nothing/little to do with the cultural polarization that seems to inhibit any large-scale collective response to the issue (which, as you say, should come as little surprise). or perhaps both of these statements are true to your thinking, but then it's less clear why (for what purpose) you advocate the former (coming back to the question of, "what is climate science comprehension FOR in a culturally polarized world?").
On Ordinary Climate Science Intelligence (OCSI) assessment scores: I think the scores are useful for exploring questions and testing hypotheses about why there is public conflict over climate change.
E.g., is the source of public conflict on this issue attributable to differences in comprehension of either the rudimentary mechanisms of this climage change?
Or perhaps to their “unfamiliarity” with the evidence that climate scientists have compiled on the causes and consequences of it?
OCSI can help to answer those questions: because it shows that members of groups polarized over the existence of climate change and the contribution humans are making to it have comparable understandings (and misunderstandings) about those matters, it gives us less reason to credit those explanations than we’d have otherwise.
Indeed, OCSI helps us to see, too, that survey items that assess public “acceptance” of human-caused climate change simply aren’t measuring anything having to do with knowledge of climate science at all. They have no correlation whatsoever to scores on a rudimentary climate literacy test and instead cohere with—behave exactly like other observable indicators of—their cultural identities.
As I explain in The Measurement Problem, OCSI was meant essentially to be a model of an assessment test, one designed to examine whether it is possible, with appropriately worded items, to disentangle (unconfound) cultural identity & knowledge when measuring how much people know about climate change.
But if we want to measure people's understandings of climate science, then I am sure it would be possible to do better than OCSI!
Your question, I take it, is actually more basic: given what we can see from OCSI, why should even want to measure how much people understand about climate science? Why should we care?
There's no answer, of course, that doesn’t presuppose some sort of normative goal.
The goal of most people who collecte data on the issues we are interested in is simply to "move the needle" of public opinion on "belief in" human-caused climate change. For them, I guess the results of the paper suggest that they should "not care" whether anyone comprehends anything meaningful about climate change. Because what they “believe” about human-caused climate change turns out not to have anything to do with what they know.
Indeed, the paper shows that, if the goal is simply the instrumental one of generating public engagement with the issue of climate change, advocates should stop obssessively measuring and minutely analyzing the percentage of the public who say they believe in (accept) human-caused climate change.
Again, what people say about that is measures only of their cultural identities—who they are. Their response to “do you believe in human-caused climate change” questions not only don’t measure what they know. They don’t even measure whether they are worried and concerned about climate change!
The consistently wrong answers most believes & skeptics give about the extent and nature of the dangers posed by climate change (e.g., that it will cause increases in skin cancer, or prevent plant photosynthesis) strongly suggest that believers and skeptics alike (in the general public at least) are very alarmed, as an emotional or affective matter, about the risks human-caused climate change poses.
What those who are trying to mobilize public opinion in this way should be trying to figure out is why their style of advocacy doesn’t tap into this reservoir of concern but instead reliably, predictably, inevitably triggers the identity-protective response that is reflected in the “No, I don’t, you asshole!” answer that 1/2 the US public gives when asked (over & over & over in polls that aren't advancing anyone's understanding of anything at this point) “do you believe in human-caused climate change?”
But there are bunch of other goals one could have—I’d say, should have—besides the navel-gazing one of “needle moving.” All of them support developing an even better instrument for assessing what ordinary people know about the science of climate change.
One is to help ordinary members of the public recognize information important to the decisions that they will make as citizens in self-governing communities the welfare of which will be affected by actions they take in relation to a changing climate.
Another would be to create communication materials that make it possible for the relatively small portion of the population who is genuinely curious about what we know about a changing climate to satisfy that interest.
Another would be to educate young people who might, if they are taught well and made excited by what they learn, become either climate scientists or adults who are genuinely curious about what we know and who, in any event, will become people who need to make decisions informed by the best evidence in their own private lives (as, say, property owners or business people; or farmers) or as citizens whose communities will be affected by climate change.
For all of those goals and related ones, then there will be value in having not an "OCSI" but a variety of them suited for the goal at hand.
As I said in the paper, e.g., I think it is silly to measure whether citizens know that the North Pole ice cap melting won't cause flooding; it's enough for them to know that melting ice sheets are creating a risk for people as a result of climate change.
But if one's goal is to educate young people, "North Pole" might be a pretty good item -- which is to say, it might actually be contributing to measurement of the latent comprehension capacity that educators should be trying to instill. So there are values in having OCSIs for these purposes that are actually tied to the sort of knowledge and comprehension capacities that it makes sense for those transmitting scientific information to focus on in the context in which they are operating.
The reason it would be nice to have on OCSI for the "curious consumer of science," too, is so that those who are part of the (truly amazing!) profession that is committed to serving him or her can figure out whether their efforts are working as well as they want.
For all of these actors, there will be domain-specific “OCSIs” better than the experimental one featured in The Measurement Problem. But I hope that the experimental OCSI can help those developing these practical, real-world OCSIs to see that they can and must “disentangle” identity and knowledge in constructing them.
Well, those are my thoughts. What do you think?