follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« What comes first--misinformation or the motivation to believe it? Some reflections on study design | Main | New paper: "Laws of cognition, cognition of law" »
Monday
Oct272014

Unconfounding knowledge from cultural identity--as big a challenge for measuring the climate-science literacy of middle schoolers as grown ups

A friend (of the best sort—one who has “got your back” to protect you from entropy’s diabolical plan to deprive you of the benefits of advances in collective knowledge) sent me a very interesting new study:

Stevenson, K. T., Peterson, M. N., Bondell, H. D., Moore, S. E., & Carrier, S. J., Overcoming skepticism with education: interacting influences of worldview and climate change knowledge on perceived climate change risk among adolescents, Climatic Change, 126(3-4), 293-304 (2014).

I very much like the SPBMC  paper.

One cool thing about it is that it tests the influence of cultural predispositions on the global-warming beliefs of middle schoolers.   It’s not the only study that has adapted the cultural cognition worldview measures to students, but it’s one of only a few and the only one I know of that is applying the measures to kids this young.

Consistent with research involving adult subjects, SPBMC find that cultural outlooks—in particular “individualism”—predicts skepticism about climate change.

SPBMC decided not to use (or at least not to report results involving) the hierarchy-egalitarianism worldview measure (maybe they figured some of the items weren’t suited for minors; I could understand that).

Instead they used a “social dominance” one and found that it didn’t predict anything relating to climate change attitudes—also interesting.

But of course the most important & interesting thing is what  SPBMC have to say about the relationship between climate-literacy & acceptance/belief in human-caused global warming, & the influence of cultural individualism on the same.

I found this part of the paper extremely valuable & informative.  I have a strong feeling that they have mined only a portion of the rich deposits of knowledge that their data contain.

Nevertheless, I found myself unconvinced (at least at this point) that the results they reported had the significance that they attached to them.

SPBMC present two principal findings. One is that acceptance of human-caused climate change in their student sample was associated with higher climate-science literacy. 

The other is that climate-science literacy had a bigger impact on kids who were relatively individualistic.  That is, as those kids display higher levels of climate science literacy, the change in the probability that they will believe in human-caused climate change increases even more than it does in kids who are relatively “communitarian” as their science-literacy levels increase.

SPBMC infer from these findings that “[c]climate literacy efforts designed for adolescents may represent a critical strategy to overcoming climate change related challenges, given stable or declining concern among adults that is driven in part by entrenched worldviews.”

For adults, worldviews are well entrenched and exert considerable influence over climate change risk perception. During the teenage years, however, worldviews are still forming, and this plasticity may explain why climate change knowledge overcomes skepticism among individualist adolescents . . . .

I myself strongly agree with SPBMC that climate-science education can make a big contribution to overcoming cultural polarization on climate change—although for reasons that I think differ from those of SPBMC.   But put that aside for a second.

The problem, in my view, is that the measure of climate-science literacy that SPBMC constructed fails to address what existing research teaches us is the biggest challenge in measuring public understanding of climate science.

That challenge is how to unconfound or disentangle genuine knowledge from the positions people take by virtue of their cultural identity.  An assessment instrument must  overcome this challenge in order to be a valid measure of climate-science literacy.

In general, people’s perceptions of risk reflect affective appraisals—positive or negative—of the putative risk source (nuclear power, guns, vaccines, etc.).

For most people most of the time, these feelings don’t reflect their comprehension of scientific data or the like. On the contrary, how people feel is more likely to shape their assessments of all manner of information, which they can be expected to conform to their pro- or con-attitude toward the putative risk source.

In this circumstance, survey items that elicit people’s understandings of the risks and benfits associated with some activity or state of affairs are best understood as simply indicators of the unobserved or latent affective orientation that people have toward that activity or state of affairs.  That attitude is all they are genuinely measuring   (Loewenstein et al. 2001; Slovic et al. 2004).

This is a huge issue for measuring climate-science literacy.

Sadly, propositions of fact on climate change—like whether it is happening & whether humans are causing it—have become entangled in antagonistic cultural meanings, transforming them into badges of membership & loyalty to affinity groups of immense significance in people’s everyday lives.

Study respondents can thus be expected to answer questions relating to climate change in a manner that reflects the pro- or con- affective stance that corresponds to their cultural identities. 

If they are the sort of persons who are culturally predisposed to believe in human-caused global warming (or “accept” it; let’s be sure to avoid the confused & confusing idea that there’s an important distinction between “believing” something & “accepting” or “knowing” it), they will affirm pretty much any proposition that to them sounds like the sort of thing one who “believes in” climate change would say.

As a result, they’ll incorrectly agree that human-caused global warming will increase the incidence of skin cancer, that industrial sulfur pollutions are causing climate change, that water vapor traps more heat than any other greenhouse gas etc.

Their “acceptance” of human-caused global warming, in other words, doesn’t reflect knowledge of the basic mechanisms that drive climate change or of the scientific evidence for how they work.

Rather, it expresses who they are.

Study after study after study after study has demonstrated this  (Bostrom et al 1994 Reynolds et al. 2010; Tobler, Visschers & Siegrist 2012; Guy et al. 2014). 

To be valid, then, a climate-science literacy scale must successfully distinguish between respondents whose correct answers reflect only their identity-based affective orientation toward global warming from those whose correct answers show genuine climate-science comprehension. 

The only way to design such a scale is to include a sufficiently large number of appropriately weighted items for which the incorrect answers are likely to seem correct to someone who is culturally predisposed to believe in climate change but who lacks understanding of the scientific basis for that position.

Now here’s the most interesting thing: if one includes a mix of items that successfully distinguishes those who “accept” human-caused climate change based on their predispositions from those who genuinely get the mechanisms of climate change, then one will discover that those who don’t “accept” or “believe in” human-caused climate change know just as much about the mechanisms of climate change as those who say they do accept it.

For sure, most “skeptics” are painfully ignorant about climate change science.

But that’s true for most “believers” too!

Only a very small portion of the general public—consisting of individuals who score very high on a general science comprehension test—can consistently distinguish propositions that most expert climate scientists accept from propositions that sound like ones such experts might accept but that in fact are wholly out of keeping with the basic mechanisms and dynamics of global warming.

Yet even among these very climate-science literate members of the public, there is no consensus on whether global warming is occurring.  Just like their climate-science illiterate counterparts, their “beliefs” about human-caused global warming are predicted by their cultural identities (Kahan in press).

In sum, “acceptance” or “belief in” human caused global warming is not a valid indicator in them either. It is an indicator of who one is, culturally speaking—nothing more and nothing less.

Judging from the results they reported in their paper, at least, SPBMC did not construct a climate-science literacy measure geared to avoiding the “identity-knowledge” confound.

In fact, they actually selected from a larger battery of items  (Tobler, Visschers & Siegrist 2012) a subset skewed toward ones that a test-taker who is culturally predisposed to “believe in” human-caused global warming could be expected to answer correctly regardless of how much or little that person actually knows about the mechanisms of climate change (e.g., “For the next few decades, the majority of climate scientists expect  a warmer climate to increase the melting of polar ice, which will lead to an overall rise of the sea level ”;  “... an increase in extreme events, such as droughts, floods, and storms”; “... a cooling down of the climate”; “The decade from 2000 to 2009 was warmer than any other decade since 1850.”). 

SPBMC left out of their battery items that Tobler et al. (2012) and other studies have found believers in climate change are highly likely to get wrong (e.g., “For the next few decades, the majority of climate scientists expect  an increasing amount of CO2 risks will cause more UV radiation and therefore a larger risk for skin cancer”; “Water vapor is a greenhouse gas”; “In a nuclear power plant, CO2 is emitted during the electricity production”; “On short-haul flights (e.g., within Europe) the average CO2 emission per person and kilometer is lower than on long-haul flights (e.g., Europe to America).”)

By my count, only 3 of the 17-19 items SPBMC identify as ones included in their scale (there is a discrepancy in the number that they report using in the text and number that appear in the on-line supplementary information, where the item-wording appears) are ones that existing studies have shown were likely to elicit wrong answers from low climate-science comprehending respondents who are nonetheless culturally predisposed to believe in climate change (“the ozone hole is the main cause of the greenhouse effect [true-false]”; “For the next few decades, the majority of climate scientists expect a precipitation increase in every region worldwide”; “Carbon dioxide (CO2) is harmful to plants”).

If one constructs a “climate science literacy” scale like this, it is bound to correlate with “acceptance” of global warming because the scale will itself be measuring the same cultural predisposition that inclines people to accept human-caused global warming.

Indeed, included in the SPBMC scale were true-false items that measured acceptance of human-caused climate change

  • The increase of greenhouse gasses is mainly caused by human activities.
  • With a high probability, the increase of carbon dioxide (CO2) is the main cause of climate change.
  • Climate change is mainly caused by natural variations (such as changes in solar radiation and volcanic eruptions).

Obviously, if one is testing the hypothesis that acceptance/belief in human-caused global warming is caused by understanding of climate science, then the former must be defined independently of the latter. 

Because SPBMC put "acceptance" items in their climate literacy scale, their finding that global-warming acceptance is associated with climate-science literacy is circular.

The same problem, in my view, characterizes SPBMC’s finding on the relative impact of climate-science literacy on students who are relatively individualistic.

Again, SPBMC’s climate-science measure is itself measuring acceptance of human-caused climate change.

So for them to say (based on a correlational model) that “climate science literacy” has a bigger impact on individualists' willingness to “accept climate change” than it does on communitarians’ is equivalent (mathematically/logically) to saying: “Reducing climate-skepticism in cultural individualists who don't believe in climate change would have a bigger impact on their willingness to accept human-caused climate change than would reducing the skepticism of cultural communitarians who already believe in climate change. . . .”

Can’t argue with that—but only because it’s essentially a tautology.

The practical question has always been why individualists are so strongly predisposed to skepticism (and communitarians to belief—same thing).

There is already evidence that the cultural individualists who score highest on a valid climate-science literacy scale are not more likely than low-scoring cultural individualists to say they accept/believe in human-caused global warming.

Because it is unclear that SPBMC constructed a scale that measures knowledge & not just a pro-belief affective orientation—indeed, because their climate-science comprehension scale includes acceptance of human-caused climate change—it doesn’t support any inference that greater climate-science comprehension would have such an effect in culturally individualistic middle schoolers.

As I mentioned, I do believe that improving climate-science education would make a very big contribution to dissipating political polarization on global warming.

The reason isn’t that understanding climate-science in itself can be expected to induce people to say they “believe in” climate change.  Again, what people say about what they believe in climate change isn’t a measure of what they know; it is a measure of why they are.

But precisely for that reason, learning to teach kids climate science will require teachers to learn how to dispel from the classroom the toxic affiliation between climate change positions and identities that now divides adults in the political realm.  When teachers learn how to do that—as I’m confident they will—then we can apply those lessons more broadly to the political domain so that there too we can use what we know rather than fight over whose side the state is going to take in a mean, illiberal status competition.

Indeed, that SPBMC performed a study like this in an educational context fills me with deep admiration.  This is the sort of research we desperately need more of, in my view.

And notwithstanding the critique I’m offering, I’m convinced there is a lot that can be learned from this paper. 

In particular, I really really hope SPBMC will report more of their data—including the psychometric properties of their climate-science literacy scale and summary data on how scores actually are distributed in their sample. 

They’d certainly be welcome to do so in this blog!

Still, as a scholar grappling with the central psychometric issues involved in measuring climate science literacy, I just don’t think the particular results SPBMC have reported support the conclusions that they purport to draw.

I’m sure they’d agree with me, too, that scholars investigating these issues are obliged to speak up when they see a study that they think hasn’t fully addressed them.  If scholars don't do this out of some misplaced sense of politeness (or any other sensibility, for that matter, that constrains open and candid scholarly exchange), then science communicators and educators who are relying on empirical work to make informed judgments will end up making serious and costly errors.

It should also go without saying that it is a mistake to think peer review happens only before a paper is published.  If anything, that’s precisely when meaningful peer review begins.

Refs

Bostrom, A., Morgan, M. G., Fischhoff, B., & Read, D. (1994). What Do People Know About Global Climate Change? 1. Mental Models. Risk Analysis, 14(6), 959-970. doi: 10.1111/j.1539-6924.1994.tb00065.x

Guy, S., Kashima, Y., Walker, I., & O'Neill, S. (2014). Investigating the effects of knowledge and ideology on climate change beliefs. European Journal of Social Psychology, 44(5), 421-429.

Kahan, D.M.(in press).Climate science communication and the Measurement Problem. Advances  in Pol. Psych.

Loewenstein, G.F., Weber, E.U., Hsee, C.K. & Welch, N. Risk as Feelings. Psychological Bulletin 127, 267-287 (2001).

Reynolds, T. W., Bostrom, A., Read, D., & Morgan, M. G. (2010). Now What Do People Know About Global Climate Change? Survey Studies of Educated Laypeople. Risk Analysis, 30(10), 1520-1538. doi: 10.1111/j.1539-6924.2010.01448.x

Slovic, P., Finucane, M.L., Peters, E. & MacGregor, D.G. Risk as Analysis and Risk as Feelings: Some Thoughts About Affect, Reason, Risk, and Rationality. Risk Analysis 24, 311-322 (2004). 

Stevenson, K. T., Peterson, M. N., Bondell, H. D., Moore, S. E., & Carrier, S. J. (2014). Overcoming skepticism with education: interacting influences of worldview and climate change knowledge on perceived climate change risk among adolescents. Climatic Change, 126(3-4), 293-304.  

Tobler, C., Visschers, V. H. M., & Siegrist, M. (2012). Addressing climate change: Determinants of consumers' willingness to act and to support policy measures. Journal of Environmental Psychology, 32(3), 197-207. doi: http://dx.doi.org/10.1016/j.jenvp.2012.02.001

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (15)

==> "That is, as those kids display higher levels of climate science literacy, the change in the probability that they will believe in human-caused climate change increases even more than it does in kids who are relatively “communitarian” as their science-literacy levels increase."

I still don't get how you can write that type of comment about a cross-sectional study. You are describing a longitudinal effect.

I also think that the implications of this kind of study - to science education - are inherently limited by the cross-sectional structure.

October 27, 2014 | Unregistered CommenterJoshua

@Joshua--

yes I wrote that sentence just to bug you.

I meant to be describing a function: what happens to y as x changes

Disagree w/ you about what can be learned from a study of this design. Too categorical. As explained before, one can learn something when a well-framed hypothesis about how the world works implies a correlation; if the correlation isn't there, then one has less reason to believe the hypothesis.

October 27, 2014 | Unregistered Commenterdmk38

Hi Dan,

Thank you for your interest in our paper. We appreciate the space you have given to this research focused on kids, and believe addressing the K-12 demographic is critical. Your suspicions are correct – there is more to learn from our data. We are in the process of digging deeper and hope to have a follow-up manuscript out soon that answers several of your concerns. We do want to respond to your key point about the potential for conflating climate knowledge and ideology here.

Intuitively, children are different from adults in many respects, and that appears to extend to how they assess climate change risk. Although we did not report them in this paper, we did explore correlations between worldview and knowledge levels. Since we used a true/false scale, we forced students to choose an answer. If worldview and knowledge were linked, we would have expected individualist students to answer more questions incorrectly, particularly those that might be linked to ideology. However, we found no correlation between individualism and any of the three knowledge sub-scales (physical knowledge, knowledge of causes, and knowledge of impacts). An individualist student is no more or less likely to answer one of the knowledge scales correctly than a communitarian. Another take on your interpretation of our findings is that at low levels of knowledge, adolescents do seem to rely on their worldviews to decide what they think about climate change. We see the same polarization along individualist/communitarian lines as you find with adults. However, more knowledgeable students rely less on worldviews, suggesting that at this age, worldviews do not display as much of an influence over climate change perceptions as they do among adults. It then becomes less of an issue of “reducing climate-skepticism,” as you say, and more of one of building knowledge. As kids understand climate more, they seem to rely on their worldviews less.

We believe your research (and other related studies) suggests a need for new knowledge scales that avoid conflating knowledge and ideology among adults, but do not see that problem as being so large among children. That said, it would certainly be great to devise a metric for K-12 contexts that parallels what you are developing for adults to facilitate valid comparisons between studies focused on adults (where disentangling knowledge and ideology seems to be pretty important) versus children.

As you say, worldviews are an expression of who we are, and developmentally, middle school students have not fully determined who they will become.

Thank you again for your interest in the study, and we appreciate the opportunity for discussion!

SPBMC (Kathryn Stevenson, Nils Peterson, Howard Bondell, Susan Moore & Sarah Carrier)

October 28, 2014 | Unregistered CommenterKathryn Stevenson/SPBMC

Thanks, @Kathryn!

This is really very interesting information. And again, it is really great that you are examining these dynamics in students, given the challenge that educators are facing here.

I am pretty open to the hypothesis that the dynamics you are examining will work differently in adolescents from how they work in adults.

But I'm still not satisfied that the study results, as reported, support the very important & interesting conclusions you are offering.

Probably, I’m just greedy but I wonder if you & the rest of the @SPBMC team might want to do a guest blog & tell us more about the data?

Here are some examples of why that would be helpful.

A. How do we reconcile these propositions (the first two are in the paper & the 3rd is in your response above)?

1. Climate science literacy (CSL) is positively correlated with "acceptance" of human-caused global warming (AGW).

2. Individualism-communitarians (I-C) is negatively correlated with AGW acceptance.

3. I-C & CSL are uncorrelated.

Imagine we plotted changes in probabiiity of accepting AGW on y-axis & standarzied values of both CSL & I-C on on x-axis.

Given 1 & 2, the slope for CSL would be positive & that for I-C would be negative.

Insofar as they have the opposite signs, we'd observe that they are negatively correlated with one another-- contrary to (3).

I’m sure there is some explanation. Maybe it has something to do with magnitude of the slopes or w/ interaction you report.

But at least some of the epxlantions that I can piece together (including ones having to do w/ your reported interaction) fit the conjecture I offered in my post.

I'm sure many of the very imaginative 14 billion regular readers of this blog would have at least 28 billion ideas too!

But I don't think that anyone can be sure what to make of the data w/o seeing the zero order correlations & more informative summary data (as you note, the paper doesn't contain those; it reports only a portion of a multivariate regression model).

B. Is CSL really measuring a single thing & if so what is it?

In your comment, you suggest there are multiple subscales; in the paper, the analyses involve only 1 scale, formed, I take it, from summing the correct responses to all the items.

So I’m curious whether the multilple subscales that you now refer to formed a single reliable scale—a matter that wasn’t addressed in the paper.

The subscales didn't form a reliable scale in Tobler et al., the source of your items. They were clear about that—it was one of the coolest findings in their study.

In my post, I identify 3 items that are best viewed as measures of *acceptance* of AGW. Tobler et al. included those items (along w/ others) in a subscale they labeled “climate change & causes.”

They also had a “physical knowledge” subscale, which contained items relating to properties of CO2 as “greenhouse gas,” the impact of CO2 on plants, the relationship between ozone layer & climate change, the relationship of fossil fuel burning to production of CO2, etc.

They reported that the “physical knowledge” subscale was only “weakly correlated” with the “climate chagne & causes” scale – the former explained only about 5% of the variance in the latter!

That is really strong evidence that the acceptance of AGW and knowledge about mechanisms of climate chagne are really different things.

I’m curious—what did you find about relatinship between the subscales in the middle schoolers?

Actually, I think it would be interesting to see the usual sorts of analyses of the item covariances that enable judgments about whether items are properly viewed as measuring a single or multiple latent variables. And also data on whether the items all have the same relationship to the underlying factors for both individualists & communitarians-- since that goes to the hear to the matter.

These are the sorts of analyses that help to show that global warming and evolution items don't have same relationship to science comprehesion across cultural subgroups.

As you know, Tobler et al performed their analyses on Swiss adults & were clear that there was no reason to assume that these sorts of items would have the same psychometric properties in samples from other socieites—which makes sense given how much variance we see on climate change attitudes across nations.

So you can see why I’m eager for more than just a report on how all of the items, when added together, correlated with AGW.


C. CSL/acceptance-of-AGW circularity issue

It is not surprising that a scale that contains the 3 acceptance items I mentioned would correlate w/ acceptance of AGW. As I mentioned in the post, those items *are* measures of AGW (Tobler et al. characterized them that way)

Because the $64K question is “how does knowledge of climate science relate to acceptance of AGW?,” it would be interesting to see if a valid scale that didn’t include those 3 items predicted AGW, as measured by those 3 items or by the items you used to measure AGW.

There are various other things (like what exactly the model was used to generate Figure 2; Figure 1 uses linear regression whereas Figure 2 reports probabilities of answering a particular question –so presumably was a logit regression?) but really these are main ones.

They all involve just knowing enough about the data – before it is put into a multivariate regression w/ patha analysis and intereaction terms etc. – to be able to know what’s going on & figure out what sort of modeling is appropriate given what’s going on.

You have such a great trove of data! If you wanted to put something together that addressed these & other quesetions & post it on this site or another, that would be great! (Obviously, too, wouldn't have to be overnight!)

October 29, 2014 | Registered CommenterDan Kahan

"Insofar as they have the opposite signs, we'd observe that they are negatively correlated with one another-- contrary to (3)."

Correlation doesn't work that way.

Consider two random variables X and Y that are independent of one another. Calculate Z = X - Y. Then Z is positively correlated with X and negatively correlated with Y, but X and Y are uncorrelated.

Not that I'm saying that's what's happening. It's just a possibility.

I would agree about the questions mixing climate science knowledge and climate change belief. Several of the questions on "climate change causes" are not about causes. Some of them are disputed or acknowledged to be unknown even in the mainstream. Several others are not normally disputed by sceptics. And it asks no questions on the areas of science that sceptics use as the basis of their arguments (e.g. feedbacks), and could form the basis of a scientifically informed decision.

I wasn't impressed by the quality of questions on the "hierarchy" scale either. "Some groups of people are simply inferior to other groups"? People who fail exams are just as good as people who pass them? Criminals are just as moral as policemen? People who work hard are just as good as people who are lazy and stupid? What sort of "groups"? "Inferior" in what sense? I'm guessing this is trying to determine whether people value 'equality of opportunity' or 'equality of outcome'.

It needs to be more specific, and in a context the children are familiar with. For example: Do children think that children who are clever and hard-working and score highly on tests think some of the marks should be taken away and given to poorer students, who don't have so many marks and need them to lead a good and happy life? Should all children be given the same marks in tests? Will that lead to a better society? Should we strive to make test scores as equal as possible, or as high as possible, and should we do it by redistributing scores or by teaching?

October 29, 2014 | Unregistered CommenterNiV

@Niv:

Agree that this could be true. Thank you.

I would amend/emend my point: we can't know from models w/o seeing summary data (or raw data presented in informative way) what inferences to draw. We need to see the summary data 1st to figure out what inferences are plausible; we can then test or extend that inference w/ a model.

Here the two predictor variables (climate literacy & individualism) interact; or Pr(accept AGW) = b1*CSL - b2*I-C + b3 * CSL*I-C. (Assume I-C & CSL are centered at mean.) If b1, b2 & b3 are all > 0, it is surprising to be told I-C and CSL are uncorrelated w/ each other (these are continuous variables, too; it woudl be easier to form sensible hypotheses about how they could be uncorrelated if they were binary-- like race & gender, say). I can imagine various alternative stories about how this could be true. the model at that point is obscuring not testing what inference is supported by the data.

Likely I should be clearer about my own attitude, too.

I think I would bet against the claim that SPMC are making. But I'd be hedged: if I lost the bet, the gain in knowledge would more than compensate me for whatever amount I lost on the bet!

More specifically:

If after getting the sort of information I described one could see that individualism is uncorrelated with a valid climate science literacy scale constructed with SPBMC items that don't presuppose acceptance of AGW (my 3d point in comment) *and* that individualism is negatively correlated w/ acceptance of AGW, that would be consistent with the results for adults in Measurement Problem.

But to learn in addition that Individualsim predicts acceptance of AGW as scores on the climate science literacy scale *increase*-- that would be pretty amazing.

It could be true. But before accepting an explanation for it, I'd like to be sure (or be assured) it is true!

October 29, 2014 | Registered CommenterDan Kahan

I think that the special case of middle schoolers needs to also be seen from their special position in adolescence. Babies may crave adult attention, and 2 1/2 year olds may happily try out the newly acquired word No!, but middle schoolers responses to adult desires, or perceived desired responses on test questions can be very complicated.

Middle Schoolers don't necessarily approach the topic of climate science as fresh minds encountering this independently. Way back in 2007, I was involved with a case of a public school teacher, who was a creationist as well as climate science denier: http://pandasthumb.org/archives/2007/03/sixth-graders-d.html. There were several layers of outside influence here, obviously the teacher. But the teacher was linked to a large mega church in town. And parents of other children in the classroom had already broached the idea of showing Al Gore's film: "An Inconvenient Truth" in the classroom. The students themselves apparently displayed an interesting mixture of acceptance of parental/ and or teacher viewpoints and rebellion from that. Which I don't think actually is the same as whether or not they were individualistic or communitarian overall. One of the apparently favorite arguments presented by the teacher had to do with the supposed problems of using CO2 measurements from Mauna Loa, an active volcano. The idea that clever middle schoolers could see something that silly adult scientists had apparently failed to notice was quite appealing to them. Obviously not every middle school classroom climate science lesson erupts into a national controversy involving churches and national bloggers like PZ Meyers, Pandas Thumb, and Chris Mooney. But they are not operating as fresh minds in a vacuum. And whether they choose to go along with what parents or media or schools have to say or not is a complicated matter of acquiescence and rebellion on the way to forming true independence. And very complicated to measure by any sort of survey. We'd certainly need a survey that attempted to control for inconsistencies in answers placed there by mixed motivation and/or even attempting to be devious middle school survey takers.

October 30, 2014 | Unregistered CommenterGaythia Weis

@Gaythia--

Do you agree that there is huge risk in using Al Gore movie or equivalent as teachng tool? It entangles rather than disentangles "who you are, whose side?" with "what do you/we know?"

In that regard, consider this... Do you have to be Nobelist to see why proposing this would be a horrible idea? Actually I'd think anyone who saw Flock of Dodos ought to recognize how bad an idea this would be

October 30, 2014 | Registered CommenterDan Kahan

Maybe people who know more about climate change are more polarized - because they use the new information they learn to reinforce their preexisting identifications.

Maybe people who are more identified ideology know more about climate change - because their strong identifications motivate them to assimilate more information to reinforce their ideologies

Maybe there is some mixture of the two.

W/o longitudinal study designs, how do you know which is the case?

October 30, 2014 | Unregistered CommenterJoshua

Dan -

==> "Do you have to be Nobelist to see why proposing this would be a horrible idea?"

What's "this" in that sentence?

October 30, 2014 | Unregistered CommenterJoshua

"What's "this" in that sentence?"

"using Al Gore movie or equivalent as teachng tool".

i.e. what half the audience will see as a transparent politically-partisan propaganda piece full of obvious errors, that will drive them even further away from paying any attention, and deepen the polarisation.

Because " It entangles rather than disentangles "who you are, whose side?" with "what do you/we know?""

October 30, 2014 | Unregistered CommenterNiV

I agree that the Al Gore movie would not be a teaching tool for a science classroom. I meant the entire example in my comment above to indicate how polarized and entangled climate change already was for many in the community, including middle schoolers. The adults on both sides were intent on teaching the controversy, not the science. And this was back in 2007. My point is that I think it would be very hard to devise a survey for middle schoolers that measured anything more than: "Do they feel, on that particular day, like going along with or disagreeing with their parents and/or teacher on something that they are already well aware is a charged issue.

I not a parent of students at this school, but was involved via Colorado Citizens for Science. What we did was bring in a speaker from the National Center for Science Education, their then faith outreach person, Dr. Peter Hess, for a public forum. This was an attempt to reach the majority of those in the community presumed to be between the two extremes. This is also when I became more involved with bloggers such as PZ Meyers and Chris Mooney. Some of their outrage at a national level actually made it harder for the school district to do what they wanted to (and did) do, which was to usher the teacher out the door at the end of the semester. They had already moved the teacher down to middle school, from high school. Evolution was not part of the middle school curriculum and he was apparently instructed not to teach it. The teacher, in contrast, seemed to have devised this event as and attention getter. He had invited news media into his classroom and seemed to be using the controversy to promote both himself and an upcoming book.

October 30, 2014 | Unregistered CommenterGaythia Weis

@Joshua

This is this.

This would be useful thing to view for anyone who thinks that is a good idea.

October 30, 2014 | Registered CommenterDan Kahan

I just read your research paper in depth on cognitive bias. I agree. I fit profile A as perfectly as you fit profile B. I got every question right, and have gone to the trouble in past years of downloading as much data on tree ring studies, surface temperature, solar variations, and green house mathematics. I have read the raging debates in detail. I can say with confidence that there is plausible variance in theories. Even as a skeptic, I answer quite differently to differently worded questions. "Is the planet warmer in the presence of higher CO2" would get a yes, while "is there strong evidence of a pending crisis in temperature rise" would get a no. You would do the same analysis and get a different response. But what you are not asking is why does profile A react the way it does? I would say that history is full of politically driven holocaust predictions. And it is rational to hold them with great skepticism. I would also say that politicized science tends to behave very badly. When lawsuits are filed, and scientists are marginalized for their beliefs, there becomes a second set of red lines crossed that very rationally will polarize people. I see bad research happening in medicine too, with lawsuits and marginalization of opponents. It just is not good science.

But when I pull back a greater distance, I drive an electric car, and am all for reducing emissions. But I am not for reducing the population or deindustrializing. I believe fossil fuels will be obsolete faster than we think, and the planet will do fine even with 2C warming. I am profile A. But profile B is just as biased, is obviously being ignored, and will likely not be remembered kindly because the ultra-wealthy world of 2100 will be living further from the equator, and thinking nothing of it. Energy harvesting will be 5X todays level, and food will be grown in higher densities on larger quantities of newly arable land. profile B says this is nonsense. But such a conjecture is not provable either way. War, Disease, and decreased birth rates are the true risks, not climate change.

November 6, 2014 | Unregistered Commenterbob goodwin

@Bob:

I wouldn't purport to explain any individual -- that's more like psychoanalysis than psychology. But I agree it can be useful to interrogate one's own constellation of views against the bkrd of patterns like these

November 6, 2014 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>