follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« More on disgust: Both liberals and conservatives *feel* it, but what contribution is it really making to their moral appraisals? | Main | Can we SENCERize the communication of science? »
Tuesday
Aug062013

Homework assignment: what's the relationship between science literacy & persistent political conflict over decision-relevant science?

I've agreed to do a talk at the annual American Geophysical Union in December. It will be part of a collection on "climate science literacy."

Here's the synopsis I submitted:

The value of civic science literacy

The persistence of public conflict over climate change is commonly understood to be evidence of the cost democracy bears as a result of the failure of citizens to recognize the best available decision-relevant science. This conclusion is true; what’s not is the usual understanding of cause and effect that accompanies this perspective. Ordinarily, the inability of citizens to comprehend decision-relevant science is identified as the source of persistent political conflict over climate change (along with myriad other issues that feature disputed facts that admit of scientific investigation). The truth, however, is that it is the persistence of public conflict that disables citizens from recognizing and making effective use of decision-relevant science. As a result, efforts to promote civic science literacy can’t be expected to dissipate such conflict. Instead, the root, cultural and psychological sources of such conflict must themselves be extinguished (with the use of tools and strategies themselves identified through valid scientific inquiry) so that our democracy can realize the value of educators' considerable skills in making citizens science literate. 

I have ideas along these lines -- ones that have figured in various papers I've written, informed various studies I've worked on, and appeared in one or another blog posts on this site.

But I haven't come close to working all this out.  

What's more, I worry (as always) that I could be completely wrong about everything.

So I welcome reflections by others on the basic claim expressed here-- reflections on how to convey it effectively; on what to do about the practical problem it reflects; but also on how to continue to probe and test to see whether it is true and to help identify any alterative account that's even more well founded and that furnishes an even more useful guide to action.

So get going-- don't put this off until the day before the talk & pull an all nighter! 

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

References (5)

References allow you to track sources for this article, as well as articles that were written in response to this article.
  • Response
    Response: assignment help uk
  • Response
  • Response
    Well written article with a bit of discussion embedded in it. Thanks for the share.
  • Response
  • Response
    Response: ewordpressthemes
    Conflict can motivate people to learn about science, but that doesn't necessarily reduce the conflict. And people who know more science may be better at the conflict as a result. It probably goes both ways, in a complicated way mixed up with everything else, and probably with no single right answer ...

Reader Comments (22)

I'd be interested in seeing the science that has been done, or in seeing new science done, that explores the question. The key claim appears to be that the arrow of causality does not point from scientific illiteracy to persistent public conflict, but from persistent conflict to scientific illiteracy.

Absent the resources (or time) to conduct actual experiments, one approach would be to try to look at historical data to see if you can find evidence that supports or disconfirms the hypothesis. I wonder if there are historical cases in which public conflict was objectively reduced, where you could look to see if there were indications of a change in the level of scientific literacy in the aftermath. Or vice versa: Cases where public conflict increased, and some form of measurement would reveal a subsequent change in scientific literacy.

A tricky aspect of this would be confounding variables. There's also the question of whether you could actually distinguish which change was occurring first and which was occurring second. Perhaps with enough cases, though, it would be possible to see a real signal that bears on the question?

I wonder what academic historians and sociologists would have to say about all this.

August 6, 2013 | Unregistered CommenterJohn Callender

@John:

The claim isn' t that political conflict causes science illiteracy; it is that political conflict interferes with comprehension of decision relevant science, including by those who are the most science literate. If that's so, then the remedy isn't "more science literacy"; it's remove the conditions that inhibit citizens from using the science understanding (including their ability to recognize valid science).

Many studies on this site & reletad commentaries on this site. Easiest thing is for me to post them in the "follow up" field.

They aren't conclusive. It woudl be a horrendous form of misunderstanding -- a kind of science illiteracy of the profoundest sort! -- to think that they are, b/c of course any conclusions one draws from evidence is always to be combined with conclusions from other sources of evidence & also treated as subject to revision based on new evidence.

But they are my starting point -- & so others can say if they think those studies & interpretations reflect a wrong or incomplete view.

August 6, 2013 | Registered CommenterDan Kahan

"The key claim appears to be that the arrow of causality does not point from scientific illiteracy to persistent public conflict, but from persistent conflict to scientific illiteracy."

I'm not sure that it points either way. Scientific illiteracy is more-or-less uncorrelated with conflict about science.

The vast majority of people are comparatively scientifically illiterate. Very few people would be able to give more than the most basic explanation about how electricity works or why steel is strong or why angular momentum is conserved (or even what that means). Why is the sky blue? Why are there two tides a day when the moon only goes round once? Why do clouds stay up in the sky? Half the public thinks the sun goes round the Earth, and the other half would seriously struggle to explain how we know that it doesn't. I'd guess around 95% of the general public don't have a clue on science.

And yet, much larger proportions than 5% hold firm opinions on a range of scientific topics, including radiation, health and nutrition, energy, space flight, pesticides, agriculture, population, wildlife, genetics, and many more. Some happen to agree with the prevailing scientific opinion and some disagree with it, but even the ones who agree don't understand why, or what the evidence is. So for the 95%, it's not a question of being literate or not, because virtually everybody is effectively illiterate. Which side of any topic one falls depends on a multitude of other factors, like what newspaper you read.

But even for those who are scientifically literate, there can still be conflict. Again, climate change is the classic, but it's not the only subject. Ask the physicists about the 'Many Worlds' interpretation of quantum mechanics sometime, if you want an incomprehensible but extremely vehement diatribe on why people who do/don't believe in Many Worlds are idiots. In the past there have been similar arguments about black holes, the luminiferous aether, action at a distance, particles or waves, differentials and infinities and imaginary numbers and so on. Conflict is normal in science.

And I'd personally argue that it's healthy, too. A little bit of conflict keeps things lively. It makes people think. It shakes them out of their comfort zone. Catches common misunderstandings and subtle errors, creates new insights for explaining things.

Conflict can motivate people to learn about science, but that doesn't necessarily reduce the conflict. And people who know more science may be better at the conflict as a result. It probably goes both ways, in a complicated way mixed up with everything else, and probably with no single right answer or simple explanation. Like most things.

August 6, 2013 | Unregistered CommenterNiV

Some thoughts:
"
"What's more, I worry (as always) that I could be completely wrong about everything."

I was struck by that comment after first reading no acknowledgement of uncertainty in following comment and the subsequent few sentences:

"This conclusion is true; what’s not is the usual understanding of cause and effect that accompanies this perspective."


One of the things that I think that "skeptics" get right (Judith Curry, in particular) is the argument that in order to communicate climate science effectively (and I would argue, to communicate any thesis in the face of controversy), uncertainty needs to be foregrounded. I don't buy their argument that foregrounding uncertainty will gain buy-in from any significant % of "skeptics" (because I think their perspective on climate change is pretty much hard-wired due to cultural cognition), and I don't think that it would substantively affect the problem you're addressing here, but I will say that when I read those first sentences I was inclined to find reasons to disagree, and when I read the follow-on admission of doubts, my orientation was immediately less oppositional.

But that's really just a comment on style, and not substance....

I am intrigued by John C's comment above. What historical precedent is there for a dissolution of politically-charged polarization that might serve as an example of how to address the polarization about climate change, nuclear energy, gun control, etc., where different camps line up behind different "experts" and dismiss those who disagree with them as being informationally, analytically, or morally inferior?

Have you looked at that question? What parallels might be found that might be instructive. More specifically - what might be instructive w/r/t science communication? Have there been, in the past, issues where broadly-shared perspective among scientists was in conflict with the views of significant %'s of the public, and where those differences were clearly correlated with political orientation, and where eventually either one side or the other or both sides moved towards some more commonly-shared perspective?

It is interesting to me that I can't think of one example. Any science-related issue that I can think of where there was a strong shift in public opinion can be characterized by a parallel shift in the prevalent view among scientists (or experts). Am I missing some obvious examples?

August 6, 2013 | Unregistered CommenterJoshua

@ Joshua

1. It is conventional to offer a conclusion as a conclusion, a belief as a belief, in making an evidence-based policy prescription. It is understood that any claim supported by evidence is subject to being challenged -- based on strength or quality of evidence & more importantly new evidence. If someone doesn't make clear what the evidence is, then others can't assess the claim, & are well advised to ignore the person making it. By the same set of conventions, people offer qualified conclusions when they themsleves have a current belief that is a weak or qualified one; one can have a strong current belief -- prior-- w/ot being unwilling to update it, so it is unnecessary to convey that one is willing to update by understating the stregnth of a prior. If the convention were otherwise-- sound uncerttain even when your priors are strong, too, lest people not think you are willing to consdier new evidence -- information woudl be lost; we'd never know if the equivocator had a strong prior but was trying to convey the obvious (I update based on new evidence) or instead had a genuinely weak belief based on all the evidence.


2. Those are the convention in a conversation among people engaged in scholarly & practical engagement with public policy. That's the conversation I'm in. I studty science of science communciation; I'm not a publiic science communicator. But as someone who studies science of science communication, I would say (based on evidence, subject to counterevidence etc) that it woudl be absurd to think that how I talk about my conclusions has an impact on public opinion on climate change -- and thus odd for me to "frame" things in ways that convey some sort of uncertainty about my views so as to persuade tnem (I'm not saying anything about whether climate change is happening, what causes it, etc either-- so why would I think my argument is even relevant to a climate skeptic as opposed to someone who like to learn bout how science communication works?).

3. for an example on "change" in comprehension of decision-relevant science being triggered by change in cultural meaning, see cigarettes. I ave added that to "reference list" of blogs (or will momentarily; if I don't, call 911).

August 6, 2013 | Registered CommenterDan Kahan

@NiV:

Ask any physicist whose son is a better musician -- Hugh Everett's, Niels Bohr's, David Bohm's or Richard Feynman's? I predict you'll find scientific consensus (97% +/- 5% at 0.95 LC) on that.

August 7, 2013 | Registered CommenterDan Kahan

@Joshua:

Having slept (briefly) on it, I can see my point 2 is not responsive. You were arguing by analogy.

So substitute this:

"2. As for the 'framing' effect of certitude in this context: My goal isn't so much to persuade anyone as to furnish evidence relevant to those engaged in conversation w/ me (the conversation being mainly a professional one, but occasionally and rewardingly one w/ curious people outside of my profession). I can realize that aim only if I can (by legitimate means, of course!) get them to notice that I have some evidence that they ought to consider. So what's more likely to guide people to my talk at AGU rather than to the billion other things they could do instead -- a synopsis that conveys my position sharply and confidently or one that is filled w/ 'ummmm, well, maybe ... I'm not sure, but here's one possibility ...' etc? Tocqueville in intro to DOM says that it's necessary to exaggerate or no one will pay attention. I think he was exaggerating a bit -- to get our attention -- but he had a legitimate point."

August 7, 2013 | Registered CommenterDan Kahan

"Have there been, in the past, issues where broadly-shared perspective among scientists was in conflict with the views of significant %'s of the public, and where those differences were clearly correlated with political orientation, and where eventually either one side or the other or both sides moved towards some more commonly-shared perspective?"

Eugenics?

August 7, 2013 | Unregistered CommenterNiV

@NiV:

say more?

August 7, 2013 | Unregistered Commenterdmk38

Dan,

It seemed to me to be an example that fitted Joshua's criteria. A broad scientific consensus formed around a theory that suited some people's political agenda, and fitted with their views about mankind. The theory held sway for a number of years, policies based on it were half-heartedly implemented by governments, and those who disagreed were ridiculed as unscientific. Eventually the theory was discredited: first politically, when a particularly unpopular world leader graphically showed what a full-blooded implementation actually looked like, and once unsupported by the politics the science soon caught up with a new understanding of why the theory was actually wrong.

Truthfully, when I suggested it I hadn't been thinking of any parallels with modern times. It was just the first thing that popped into my mind when Joshua asked for examples, as an example of a scientific controversy mixed up in politics that was later resolved. In retrospect, I probably shouldn't have brought it up.

So I'll offer instead 'black holes'. I don't think there was a clear political divide, but apart from that, it kinda fits. There was a broad consensus in the astrophysics community that they couldn't possibly exist, with both Eddington and Einstein coming out against them. However, the general public quite liked the idea, thinking they were dramatic and cool. Eventually some decades later the old guard of scientists retired or died, a new younger generation grew up without their preconceptions, and the consensus changed again. Now everyone agrees.

The research did get mixed up a little with the politics of the cold war, partially because a lot of the physics of stellar collapse was the same as was used in atomic bomb research. The Russians were pursuing their research on them while the West was still hung up on trying to find reasons why the world shouldn't work like that. Establishment science tries to maintain the world as a sensible place. Revolutionary science likes to overturn common sense. It's a different cultural mindset.

Although Einstein was an odd case, since he started off as a revolutionary, but changed into a conservative later on on black holes and quantum mechanics.

It reminds me somewhat of the individualist/communitarian divide, in your scheme. Do you try to fit in with the scientific community, or do you strike out as a maverick individual? Do you think there might be an individualist/communitarian split on other scientific questions? :-)

August 7, 2013 | Unregistered CommenterNiV

NiV -

I don't think there was a clear political divide, but apart from that, it kinda fits.

Well, in terms of my focus of interest, political divide is a basic exclusion criterion.

As for eugenics - do you have some evidence to show where there was difference in prevalence of view among the scientific community as compared to the public more generally - as that, also, is a basic exclusion criterion?

August 7, 2013 | Unregistered CommenterJoshua

Dan -

Cigarettes is an interesting example. How is the change in public perception about cigarettes consistent with a theory of cultural cognition?

In order to be analogous, it would have to be true that initially significant %'s of the public rejected a prevalent view among "experts" that cigarettes are harmful- so good so far. (I think) It may be a safe assumption that a discrepancy in prevalence of opinion that cigarettes are harmful was correlated with political orientation (I have no idea if it actually was, but it does seem to be somewhat logical conjecture).

So it seems that a concerted PR campaign by that "expert" community is what eventually changed public perception. In that sense, it seems that the deficit theory would apply for the change in perception about cigarettes; by providing more information even though it was potentially politically polarizing, eventually the prevalent view among scientists prevailed. In that case, the cultural/social/political group identification did not hold sway (long-term) against a predominance of evidence presented by the scientific community. While those identifications may have affected the initial state of the "debate," they did not prove decisive long-term.

Now I tend to agree with you in broad strokes about the weakness of the deficit theory in the context of climate change (even if I disagree with you w/r/t some of the details) so I've got me a problem. So I've got some questions:

1) Am I misunderstanding something about the context with cigarettes that confuses my understanding of whether or not the deficit theory applies?

2) Is there no way to effectively generalize? Could the weakness of the deficit theory simply apply in one circumstance but not in another? Please note this is not quite the same as possibility #4.

3) Is the deficit theory actually applicable in both circumstances, and that it is simply too early to assess what the situation will be long-term with how the deficit theory applies w/r/t climate change?

4) Is there something fundamentally about the circumstance with climate change that just mean that it is a more "wicked." In other words, maybe there is just too much room with climate change for the complicated matrix of influences on risk perception to surface and overwhelm the power of overcoming a "deficit" - such as short-term weather phenomena or economic circumstances. This is different than #2, because #2 is suggesting that the whole dynamic is too complex for a pattern to emerge whereas this possibility rests on identifying specific influences that make the deficit theory apply for cigarettes but not for climate change.

Other possibilities?

August 7, 2013 | Unregistered CommenterJoshua

@Joshua--

The story of cigarettes is so much more interesting than that! It is precisely b/c there is so much eviidence that the account you offered-- that the change was "pr campaign by experts" curing the "knowledge deficit" -- can be is false that I listed it as example of how "changes in cultural meanings" come before recogniton of decision relevant science on a cultually contested matter. ICheck out the blog post I linked (and added to the list in the update) in response to your request.

I'll thkn about the rest (although the link does address the very issue you are asking about)

August 7, 2013 | Registered CommenterDan Kahan

Also, NiV - contrary to your description - a brief bit or research seems to indicate that (at least in the U.S.): (1) eugenics had fairly broad support among a political cross-section of the public (thus resulting in associated legislation in many states with diverse political demographics) and (2) the scientific community played a significant role in discrediting the scientific validity of eugenics theory. So the factors leading to changing opinions were more complex than that you outlined with the following description?:

...and once unsupported by the politics the science soon caught up with a new understanding of why the theory was actually wrong.

August 7, 2013 | Unregistered CommenterJoshua

Dan -
It declined after public health advocates initiated a vicious and viciously successful social meaning campaign that obliterated all the various positive cultural meanings associated with smoking (or most of them) and stigmatized cigarette use as "stupid," "weak," "inconsiderate," "repulsive," etc. At that point, people not only accepted the evidence in the SG's 1964 Report but started to accept all sorts of overblown claims about 2nd hand smoke etc. Yup -- it was all about "eventually accepting evidence"; nothing to do with social meanings there... (not). (I discuss the issue, and relevant sources including 2000 Surgeon General's Report on smoking & social norms, in an essay entitled The Cognitively Illiberal State.)

So, two questions about that excerpt:

This first is whether or not those conducting the PR campaign about the harm from cigarettes (government or scientific establishment "experts"), and their tactics, might not be analogous to those who are engaged in the PR campaign against "deniers" using a similar tactic of associating "denialism" with socially repugnant characteristics might also be similar?

The second is whether you're being too categorical about your disassociation between the impact of scientific evidence and the socially-oriented PR campaign. In other words, you seem to be arguing that the gap between when the report was published and when public opinion came around suggests that the impact of the scientific report was null - but maybe it just took a lot of time for the scientific evidence to grab hold (relatedly, I've seen the argument made by "realists" that over-time, those who question the "consensus" view of climate change are, essentially, dying out - and that "skepticism" is much less prevalent among younger people. Perhaps the changes in public opinion about the "harm" of homosexuality would be a similar example there?). Further, related to item #1 - weren't the socially-oriented campaigns conducted by those affiliated with the scientific community? Wouldn't that suggest that the cleavage you're identifying wasn't nearly as categorical as you suggest?

...but started to accept all sorts of overblown claims about 2nd hand smoke etc.

Uh oh. I really don't want to get into that one - and I won't because it is way more than challenging enough just to try to deal with the questions at hand - but I just want to go on record as saying that I think that issue deserves a much, much more comprehensive treatment.

August 7, 2013 | Unregistered CommenterJoshua

"(1) eugenics had fairly broad support among a political cross-section of the public (thus resulting in associated legislation in many states with diverse political demographics) and (2) the scientific community played a significant role in discrediting the scientific validity of eugenics theory."

(1) Data?

(2) I didn't say otherwise.

August 8, 2013 | Unregistered CommenterNiV

Not much direct evidence, but how do laws in 32 states, and national laws like the Immigration Act of 1924 get enacted and pass SCOTUS muster w/o fairly broad political support from citizens of fairly diverse political demographics?

http://en.wikipedia.org/wiki/Eugenics_in_the_United_States

I have no background on the source below, but the author mentioned, Paul, does seem like someone who has studied the history of Eugenics extensively

Another common supposition is that eugenics was a right-wing movement. But as Paul and many others have pointed out, from the end of the nineteenth century to the middle of the twentieth, eugenics enjoyed huge popular support amongst all sections of society.

http://www.hgalert.org/topics/geneticDiscrimination/eugenics.htm

Paul's book:

http://www.amazon.com/The-Politics-Heredity-Biomedicine-Nature-Nurture/dp/0791438228

Do you have data that indicate otherwise?

August 8, 2013 | Unregistered CommenterJoshua

Also - another source that seems to support the view that eugenics had fairly broad support among a political cross-section of the public.

http://www.amazon.com/Popular-Eugenics-National-Efficiency-American/dp/0821416928

August 8, 2013 | Unregistered CommenterJoshua

@Joshua

The blog post refers/links to sources (including the 2000 Surgeon General's Report on public understanding of/reaction to smoking risks) that afford "more comprehensive treatment" to all the points you raise -- take a look at them & let me know if you find the evidence they present unsatisfying & why

August 8, 2013 | Unregistered Commenterdmk38

Dan,
When you say "cultural and psychological sources of such conflict must themselves be extinguished (with the use of tools and strategies themselves identified through ..."

that, itself, strikes me as a potential, cultural source of conflict.

There is an implicit value-system being advanced, which might be crudely stated as, "science is the best source of knowledge in all matters", or "one may not validly argue with scientific claims made by a majority of scientists".

One might take this goal as being a veiled threat to one's culture, if members of that culture were in some respect "counter cultural". What counter-cultural people know is the history of various abuses being conducted under a scientific banner. They also remember that many of the great scientists were ridiculed in their time.

Just a thought.

Vastly enjoying your research, however! Please continue!

October 22, 2013 | Unregistered CommenterMr. Lynch

@Mr.Lynch
It is fair to say that the premise of pretty much every normative claim being advanced across these posts that science's way of knowing is superior to any alternative way of knowing -- although I think that science's way of knowing is confined to matters that admit of being known by empirical observation & valid causal inference. But that definitely doesn't reduce to "one may not validly argue with scientic claims made by a majority of scientists"; scientists do this all the time & it is a premise of science's way of knowing that any current position is subject to revision upon the presentation of additional evidence.
For sure this is a culturally partisan stance. I admit to being a science partisan.
But I think that this creates no real cultural conflict in our society; we are past the point of questionin science's way of knowing. People in the US really can't even conceive of what it would be like to treat one of science's rivals as superior to science's way of knowing.
Does this seem wrong to you?

October 22, 2013 | Unregistered Commenterdmk38

@dmk38: Let me start by thanking you for your gracious reply, and recognition of the cultural aspect of "partisanship in science", for lack of a better term.

You write: " People in the US really can't even conceive of what it would be like to treat one of science's rivals as superior to science's way of knowing."

I don't think this is quite true. One example is in healthcare: people are willing to be quite unscientific about what treatments they will consider for themselves, and know that the medical profession (including the drug and device industry) have occasionally messed up, big time. They also know that scientific claims of knowledge can be polluted by motivated reasoning and calculating (disclosure: I used to work in that industry,and was accused of this often.)
I guess my main point is that it is reasonable -- even for the unschooled general public -- to be skeptical about scientific claims when the personal or societal stakes are high. I think that is human nature, and unlikely to be overcome by better scientific communication. As a socially concerned citizen, I'm not sure it is in society's best interest to have that skepticism overcome, either.

Best regards, etc.

October 22, 2013 | Unregistered CommenterMr. Lynch

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>