follow CCP

Recent blog entries
Thursday
Mar072013

Marschack Lecture at UCLA on Friday March 8

Will file a "how it went" afterwards for those of you who won't be able to make it.

 

Monday
Mar042013

Informed civic engagement and the cognitive climate

Gave a talk today at an event sponsored by the Democracy Fund. Topic was how to promote high-quality democratic deliberations in 2016.

Pretty sure the guy who would have been ideal for the talk was Brendan Nyhan. Maybe he wasn't available. But I did the best I could, which included advising them to be sure to consult Nyhan's work on the risk of perverse effects from aggressive "fact checking."

Outline of my remarks below (delivered in 10 mins! Barely time for one sentence; of course, even w/ 120 mins, I still wouldn't use more than one sentence). Slides here.

1. Overview: Cognition & reasoned public engagement

Promoting reasoned public engagement with issues of consequence requires more than supplying information. The public’s assessment of information is governed by cognitive dynamics that are independent of information availability and content. Indeed, such dynamics can produce perverse effects: e.g., polarization in response to accurate information, or intensification of mistaken belief in face of “fact checking” challenges. The anticipation of such effects, moreover, can create incentives for political campaigns to foster public engagement that isn’t connected to the best available evidence, or simply to ignore issues of tremendous consequence.

2.  2012: Two dynamics, two missing issues

a.  Climate change was largely ignored in 2012 Presidential election b/c of “culturally toxic meanings.” When positions become symbols of group membership & loyalty, citizens resist engaging information that is hostile to their group, and draw negative inferences about the values and character of political candidates tho present it. It is thus safer for candidates in a close election to steer clear of the issue than to try to persuade. This explains Obama's and Romney's decisions to avoid climate: they couldn't have informed the public if they had and faced a much bigger risk of alienating voters they hoped they might otherwise appeal to.

b. Campaign finance, arguably the most important issue confronting US, was ignored, too, not because of toxic meaning but because of “affective poverty.” Public opinion reflects widespread support for all manner of campaign finance regulation. But the issue is inert; it doesn’t generate the images, stories, associations through which citizens apprehend matters of consequence for their lives. Thus, focusing on it would be a waste from candidates’ point of view.

3.  2016: Managing the cognitive climate

a.  The influences that determine cognitive engagement can’t be ignored but also shouldn’t be treated as fixed or given. If a cognitive mechanism that frustrates engagement can be identified, responsive strategies can be formulated to try to counteract the operation of that mechanism.  I’ll focus on the 2012 ignored issues as examples, but same orientation would be appropriate for any other issue.

b. Local political activity on adaptation is vibrant even in regions—e.g., Fla & Az-- in which climate change mitigation is taboo topic for political actors. Adaptation is free of the toxic meanings that surround climate change debate and indeed congenial to locally shared ones. Promoting constructive deliberations on adaptation has the potential to free the climate debate from meanings that block public engagement and scare politicians off.  The cognitive climate would then be more hospitable for national engagement in 2016.

c. Between now & 2016, there is time to work on affective enrichment of campaign finance. Just as public health activists did with cigarettes, so activists can create and appropriately seed public discourse with culturally targeted narratives that infuse campaign finance with motivating resonances. This would create incentive of candidates to feature issue rather than ignore it in campaigns.

4.  Proviso: Cognitive climate management must be evidence based.

The number of plausible strategies for positively managing the cognitive climate will always exceed the number that will actually work. Imaginative conjecture alone won’t reliably extract the latter from the sea of the former. For that, it’s necessary to use evidence-based strategies. Activists confronted with practical objectives and possessed of local knowledge should collaborate with social scientists to formulate hypotheses about strategies for managing the cognitive climate, and to use observation and measurement for fine tuning and assessing those strategies. And they should start now.

Sunday
Mar032013

How common is it to notice & worry about the influence of cultural cognition on what one knows? If one is worried, what should one do?

Via Irina Tcherednichenko, I encountered a joltingly self-reflective post by Lena Levin on G+:

Just yesterday, I successfully stopped myself from telling a person that their expressed belief has not a shred of evidence to support it (just in case, it wasn't a religious belief, that was something that could be demonstrated scientifically, but hasn't been). I stopped myself (pat on the head goes here) because, for one thing, I knew it would lead nowhere; and for another, I have my share of beliefs with a similar status of not being supported by scientific evidence (but not disproved by it either).

Just like anyone else beyond the age of five or ten, I have a worldview, my own particular blend of education, research, life experiences, internalized beliefs, etc. And by now, this worldview isn't easy to shake, let alone change. It doesn't mean that I disregard new scientific evidence, but it does mean that whenever I hear of new findings that seem to be in explicit contradiction with my worldview, I make a point of finding the source and reading it in some detail (going to a university library if need be). In 99 cases out 100 (at least), it turns out that I don't have to change my worldview after all: sometimes the apparent contradiction results from BBC-style popularization with a healthy doze of exaggeration or downright mistakes on a slow news day, sometimes the original research arrives at some almost statistically insignificant result based on far too small a sample, prettified it to make it publishable, or something else, or both.

But the dangerous thing is, if a reported finding does agree with my worldview, I usually don't go to such lengths to check the original source and the quality of research (with few exceptions). There is, of course, a certain degree of confirmation bias at work here, but my time on this earth is limited and I cannot spend it all in checking and re-checking what is already part of my worldview. What I do try to avoid in such cases is the very tempting assumption that now, finally, this particular belief is a knowledge based on scientific evidence (unless I really checked it at least with the same rigor as described above). I am afraid I am not always successful in this... are you?

I thought others might enjoy reflecting/self-reflecting on this sort of self-reflection too.

Here are my questions (feel free to add & answer others):  

1.  What fraction of people are likely to be this self-reflective about how they know what they know?

2.  Would things be better if in fact it were more common for people to reflect on the relationship between who they are & what know, on how this might lead them to error, and on how it might create conflict between people of different outlooks? If so, how might such reflection be promoted (say, through education, or forms of civic engagement)?

3.  Okay: what is the answer to the question that Levin is posing (I understand her to be asking not merely whether others who use her strategy think they are successful with it but also whether that strategy is likely to be effective in general & whether there are others that might work better)? What should a person who knows about this do to adjust to the likely tendency to engage in biased search (& assimilation) consistent w/ worldview.

Saturday
Mar022013

Check out Jen Briselli's cool pilot study of cultural cognition of vaccine risk perceptions

She called my attention to the study a few days ago & I'm just now getting a chance to think about it in a serious way. So far what's grabbing my attention the most is the scatterplot of "preferred arguments," although I definitely have a range of thoughts & questions that I plan to relay to Jen.  I'm sure she'd like to know what others think too.  Plus check out her site & learn about her really interesting general project.

Wednesday
Feb272013

Dear Seth Mnookin & other great science journalists

Dear Seth,

Fighting falsehood and selfishness with facts & public-spirit!I've reflected a bit more on this (& this).  I've pinpointed the source of my frustration: the conflation of the  "anti-vaccine movement" with a "growing crisis of public confidence,” a “growing wave of public resentment and fear,” an “epidemic of fear"  etc. that have pushed us to the “tipping point” at which herd immunity breaks down” – or indeed, over it “causing epidemics” in whooping cough & other diseases because of the “low vaccination rate.

The first is real, is a menace, and warrants being vividly identified and analyzed and also effectively repelled with fact and public spirit.

The second is a phantom. It also warrants being identified & analyzed. How do so many come to be so terrified of something that is genuinely terrifying but that doesn't truly exist?  Psychological dynamics are involved, certainly, but I suspect manipulative forms of self-promotion -- ones that reflect a betrayal of craft  -- are also at work.  

Whatever its cause, though, the propagation of the assertion that there is a "growing crisis of public confidence" in vaccines -- a claim frequently bundled with the empirically unsupported proposition that science is "losing authority" in our society -- deserves being opposed too.  Our science communication environment should not be polluted with misrepresentation.  Fear should not dilute the currency of reason in public discussion. The Liberal Republic of Science shouldn't tolerate partisan resort to "anti-science" red-scare tactics (on left or right).

The moral force of these  principles doesn't depend on proof of the bad consequences that disregarding them produces. But violating them does predictably generate  very bad consequences, including the disablement of our capacity to recognize and be guided by the best available scientific evidence in our personal and collective decisions. 

Be like Ralph! & Danny!Ironically our society, which possess more science intelligence than any in history, lacks an organized science-communication intelligence. But many, in many sectors of society, recognize this deficit and are taking effective steps to remedy it. 

Science journalists are, of course, playing the leading role in this effort. We have always relied on them to make what's known by science known to those whose quality of life science can enhance. They will necessarily  play a key role if our society can succeed in replacing the blundering, unreflective manner in which it now handles transmission of scientific knowledge with a set of scientifically informed practices and institutions consciously geared to performing this critical task. 

So it would be ungrateful and ignorant to be angry at "the media" for being the medium of the  "anti-vaccine = anti-science public" phantom.  If we turn to science journalists for help in counteracting the propagation of this pernicious trope, it's not a call to "clean house."  It's just a request to the thoughtful and public-spirited members of that profession to do exactly what we are relying on them to do and what they have already been doing in modeling for the rest of us what contributing to the public good of maintaining a clean science communication environment looks like.

Your grateful admirer,

Dan

 

Tuesday
Feb262013

Six modest points about vaccine-risk communication

1. Public fears of vaccines are vulnerable to exaggeration as a result of various influences, emotional, psychological, social, and political.

2. Fears of public fear of vaccines are vulnerable to exaggeration, too, as a result of comparable influences.

3. High-profile information campaigns aimed at combating public fear of vaccines are likely to arouse some level of that very type of fear. As Cass Sunstein has observed in summarizing the empirical literature on this effect, “discussions of low-probability risks tend to heighten public concern, even if those discussions consist largely of reassurance.

4.  Accordingly, an informed and properly motivated risk communicator would proceed deliberately and cautiously.  In particular, because efforts to quiet public fears about vaccines will predictably create some level of exactly that fear, such a communicator will not engage in a high-profile, sustained campaign to “reassure” the general public that vaccines are safe without reason to believe that there is a meaningful level of concern about vaccine risks in the public generally. 

5.  Not all risk communicators will be informed and properly motivated. Some communicators are likely to be uninformed, either of the facts about the level of public fear or of the general dynamics of public risk perception, including the potentially perverse effects of trying to “reassure” the public.  Others will not be properly motivated: they will respond to incentives (e.g., to gain attention and approval; to profit from fears of people who understandably fear there will be a decline in public vaccination rates) to exaggerate the level of public fear of vaccines.

6.  Accordingly, it makes sense to be alert both to potential sources of misinformation about vaccine risk and to potential sources of misinformation about the level of public perceptions of the risk of vaccines.  Being alert, at a minimum, consists in insisting that those who are making significant contributions to public discussion are being strictly factual about both sorts of risks.

Monday
Feb252013

What is the evidence that an "anti-vaccination movement" is "causing" epidemics of childhood diseases in US? ("HFC! CYPHIMU?" Episode No. 2)

note: go ahead & read this but if you do you have to read this.

This is the second episode of  “Hi, fellow citizen! Can you please help increase my understanding?”--orHFC! CYPHIMU?"--a spinoff of CCP’s wildly popular feature, “WSMD? JA!.” In "HFC! CYPHIMU?," readers competete against one another, or collectively against our common enemy entropy, to answer a question or set of related questions relating to a risk or policy-relevant fact that admits of scientific inquiry. The questions might be ones that simply occur to me or ones that any of the 9 billion regular subscribers to this blog are curious about. The best answer, as determined by “Lil Hal,”™ a friendly, artificially intelligent robot being groomed for participation in the Loebner Prize competition, will win a “Citizen of the Liberal Republic of Science/I ♥ Popper!” t-shirt!

I'm simply perplexed here. What's the evidence to support the claim that public resistance to childhood vaccination is connected to an  increased incidence of any childhood disease? Where do I find it?

If one does a Google search, one can easily find scores of alarming new sreports about "growing" anti-vaccine "movement" and its responsibility for outbreaks of diseases such as whooping cough.

But it's really really hard to find a news story that presents the sort of evidence that a curious and reasonable person might be interested in seeing in support of this genuinely scary assertion.

Look, I'm 100% positive that there are vocal, ill-informed opponents of childhood vaccination. Seth Mnookin paints a vivid, disturbing picture of them in his great book The Panic Virus. These groups assert that childhood vaccinations cause autism, a thoroughly discredited claim that has been shown to have originated in flawed (likely fraudulent) research.

If the question is whether we shoud condemn such folks, the answer is clearly yes.

But if the question is whether we should conclude that "[t]he anti-vaccine movement [has] cause[d] the worst epidemic of whooping cough in 70 years," etc., then we need more than the spectacle of such know-nothings to answer it. For such a claim to be warranted, there must be empirical evidence of (a) declining childhood vaccination rates that are (b) tied to disease epidemics.

Actually, it's pretty easy to find evidence-- outside of media reports on the anti-vaccine movement-- that tends to suggest (a) is false.  Consider this table from a recent (Sept. 2012) Center for Disease Control Morbidity and Mortality Weekly Report:


What it shows is DTaP vaccination rates for pertussis (whooping cough) holding steady at 95% for 3 or more doses and about 85% for 4 or more over the period from 2007-2011.

For MMR (mumps, measles, rubella), the rate hovers around 92% for the entire period. 

The rate of "children receiv[ing] no vaccinations" remains constant at about 0.7% (i.e., less than 1%). (In between these rows of data are rates for various other vaccinations -- like the one for Hepitatis B -- which all seem to show the same pattern. See for yourself.)

As for (b), it's also not too hard to find public health studies concluding that the outbreak in whooping cough was not caused by declining vaccination rates.  One, published recently in the New England Journal of Medicine, found that the incidence of whooping cough was actually slightly higher among children who had received a full schedule of five DTaP shots than those who hadn't, and that their immunity decreased every year after the fifth shot. That's not what you'd expect to see if the increased incidence of this illness was a consequence of nonvaccination.

"So what are the causes of today's high prevalence of pertussis?," asked a opinion commentary writer in NEJM.

 First, the timing of the initial resurgence of reported cases suggests that the main reason for it was actually increased awareness. What with the media attention on vaccine safety in the 1970s and 1980s, the studies of DTaP vaccine in the 1980s, and the efficacy trials of the 1990s comparing DTP vaccines with DTaP vaccines, literally hundreds of articles about pertussis were published. Although this information largely escaped physicians who care for adults, some pediatricians, public health officials, and the public became more aware of pertussis, and reporting therefore improved.

Moreover, during the past decade, polymerase-chain-reaction (PCR) assays have begun to be used for diagnosis, and a major contributor to the difference in the reported sizes of the 2005 and 2010 epidemics in California may well have been the more widespread use of PCR in 2010. Indeed, when serologic tests that require only a single serum sample and use methods with good specificity become more routinely available, we will see a substantial increase in the diagnosis of cases in adults.

In addition, of particular concern at present is the fact that DTaP vaccines [a newer vaccine introduced in the late 1990s] are less potent than DTP vaccines.4 Five studies done in the 1990s showed that DTP vaccines have greater efficacy than DTaP vaccines. Recent data from California also suggest waning of vaccine-induced immunity after the fifth dose of DTaP vaccine.5 Certainly the major epidemics in 2005, in 2010, and now in 2012 suggest that failure of the DTaP vaccine is a matter of serious concern.

Finally, we should consider the potential contribution of genetic changes in circulating strains of B. pertussis.4 It is clear that genetic changes have occurred over time in three B. pertussis antigens — pertussis toxin, pertactin, and fimbriae. . . .

Nothing about declining vaccination rates. Nothing.   

The writer concludes, very sensibly, that "better vaccines are something that industry, the Center for Biologics Evaluation and Research of the Food and Drug Administration, and pertussis experts should begin working on immediately."  

He also admonishes that "we should maintain some historical perspective on the renewed occurrences of epidemic pertussis and the fact that our current DTaP vaccines are not as good as the previous DTP vaccines: although some U.S. states have noted an incidence similar to that in the 1940s and 1950s, today's national incidence is about one twenty-third of what it was during an epidemic year in the 1930s."

I should point out too that in research I've done, I've just not found any evidence that a meaningful proportion of the general public views childhood vaccination as risky, or that there is any meaningful cultural divisions on this point.

Indeed, such vaccinations are one of the most commonly cited grounds members of the U.S. general public give for their (remarkably) high regard for scientists.

So ... what to make of this?  

Here are some questions:

1. Is there evidence I'm overlooking that suggests there really is a meaningful, measureable decline in vaccine rates in the U.S.? If so, please point it out, and I will certainly post it!

2. Is there evidence that nonvaccination (aside, say, from that in newly arrived immigrant groups) is genuinely responsible for any increase in any childhood disease? Ditto!

3. If not, why does the media keep making this claim? Why do so many people not ask to see some evidence?

4. If there isn't evidence for the sorts of reports I'm describing, is it constructive to make people believe that nonvaccination is playing a bigger role than it actually is in any outbreaks of childhood diseases? Might doing so actually reduce proper attention to the actual causes of such outbreaks, including ineffective vaccines?  Might they stir up anxiety by actually inducing people to believe that more people are worried about the vaccines than really are?

Can you please help increase my understanding, fellow citizens?


Sunday
Feb242013

What sort of vice is overstatement?

I don't really know, but I'm sure the sort of character deficiency that overstatement indicates is even more serious if someone who indulges in it doesn't recognize or acknowledge having done so, feel regret about it, thank the friends who pointed it out, and resolve to try to avoid recurrence.

My post on the "false & tedious defective brain meme" contained some regrettable elements of overstatement.

Before grappling with them, I want to start by extracting from the post the points that I do want to stand by and that I'm quite willing to defend in engaged discussion with others. They are essentially two: (a)  that "defective rationality" accounts of polarization over policy-relevant science are ill-supported; and (b) that the frenetic and repetitive prorogation of these accounts in wide-eyed, story-telling modes of presentation demeans serious public discussion and distracts thoughtful people from thoughtful engagement with this serious problem.

These are strong claims but I want to advance them strongly because I feel they are right and important, and because I believe that obliging people to confront them, to the extent that I can, will advance common understanding -- either by helping people to see why views they might hold should be abandoned or, if it turns out I'm wrong (I certainly accept that I might be), by fortifying the basis for confidence they can have in them once they've dealt with evidence that seems to suggest a very different explanation for the difficulty we face.

Here are the elements of the post that I now recognize to be in the nature of regrettable overstatement:

  • The singularity and certitude with which I advanced my alternative explanation. In fact, I feel the position I articulated--one I & others have been engaged in elaborating theoretically and testing empirically for a sustained period--is the best one for the phenomenon I mean to address, viz., conflict over societal risks and related facts. But there are other reasonable and plausible hypotheses (ones that are also much more subtle than the "our brains make us stupid!" trope); also many open questions, the investigation of which can furnish evidence that warrants revising the degree of confidence a reasonable person can have that the position I advanced, and not these others, is correct. It is disrespectful of other researchers and thoughtful appraisers of research to carry on as if this were not so. The cast of mind I displayed also demeans the enterprise of empirical inquiry by evincing the vulgar attitude that science is about reaching "final" and "conclusive" answers to difficult questions. Intrinsic to science's way of knowing is recognition of the permanent provisionality of what is known; expressing oneself in a manner that obscures or denies this not only risks misleading people but is ugly. One can have and communicate conviction in favor of, and can passionately advocate action based on, one's beliefs without concealing that what one believes is necessarily based on one's best understanding of the currently available evidence.
  • The thoughtless conflation of discrete and complex matters. I meant to be addressing something particular: polarization over risks and other policy-relevant facts in controversies like climate change, gun control, fiscal policy, etc. But I wrote in a manner that invited the interpretation that  I was discussing something much more general. Motivated reasoning, biased assimilation, and the like are not confined to these matters; the dynamics involved in attitude polarization won't reduce to the single one I was giving. The carelessness generality with which I presented my views injected an air of grandiosity into them that is embarrassing as well as potentially misleading.
  • Reckless imprecision in criticism.  I framed my argument as a criticism of science journalists. Often science journalists do, I think, deliberately frame as evidence of defects in human rationality--and in particular, as inconvenient leftovers in the evolution of the "brain"-- findings of decision science that don't bear any such interpretation. I am quoted often in articles that squeeze themselves into this template even though I don't see "cultural cognition" that way at all, and have been careful to emphasize in my discussions with writers that polarization originating in cultural cognition reflects unusual, correctible conditions inimical to reason (by analogy: if you can't see after someone shines a bright light in your eyes, that doesn't mean your eye is "defective"; it means that the normally reliable faculty of sight is disabled by the flashing of intense bursts of light into your face). I should have noted, though, that many science journalists don't make this mistake.  And even more important, many of those who do are only transmitting truly awful scholarship being performed by researchers (and scholarly synthesizers) who exploit the peculiar fascination with "brain" centered explanations. It's really a huge injustice to express dissatisfaction with science journalists (whose craft skills should in fact be mined for insights into how to improve science communication in multiple sectors of our society) for this regrettable spectacle, which continues notwithstanding high-profile exposure of the defects in methods still routinely used in many such studies.  (Oh-- just to avoid compounding my problems: I don't mean to say that all neuro-science studies that feature fMRI use these bogus methods. Indeed, some of the coolest studies I've ever seen are based on fMRI used as a distinctively discerning measure in connection with inspired experimental designs. See, e.g., this.)
Friday
Feb222013

Is A. Gelman trying to provoke me, or is that just my narcissism speaking?


Friday
Feb222013

The false and tedious "defective brain" meme

I know expressing exasperation doesn't really accomplish much but:

Please stop the nonsense on our “defective brains.”

Frankly, I don’t know why journalists write, much less why newspapers and newsmagazines continue to publish, the same breathless, “OMG! Scientists have determined we’re stupid!!!” story over & over & over. 

Maybe it is because they assume readers are stupid and will find the same the same simplistic rendering of social psychology research entertaining over & over & over.

Or maybe the writers who keep recycling this comic book account of decision science can't grasp the grownup version of why people become culturally polarized on risk and related facts—although, honestly, it’s really not that complicated!

Look: the source of persistent controversy over risks and related facts of policy significance is our polluted science communication environment, not any defects in our rationality.

People need to (and do) accept as known by science much much much more than they could possibly understand through personal observation and study.  They do this by integrating themselves into social networks—groups of people linked by cultural affinity—that reliably orient their members toward collective knowledge of consequence to their personal and collective well-being.

The networks we rely on are numerous and diverse—because we live in a pluralistic society (as a result, in fact, of the same norms and institutions that make a liberal market society the political regime most congenial to the flourishing of scientific inquiry).  But ordinarily those networks converge on what’s collectively known; cultural affinity groups that failed to reliably steer their members toward the best available evidence on how to survive and live well would themselves die out.  

Polarization occurs only when risks or other facts that admit of scientific inquiry become entangled in antagonistic cultural meanings. In that situation, positions on these issues will come to be understood as markers of loyalty to opposing groups.  The psychic pressure to protect their standing in groups that confer immense material and emotional benefits on them will then motivate individuals to persist in beliefs that signify their group commitments.

They'll do that in part by dismissing as noncredible or otherwise rationalizing away evidence that threatens to drive a wedge between them and their peers. Indeed, the most scientifically literate and analytically adept members of these groups will do this with the greatest consistency and success.  

Once factual issues come to bear antagonistic cultural meanings, it is perfectly rational for an individual to use his or her intelligence this way: being "wrong" on the science of a societal risk like climate change or nuclear power won't affect the level of risk that person (or anyone else that person cares about): nothing that person does as consumer, voter, public-discussion participant, etc., will be consequential enough to matter. Being on the wrong side of the issue within his or her cultural group, in contrast, could spell disaster for that person in everday life.

So, in that unfortunate situation, the better our "brains" work, the more polarized we'll be. (BTW, what does it add to these boring, formulaic "boy, are humans dumb!" stories to say "scientists have discovered that our brains  are responsible for our inability to agree on facts!!"? Where else could cognition be occurring? Our feet?!)

The number of issues that have that character, though, is miniscule in comparison to the number that don’t. What side one is on on pasteurized milk, fluoridated water, high-power transmission lines, “mad cow disease,” use of microwave ovens, exposure to Freon gas from refrigerators, treatment of bacterial diseases with antibiotics, the inoculation of children against Hepatitis B, etc. et. etc., isn't viewed as a a badge of group loyalty and commitment for the affinity groups most people belong to. Hence, there's not meaningful amount of cultural polarization on these issues--at least in the US (meaning pathologies are local; in Europe there might be cultural dispute on some of these issues & not on some of the ones that divide people here).

The entanglement of facts that admit of scientific investigation—e.g., “carbon emissions are heating the planet”; “deep geologic isolation of nuclear wastes is safe”—with antagonistic meanings occurs by a mixture of influences, including strategic behavior, poor institutional design, and sheer misadventure. In no such case was the problem inevitable; indeed, in most, such entanglement could easily have been avoided.

These antagonistic meanings, then, are a kind of pollution in the science communication environment.  They disable the normal and normally reliable faculties of rational discernment by which ordinary individuals recognize what is collectively known.

One of the central missions of the science of science communication in a liberal democratic state is to protect the science communication environment from such contamination, and to develop means for detoxifying that environment when preventive or protective measures fail.

This is the account that is best supported by decision science. 

And if you can’t figure out how to make that into an interesting story, then you are falling short in relation to the craft norms of science journalism, the skilled practitioners of which continuously enrich human experience by figuring out how to make the wonder of what's known to science known by ordinary, intelligent, curious people.

Thursday
Feb212013

Local adaptation & field testing the science of science communication

from Making Climate-Science Communication Evidence-based—All the Way Down:

Consider this paradox. If one is trying to be elected to Congress in either Florida or Arizona, it is not a good idea to make “combating global climate change” the centerpiece of one’s campaign. Yet both of these states are hotbeds of local political activity focusing on climate adaptation. A bill passed by Florida’s Republican-controlled legislature in 2011 and signed into law by its tea-party Governor has initiated city- and county-level proceedings to formulate measures for protecting the state from the impact of projected sea-level rises, which are expected to be aggravated by the increased incidence of hurricanes.

Arizona is the site of similar initiatives. Overseen by that state’s conservative Governor (who once punched a reporter for asking her whether she believed in global warming), the Arizona proceedings are aimed at anticipating expected stresses on regional water supplies.

Climate science—of the highest quality, and supplied by expert governmental and academic sources—is playing a key role in the deliberations of both states.  Florida officials, for example, have insisted that new nuclear power generation facilities being constructed offshore at Turkey Point be raised to a level higher than contemplated by the original design in order to reflect new seal-level rise and storm-activity projections associated with climate change. The basis of these Florida officials’ projections are the same scientific models that Florida Senator Marco Rubio, now considered a likely 2016 presidential candidate, says he still finds insufficiently convincing to justify national regulation of carbon emissions.

The influences that trigger cultural cognition when climate change is addressed at the national level are much weaker at the local one. When they are considering adaptation, citizens engage the issue of climate change not as members of warring cultural factions but as property owners, resource consumers, insurance policy holders, and tax payers—identities they all share. The people who are furnishing them with pertinent scientific evidence about the risks they face and how to abate them are not the national representatives of competing political brands but rather their municipal representatives, their neighbors, and even their local utility companies.

What’s more, the sorts of issues they are addressing—damage to property and infrastructure from flooding, reduced access to scarce water supplies, diminished farming yields as a result of drought—are matters they deal with all the time. They are the issues they have always dealt with as members of the regions in which they live; they have a natural shared vocabulary for thinking and talking about these issues, the use of which reinforces their sense of linked fate and reassures them they are working with others whose interests are aligned with theirs. Because they are, in effect, all on the same team, citizens at the local level are less likely to react to scientific evidence in defensive, partisan way that sports fans do to contentious officiating calls.

Nevertheless, it would be a mistake to assume that local engagement with adaptation is impervious to polarizing forms of motivated reasoning. The antagonistic cultural meanings that have contaminated the national science communication environment could easily spill over into local one as well. Something like this happened—or came close to it—in North Carolina, where the state legislature enacted a law that restricts use of anything but “historical data” on sea-level in state planning. The provision got enacted because proponents of adaptation planning legislation there failed to do what those in the neighboring state of Virginia did in creating a rhetorical separation between the issue of local flood planning and “global climate change.” Polarizing forms of engagement have bogged down municipal planning in some parts of Florida—at the same time as progress is being made elsewhere in the state.

The issue of local adaptation, then, presents a unique but precarious opportunity to promote constructive public engagement with climate science. The prospects for success will turn on how science is communicated—by scientists addressing local officials and the public, certainly, but also by local officials addressing their constituents and by myriad civic entities (chambers of commerce, property owner associations, utility companies) addressing the individuals whom they serve. These climate-science communicators face myriad challenges that admit of informed, evidence­-based guidance, and they are eager to get guidance of that kind. Making their needs the focus of field-based science-communication experiments would confer an immense benefit on them.

The social science researchers conducting such experiments would receive an immense benefit in return. Collaborating with these communicators to help them protect their science communication environment from degradation, and to effectively deliver consequential scientific information within it, would generate a wealth of knowledge on how to adapt insights from lab models to the real world.

There are lots of places to do science communication field experiments, of course, because there are lots of settings in which people are making decisions that should be informed by the best available climate science. There is no incompatibility between carrying out programs in support of adaptation-science communication simultaneously with ones focused on communicating relevant to climate policymaking at the national level.

On the contrary, there are likely to be numerous synergies. For one thing, the knowledge that adaptation-focused field experimentation will likely generate about how to convert laboratory models to field-based strategies will be relevant to science communication in all domains. In addition, by widening the positive exposure to climate science, adaptation-focused communication is likely to create greater public receptivity to open-minded engagement with this science in all contexts in which it is relevant. Finally, by uniting on a local level all manner of groups and interests that currently occupy an adversarial relation on the climate change issue nationally, the experience of constructive public engagement with climate science at the local level has the potential to clear the air of the toxic meanings that have been poisoning climate discourse in our democracy for decades.

Tuesday
Feb192013

On science communication & the job of the scientist: a thoughtful response from a scientist

Below is an extremely thoughtful comment relating to my 2d post on my experience in giving a presentation to a group of public-spirited citizen scientists at the  North American Carbon Program a couple of weeks ago.

Just by way of context: I stressed that it is a mistake to think that the job of the scientist is to communicate as opposed to doing science -- not because scientists shouldn't communicate with the public (the ones who take that on that added demand are heroes in my book) but because a democratic society that expects or relies on its scientists to bear the responsibility for making what's known to science known to citizens necessarily doesn't get the central tenets of the science of science communication: (1)  that there is distinction between "doing" and "communicating" valid science; and (2)  that the latter demands its own science, its own professional training, and its own reliable implementing institutions and practices. Not getting (1) & (2) is the source of the persistent public conflict on climate science & risks squandering in general what is arguably our society's greatest asset -- the knowledge that science confers on how to secure collective health, safety, and prosperity.

But the one thing I am more confident is correct than this argument is that the surest means for remedying the deficit in our society's science-communication intelligence is through the process of conjecture and refutation that is the signature of science. Let's articulate as many experience-informed hypotheses as we can; and let's test them by doing and modeling them within our universities and within all the other settings in which science and science-informed policymaking are practiced.

So consider this inspired account of what's to be done. If it weren't an "n of 1," I myself would accept that it in itself refutes my claim that it's a mistake to think that we shouldn't conflate excellence in doing and communicating science.

from Paul Shepson:

Dan - you said in your revised post, that "Their job is not to communicate their science to non-experts or members of the public." This did strike me as a weird thing to say. When I am doing science, I try to do it in a scientifically defensible way. When I am communicating to the public about science, I try to do it in a way in which they learn something, and hopefully laugh a few times. But what my job is, that's for me and my employer to negotiate, and hopefully, for me to be creative about. My job is to feel good about what I do, and at the same time hopefully help people, and get to eat. But, as I said in my email to you, it is indeed our responsibility to do exactly this (communicate to members of the public), as I said, especially when the scientific results have large social, ethical, economic, human and ecosystem health impacts. And, it is the case that Federal agencies, e.g. NSF, that fund the scientific community REQUIRE that we communicate our science outside of the scientific community.

For me, doing this is an integral part of who I am as a scientist. I have learned, from a variety of personal experiences, like marriage counseling, and communicating about climate change to Rotarians, etc., that it is very important to "get into the heads of" the members of the audience. But, until your presentation at the NACP meeting, I didn't fully have the jargon about, and the better informed ideas about, the importance and impact of cultural cognition. This has helped me a great deal, and I am sure it will in future presentations; I am already implementing changes (in my head) as a result of your blogs and your presentation. But I don't typically expect scientists to communicate, as you have said, the "validity of valid science". Scientists more often are communicating about the process of science, which can be far more interesting and entertaining, than trying to hammer home the idea that some set of climate science-related conclusions are valid. For me, a quantitative scientist, to discuss the "validity" of my work requires the use of error analysis, and thus, for a general audience, might require them to use stimulants of some sort. People sometimes use the word valid or validate when referring to one of the most important tools of science, the model. But, models are almost never valid, they are a representation and most often simply a test of our understanding of a natural system, such as the Earth. It is hard for me to imagine an Earth System model as ever being valid. But what is fun to tell people about is the process of finding things out, to use a Feynman-ian-like term, since you have referred to Feynman in your blog. People will listen to stories about how hypotheses are developed, e.g. about warming in the Arctic, and then about how you went there to test it, and observed a similar warming, and a similar loss of sea ice, but how that loss of sea ice is occurring faster than the models predicted, and then how that comparison led you to think harder about what is wrong with a model. Models aren't ever valid, they are wrong, and it is learning about the wrongness that leads to scientific progress. The finding things out, and the wrongness is the excitement of science. People love to hear stories about what an Inupiat Eskimo taught you about ice that you never learned from other scientists, and how that helped you rethink your model. Science is a process, not a bunch of end results that are either valid, or not. Ah, but enough ranting.

Regarding making my University bear its share of the burden, I can't really make my University do much of anything. I have tried! But, I can motivate myself to try to inspire young people about the process of science, and to tweak peoples minds to think about things in a different way, and hopefully, in a positive, constructive way. So, when I asked you about taking a renewable energy engineer with me to the Rotary Club, I was suggesting that it might be effective for people who value individualism and a hierarchical world to see the unprecedented investment opportunities in renewable energy, which everyone on the planet will likely eventually need. Its a darn big market! And that pursuit of such investment opportunities might "symbolize human resourcefulness", in a way that is fully consistent with the values of the cultural group with which they identify. Shouldn't we try to take Warren Buffet with us to the Rotary Club? I think the climate science community should be communicating that everyone can win, and that includes the cultural groups with which they strongly identify, in the pursuit of the solutions to climate change.

While you might not think that I am, I will take the liberty of saying thank you for helping me to think more clearly.

 

Monday
Feb182013

The two-channel strategy/model for satisfying the public's appetite to know what is known by science

Below is a summary of my remarks (or what I can remember of them!) at the AAAS panel I participated in on Friday on Engaging Lay Publics in Museums on Provocative Societal Questions Related to Science. My slides are here.  It is part 1 of a 2-part series; in the 2d part, I'll summarize the presentations of co-panelists Lucy Kirschner and Elizabeth Kunz Kollman on a truly astonishing exploratory field-experiment that the Boston Museum of Science conducted in the form of an exhibit designed to promote reflection on the dynamics of public engagement with science relevant to controversial policy issues

A two-channel strategy (model!) for enlarging satisfaction of the public appetite to know what’s known

 1. There are two situations in which professional science communicators get into trouble. The first is when they rely entirely on their intuitions unfortified with evidence. The second is when they ask social scientists what to do based on the evidence and social scientists actually purport to tell them

The problem with the evidence-free approach is not that professional communicators don’t have any sound intuitions about what to do; it’s that they have too many of them. Their experience-informed insights are always plausible, but here, as elsewhere with complicated social matters, more things are plausible than are true. Hypothesis, observation, and measurement are needed to cull the latter from the former.

The problem with communicators relying on the social scientists to tell them what to do is that the social scientists don’t have practical, experience-based insights into communication. They have models. The models, if they are well-designed, identify the mechanisms of consequence in particular communication settings. Those mechanisms are important for determining which of the communicators’ plausible intuitions are most likely to work. But turning the models that produced the mechanisms are not themselves communication materials. Communicators need to turn those models into materials that till produce those effects in the real-world. Social scientists can’t do it for them: they don’t have evidence on that, and if they just try to guess what will work, they will say many implausible (also empty, self-contradictory) things because they lack local knowledge.

I certainly don’t have reliable intuitions on how to communicate science in a manner that satisfies the appetite of the public (or the appetite of that portion of it that has one) to enjoy the thrill and wonder of knowing what’s known. I am part of that public, and recognize with admiration and gratitude the special craft sense of those who feed the curiosity of me and others who share my interest. 

Those who have this special professional skill are intent all the same on improving their art.  I have through empirical study acquired knowledge of some of the mechanisms that shape public engagement with science.  Is what I know something that will help these communicators? Once they’ve heard what I said, they should tell me.

2.  The science of science communication can help communicators only through evidence-based experiments based on social scientist/practitioners collaborationBased on what the social scientist knows about mechanisms, the communicator will be filled with ideas about how to fashion communication strategies that successfully reproduce the effects of the social scientists’ models in the world. So social scientists shouldn’t tell communicators what to do; communicators should tell social scientists what they think will work. Because here too the communicators will have more plausible intuitions than can be true, their proposals should be regarded as hypotheses. The social scientists can then help the communicators to structure their programs as experiments, ones that generate observations that can be measured and that support valid inferences about what does and doesn’t work.  They can use that information. But they should also share it, so others can learn too.

3.  A two-channel strategy. The two channel-strategy is a model of communicating science. It tests a hypothesis about how mechanisms associated with science communication conflicts can be neutralized.  The basic idea is that ordinary members of the public receive science information along two channels. One transmits content. The other transmits meaning: what is the significance, if any, for my standing in my cultural group associated with crediting or discrediting this information?  Conflicts over climate change reflect a conflict between the signals being transmitted along the content channel and the meaning channel; many citizens “push back”—they don’t engage the communication attentively and with an open-mind—because the information conveys meanings that threaten their cultural identity.  The CCP experiment on “geoengineering and the science communication environment” is a model of how conscious regulation of the information on the meaning channel can improve engagement with content transmitted along the content channel.

4.  The two-channels model and satisfying the public appetite to know what’s known.  Some professional science communicators—including science documentary producers and science museum directors—subscribe e to what might be called the “missing science audience thesis” (MAT): that the number of people who enjoy their materials is smaller than the total who possess an appetite to know what’s known and who would find it satisfied (amply and exhilaratingly) by the work these communicators do.  Could the two channel-model be of value in overcoming MAT?

The reason to surmise it might be is that the demographic characteristics of these communicators’ current audience suggest the underrepresentation of people of the same cultural style who react dismissively to climate science. These individuals—many of whom have hierarchical and individualistic worldviews—are not anti-science (no significant portion of the American public actually is): they are science literate and share in the prevailing positive view of scientists in American society; they have admiration for technological innovation, including nuclear power, nanotechnology and geoengineering; and like everyone else, they favor making use of science in public policymaking—indeed, like their opponents in culturally factionalized debates over policy-relevant science believe (sometimes correctly, sometimes incorrectly) that the positions that predominate in their group are consistent with scientific consensus.  The two-channel strategy suggests that communicators can tap into the latent receptivity of these citizens to the content of scientific information on climate change by combining that information with cultural meanings that are congenial rather than hostile to their worldviews.

Could MAT originate in an unintended conflict between the information being conveyed along the content and meaning channels? If so, what elements of the information being communicated generate the hostile meanings? How might those be modified to make the signal transmitted along the meaning channel more congenial without changing the one being conveyed along the content channel—since, indeed, the supposition is that the content of these communicators’ materials are exactly what would satisfy the appetite of these citizens to know what’s known?

The communicators at the Boston Museum of science aren’t asking me those questions; they are showing me and others their own answers, which are the animating conjectures of practical field experiments conducted as part of their own work.  They are also sharing with others in their extraordinary profession the valuable knowledge that their efforts have generated.

To me, the results bear all the signatures of the scientific advancement of knowledge.

And not surprisingly, given that these field experimenters are also expert communicators, their results inspire in me the same thrill and awe that I experience whenever I cross the bridge that their craft supplies between my curiosity and the wondrous discoveries of science.

Saturday
Feb162013

Is it plausible that higher cognitive reflection (system 2) increases polarization

This is from correspondence with @Joshua, who says:

I"m having difficulty understanding [your claim that "in a polluted science communication environment, there will be the equivalent of a psychic incentive to form group-congruent beliefs. People who are higher in science comprehesnion will be even better at doing that."]

When you say "better at doing that," doesn't it mean, essentially, better at being polarized and hence, more polarized? If someone is driven to acquire more data by virtue of a system 2 orientation, and accordingly is better at filtering those increased data to confirm bias, doesn't that necessarily translate into being more polarized?

That doesn't quite fit with my non-empirical assessment of human nature. My guess is that scientific literacy probably has little effect on one's tendency towards polarization (not zero effect - I assume that "literacy" as a general characteristic on a macro-scale is associated with less antagonistic behavior) , but someone who is more unequivocal in their viewpoint is more likely to seek out information to confirm their bias (because their identity is more closely associated with that viewpoint and they have more to lose if they're wrong) - and even more so if they happen to have a system 2 orientation.

My response:

I think you've got it -- "it" being my claim: (1) that in an environment in which positions on risk or facts of policy-significance become suffused with identity-signifying meanings, there will be cultural polarization b/c of the pressure members of diverse communities experience to protect their standing in the group; and (2) such polarization will be greater among individuals who are most disposed and able to engage in conscious, effortful information processing (system 2), because people who are better in general at making use of information to advance their interests will, in this polluted envirionment, use those abilities to attain a tighter fit between their beliefs and their identities (through motivated search for information, through closer scrutiny of messages that might contain meanings threatening to or affirming of group identity, & through formulation of innovative counterarguments).

You say you have trouble with this claim b/c it doesn’t fit your own observation & sense of human nature?

My guess would be that this position both fits many impressions most people have about how things work, and is at odds with many impressions they have formed that suggest something else could be going on. I certainly feel this way.

This is the situation we are in usually -- possessed of more plausible conjectures about what is going on than can really be (helpfully) true. That's why we should hypothesize, measure, observe, & report; it is why we shouldn't tell stories, that is, confidently present what is imaginative conjecture embroidered w/ bits of psychological research as "scientifically established" accounts that disguise uncertainty and stifle continued investigation.

So I don't offer my account as any sort of "conclusively proven!" show stopper. I offer it as my hypothesis.

And I offer both the "science comprehension & polarization" study and the "cognitive reflection, motivated reasoning, and ideology" experiment as evidence that I think gives us reason to treat this hypothesis as more likely true (or closer to useful truth) than alternatives. Then I wait for others to produce more evidence that we can use to adjust further. But if I have to act in the meantime, I do what seems sensible based on my best current understanding of what's true.

So I am content if people start with the idea, "this expressive rationality thesis (ERT) you keep talking about-sure, it's plausible, but what's the evidence that that rather than [9 other plausible conjectures] is the source of the problem?"

If someone says, "ERT is not plausible," I'm puzzled; most of us have enough common material in our registers of casual observation to be able to recognize how people could believe one or another of the things that any one of us finds plausible.

But if that person finds ERT implausible, I will simply say to her, "well, still consider my evidence, please. I imagine after you do you will still not be convinced ERT is the source of disputes over climate change & nuclear power & the like, since you are starting w/ prior odds so long against this being so. But my hope is that you'll conclude that the evidence I have collected is sound and supplies a likelihood ratio > 1 in support of ERT, and that you will then at least have posterior odds that are less long against it."

If the person then accepts the invitation, considers the evidence open-mindedly, and gives it the weight that it is due under appropriate criteria for judging the validity of empirical proof, that will make me happy, too.

As long as we both keep iterating & updating, we'll converge eventually. 

Thursday
Feb142013

Terrorism, climate change, and surprise

In one of the enlightening "drunkard's walks" that the internet enables, I bumped into this fascinating blog post at the site Grow this City. Shouldn't one show one's gratitude for the gratuitous conferral of this sort of benefit by making an effort to enable others to enjoy it too?  So I repost; and then offer a conversational response.

At a recent meeting of a class on climate change policy, my professor led a discussion on the psychology of climate change and why it is so difficult to motivate people to act on the dire warnings published by climate scientists.

The basis of our discussion was a set of three articles published by psychologists on the topic. Two were by Elke U. Weber: “Public Understanding of Climate Change in the United States” and “Why Global Warming Does Not Scare Us (Yet)”. A third was by Dan M. Kahan titled “The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change”. A couple lines from the abstract of one of Weber’s article’s sums up the conclusion that both she and Kahan reach:

“When people fail to be alarmed about a risk or hazard, they do not take precautions… The time-delayed, abstract, and often statistical nature of the risks of global warming does not evoke strong visceral reactions.”

Basically, people do not take action to prevent or prepare for climate change because climate change is not scary enough.

Reading those findings got me thinking – is there a phenomenon similar to climate change that does scare people?

Eureka! There is such a thing! It’s called Terrorism. And, unlike climate change, it scares the shit out of people.

The analogy between climate change and Terrorism holds up for these three reasons:

1. They are diffuse in their causes and in their harms.

2. Preventing them requires large-scale social coercion and massive diversions of resources.

3. They cannot be prevented with total certainty even if we employ all the coercion and resources we can muster.

I brought this idea up in class and might as well have detonated a flash-bang grenade. My peers were shell-shocked. Their ethical circuitry shorted out. A business major blurted, “Terrorism isn’t like climate change. It’s a big danger that we have to fight to defend our country.”

To this I said, “The chances of being injured or killed in an act of terror is very low. You have a better chance being struck by lightning.”

The business major countered, “Look at Oklahoma City, the World Trade Center, the Shoe Bomber. Terrorism happens all the time.”

I then suggested that it may be the case that the US government has acted more decisively and with more resources to the threat of terrorism than to the threat of climate change because the United States is a fossil fuel-based regime. The reason that there was such a thorough (and effective) propaganda campaign to justify the “War on Terror” was that it generated support for the invasion and decade-long wars in Iraq and Afghanistan. Those wars, I said, secured Middle Eastern oil for the United States, strengthening its fossil fuel-based regime. On the other hand, preventing climate change is not as strategically important to the USA, so our government has devoted more resources to fighting Terrorism than to addressing the problem of climate change.

My classmates went pale. My professor stayed silent. And the business major came at me again.

“The wars in Iraq and Afghanistan were about terrorism. They had nothing to do with oil. They made us more safe from terrorism.”

I said, “Come on, the idea that we invaded those countries because of oil is not a crazy one. It’s obvious.”

But my classmates looked at me like I was insane, like I had jumped on the big oval table in the middle of the room and defecated before them.

But the normally quiet girl to my right spoke up. “It might also have something to do with class. 9/11 blew up a skyscraper in Manhattan. Climate change hurts poor people first.”

But my professor, who has a JD from Stanford and an aversion to talking about class or speaking ill of the US government, intervened. He changed the subject, and ‘terrorism’ didn’t enter into the same sentence as ‘climate change’ from then on.

Bonus fact: the Iraq War has been more expensive than the anticipated cost of the Kyoto Protocol to the US.

1. This is a really compelling & cool anecdote that powerfully illustrates how intriguingly & oddly selective perceptions of risk are. Obviously, an element of the phenomenon is how unaware people (we!) normally are of how oddly selective our perceptions are — they just seem so given, obvious, we don’t notice.  The failure of people (like your classmates but everyone else, including you and me at one time or another)  to “get” how oddly selective risk perceptions are — to react in fact w/ incomprehension mixed with irritation — when this is pointed out is obviously bound up with whatever it is in us that makes us form such strange schedules of risk perception in the first place.

Two other cool things in the story: at least for a curious person, the surprise at discovering instances of the odd selectively & realizing that they beg for explanation are pleasurable; and for the curious person the disappointment of finding out that other people actually resist being made to confront the puzzle is offset by what that teaches her about shape of the pieces she needs to solve the puzzle.

2. The thesis — we overestimate terrorism risks relative to climate change ones because of the vivid an immediate character of the former and the less emotionally sensational, more remote character of the latter — is very plausible, because it's rooted, as you point out, in real dynamics of risk perception. For a wonderful essay that elaborates on this hypothesis (without presenting it as a hypothesis, unfortunately; conjecture is beautiful, and supplies the motivation for investigation, unless it is disguised as a “scientific, empirical fact,” in which case is risks stifling scientific, emprical engagement; you aren’t doing that, btw!), see Sunstein, C.R. On the Divergent American Reactions to Terrorism and Climate Change. Columbia Law Rev 107, 503-557 (2007).

3. I want to reciprocate the friendly gesture reflected in your sharing this genuinely engaging and thoughtful insight (and the infectious nature of the excitement of your discovery of it) by suggesting that I think that explanatioin is not quite right!

The paper of mine that you cite — “Tragedy of the Risk Perceptions Commons,” a working paper version of Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G., The polarizing impact of science literacy and numeracy on perceived climate change risks, Nature Climate Change 2, 732-735 (2012) — is actually meant to pit that hypothesis against a rival one.

You surmise — again, quite plausibly, in light of mechanisms of cognition that we know are very imporant for risk perception— that the public's relative ranking of terrorism and climate change risks is a consequence of the tendency of people to process information about risk heuristically, intuitively, emotionally (Kahneman’s “fast” system 1), as opposed to consciously, deliberately, analytically (“slow” system 2).

Our study presents evidence, though, that the disposition to think consciously, deliberately, analytically (to use system 2) doesn’t uniformly predict more concern about climate change. In fact, it predicts greater cultural polarization over climate change  risks and a whole bunch of other ones too! We treat this as evidence that public conflict or confusion over climate change risks is a consequence of “cultural cognition,” a dynamic that unconsciously motivates people to attend selectively to information about risk in patterns that reinforce their commitment to opposing groups. Those who see climate changes as higher in risk actually see terrorism risks as less of a concern for society. (Take a look, e.g., at the group variation reflected in this chaotic graphic. The effect only gets stronger as people's ability to engage in reflective, dispassionate analytical reasoning increases.

4. As I said, this observation is meant to reciprocate the spirit of your post. My aim is not to “set you straight,” but to deepen if I can your sense of wonder over things that are, as you recognize, filled with surprise!

If you in turn surprise me back by showing me that my solution to this tiny patch of the puzzle is also incomplete — I will be shocked (but not surprised again to find myself surprised), and once again grateful to you.

What a strange world!

But also what a sad situation the citizens of our democracy are in — to be in disagreement over such consequential things, and to feel motivated to react with resentment toward others who see things differently from them.

Maybe by indulging our curiosity, you and I and others will learn things that can be used to help the members of our culturally pluralistic society converge in their understandings of the best available evidence of the dangers we face and how to abate them.

Wednesday
Feb132013

Evidence-based Climate Science Communication (new paper!)

Here's a new paper. Comments welcome!

There are 2 primary motivations for this essay.

The first might be pretty obvious to people who have been able to observe organized planning and execution of climate-science communication first hand. If not, read between the lines in  the first few pages & you will get a sense.  

Frankly, it frustrates me to see how ad hoc the practice of climate-science communication is.  There's a weird disconnect here. People who are appropriately concerned to make public-policy deliberations reflect the best available scientific evicence don't pursue that goal scientifically.

The implicit philosophy that seems to animate planning and executing climate-science communication is "all opinions are created equal."

Well, sorry, no. All opinions are hypotheses or priors. And they can't all be equally valid. So figure out empirically how to identify the ones that are.

Indeed, take a look & see what's already been tested. It's progress to recognize that yesterday's plausible conjecture is today's deadend or false start. Perpetually recycling imaginative conjectures instead of updating based on evidence condemns the enterprise of informed communcation to perpetual wheelspinning.

My second motivation is to call attention to local adaptation as one of the field "laboroatories" in which informed conjectures should be tested.  Engagement with valid science there can help promote engagement with it generally.  Moreover, the need for engagement at the local level is urgent and will be no matter what else happens anyplace else.  We could end carbon emissions today, and people in vulnerable regions in the U.S. would still be facing significant adverse climate impacts for over 100 yrs.  The failure to act now, moreover, will magnify the cost-- in pain & in dollars -- that people in these regions will be needlessly forced to endure.

So let's get the empirical toolkits out, & go local (and national and international, too, just don't leave adaptation out).

Thursday
Feb072013

The declining authority of science? (Science of Science Communication course, Session 3)

This semester I'm teaching a course entitled the Science of Science Communication. I have posted general information on the course and will be posting the reading list at regular intervals. I will also post syntheses of the readings and the (provisional, as always) impressions I have formed based on them and on class discussion. This is this third such synthesis. I eagerly invite others to offer their own views, particularly if they are at variance with my own, and to call attention to additional sources that can inform understanding of the particular topic in question and of the scientific study of science communication in general. 

In Session 3, we finished off “science literacy and public attitudes” by looking at “public attitudes” toward science.  The theory for investigating the literature here is that one if one wants to understand the mechanisms by which scientific knowledge is transmitted in various settings, it likely is pretty important to consider how much value people attach to being informed of what science knows. 

1.  So what are we talking about here? I’m going to refer to the “authority of science” to mean assent to its distinctive understanding of “knowing” as valid and as superior to competing understandings (e.g., a religious one that treats as known matters revealed by the word of God, etc.). The relevant literature on “attitudes toward science” tries to assess the extent of the authority of science, including variation in it among different groups and over time.

Indeed, a dominant theme in this literature is the declining or contested status of the authority of science. “Many scholars and policy makers fear that public trust in organized science has declined or remains inadequate,” summarizes Gauchat, a leading researcher in this field. What accounts for that?

2. Well, what are they talking about? But before examining the explanations for the growing resistance to the authority of science, it’s useful to interrogate the premise: why exactly would anyone worry that the authority of science is seriously in doubt in American society? 

Pew did an amazingly thorough and informative survey in 2009 and concluded “Americans like science.” They “believe overwhelmingly that science has benefited society and has helped make life easier for most people.”

This sentiment, moreover, is pretty widespread. “Partisans largely agree on the beneficial effects of science,” the Pew Report continues, “with 88% of Republicans, 84% of independents and 83% of Democrats saying the impact is mostly positive. There are differences—though not large—tied to race, education, and income.”

“[L]arge percentages,” too, “think that government investments in basic scientific research (73%) and engineering and technology (74%) pay off in the long run.” Again, this is not something that generates meaningful political divisions.

Data collected over three decades' time by the NSF suggests that this 2009 picture from Pew is a but a frame in a thirty-year moving picture that shows -- well, a stationary object. Americans love science for all the wonderful things it does for them, want government to keep funding it, and have for decades.

 

Amusingly, the Pew Report seems to feel compelled to pay respect to the “declining authority” perception, even in the course of casting immense doubt on it.  The subtitle of the Report is “Scientific Achievements Less Prominent Than a Decade Ago.” The basis of this representation turns out to be a question that asked subjects to select the “Nation’s greatest achievement” from a specified list.  Whereas 47% picked “Science/medicine/technology” in 1999, only 27% did in 2009.  Most of the difference, though, was reflected in the 12 percentage point increase in “Civil rights/Equal rights,” and nearly all the rest in “Nothing/Don’t Know,” the only option chosen more often than Science/medicine/technology.”

A better subtitle, then, would have been “After Election of America’s First African-American President, Recognition of Gains in Civil Rights Eats Away at American’s Awe of Science.”

3.  Uncritically examined assumptions tend to multiply.... I keep mentioning the bipartisan or nonpartisan aspect of the public’s warm feeling toward science because my guess is that the premise that the authority of science is in “decline” is an inference from the sad spectacle of political polarization on climate change. If so, then this would be a case where the uncritical acceptance of one assumption--that conflict over climate change reflects a decline in the authority of science-- has bred uncritical acceptance of another--that the authority of science is declining.

I could sort of understand why someone might hypothesize that people who are skeptical about climate change don’t accept science’s way of knowing, but not why anyone would persist in this view after examining any reasonable amount of evidence. 

The people who are skeptical about climate change, just like those who believe in it, believe by an overwhelming margin that “scientists contribute to the well-being of society.”  The reason that there is public division on climate change is not that one side rejects scientific consensus but that the two disagree about what the “consensus” on climate change is, a conclusion supported by numerous studies including the Pew Report.

A related mistake is to treat the partisan divide on climate as evidence that “Republicans” are “anti-science.”  Not only do the vast majority of such individuals who identify as Republican view science and its impact on society positively. They also, as the Pew Report notes, hold views on nuclear power more in keeping with those of scientists (who are themselves overwhelmingly Democratic) than the vast majority of ordinary members of the public who call themselves “Democrats.”

Another probable basis for the ill-supported premise that science’s authority is low or in decline etc. is the high proportion of the U.S. population—close to 50%--who say they believe in divine creation.  In fact, the vast majority of those who say they don’t believe in evolution also have highly positive views about the value of science.

I suppose one could treat the failure to “accept” evolution (or to “believe” in climate change)  as “rejection” of the authority of science by definition. But that would be a boring thing to do, and also invite error.

It would be boring because it would foreclose investigation of the extremely interesting question of how people who hold one position they know is rejected by science can nevertheless persist in an extremely positive view of science in general -- and simply live in a manner that so pervasively assumes science’s way of knowing is the best one (I don’t know for sure but am pretty confident that people who believe in evolution are not likely to refuse to rely on a GPS system because its operation reflects Einstein’s theories on relativity, e.g.).

The error that's invited by equating rejection of evolution or climate change with “rejection of the authority of science” is the conclusion that the rejection of the authority of science causes those two beliefs.  Definitions, of course, don’t cause anything. So if we make the awkward choice to analytically equate rejection of evolution or of climate change with rejection of the authority of science, we will have to keep reminding ourselves that “rejection of the authority of science” would then be a fallacious answer to the question what really does cause differences in public beliefs about evolution and about climate change?

4.  But then what are the “public attitude” measures measuring? The public attitude scholars, and in particular Gauchat, report lots of interesting data on the influences on attitudes toward science.  The amount of variance they find, moreover, seems too large to be understood as an account for the difference between the 85% of Americans who seem to think science is great and the 15% or so who seem to have a different view. The question thus becomes, what exactly are they measuring and what’s its relationship to peoples’ disposition to be guided by science’s way of knowing on matters of consequences to their decisionmaking?

Literally what these scholars are measuring is variance in a composite scale of attitudinal Likert items that appear in the GSS and the NSF Science Indicators. The items consist of statements (with which respondents indicate their level of disagreement or agreement on a 5- or 7-point scale) like these 

  1. Because of science and technology, there will be more opportunities for the next generation.
  2. We depend too much on science and not enough on faith.
  3. Scientific research these days doesn’t pay enough attention to the moral values of society.
  4. Science makes our way of life change too fast.

I think these items are measuring something interesting, because Gauchat has found that they correlate in interesting ways with other individual characteristics.  One of these is an attitudinal dispositions that Gauchat calls “institutional alienation,” which measures trust in major institutions of government and civil society. They also correlate highly with science literacy.

But in truth, I’m not really sure what the disposition being measured by this type of “public science attitude” scale is. Because we know that in fact the public reports having high regard for science, a composite “science attitude” scale presumably is picking up something more general than that. I am unaware (maybe a reader of this blog will direct me to relevant literature) that attempts to validate the “science attitude” scale in relation to whether people are willing to rely on science in their lives—for example, in seeking medical treatment from physicians, or making use of safety-related technologies in their work, etc.  I would be surprised if that were so, given how unusual it is the US & other modern, liberal democratic socieites to see behavior that reflects genuine distrust for science’s authority. My guess is that the “public science attitudes” scales are measuring something akin to “anti-materialism” or “spiritualism.” Or maybe this is the elusive “fatalism” that haunts Douglas’s group-grid!

Indeed, I think Gauchat is interested in something more general than the “authority of science,” at least if we understand that to mean acceptance of science’s way of knowing as the best one.  He is looking for and likely finding pockets of American society that are unsatisfied with the meaning (or available meanings) of a life in which science’s authority is happily taken for granted by seemingly all cultural communities, even those for whom religion continues to furnish an important sentimental bond. 

For his purpose, though, he probably needs better measures than the ones that figure in the GSS and NSF batteries. I bet he’ll devise them. I suspect when he does, too, he’ll find they explain things that are more general than (& likely wholly unrelated to) partisan political disputes over issues like climate change.

Finally, in a very interesting paper, Gauchat examines variance in a GSS item that asks respondents to indicate how much “confidence” they have “in the people running . . . the Scientific Community”—“a great deal,” “only some,” or “hardly any.”  Gauchat reports finding that the correlation between identifying themselves as politically “conservative” and selecting “great deal” in response to this item has declined in the last 15 years. It’s interesting to note, though, that only about 50% of liberals have over time reported “a great deal” of confidence in “the people running . . . the Scientific Community,” and the individuals historically least likely to have a “great deal of trust” identify themselves as “moderates.”

I have blogged previously on this paper. I think the finding bears a number of possible interpretations. One is that Republicans have become genuinely less “confident” in the “people running the Scientific Community” during the period in which climate change has become more politically salient and divisive. Another is that climate skepticism is exactly what the GSS “confidence” item—or at least variance in it—is really measuring; it seems reasonable that conservatives might understand the (odd!) notion of “people running the Scientific Community” to be an allusion to climate scientists.  Gauchat’s finding thus points the way for additional interesting investigations.

But whatever this item is measuring, it is not plausibly understood as a measure of a general acceptance of the authority of science, at least if that concept is understood as assent to the superiority of science’s way of knowing over alternative ones.

Republicans continue to go to doctors and use microwave ovens—and continue to say, as they have for decades, that they admire scientists and science, no doubt because it furnishes them with benefits both vital and mundane. 

They don’t (for the most part) believe in climate change, and if they are religious they probably don’t believe in evolution (same for religious Democrats).

But that’s something that needs another, more more edifying explanation than “decline in the authority of science.”

Reading list

Wednesday
Feb062013

Yet another installment of: "I only *study* science communication ..." 

Man, I suck at communicating!

I’ve now received 913 messages (in addition to many many comments) from scientists saying  “I attended your recent presentation, and you did fine—everyone loved you. Seriously. Don’t jump – here’s a number to call for help.  Okay? Okay?”

I see exactly what happened, of course. Despite my intentions, I came across like whining, self-pitying baby, because I wrote something that made me sound like a whining, self-pitying baby!

Actually, the potential miscommunication I am most anxious to fix is any intimation that I felt the audience at the  North American Carbon Program meeting made me feel I wasn't playing a constructive role in the discussion.  Definitely no one did in Q&A.  And after, the comments from the many people who lingered to discuss consisted of "very interesting!" (n = 3)  "thanks for giving us something to think about," (n = 2)  & "[really interesting observation/question relating to the data & issues]” (n = 7). (Like I said in the talk, it is essential to collect data, and not just go on introspection, when assessing the impact of science communication strategies.)

The source of the disappointment was wholly internal.  Also—but please don’t take this as reason to console me; I’m fine!—I remain convinced it was warranted.  I have proof: interrogating the feeling has enabled me to learn something.

So let me try this again . . . .

Something astonishing and important happened on  Monday.

I got the opportunity to address a room full of scientists who, by showing up (& not leaving for 2 hrs!), by listening intently, by asking thoughtful questions, by sharing relevant experiences, and by offering reasonable proposals proved that they, like me, see fixing the science communication problem as one of the most pressing and urgent tasks facing our society.

Of course, I stand by my position (subject, forever, to revision in light of new evidence) on what the source of the problem is. Also, I am happy, but hardly surprised, to learn that members of the audience didn’t at all resent my registering disagreement when I felt doing so would serve the goal of steering them—us—clear of what I genuinely believe to be false starts and deadends.

What disappoints me is not that I felt obliged to say “no,” "I don't think so," and “not that.”

It is that I failed to come fully prepared to identify, for an audience of citizen scientists who afforded me the honor of asking for my views, what I believe they can do as scientists to help create a science communication environment in which diverse citizens can be expected to converge on the best available scientific evidence as they deliberate over how best to secure their common ends.

I said (in my last post), “the scientist’s job is to do science, not communicate it.”  I didn’t convey my meaning as clearly as I wish I had (because, you see, science communication is only a hobby for me; my job is to contribute to scientific understanding of it).

Of course, scientists “communicate” as part of their job in being scientists.  But that communication is professional; it is with other scientists. Their job is not to communicate  their science to nonexperts or members of the public.

This is a very critical point to get clear on so I will risk going on a bit. 

The mistake of thinking that doing valid science is the same as communicating the validity of valid science is what got us into the mess we are in! Communicating and doing are different; and the former is something that admits of and demands its own independent scientific investigation.

In addition, the expert use of the scientific knowledge that the study of science communication creates is something that requires professional training and skill suited to communicating science, not doing science. Expecting the scientist to communicate the validity of her science because she had the professional skill needed to generate it is like expecting the players in a major league baseball game to do radio play-by-play at the same time, and then write up sportspage accounts for the fans who couldn’t tune in.

Yes, yes, there’s Carl Sagan; he’s the Tim McCarver of science communication. For sure be Carl Sagan or better still Richard Feynman if you possibly can be, b/c as I said, if you can help me and other curious citizens to participate in the wonder of knowing what is known to science, you will be conferring an exquisite benefit of immeasureable intrinsic value on us! Still, that won’t solve the climate change impasse either.

But neglecting to add this was my real mistake: just because what you say in or about your job as a scientist won’t dispel controversy over climate change does not mean that it isn’t your duty as a citizen scientist to contribute to something only scientists are in a position to do and that is essential not only to dispelling controversy over climate science but to addressing what caused that controversy and numerous others (nuclear power . . . HPV vaccine), and that will continue to cause us to experience even more of the same (GM foods . . . synthetic biology) if not corrected.

The cause of the science communication problem is the disjunction between the science of science communication and the practice of science and science-informed policymaking.  We must integrate them—so that we can learn as much as we can about how to communicate science, and never fail to use as much as we know about how to make what’s known to science known by those whose well-being it can serve.

Coordinated, purposeful effort by the institutional and individual members of the scientific community are necessary to achieve this integration (not sufficient; but I’ll address what others must do in part 5,922 of this series of posts). That was the message—the meaning—of the National Academy of Science’s “Science of Science Communication” Sackler Colloquium last spring.

Universities are where both science and professional training of those whose skills are informed by science take place. Universities—individually and together—must organize themselves to assure that they contribute, then, to the production of knowledge and skill that our society needs here.

What does that mean? Not necessarily one thing (such as, say, a formal “science of science communication” program or whathaveou). But any of a large number of efforts that a university can make, if it proceeds in a considered and deliberate way, to make sure that its constituent parts (its various social science graduate departments, its professional schools, its interdisciplinary centers and whatnot) predictably, systematically interact in a manner that advances the integration of the forms of knowledge that must be combined.

So make this happen

Combine with others within your university and petition, administer, or agitate as necessary to get your institution both to understand and make its contribution to this mission in whatever way intelligent deliberation recommends.

Model it yourself by teaching—or better yet co-teaching with someone in another discipline that also should be integrated—a course called the “Science of Science Communication” that’s cross-listed in multiple relevant programs.

Infect a brilliant student or two or fifty with excitement and passion for contributing to the creation of the knowledge that we need—and do what you can to demonstrate that should they choose this path their scholarly excellence will be conferred the recognition it deserves (or at least won’t compromise their eligibility for tenure!).

Is that it? No other things that scientists can do? 

I’m sure there are others (to be taken up in later posts, certainly, I promise). But making their universities bear their share of the burden to contributing to the collective project of melding science and science-informed policymaking with the science of science communication is the single most important thing you can do as a scientist to solve the science communication problem.

But don’t stop doing your science, and just keep up the great work (no need to change how you talk) in that regard.

Okay. Next question?  

Tuesday
Feb052013

Another installment of: "I only study science communication -- I didn't say I could do it!" 

Gave a talk yesterday at the North American Carbon Program’s 2013 meeting, “The Next Decade of Carbon Cycle Research: From Understanding to Application.

Obviously, I would have been qualified to be on any number of panels (best fit would have been “Model-data Fusion: Integrated Data-Model Approaches to Carbon Cycle Research”), but opted to serve on “Communicating Our Science” one (slides here).

Bob Inglis (click to learn more!)The highlights for me were the excellent presentations by Jeff Kiehl, an NCAR scientist who has really mastered the art of communicating complicated and controversial science to diverse audiences, and former Rep. Bob Inglis, who now heads up the Energy & Enterprise Institute, a group that advocates using market mechanisms rather than centralized regulation to manage carbon emissions. I also learned a lot from the question/answer period, where scientists related their experiences, insights, & concerns.

To be honest, I’m unsure that I played a constructive role at all on the panel, & I’ve been pondering this.

The theme of my talk was “the need for evidence-based science communication.”  I stressed the importance of proceeding scientifically in making use of the knowledge that the science of science communicate generates. Don't use that knowledge to construct stories; use it to formulate hypotheses about what sort of communication strategy is likely to work -- and then measure the impact of that strategy, generating information that you & others can use to revise and refine our common understanding of what works and what doesn't.

I'm happy w/ what I had to say about all of this, but here's why I’m not really sure it was useful:

1.  I don’t think I was telling the audience what they wanted to know. These were climate scientists, and basically they were eager to figure out how they could communicate their science more effectively.

My message was one aimed, really, at a different audience, those whom  I think of as “science communication practitioners.”  Like Bob Inglis, who is trying to dispel the fog of accumulated ideological resonances that he believes obscures from citizens who distrust government regulation the role that  market mechanisms can play in reducing climate-change risks. Or Jeff Kiehl, who is trying to figure out how to remove from the science communication environment the toxic partisan meanings that disable the rational faculties that citizens typically use to figure out what is known to science.  Or municipal officials and others who are trying to enable parties in stakeholder deliberations on adaptation in Florida and elsewhere to make collective decisions informed by the best available science.

2.  Indeed, I think I told the audience a number of things its members actually didn’t want to hear. One was that it’s almost certainly a mistake to think that how scientists themselves communicate their science will have much impact on the quality of public engagement with climate science.

For the most part, ordinary members of the public don’t learn what is known to science from scientists. They learn it from interactions with lots of other nonscientists (typically, too, ones who share their values) in environments that are rich with cues that identify and certify what’s collectively known.

There’s not any meaningful cultural polarization in the U.S., for example, over pasteurization of milk. That’s not because biologists do a better job explaining their science than climate scientists have done explaining theirs. It’s because the diverse communities in which people learn who knows what about what are reliably steering their members toward the best available scientific evidence on this issue—as they are on a countless number of other ones of consequence to their lives.

Those communities aren’t doing that on climate change because opposing positions on that issue have come to be seen as badges of loyalty to opposing cultural groups. It’s possible, I think to change that.  But the strategies that might accomplish that goal have nothing to do with the graphic representations (or words) scientists use for conveying the uncertainty associated with climate-model estimates.

I also felt impelled to disagree with the premises of various other genuinely thoughtful questions posed by the audience. E.g., that certain groups in the public are skeptical of climate change because it threatens their “interests” or lifestyle as affluent consumers of goods associated with a fossil-fuel driven economy. In fact (I pointed out), wealth in itself doesn’t dispose people to downplay climate change risks; it magnifies the polarization of people with different values

Maybe I was being obnoxious to point this out. But I think scientists should want their views about public understandings of science to accord with empirical evidence.

I also think it is important to remind them that if they make a claim about how the public thinks, they are making an empirical claim. They might be right or they might be wrong. But personal observation and introspection aren’t the best ways to figure that out; the sort of disciplined observation, measurement, and inference that they themselves use in their own domain are.

Shrugging one's shoulders and letting empirically unsupported or contestable claims go by unremarked amounts to accepting that a discussion of science communication will itself proceed in an unscientific way.

Finally, I felt constrained to point that ordinary citizens who have the cultural identity most strongly associated with climate-change skepticism actually aren’t anti-science.

They love nanotechnology, e.g.

They have views about nuclear power that are more in keeping with “scientific consensus” (using the NAS reports as a benchmark) than those who have a recognizable identity or style associated with climate change concern.

If you want to break the ice, so to speak, in initiating a conversation with one of them about climate science, you might casually toss out that the National Academy of Sciences and the Royal Society have both called more research on geoengineering. “You don’t say,” he’s likely to respond.

Now why’d I do this? My sense is that the experience with cultural conflict over climate change has given a lot of scientists the view that people are culturally divided about them.  That’s an incorrect view—a non-evidence-based one (more on that soon, when I write up my synthesis of Session 3 of the Science of Science Communication course). 

It’s also a misunderstanding that I’m worried could easily breed a real division between scientists and the public if not corrected. Hostility tends to be reciprocated. 

It's also sad for people who are doing such exciting and worthwhile work to labor under the false impression that they aren't appreciated (revered, in fact).

3.  Finally,  I think I also created the impression that what I was saying was in tension with the great advice they were getting from the one panelist most directly addressing their central interest.

I’d say Jeff Kiehl was addressing the question that members of the audience most wanted to get the answer to: how should a climate scientist communicate with the public in order to promote comprehension and open-minded engagement with climate science?

Jeff talked about the importance of affect in how people form perceptions of risk.  The work of Paul Slovic, on whom Jeff was relying, 100% bears him out.

In my talk, I was critical of the claim that the affect-poor quality of climate risks relative, say, to terrorism risks, explains why the public isn’t as concerned about climate change as climate scientists think they should be. 

That’s a plausible conjecture; but I think it isn’t supported by the best evidence. If it were true, then people would generally be apathetic about climate change. They aren’t; they are polarized.

It’s true that affective evaluations of risk sources mediate people’s perceptions of risk. But those affective response are the ones that their cultural worldviews attach to those risk sources.  Super scientist of science communication Ellen Peters has done a kick ass study on this!

What’s more, as I pointed out in my talk, people who rely more on “System 2” reasoning (“slow, deliberate, dispassionate”) are more polarized than those who rely predominantly on affect-driven system 1.

But this is a point, again, addressed to communication professionals: the source of public controversy on climate change is the antagonistic cultural meanings that have become attached to it, not a deficit in public rationality; dispelling the conflict requires dissipating those meanings—not identifying some magic-bullet “affective image.”

What Kiehl had to say was the right point to make to a scientist who is going to talk to ordinary people.  If that scientist doesn’t know (and she might well not!) that ordinary members of the public tend to engage scientific information affectively, she will likely come off as obtuse!

What’s more, nothing in what I had to say about the limited consequence of what scientists say for public controversy over climate change implies that scientists shouldn’t be explaining their science to ordinary people, and doing so in the most comprehensible, and engaging way possible.

Lots of ordinary people want to know what the scientists do. In the Liberal Republic of Science, they have a right to have that appetite—that curiosity—satisfied!

For the most part, performing this critical function falls on the science journalist, whose professional craft is to enable ordinary members of the public to participate in the thrill and wonder of knowing what is known to science.

Secondary school science teachers, too: they inculcate exactly that wonder and curiosity, and wilily slip scientific habits of mind in under the cover of enchantment!

The scientist’s job is to do science, not communicate it.

But any one of them who out of public spiritedness contributes to the good of making it possible for curious people to share in the knowledge of what she knows is a virtuous citizen.

Regardless of whether what she's doing when she communicates with the public contributes to dispelling conflict over climate change.

Friday
Feb012013

Cultural cognition & cat-risk perceptions: Who sees what & why?

So like billions of others, I fixated on this news report yesterday:

Obvious fake! These are professional-model animals posing for staged picture. Shame on you, NYT!For all the adorable images of cats that play the piano, flush the toilet, mew melodiously and find their way back home over hundreds of miles, scientists have identified a shocking new truth: cats are far deadlier than anyone realized.

In report that scaled up local surveys and pilot studies to national dimensions, scientists from the Smithsonian Conservation Biology Institute and the Fish and Wildlife Service estimated that domestic cats in the United States — both the pet Fluffies that spend part of the day outdoors and the unnamed strays and ferals that never leave it — kill a median of 2.4 billion birds and 12.3 billion mammals a year, most of them native mammals like shrews, chipmunks and voles rather than introduced pests like the Norway rat.

The estimated kill rates are two to four times higher than mortality figures previously bandied about, and position the domestic cat as one of the single greatest human-linked threats to wildlife in the nation. More birds and mammals die at the mouths of cats, the report said, than from automobile strikes, pesticides and poisons, collisions with skyscrapers and windmills and other so-called anthropogenic causes.

My instant reaction (on G+) was: bull shit!

My confidence that I knew all the facts here -- and that the study, published in Nature Communicationswas complete trash and almost surely conducted by researchers in the pocket of the bird-feed industry -- was based on my recollection of some research I’d done on this issue a few yrs ago (I’m sure in response to a rant against cats and bird “genocide” etc.). I recalled that there was "scientific consensus" that domestic cats have no net impact on wildlife populations in the communities that people actually inhabit (yes, if you put them on an island in the middle of the Pacific Ocean, they'll wipe out an indigenous species or two or twelve).  But I figured (after posting, of course) that I should read up and see if there was any more recent research.

What I found, unsurprisingly, is either there is no scientific consensus on the net impact of cats on wildlife populations or there is no possibility any reasonable and intelligent nonexpert could confidently discern what that consensus is through the fog of cultural conflict!

Check this out:

And this:

This is definitely a job for the science of science communication!

So I’d like is some help in forming hypotheses.  E.g.,

1.  What are the most likely mechanisms that explain variance in who perceives what and why about the impact of cats on wildlife population? Obviously, I suspect motivated reasoning: people (myself included, it appears!) are conforming their perceptions of the evidence (what they read in newspapers or in journals; what they “see with their own eyes,” etc.) to some goal or interest or value extrinsic to forming an accurate judgment. But what are the other plausible mechanisms?  Might people be forming perceptions based on exogenous “biased sampling”—systematically uneven exposure to opposing forms of information arising from some influence that doesn't itself originate in any conscious or unconscious motivation to form or preserve a particular belief  (e.g., whether they live in the city or country)? Something else? What sorts of tests would yield evidence that helps to figure out the relative likelihood of the competing explanations?

2.  Assuming motivated reasoning explains the dissensus here, is the motivating influence the dispositoins that inform the cultural cognition framework? How might perceptions of the net impact of cats on wildlife populations be distributed across the Hierarchy-egalitarian and Individualist-communitarian worldview dimensions?  Why would they be distributed that way

3.  Another way to put the last set of questions: Is there likely to be any relationship between who sees what and why about the impact of cats on wildlife population and perceptions of climate change risks? Of gun risks? Of whether childhood vaccinations cause autism? Of whether Ray Lewis consumed HGH-laced deer antler residue?

4.  If the explanation is motivated reasoning of a sort not founded on the dispositions that inform the cultural cognition framework, then what are the motivating dispositions? How would one describe those dispositions, conceptually? How would one measure them (i.e., what would the observable indicators be)?

Well? Conjectures, please -- on these or any other interesting questions.

By the way, if you'd like to see a decent literature review, try this:

Barbara Fougere, Cats and wildlife in the urban environment.

 

Page 1 ... 7 8 9 10 11 ... 20 Next 20 Entries »