Many thanks to all the people who sent me emails asking if I saw Cass Sunstein's op-ed on "biased assimilation" today in NYT: they assured I didn't miss a good read!
Sunstein's basic argument is that inundating people with "balanced information" doesn't promote convergence on sound conclusions about policy because of "biased assimilation." For this, he cites (via the magic of hyperlinked text) the classic 1979 Lord, Ross & Lepper study on capital punishment.
Sunstein's proposal for counteracting this dynamic is to recruit ideologically congenial advocates to challenge people's preexisting views: "The lesson for all those who provide information," he concludes, is "[w]hat matters most may be not what is said, but who, exactly, is saying it."
Op-ed word limits and the aversion of editors to even modest subtlety make simplification inevitable. Given those constraints, what Sunstein manages in 800 words is a nice feat.
But being free of such constraints here, I'd say the growing "science of science communication" literature suggests a picture of public conflict over science that is simultaneously tighter and richer than the one Cass was able to present.
To begin, "biased assimilation" doesn't itself predict that identity-congruent messengers should be able to change minds. LR&L find only that only that people will construe information on controversial issues to reinforce what they already believe--"confirmation bias" essentially.
I believe the phenomenon at work in polarized science debates is something more general: identity-protective motivated reasoning. This refers to the tendency of people to conform their processing of information -- whether scientific evidence, policy arguments, the credibility of experts, or even what they see with their own eyes -- to conclusions that reinforce the status of, and their standing in, important social groups.
"Biased assimilation" might sometimes be involved (or appear to be involved) when identity-protective motivated reasoning is at work. But because sticking to what one believes doesn't always promote one’s status in one’s group, people will often be motivated to construe information in ways that have no relation to what they already believe.
E.g., in a study that CCP did of nanotechnology risk perceptions, we did find that individuals exposed to "balanced information" became culturally polarized relative to ones who hadn't received balanced information. But those in the "no-information" condition, most of whom knew little about nanotechnology, were not themselves culturally divided; they had priors that were random with respect to their cultural views. Thus, the subjects exposed to balanced information selectively assimilated it not to their existing beliefs but to their cultural predispositions--which were attuned to affective resonances that either threatened or affirmed their groups' way of life.
Or consider a framing experiment we did involving "geoengineering." In it, we found that individuals culturally predisposed to be dismissive toward climate-change science were much more open-minded in their assessment of such sciencewhen they were first advised that scientists were proposing research into geoengineering and not only stricter CO2 limits as a response to climate change.
Biased assimilation -- the selective crediting or discrediting of information based on one's prior beliefs -- can't explain that result, but identity-protective motivated reasoning can. The congeniality of geoengineering, which resonates with pro-technology, pro-market, pro-commerce values, reduced the psychic cost of considering information to which individuals otherwise would have attached value-threatening implications--such as restrictions on commerce, technology, and markets.
Identity-protective motivated reasoning also explains the persuasiveness of ideological congenial advocates that Sunstein alluded to at the end of his column. The group values of the advocate are a cue about what position is predominant in a person's cultural group. If that cue is strong and credible enough, then people will go with the argument of the culturally congenial advocate even if the information he is presenting is contrary to their existing beliefs.
We examined this in a study of HPV-vaccine risk perceptions. In that experiment, we found that "balanced information" did polarize subjects along lines that reflected positions (and thus existing beliefs) predominant within their cultural groups. But when arguments were attributed to "culturally identifiable experts" – fictional public health experts to whom we knew subjects would impute particular cultural values -- individuals consistently adopted the position advocated by the expert whose values they (tacitly) sensed were most like theirs.
This study only shows not only that the influence of culturally congenial experts is distinct from, and stronger than, biased assimilation. It also helps to deepen our understanding of why.
Indeed, reliable understandings of “why”-- and not merely analytical clarity--is what's at stake here. As I'm sure Cass would agree, one needs to do more than reach into the grab bag of effects and mechanisms if one wants to explain, predict, and formulate prescriptions. One has to formulate a theoretical framework that integrates the dynamics in question and supplies reliable insights into how they are likely to interact. Identity-protective cognition (of which cultural cognition is one conception or, really, operationalization) is a theory of that sort, whereas "biased assimulation" is (at most) one of the mechanisms that theory connects to others.
If I'm right (I might not be; show me the evidence that suggests an alternative view) to see identity-protective cognition as the more general and consequential dynamic in disputes about policy-relevant science, moreover, then it becomes important to identify what the operative group identities are and the means through which they affect cognition. Sunstein suggests ideological affinity is important for the credibility of advocates. Well, sure, ideological affinity is okay if one is trying to measure identity-protective motivated reasoning. But for reasons I’ve set forth previously, I’d say cultural affinity is generally better -- if we are trying to explain, predict and formulate prescriptions that improve science communication.
As for whether recruiting ideologically congenial advocates is the "lesson" for those trying to persuade "climate skeptics," that's a suggestion that I'm sure Cass would urge real-world communicators to consult Bob Inglis about before trying. Or Rick Perry and Merck.
These two cases, of course, are entirely different from one another: Inglis took a brave stance based on how he read the science, whereas Perry took a payment to become a corporate sock-puppet. But both cases illustrate that deploying culturally congenial advocates to spread counter-attitudinal messages isn't a prescription that emerges from the literature in nearly as uncomplicated a manner as Sunstein might be seen to be suggesting.
The point generalizes. It's important to to attend to the wider literature in the science of science communication because the lessons one might distill by picking out one or another study in social psychology risks colliding head on with opposing lessons that could be drawn from others examining alternative mechanisms.
Actually, I'm 100% positive Sunstein would agree with this. Again, one can't possibly be expected to address something as complex as reconciling off-setting cognitive mechanisms (here: "trust the guy with my values," on one hand, vs. "excommunicate the heretic" & the "Orwell effect, on the other) in the cramped confines of an op-ed.
Okay, enough of that. Going beyond the op-ed, I'm curious what Sunstein now thinks about the relationship between "biased assimilation" --and identity-protective motivated reasoning generally -- and Kahneman's "system 1/system 2" & like frameworks of dual process reasoning.
This was something on which a number of CCP researchers including Paul Slovic, Don Braman, John Gastil & myself, debated Cass in a lively exchange in the Harvard Law Review before he took on his post in the Obama Administration. Sunstein's position then was that cultural cognition was essentially just another member of the system 1 inventory of "cognitive biases."
But research we've done since supports the hypothesis that culturally motivated reasoning isn't an artifact of “bounded rationality,” as Sunstein puts it. On the contrary, cultural cognition recruits systematic reasoning, and as a result generates even greater polarization among people disposed to use what Kahneman calls “system 2” processing.
Indeed, in our Nature Climate Change paper, we argued that this effect reflects the contribution that identity-protective cognition makes (or can make) to individual rationality. It's in the interest of individuals to conform their positions on climate change to ones that predominate within their group: whether an individual gets the science "right" or "wrong" on climate change doesn't affect the risk that climate change poses to him or to anyone else-- nothing he does based on his beliefs has any discernable impact on the climate; but being "wrong" in relation to the view that predominates in one's group can do an individual a lot of harm, psychically, emotionally, and materially.
The heuristic mechanisms of cultural cognition (including biased assimilation, cultural-affinity credibility judgments) steer a person into conformity with his or her cultural group and thus help to make that person's life go better. And being adept at system 2 only gives such a person an even greater capacity to "home in" on & defend the view that predominates in that person's group.
Of course, when we all do this at once, we are screwed. This is what we call the "tragedy of the risk perception commons.” Fixing the problem will require a focused effort to protect the science communication environment from the sort of toxic cultural meanings that create a conflict between perceiving what is known to science and being who we are as individuals with diverse cultural styles and commitments.
I’m glad Cass is now back from his tour of public service (and grateful to him for having taken it on), because I am eager what he has to say about the issues and questions that risk-percepton scholars have been debating since he’s been gone!
Kahan, D., Braman, D., Cohen, G., Gastil, J. & Slovic, P. Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law Human Behav 34, 501-516 (2010).
Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, advance on line publication, http://www.nature.com/doifinder/10.1038/nclimate1547 (2012).
Lord, C.G., Ross, L. & Lepper, M.R. Biased Assimilation and Attitude Polarization - Effects of Prior Theories on Subsequently Considered Evidence. Journal of Personality and Social Psychology 37, 2098-2109 (1979).