This is just the first post in a series to address a very small question that I’m sure we can quickly dispose of.
But here’s the question:
I’m sure the vast majority of you need no further explanation. But for newbies, this is a “tweet” from “Fearless Dave” Ropeik, the public risk perception expert who correctly believes it is irrational to worry about anything. Likely you all remember the discussion we recently had about how Fearless Dave had his kids go over & play with the nextdoor neighbors’ children when they had Ebola because he figured it was much better for his kids to get the disease when they were young than when they were grown ups. Of course—this is the perfect System 2 rationality we all aspire to!
But anyway, what he’s asking is—why do cultural affinities (like being an “egalitarian communitarian” as opposed to a “hierarch individualist”) make such a big difference in perceptions of the risk of climate chanage, or owning a handgun, or nuclear energy?
Fearless Dave doesn’t mean why as in “what are the mechanisms that generate such big disparities in the proportion of people of one type who believe that human beings are heating up the climate & the proportion of another type who believe that?”; he’s quite familiar with (and a very lucid expositor and insightful interpreter of) all manner of work on risk perception, including the research that shows how people of opposing identities conform all manner of information—from their intepretation of data to their assessments of arguments to their perception of the expertise of scientists to what they observe with their own eyes—to the position that predominates in their group.
What he wants to know is why these cognitive mechanisms are connected to group identities. Why are people so impelled to fit their views to their groups'? And why do the groups disagree so intently?
Is there, Fearless Dave wonders, some sort of genetic hard wiring having to do with the evolutionary advantages, say, that “Democratic” or “nonreligious” cavepeople & “Republican” “religious” cavepeople got from forming opposing estimates of the risk of being eaten by a a sabre tooth tiger on the savannah--and then going to war w/ each other over their disagreement?
Really good question.
I don’t know.
But I and a few others twitterers offered some conjectures:
Now probably this exchange needs no explanation either.
But basically, I and Jay Van Bavel are disagreeing about the reason cultural identities generate conflicting perceptions of risk and like facts.
Or maybe we aren’t. It’s hard to say.
While Twitter is obviously the venue most suited for high-quality scholarly interaction, I thought I’d move the site of the exchange over to the CCP Blog--so that you, the 12 billion regular readers of this blog (for some reason 2 billion people unsubscribed after my last post!), could participate in it too.
Just to get the ball of reasoned discussion rolling, I’m going to sketch out two competing answers to Fearless Dave’s question: the “Tribal Science Epistemologies Thesis” (TSET) and the “Polluted Scicomm Environment Thesis” (PSET). The answers aren't "complete" even on their own terms, but they convey the basics of the positions they stand for and give you a sense of the attitudes behind them too.
TSET. People are by nature factional. They use in-group/out-group distinctions to organize all manner of social experience—familial, residential, educational, occupational, political, recreational (“f***ing Bucky Dent!”). The ubiquity of this impulse implies the reproductive advantage it must have conferred in our formative prehistory. Its permanence is testified to by the unbroken narrative of violent sectarianism our recorded history comprises.
The mechanisms of cultural cognition reflect our tribal heritage. The apprehension of danger in behavior that deviates from a group’s norms fortifies a group’s cohesion. Imputing danger to behavior characteristic of a competing group’s norms helps to stigmatize that group’s members and thus lower their status. Cultural cognition thus reliably converts the fears and anxieties of a group’s members into the energy that fuels that group’s drive to dominate its rivals.
In a democratic political order, these dynamics will predictably generate cultural polarization. Opposing positions on societal risks (climate change, gun ownership, badger infestation) supply conspicuous markers of group differentiation. Democratically enacted policies endorsing o rejecting those positions supply evocative gestures for remarking the relative status of the groups that hold them.
Nothing has really changed. Nothing ever will.
PSET. Cultural conflict over risk and related facts is not normal. It is a pathology peculiar to the pluralistic system of knowledge certification that characterizes a liberal democratic society.
Individuals acquire their understanding of what is known to science primarily through their everyday interactions with others who share their basic outlooks. Those are the people they spend most of their time with, and the ones whose professions of expertise they can most reliably evaluate. Because all self-sustaining cultural groups include highly informed members and intact processes for transmitting what they know, this admittedly insular process nevertheless tends to generate rapid societal convergence on the best available evidence.
But not always. The sheer number of diverse groups that inhabit a pluralistic liberal society, combined with the tremendous volume of scientific knowledge such a society is distinctively suited to generating, makes occasional states of disagreement inevitable.
Even these rare instances of nonconvergence are likely to be fleeting.
But if by some combination of accident, misadventure, and strategic behavior, opposing perceptions of risk become entangled in antagonistic cultural meanings, dissensus is likely to endure and feed on itself. The material advantage any individual acquires by maintaining her standing within her cultural group tends to exceed the advantage of holding personal beliefs in line with the best evidence on societal risks. As a result, when people come to regard positions on risk as badges of membership in one or another group, they will predictably use their reason to persist in beliefs that express their cultural identities.
This identity-protective variant of cultural cognition is the signature of a polluted science communication environment. The entanglement of risks in antagonistic cultural meanings disables human reason and deprives the citizens of the Liberal Republic of Science of their political regime’s signature benefits: historically unprecedented civil tranquility and a stock of collective knowledge bountiful enough to secure their well-being from all manner of threat, natural and man-made.
But we can use our reason and our freedom to overcome this threat to our reason and our freedom. Dispelling the toxin of antagonistic cultural meanings from our science communication environment is the aim of the science of science communication—a “new political science for a world itself quite new.”
So? Which is closer to the truth—TSET or PSET?
What are the key points of disagreement between them? What might we already know that helps us to resolve these disagreements, and what sorts of evidence might we gather to become even more confident?
What are the alternatives to both TSET and PSET? Why might we think they are closer to the truth? How could we pursue that possiblity through observation, measurement, and inference?
And what does each of the candidate accounts of why “group affiliation” has such a profound impact on our perception of risk and like facts imply about the prospects for overcoming the barrier that cultural polarization poses to making effective use of scientific knowledge to promote our ends, individual and collective?
BTW, why do I say "closer to the truth" rather than "true"? Because obviously neither TSET nor PSET is true, nor is any other useful answer anyone will ever be able to give to Fearless Dave's question. The question isn't worth responding to unless the person asking means, "what's a good-enough model of what's going on--one that gives me more traction than the alternatives in explaining, predicting, and managing things?"
So ... what's the answer to Fearless Dave's question? Do TSET & PSET help to formulate one?