follow CCP

Recent blog entries
« Is A. Gelman trying to provoke me, or is that just my narcissism speaking? | Main | Local adaptation & field testing the science of science communication »
Friday
Feb222013

The false and tedious "defective brain" meme

I know expressing exasperation doesn't really accomplish much but:

Please stop the nonsense on our “defective brains.”

Frankly, I don’t know why journalists write, much less why newspapers and newsmagazines continue to publish, the same breathless, “OMG! Scientists have determined we’re stupid!!!” story over & over & over. 

Maybe it is because they assume readers are stupid and will find the same the same simplistic rendering of social psychology research entertaining over & over & over.

Or maybe the writers who keep recycling this comic book account of decision science can't grasp the grownup version of why people become culturally polarized on risk and related facts—although, honestly, it’s really not that complicated!

Look: the source of persistent controversy over risks and related facts of policy significance is our polluted science communication environment, not any defects in our rationality.

People need to (and do) accept as known by science much much much more than they could possibly understand through personal observation and study.  They do this by integrating themselves into social networks—groups of people linked by cultural affinity—that reliably orient their members toward collective knowledge of consequence to their personal and collective well-being.

The networks we rely on are numerous and diverse—because we live in a pluralistic society (as a result, in fact, of the same norms and institutions that make a liberal market society the political regime most congenial to the flourishing of scientific inquiry).  But ordinarily those networks converge on what’s collectively known; cultural affinity groups that failed to reliably steer their members toward the best available evidence on how to survive and live well would themselves die out.  

Polarization occurs only when risks or other facts that admit of scientific inquiry become entangled in antagonistic cultural meanings. In that situation, positions on these issues will come to be understood as markers of loyalty to opposing groups.  The psychic pressure to protect their standing in groups that confer immense material and emotional benefits on them will then motivate individuals to persist in beliefs that signify their group commitments.

They'll do that in part by dismissing as noncredible or otherwise rationalizing away evidence that threatens to drive a wedge between them and their peers. Indeed, the most scientifically literate and analytically adept members of these groups will do this with the greatest consistency and success.  

Once factual issues come to bear antagonistic cultural meanings, it is perfectly rational for an individual to use his or her intelligence this way: being "wrong" on the science of a societal risk like climate change or nuclear power won't affect the level of risk that person (or anyone else that person cares about): nothing that person does as consumer, voter, public-discussion participant, etc., will be consequential enough to matter. Being on the wrong side of the issue within his or her cultural group, in contrast, could spell disaster for that person in everday life.

So, in that unfortunate situation, the better our "brains" work, the more polarized we'll be. (BTW, what does it add to these boring, formulaic "boy, are humans dumb!" stories to say "scientists have discovered that our brains  are responsible for our inability to agree on facts!!"? Where else could cognition be occurring? Our feet?!)

The number of issues that have that character, though, is miniscule in comparison to the number that don’t. What side one is on on pasteurized milk, fluoridated water, high-power transmission lines, “mad cow disease,” use of microwave ovens, exposure to Freon gas from refrigerators, treatment of bacterial diseases with antibiotics, the inoculation of children against Hepatitis B, etc. et. etc., isn't viewed as a a badge of group loyalty and commitment for the affinity groups most people belong to. Hence, there's not meaningful amount of cultural polarization on these issues--at least in the US (meaning pathologies are local; in Europe there might be cultural dispute on some of these issues & not on some of the ones that divide people here).

The entanglement of facts that admit of scientific investigation—e.g., “carbon emissions are heating the planet”; “deep geologic isolation of nuclear wastes is safe”—with antagonistic meanings occurs by a mixture of influences, including strategic behavior, poor institutional design, and sheer misadventure. In no such case was the problem inevitable; indeed, in most, such entanglement could easily have been avoided.

These antagonistic meanings, then, are a kind of pollution in the science communication environment.  They disable the normal and normally reliable faculties of rational discernment by which ordinary individuals recognize what is collectively known.

One of the central missions of the science of science communication in a liberal democratic state is to protect the science communication environment from such contamination, and to develop means for detoxifying that environment when preventive or protective measures fail.

This is the account that is best supported by decision science. 

And if you can’t figure out how to make that into an interesting story, then you are falling short in relation to the craft norms of science journalism, the skilled practitioners of which continuously enrich human experience by figuring out how to make the wonder of what's known to science known by ordinary, intelligent, curious people.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (12)

That's an interesting point and I can agree with much of it from my perspective in a totally different field. However, It's interesting that you claim that

"it is perfectly rational for an individual to use his or her intelligence this way:being "wrong" on the science of a societal risk like climate change or nuclear power won't affect the level of risk that person (or anyone else that person cares about) faces, while being on the wrong side of the issue within his or her culture could spell disaster for that person in everday life."
I would say this is perfectly "self interested" but It's not rational, especially when there is no conscious intent to deceive or misrepresent.

You also write

"Indeed, the most scientifically literate and analytically adept members of these groups will do this with the greatest consistency and success. So the better our "brains" works, the more polarized we'll be."
but this denies any role for honesty, self-awareness, metacognition, and objectivity in thinking analytically, scientifically, or our brains working "better". I'm not sure I'm so cynical as to totally give up on those ideas as you seem to do.

I think that asserting that adaptive/self interested wrongness is somehow optimal because it is successfully self-serving is wrong in a different way than presenting it as a defect is wrong.

February 22, 2013 | Unregistered CommenterMattK

@MattK:

Should have qualified, elaborated, since the impression you formed -- while not the one I meant to convey -- is reasonable

In the study that is hyperlinked -- Polarizing Impact of Science Literacy & Numeracy on Perceived Climate Change Risks>, we take care to ditinguish "individually rational" from "collectively rational" (same in Ideology, Motivated Reasoning, & Cognitive REfliction. The sense of "rationality" here is one from rational choice theory; we are positing collective action problem created by incentive individuals experience to form perceptions of fact in identity-protective fashion & collective interest of diverse people in converging on facts that are policy releivant. But nothing in such an account implies that what is "rational" at individual level is normatively desirable; on contrary, rational people will want to change the conditions that make it rational for each individual to behave in way that dooms the whole to bad outcomes.

Also nothing cynical (intended) in this. On the contrary, the "we'ere so brainless" trope is cynical. My point is we can use our intelligence -- a science of science communciation -- to protect our science communciation enviornment.. In a clean environment-- one that doesn't have contamination of antagonistic meanings tied to facts that admit of scientific investigation -- we don't polarize. We converge on best evidene

Generally, I find talking about "rationality" more confusing than helpful but here I like it as remedy for the "bounded rationality" store-telling template that keeps getting slapped down on politically polariazed debates like climate change. One can't tell whether someone's mode of reasoning is "biased" unless one starts w/ a defensible account of what he or she is trying to achieve. The "we're such irrational creatures!" trope glides right over that

February 22, 2013 | Registered CommenterDan Kahan

Hi Dan,

I'm a frequent reader of the blog and have high regards for you. However, I'm quite disappointed in this post. I agree that journalists and even us laypeople can get caught up in the "we're stupid" meme. And I also agree that we're not -- we're smart beings!

That said, it seems you've become almost too entrenched in your cultural cognition explanation. This post walks a fine line of becoming a "magic bullet" explanation of polarization and misunderstanding of scientific findings.

You state, "Polarization occurs only when risks or other facts that admit of scientific inquiry become entangled in antagonistic cultural meanings." Really? So you've (and others) tested all other plausible alternative explanations for polarization and *only* found yours to have rigorous empirical evidence? I'm skeptical. I know you have tested some, but I am taken aback that someone with your disposition -- from what I read, motivated to find the best answers through scientific inquiry -- is this certain about something. You clearly have evidence for your claims, and your findings might even be the best explanation of polarization and interpretation of what's known in science.

But you also know that behavior is complex, and there are most likely many different causes of actions. The cultural cognition explanation is a good one, and certainly the communication environment is worth our attention! But I highly doubt it's the only cause. As you know, humans do have many fallibilities, both individually and collectively, and these should be taken into account as well, along with whatever other conjectures turn out to have evidence behind them.

I think Stanovich, from his "Thinking Straight about Psychology", put it best (and I'm sure you'd agree):

"But often people forget that behavior is multiply determined. They seem to want to find the so-called magic bullet—the one cause of the behavioral outcome that interests them…The world is complicated, and the determinants of behavior are many and complex. Just because we have demonstrated a cause of behavior does not mean that we have uncovered the only cause or even the most important cause. To provide a thorough explanation of a particular behavior, researchers must study the influence of many different variables and amalgamate the results of these studies to give a complete picture of all the causal connections.”

February 22, 2013 | Unregistered CommenterColin D

"In a clean environment-- one that doesn't have contamination of antagonistic meanings tied to facts that admit of scientific investigation -- we don't polarize. We converge on best evidence"

Do we? Or do we simply converge? Is it possible that we only notice it when there's a controversy?

If a social subset of the population can converge on a position that is wrong, and this is people behaving as people do, then what would it look like if the subset was the whole set? Wouldn't it look like everyone converging on the same non-controversial conclusion that everyone agreed was right? Wouldn't our identity as rational people who are members of the group being referred to when people say "everyone knows that..." be just as strong an inducement to conform?

After all, it's bad enough having half the population thinking you're nuts, how much more pressure would there be if the whole population thought you were nuts? So aren't we just as constrained by our social networks even when there isn't a political controvery?

It was just a thought. :-)

February 22, 2013 | Unregistered CommenterNiV

@Colin D:
I see your point & am troubled by it. I will have to reflect on it a bit.

February 22, 2013 | Unregistered Commenterdmk38

Ok, good clarification, thank you. IMore careful reading/or familiarity with social cognition would probably have let me catch the nuance. Personally, I am concerned with preserving objectivity and metacognition as an aspiration or ideal but I defintitely see the need to have a better understanding and sympathy for the reality of how people really think and how that can guide science communication strategy. I suppose one could draw parallels (and I don't mean this to be arrogant or condescending to anyone) between wanting to create effective policies to deal with high obesity (or smoking, alcohol, or any number of other issues) without undermining the value of personal responsibility and without denigrating/blaming people.

February 22, 2013 | Unregistered CommenterMattK

I'm in agreement with Colin's points. As much as I have been accused of obsessively pointing to "motivated reasoning" on climate blogs (it happens often, ask NiV), I had a similar reaction to your
"only"

February 22, 2013 | Unregistered CommenterJoshua

"The psychic pressure to protect their standing in groups that confer immense material and emotional benefits on them will then motivate individuals to persist in beliefs that signify their group commitements."

Or, as Upton Sinclair put it, "It is difficult to get a man to understand something, when his salary [or, I would add, his position within his community, or his hope of eternal salvation] depends upon his not understanding it!"

February 23, 2013 | Unregistered Commenterpaulbraterman

The article has great value in that it points to conformity as an important driver behind public discourse on various topics. But there are several distinct categories into which people fall. Take, for example, the difference between informational conformity, where a person conforms because he/she believes the group's information to be true, and normative conformity, where the target knows the information is false, but pretends that it's true to "get along" or avoid social costs. Socially, both these individuals could speak the same non-truth, but one is a true fool and the other is a self-serving agent. Polarization results in either case on the surface, but the underlying cause is quite different.

Both moral psychologists like Haidt, who tracks in-group preferences among other things, and evolutionary psychology with its focus on sexual selection (as well as natural selection) tell us why group status can be important. The answer, in my view, is to align truth-seeking with survival and status, and then educate the true fools.

February 23, 2013 | Unregistered CommenterEyesOpen

I agree with Colin for another reason. Some people are just contrary.

Though, I would point out that you have not been outlining a magic bullet in this seminar.

IMO, you still need that step(s) in the iterative process that asks is the science meeting the requirements of your model citizen to make a decision, rather than automatically assuming it caused by antagonistic cultural meanings. As NiV pointed out including this will make it a more powerful statement. And if the SoSC should determine that the science is not meeting the requirements of you model citizen, your SoSC will be more useful.

February 24, 2013 | Unregistered CommenterJohn F Pittman

I should correct an error I made in my first comment: The title of Stanovich's book is "How to Think Straight about Psychology"

February 24, 2013 | Unregistered CommenterColin Doms

This is a very interesting idea. I agree with you that ordinary Americans are not as dumb as their opponents would like to believe. I also agree with other posters that group identity cannot be the only criterion people use when deciding upon controversial issues - for instance, this article describes how conservatives and liberals use different parts of the brain to assess risk in gambling (http://www.upi.com/Health_News/2013/02/16/Left-right-process-risk-differently/UPI-21191361072361/).

I had two thoughts your article did not address. First, group leadership often intentionally promotes highly emotional and polarizing terms on issues where the facts are not really arguable (evolution vs creation), and no matter how carefully the scientific community communicates their findings, this leadership will create polarity for political reasons, employing the "most scientifically literate and analytically adept" people to develop their rhetoric. Their constituents will then, as you say, adopt and defend the position of trusted leaders using all the rational means they possess, for the rational reason that they need their group. You say these issues "become entangled in antagonistic cultural meanings" as if this is a passive process (sheer misadventure), and I believe it is not.

Second, people who have similar beliefs self-select as groups. My friend always had very conservative views, and finally moved out of the liberal North to the central South, where he feels people are "normal". He does not defend their platform (using his considerable brainpower) in order to secure his position in their culture; he made the rational choice to seek out a culture whose platform he could really defend.

You make a great point that people are generally not too dumb to understand the findings of science, but use criteria other than the scientific method to define their position on important issues. As for "detoxifying" scientific discourse, well, that is a tall order. Scientists are people too, and use all kinds of criteria, same as everybody else.

And you have to admit that some very loud people are really dumb.

February 25, 2013 | Unregistered CommenterLeslie

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>