follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Weekend update: "Knowing disbelief in evolution"-- a fragment | Main | For what it's worth: breaking down "belief in" GW vs. "belief in" AGW as function of partisanship & OCSI »
Saturday
Aug232014

Weekend update: "Culture is prior to fact" & what that implies about resolving political conflict over risk

The idea that cultural cognition and related dynamics are peculiar to "unsettled" issues, or ones where the scientific evidence is not yet "clearly established," is a recurring theme.  For some reason, the recent "What exactly is going on in their heads?" post has stimulated many commentators -- in the discussion thread & in correspondence -- to advance this claim.  In fact, that view is at odds with the central tenet of cultural cognition as a research program.

The cultural cognition thesis asserts that "culture is prior to fact" in a cognitive sense: the capacity of individuals to recognize the validity of evidence on risks and like policy-relevant facts depends on cognitive faculties that themselves are oriented by cultural affiliations. Because cultural norms and practices certify that evidence has the qualities that entitle it to being credited consistent with science's criteria for valid proof, ordinary members of the public won't be able to recognize that scientific evidence is "clear" or "settled" unless doing so is compatible with their cultural identities. 

Below I reproduce one relatively early formulation of this position. It is from  Kahan, D.M. & Braman, D. Cultural Cognition of Public Policy. Yale J. L. & Pub. Pol'y 24, 147-170 (2006).  

In this essay, Don "Shotgun" Braman & I characterize the "cultural cognition thesis" as a "conjecture."  I am happy to have it continue to be characterized as such -- indeed, prefer that it forever be referred to as "conjectural" no matter how much evidence is adduced to support it than that it be referred to as "proven" or "established" or the like, a way of talking that reflects a vulgar heuristic substitute for science's own way of knowing, which treats every current best understanding as provisional and as subject to modification and even rejection in light of additional evidence. 

But in fact, since this essay was published, the Cultural Cognition Project has conducted numerous experiments that support the "cultural cognition thesis."  These experiments present evidence on mechanisms of cognition the operation of which implies that "clear" or valid evidence can be recognized as such only when assent to it affirms rather than denigrates perceivers' cultural identities.  Such mechanisms include (1) culturally biased search and assimilation; (2) cultural source credibility; (3) the cultural availability effect; and (4) culturally motivated system 2 reasoning.  

As the excerpt emphasizes (and as is documented in its many footnotes, which are not reproduced here), all of these involve extensions of well-established existing psychological dynamics.  The nerve of the cultural cognition research program has been been simply to demonstrate important interactions between known cognitive mechanisms and cultural outlooks, a process that we hypothesize accounts for persistent political conflict on risk and other policy-relevant facts that admit of scientific investigation.

Knowing what I (provisionally) do now, there are collateral elements of the account below that I would qualify or possibly even disavow! I'm sure I'll continue to discover holes and gaps and false starts in the future, too--and I look forward to that.

V. FROM HEURISTIC TO BIAS 

Public disagreement about the consequences of law is not just a puzzle to be explained but a problem to be solved. The prospects for enlightened democratic decisionmaking obviously depend on some reliable mechanism for resolving such disputes and resolving them accurately. Because such disagreements turn on empirical claims that admit of scientific investigation, the conventional prescription is the pursuit and dissemination of scientifically sound information.

The hope that democracy can be enlightened in such a straightforward manner, however, turns out to be an idle one. Like most heuristics, cultural cognition is also a bias. By virtue of the power that cultural cognition exerts over belief formation, public dispute can be expected to persist on questions like the deterrent effect of capital punishment, the danger posed by global warming, the utility or futility of gun control, and the like, even after the truth of the matter has been conclusively established.

Imagine—very counterfactually—that all citizens are perfect Bayesians. That is, whenever they are apprised of reliable information, they readily update their prior factual beliefs in a manner that appropriately integrates this new information with all existing information at their disposal.

Even under these circumstances, conclusive discovery of the truth is no guarantee that citizens will converge on true beliefs about the consequences of contested public policies. For while Bayesianism tells individuals what to do with relevant and reliable information, it doesn’t tell them when they should regard information as relevant and reliable. Individuals can be expected to give dispositive empirical information the weight that it is due in a rational-decisionmaking calculus only if they recognize sound information when they see it.

The phenomenon of cultural cognition suggests they won’t. The same psychological and social processes that induce individuals to form factual beliefs consistent with their cultural orientation will also prevent them from perceiving contrary empirical data to be credible. Cognitive-dissonance avoidance will steel individuals to resist empirical data that either threatens practices they revere or bolsters ones they despise, particularly when accepting such data would force them to disagree with individuals they respect. The cultural judgments embedded in affect will speak more authoritatively than contrary data as individuals gauge what practices are dangerous and what practices are not. And the culturally partisan foundation of trust will make them dismiss contrary data as unreliable if they perceive that it originates from persons who don’t harbor their own cultural commitments.

This picture is borne out by additional well-established psychological and social mechanisms. One constraint on the disposition of individuals to accept empirical evidence that contradicts their culturally conditioned beliefs is the phenomenon of biased assimilation. This phenomenon refers to the tendency of individuals to condition their acceptance of new information as reliable based on its conformity to their prior beliefs. This disposition to reject empirical data that contradict one’s prior belief (for example, that the death penalty does or doesn’t deter crime) is likely to be especially pronounced when that belief is strongly connected to an individual’s cultural identity, for then the forces of cognitive dissonance avoidance that explain biased assimilation are likely to be most strongly aroused.

Two additional mechanisms reinforce the tendency to see new information as unreliable when it challenges a culturally congenial belief. The first is naïve realism. This phenomenon refers to the disposition of individuals to view the factual beliefs that predominate in their own cultural group as the product of “objective” assessment, and to attribute the contrary factual beliefs of their cultural and ideological adversaries to the biasing influence of their worldviews. Under these conditions, evidence of the truth will never travel across the boundary line that separates a factually enlightened cultural group from a factually benighted one.

Indeed, far from being admitted entry, the truth will be held up at the border precisely because it originates from an alien cultural destination. The second mechanism that constrains societal transmission of truth—reactive devaluation—is the tendency of individuals who belong to a group to dismiss the persuasiveness of evidence proffered by their adversaries in settings of intergroup conflict.

We have been focusing on the impact of cultural cognition as a bias in the public’s recognition of empirically sound information. But it would be a mistake to infer that the immunity of social and natural scientists to such bias improves the prospects for truth, once discovered, to penetrate public debate.

This would be a mistake, first, because scientists aren’t immune to the dynamics we have identified. Like everyone else, scientists (quite understandably, even rationally) rely heavily on their priors when evaluating the reliability of new information. In one ingenious study, for example, scientists were asked to judge the experimental and statistical methods of what was represented to be a real study of the phenomenon of ESP. Those who received the version of the fictitious study that found evidence of ESP rated the methods to be low in quality, whereas those who received the version that found no evidence of ESP rated the methods to be high in quality, even though the methods were in fact independent of the conclusion. Other studies showing that cultural worldviews explain variance in risk perceptions not just among lay persons but also among scientists who specialize in risk evaluation fortify the conclusion that for scientists, too, cultural cognition operates as an information-processing filter.

But second and more important, any special resistance scientists might have to the biasing effect of cultural cognition is beside the point. The issue is whether the discovery and dissemination of empirically sound information can, on its own, be expected to protect democratic policymaking from the distorting effect of culturally polarized beliefs among citizens and their representatives.

Again (for the umpteenth time), ordinary citizens aren’t in a position to determine for themselves whether this or that scientific study of the impact of gun control laws, of the deterrent effect of the death penalty, of the threat posed by global warming, et cetera, is sound. Scientific consensus, when it exists, determines beliefs in society at large only by virtue of social norms and practices that endow scientists with deference-compelling authority on the issues to which they speak. When they address matters that have no particular cultural valence within the group-grid matrix—What are the relative waterrepellant qualities of different synthetic fabrics? Has Fermat’s Last Theorem been solved?—the operation of these norms and practices is unremarkable and essentially invisible.

But when scientists speak to policy issues that are culturally disputed, then their truth-certifying credentials are necessarily put on trial. For many citizens, men and women in white lab coats speak with less authority than (mostly) men and women in black frocks. And even those who believe the scientists will still have to choose which scientists to believe. The laws of probability, not to mention the professional incentives toward contrarianism, assure that even in the face of widespread professional consensus there will be outliers. Citizens (again!) lack the capacity to decide for themselves whose work has more merit. They have no choice but to defer to those whom they trust to tell them which scientists to believe. And the people they trust are inevitably the ones whose cultural values they share, and who are inclined to credit or dismiss scientific evidence based on its conformity to their cultural priors.

These arguments are necessarily interpretative and conjectural. But in the spirit of (casual) empirical verification, we invite those who are skeptical to perform this thought experiment. Ask yourself whether you think there is any credible scientific ground for believing that global warming is/isn’t a serious threat; that the death penalty does/doesn’t deter; that gun control does/doesn’t reduce violent crime; that abortion is/isn’t safer than childbirth. If you believe the truth has been established on any one of these issues, ask yourself why it hasn’t dispelled public disagreement. If you catch yourself speculating about the possible hidden cognitive motivations the disbelievers might have by virtue of their cultural commitments, you may proceed to the next Part of this Essay (although not until you’ve reflected on why you think you know the truth and whether your cultural commitments might have anything to do with that belief).  If, in contrast, you are tempted to answer, “Because the information isn’t accessible to members of the public,” then please go back to the beginning of this Essay and start over.

VI. OVERCOMING CULTURAL BIAS: IDENTITY AFFIRMATION

Nothing in our account implies either that there is no truth of the matter on disputed empirical policy issues or that the public cannot be made receptive to that truth. Like at least some other cognitive biases, cultural cognition can be counteracted. . . .  

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (12)

Dan said in another thread about cultural affinity-

"People are likely to be better at “reading” people—at figuring out who really knows what about what—when they are interacting with others with whom they share values and related social understandings. They are, sadly, more likely to experience conflict with those whose values and understandings differ from theirs, a condition that will interfere with transmission of knowledge."

You seem to be operating/researching under the assumption that the average US citizen prioritizes their thinking and thought processes, first and foremost according to their political affiliation. Which would be fallacious. Most people look at their political affiliation as just one of many subcategories that make up their entire "culture" as a whole. For example, in order to BE a Democrat or Republican, one must first belong to a larger group identified as "Americans" or "US Citizens". As members of that group, they share many values and understandings. In fact:

In a study done in 2008 by Kennan Sheldon, after comparing values between Republicans and Democrats, he said "The one thing that struck me the most was that the value differences were rather small – really, people were more alike than different, in that almost everybody favored intrinsic values more than extrinsic values,” Sheldon said. “It was just a small relative difference between the two parties."

More recently, a 2014 Pew Survey shows that there is far more diversity in BOTH parties than most people think there is.
http://www.pewresearch.org/fact-tank/2014/07/01/a-closer-look-at-who-identifies-as-democrat-and-republican/

http://www.pcusa.org/news/2014/7/11/forget-republican-or-democrat-americans-divide-the/

Now, maybe you are unaware of the intrinsic and shared values that exist between Republicans and Democrats. Or maybe you were aware of them, but you chose to ignore them for some reason. But if you did not actually ASK your test takers specifically what their values and understandings are...and what priority their political party affiliation has in their lives, then you are working on nothing but the assumption that all of your test takers identify themselves first and foremost by their party identifications and therefore assume that it is the "cultural affinity" they have within their political parties that most affects their thought processes regarding climate science!

But in reality, most people I know will give you very different answers about which group they primarily identify themselves by if asked to prioritize all the groups they identify with.

For example I might say "Christian, then American, then citizen, then Mother, Wife, neighbor etc, ALL before I got to my political affiliation.

My neighbor might say "American, Democrat, husband, father, atheist" etc.

If you only went as far as to ask your test takers what political party they are affiliated with in order to be able to attribute some to one side and some to the other, and didn't ask them individually to identify and prioritize their personal values as well, then you are making a HUGE assumption about their cultural affinities based on your own PRIOR assumptions about political affiliations and cultural affinities!

In reality the two groups SHARE many values and understandings, and attempting to determine how/why they respond the way they do to climate science based upon the assumption that they don't is always going to produce confusing, and false, results.

August 23, 2014 | Unregistered CommenterSaudadia

Dan said-
“Citizens (again!) lack the capacity to decide for themselves whose work has more merit. They have no choice but to defer to those whom they trust to tell them which scientists to believe. And the people they trust are inevitably the ones whose cultural values they share, and who are inclined to credit or dismiss scientific evidence based on its conformity to their cultural priors.”

Upon what do you base this declaration? Average citizens can do math. Average citizens can read. Average citizens can compare the results of one study to the results of another. Average citizens should be able to grasp margins of error, and apply basic logic, and reason. You don’t have to understand calculus or be able to extrapolate vast mathematical equations in order to determine whether or not the sea level at the local beach is rising or not. Or read a thermometer. Or understand the difference between a calculated “estimate” and an “empirical measurement”.

Maybe I’m not an average citizen, but I actually READ the scientific papers myself, and determine whether or not I agree with their conclusions based upon whether or not both the evidence and the arguments they used have MERIT on their own. I have zero idea or concern with the “cultural values” held by the scientists themselves. I also REFUSE to defer to someone else telling me which scientists to believe…because “I believe in SCIENCE”….I don’t “believe in scientists”. Since there are no “people I defer to”, my evaluations are not clouded by someone else’s “cultural values” nor their inclinations or priors.

I trust scientists whose work is logical, methodical, consistent, and which contains the LEAST amount of assumptions, estimates, and contaminating priors as possible. If there is empirical evidence, I trust it over modeled output every time. If the measurements are inconsistent with each other, I distrust all of them until the discrepancies can be worked out. I discount the statements of all and any scientist directed at anything other than the science or evidence in question-especially when it contains appeals to emotion, fear, morals, or reflects their own personal opinion more than it does concrete scientific evidence. If you can’t convince me based upon objective evidence and reasoning, then your work probably contains less science and more speculation than it should.

Dan said-

"But in the spirit of (casual) empirical verification, we invite those who are skeptical to perform this thought experiment. Ask yourself whether you think there is any credible scientific ground for believing that global warming is/isn’t a serious threat; that the death penalty does/doesn’t deter; that gun control does/doesn’t reduce violent crime; that abortion is/isn’t safer than childbirth. If you believe the truth has been established on any one of these issues, ask yourself why it hasn’t dispelled public disagreement. If you catch yourself speculating about the possible hidden cognitive motivations the disbelievers might have by virtue of their cultural commitments, you may proceed to the next Part of this Essay (although not until you’ve reflected on why you think you know the truth and whether your cultural commitments might have anything to do with that belief). If, in contrast, you are tempted to answer, “Because the information isn’t accessible to members of the public,” then please go back to the beginning of this Essay and start over."

What if my first thought is Occam's Razor....that in the absence of certainty, the less assumptions made the better...and my first question is "How many of the "disbelievers" have actually seen the information accessible to members of the public?"

My second is that I see a distinct difference between "credible scientific ground for believing" something, and "the truth about something being established". I suspect that YOU understand that distinction as well because you used two different phrasings....the first involved grounds for "belief" and the second is pretty much the definition of "A FACT".

Your posts are fascinating to me, as someone on the outside observing a supposed impartial and objective "scientist".

August 23, 2014 | Unregistered CommenterSaudadia

I have to read the above in detail.
I have a couple of initial thoughts that I am willing to discuss with others.
'Culture is prior to facts.'
From a neuroscience, cell by cell, perspective, this sentence does not have meaning. Culture is a set of 'facts' to the brain. Science is a related set of 'facts', not different in detail from 'culture.' Culture may precede 'scientific facts' and may dominate them but is not, at ground level, different from them.
As for political implications, we have found, in a group of 40 with very different backgrounds, that patience, persistence, trust, and respect at in person meetings seem to be the only way to partially redo 'culture.' We have also found that apparent changes in culture may appear to stick for a few weeks but then revert, for clear neuroscience reasons.
Ask questions if the above is interesting. The cool part of the discussion is likely to be offline.

August 23, 2014 | Unregistered CommenterEric Fairfield

Hi Dan,

Thanks for delving into this deeper because I think it helped me understand why I was feeling a bit like we were talking past one another on the other thread.

"The issue is whether the discovery and dissemination of empirically sound information can, on its own, be expected to protect democratic policymaking from the distorting effect of culturally polarized beliefs among citizens and their representatives."

This is not the issue for me. I am interested in what you and your colleagues come up with for this question as you proceed in your research, but it is not the reason I am on this discussion board. My concern is that the distorting effects you cite are have crept into the assessment of the science itself, particularly as in regards to climate change but on other issues as well. For me it is not beside the point, it is the point! My concern is that the position of science as a trusted source of information will be damaged if it is just another marketing weapon deployed in policy battles. IF in fact climate scientists are not portraying a complete picture of the uncertainties due to their biases AND they can be convinced that such distortions are possible in their thinking, then perhaps they may work harder to overcome the distorting effects and portray a more accurate picture of what is known and unknown.

Will this make a lick of difference in policy outcomes? Doubtful, as everyone will continue to emphasize the science that suits their purposes. But, the scientific information would hopefully be trusted by all, so that the debates can be in the policy realm as they should be. So this is why it is so important to me to determine where the distortions are taking place, and not so much the fact that they are taking place.

It is also why I jumped in on that particular thread. You seemed to be struggling to figure out why people who understood the nuances of climate science could also express no belief in its consensus conclusions. Simply put, many people who understand the theories well (including 48% of the membership of the American Meteorological Society as mentioned before) are not convinced that those theories are right. You have expressed that this is beside the point for your main area of research, but to me it seems like a very straightforward explanation for the particular question you were asking on that thread.

August 23, 2014 | Unregistered CommenterEvan

"Again (for the umpteenth time), ordinary citizens aren’t in a position to determine for themselves whether this or that scientific study of the impact of gun control laws, of the deterrent effect of the death penalty, of the threat posed by global warming, et cetera, is sound."

I'd be careful on this for climate science. Yes it is true that the ordinary citizen cannot determine what the science says on climate change. But for climate, people need not necessarily even consult science to gather personally relevant evidence for climate. Hamilton and Stampone (2013) showed that opinions of the reality of climate change varied with the daily weather. People can, and I would argue primarily do, determine how threatening climate change is from their own personal climate experiences. Since climate change (as it stands today anyway) is not physically perceptible by any given individual, I think only those motivated by political, financial, or environmental interests even bother to see what anyone else has to say.

August 23, 2014 | Unregistered CommenterEvan

Evan said-
"My concern is that the distorting effects you cite are have crept into the assessment of the science itself, particularly as in regards to climate change but on other issues as well. For me it is not beside the point, it is the point! My concern is that the position of science as a trusted source of information will be damaged if it is just another marketing weapon deployed in policy battles."

I suspect you are trying to be kind, or at least trying to break this concept to Dan gently. For the millions of people who have the most basic knowledge and experience with marketing tactics and how they are used to manipulate people into buying things, it's almost painful to see "scientists" attempting to use those marketing tactics because they think they will make the science more readily accepted. All it does it degrade the science into a "product", the scientists in to "salesmen" and the message into a "sales pitch".

For example, one particular group of scientists wrote what they think is a clever guide on "sticky thoughts/ideas", and keep creating shiny little app/"widgets" as a means to "sell" their view of climate change to what they think are willing buyers. I attended a sales seminar decades ago that described the best tactics for convincing someone to purchase a given product and both of those concepts were described to a room full of would be sales people at length. Along with the psychological reasons for using them! Every Amway, Avon, Tupperware, (insert any direct sales marketing company here) sales rep on the planet can recognize them a mile away!

There's a REASON people all people cringe when you say "used car salesman". Who hasn't been in the uncomfortable presence of the blatant, over the top, smarmy, seedy, shady, in-your-face behavior that terms conjures up? And even though this particular group of "scientists" think themselves to be far more clever, and more subtle, their tactics only make me feel slightly less in need of a shower every time I see them.

Climategate didn't help. It does NOT matter what the scientists say about it, it DID have a hugely negative impact on how people viewed what they formerly thought of as a "distinguished, mature, group of people with great integrity". To read the petty, conniving, disrespectful, and sometimes sneaky thoughts and plans of this group of "scientists" made people realize that they are nothing more than ordinary, every day people. And normal, ordinary, every day people will lie, cheat, steal, and manipulate to get what they want. We no longer elevate "scientists" to the level we once did simply because they bear the title of "scientist". For years, they have been demonstrating that they don't deserve elevation or our trust based purely on their title/professional declaration anymore.

But you don't have to take my word for it. Here's proof-
http://www.huffingtonpost.com/2013/12/21/faith-in-scientists_n_4481487.html

"A whopping 78 percent of Americans think that information reported in scientific studies is often (34 percent) or sometimes (44 percent) influenced by political ideology, compared to only 18 percent who said that happens rarely (15 percent) or never (3 percent).

Similarly, 82 percent said that they think that scientific findings are often (43 percent) or sometimes (39 percent) influenced by the companies or organizations sponsoring them."

http://www.washingtonpost.com/blogs/the-fix/wp/2013/04/22/how-americans-see-global-warming-in-8-charts/

"Trust in climate scientists is not universal, and has dropped in recent years. Just 26 percent of Americans said they trust scientists "completely" or "a lot" in a 2012 Washington Post-Stanford University poll, down from 32 percent in 2007. More, 35 percent, said they trust scientists only "a little" or "not at all." In a striking finding, more than one-third of the public believed climate scientists who say global warming is real make their conclusions based on money and politics."

(Climategate happened in November of 2009...and graph 4 of that survey shows that while today's survey respondents show more belief in global warming actually occurring than they did in 2009, the numbers have never recovered to what they were in 2006)

This study delved rather deeply into the impact of Climategate specifically on American's trust levels.

http://environment.yale.edu/climate-communication/files/Climategate_Opinion_and_Loss_of_Trust_1.pdf

August 23, 2014 | Unregistered CommenterSaudadia

Saudadia

How would you summarize the results of that Yale study w/r/t the impact of climategate on public opinion?

What do you think about how the data presented led the authors to say the following in the discussion?:

These results also provide evidence of the important roles that cultural worldviews, political
ideology, and motivated reasoning play in mediating public interpretations of and responses to
global warming

What do you think about the association between political orientation and the response of the participants w/r/t the impact of climategate?

Finally, given you said the following:

You seem to be operating/researching under the assumption that the average US citizen prioritizes their thinking and thought processes, first and foremost according to their political affiliation. Which would be fallacious. Most people look at their political affiliation as just one of many subcategories that make up their entire "culture" as a whole.

do you have any thoughts on the reasons for the strong association between political orientation/world view and views on climate change?

August 24, 2014 | Unregistered CommenterJoshua

Joshua,

Strong association/correlation doesn't prove causation. If the authors didn't ask the respondents to weight their own answers according to specific priorities, then the authors conclusions are subjective to their own determinants...not the survey takers'. You can regress model all you want to, but people are unique and individual and trying to judge them as if they are the same or view themselves the way strangers view them seems insulting, if not irrational.

I think that liberals and conservatives in general, think differently, use their brains differently, and react differently to certain things, and that treating them as if they think and react the exact same way is what is causing Dr Kahan to be so frustrated.

I'm NOT saying that culture worldviews or political ideologies DO NOT affect how people respond to various things. I just think that assuming that those two categories are where people find their most valid and complete identities in relationship to society and those around them, is a HUGE freaking a priori bias that affects the results MORE than either category does.

August 25, 2014 | Unregistered CommenterSaudadia

Saudadia -

==> "Strong association/correlation doesn't prove causation. "

Of course not!

==> "If the authors didn't ask the respondents to weight their own answers according to specific priorities, then the authors conclusions are subjective to their own determinants...not the survey takers'. "

The authors were speculating about potential explanations for associations found in the data. I don''t see what you think might have been improved by asking the respondents to weight their answers. It wouldn't have changed the existing associations found in the data (not to mention the obvious problems with somehow verifying their weighting).

==> "You can regress model all you want to, but people are unique and individual and trying to judge them as if they are the same or view themselves the way strangers view them seems insulting, if not irrational."

So do you think it would be rational to ask the respondents whether their views are biased by their political orientation?

==>? I think that liberals and conservatives in general, think differently, use their brains differently, and react differently to certain things, and that treating them as if they think and react the exact same way is what is causing Dr Kahan to be so frustrated."

Interesting. So do you thiink this difference is physiologically based? Something different in the architecture of the brains of libz and conz, respectively? If a kid is born into a particular political culture, do they then start using their brain in a way that matches their group's brain use?

Is this difference some across-the-board difference? So libz use their brain differently in all tasks?

Can you point to some actual evidence to support your conjecture? Some kind of fMRI data or something like that - showing difference in brain "use" where different parts of the brain light up differently? Perhaps some controlled studies to validate your "brain use" theory? Something empirical?

Because, ironically, it seems to me that perhaps you conjecture here (if you have no supporting evidence) might fit the description of how you think libz use their brains, and I"m assuming here that you aren't a lib.

==> "I'm NOT saying that culture worldviews or political ideologies DO NOT affect how people respond to various things. I just think that assuming that those two categories are where people find their most valid and complete identities in relationship to society and those around them, is a HUGE freaking a priori bias that affects the results MORE than either category does."

Maybe you should read more about the theory before positing such a criticism. As I understand the theory, those "two categories" are not necessarily thought to be the ones were people find their "most valid and complete identities,..." - but are part of a larger matrix of evidence that goes together. And, of course, worldview is pretty broad category to begin with. Also, it could well be that different categories of identification are associated with different opinions in different ways. It could be that political and world views is association with views on climate change but not with, say, views on whether little babies are cuddly. The question, then, would be why there are associations with some beliefs and not with others. I think that part of the thinking w'/r/t climate change is that it is a politically polarized issue - which then becomes associated with that category of identification that overlaps with political identifications.

Anyway, can you provide some evidence for what part of the explanation for the strong association between political and worldview and climate change is due to how libz and conz use their brains differently, and how much you think is likely associated with political/world view associations?

August 26, 2014 | Unregistered CommenterJoshua

For what it is worth, we are on the brink of being able to simulate what would happen if you had a central set of facts and thousands of individualized 'people' reacting to those facts. The people would be individualized so that each had their own life history and weighted response to new facts.
Since our ability to create this simulation is only two days old and the idea that a simulation might be useful is only 10 minutes old, I don't have any designed experiments yet. A simulation, however, might be a decent test bed to try out propositions about liberals and conservatives (and even independents!) with respect to climate change or other topics. The individuals in the simulation could hold their beliefs throughout a simulation or could interact with other individuals so that everyone's beliefs would change a bit over time.

August 26, 2014 | Unregistered CommenterEric Fairfield

Joshua-

“So do you think it would be rational to ask the respondents whether their views are biased by their political orientation?”

More rational than assuming that the correlation suggests causation one way more than it does the opposite way. It is not just as rational to think that people’s political orientation is biased by their views?

“ Interesting. So do you thiink this difference is physiologically based? Something different in the architecture of the brains of libz and conz, respectively? If a kid is born into a particular political culture, do they then start using their brain in a way that matches their group's brain use?”

Since there is no conclusive evidence that the differences are physiologically based, or not physiologically based, I truly don’t have an opinion one way or the other. I “think” we simply don’t know yet.

“Is this difference some across-the-board difference? So libz use their brain differently in all tasks?”

I haven’t seen any research that really puts liberals and conservatives through the paces of “all tasks” “across-the-board”, so I can’t answer to that. But there have been various studies that show brain differences between liberals and conservatives and various theories that they process information differently.

http://www.nature.com/neuro/journal/v10/n10/abs/nn1979.html
http://www.ncbi.nlm.nih.gov/pubmed/21474316
http://www.ncbi.nlm.nih.gov/pubmed/19562629

“Can you point to some actual evidence to support your conjecture? Some kind of fMRI data or something like that - showing difference in brain "use" where different parts of the brain light up differently? Perhaps some controlled studies to validate your "brain use" theory? Something empirical?”

(see above)

“Because, ironically, it seems to me that perhaps you conjecture here (if you have no supporting evidence) might fit the description of how you think libz use their brains, and I"m assuming here that you aren't a lib.”

Oddly, I have repeatedly referred to differences in BOTH groups, you however seem to have “conjectured” that my thoughts pertain only to how I “think libz use their brains” (and not how conservatives use theirs) based on nothing more than your assumption that I’m not a lib. I prefer to examine both sides of the issue and attempt to remove my own biases as much as possible, but you seem to prefer, “ironically”, to attempt to think more about what YOU think I “might be” thinking based on who you think I am.

“I think that part of the thinking w'/r/t climate change is that it is a politically polarized issue - which then becomes associated with that category of identification that overlaps with political identifications.”

The fact that climate change has become politically polarized doesn’t mean that people reach their beliefs about climate change because of their political party. Again, correlation is not causation.

August 26, 2014 | Unregistered CommenterSaudadia

Saudadia -

==> "More rational than assuming that the correlation suggests causation one way more than it does the opposite way. It is not just as rational to think that people’s political orientation is biased by their views?"

I'm afraid I don't understand what you're saying here. Did you mean to say..." Is it not" rather than "it is not" - as suggested by the question mark? But even assuming that, I still can't understand the question "Is it not just as rational to think that people's political orientation is biased by their views?"

==> "Since there is no conclusive evidence that the differences are physiologically based, or not physiologically based, I truly don’t have an opinion one way or the other. I “think” we simply don’t know yet."

So we don't know yet, yet you say earlier:

"I think that liberals and conservatives in general, think differently, use their brains differently, and react differently to certain things,"

So I'm having trouble reconciling not knowing and then making a non-conditional statement that you think that generally libz and cons use their brains differently... Do you mean that the part that we don't know is whether the difference is physiologically based, but we do know that conz and libz use their brains differently (generally)?

==> "I haven’t seen any research that really puts liberals and conservatives through the paces of “all tasks” “across-the-board”, so I can’t answer to that. But there have been various studies that show brain differences between liberals and conservatives and various theories that they process information differently."

Well, by looking at the abstracts only, I can't evaluate those studies - but I'd be curious if you might be able to explain how those studies translate into a theory of causality behind the strong association between political/world view orientation and views on climate change among ibz and conz. respectively. How might you apply what those studies say about physiological differences in libz' and conz' brains to the context of climate change?


Me: “Because, ironically, it seems to me that perhaps your conjecture here (if you have no supporting evidence) might fit the description of how you think libz use their brains, and I"m assuming here that you aren't a lib.”

You: "Oddly, I have repeatedly referred to differences in BOTH groups, you however seem to have “conjectured” that my thoughts pertain only to how I “think libz use their brains” (and not how conservatives use theirs)"

Not at all. You speculated about differences in how libz and conz use their brains differently, and I am asking you for evidence as to how you think libz use their brains (differently than conz)... The difference from conz is necessarily implied.

==> ". I prefer to examine both sides of the issue and attempt to remove my own biases as much as possible, but you seem to prefer, “ironically”, to attempt to think more about what YOU think I “might be” thinking based on who you think I am."

I am asking you to provide the evidence that you use to theorize (determine?) that differences in how libz and conz use their brains translates into an association between political/world view and opinions about climate change. I'm still waiting. My guess is that you haven't provided that information because you don't have supporting evidence. Further, I'm speculating that one of the differences that you see in how libz and conz use their brains is that libz (generally) are more likely to formulate opinions w/o solid evidence (or, perhaps, on an emotional basis), and that we see evidence of that in the broad differences in opinions among libz and conz w/r/t climate change. So, it would be ironic if you've formed your views about brain usage differences explaining differences in views among libz and conz on climate change if you don't have evidence. I could certainly be wrong about that, but I see nothing wrong with speculating until you provide evidence that shows my speculation wrong. I would be happy to see such evidence provided.


==> "The fact that climate change has become politically polarized doesn’t mean that people reach their beliefs about climate change because of their political party. Again, correlation is not causation."

Right - correlation doesn't equal causation. We agree about that.

But what remains is a correlation for which (1) you have yet to offer a substantial, evidence-based theory about a potential causation - which Dan has done and, (2) you are inaccurately simplifying the evidence-based theory Dan has presented: he has not said that people reach beliefs because of their political party. If that's your summary of motivated reasoning, and how Dan applies the theory of motivated reasoning to climate change, then I think that you should read more about his work and the theory.

So again, I ask you, what is your explanation - with some depth and supporting evidence - for why libz and conz view climate change differently (generally) if it isn't because of differences in political/world view orientation. And if I could add to that question - why do you think that climate change has become politically polarized?

I"m guessing that you have access to full articles, so you might consider starting here (following from your links)...

The findings provide the first neuroimaging evidence for phenomena variously described as motivated reasoning, implicit emotion regulation, and psychological defense. They suggest that motivated reasoning is qualitatively distinct from reasoning when people do not have a strong emotional stake in the conclusions reached.

http://www.ncbi.nlm.nih.gov/pubmed/17069484

Perhaps, being "right" or having one's social group be "right" might create a strong emotional stake for people in evaluating climate change. Certainly, that would seem to be somewhat explanatory for the widespread identity aggressive and identity defensive behaviors so commonly found related to climate change (on both sides of the political aisle). It might also explain why people seem to think that views on climate change are explained by differences in how libz and conz use their brains - as opposed to commonalities in how libz and conz use their brains when reasoning about the science of highly polarized issues when they are generally quite poorly informed about the scientific evidence. Please don't forget that last part. A key piece of this is that many people who feel so strongly about the science of climate change are not familiar with the science.

August 27, 2014 | Unregistered CommenterJoshua

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>