follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« One of this week's talks: #Scicomm on "Gene Drive Modification" at Nat'l Acad of Sci | Main | Using our reason to rehabilitate our reason: reflections on visit to HHMI, plus lecture slides »
Friday
Jul142017

Weekend reading: Professional judgment and biased information processing

For those interested, as am I, in how politically motivated reasoning affects professionals:

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (12)

Jonathan -

Note:

A further experiment, in which policy
professionals engage in discussion, shows that deliberation may be able to mitigate the effects of some of these biases.

Would be interesting to see someone research whether in contrast to "discussion," an explicit "intervention" focused on "motivated reasoning" - one that looks at the phenomenon and mechanism from a theoretical frame and then in particular context - might have a mitigating effect.


From the perspective of an educator who believes in the power of metacognition, I think it would. As to how generalizable those benefits might be to other contexts, or how sustainable the effects might be, I am more circumspect.

July 14, 2017 | Unregistered CommenterJoshua

Hmmm. From the conclusion.

Group deliberation might reduce confirmation bias, but have no effect the framing of risk, and could even increase a policy professional’s risk aversion if the group makes risk-aversion in the authorizing environment more salient.
.

That seems to me to be rather broad speculation, given that their "group deliberation" lacks any elements designed to specifically target biases that are associated with framing risk.

July 14, 2017 | Unregistered CommenterJoshua

From the paper abstract: '...and, most strikingly, confirmation bias correlated with ideological priors, despite having an explicit mission to promote evidence-informed and impartial decision making.'

Indeed we cannot expect even folks with such a mission to be immune to bias. This will especially be the case with long exposure to culturally aligned narrative frames that are loaded with high emotive content. Scientists sometimes self-report that emotive hot-buttons have been hit.
http://judithcurry.com/2015/04/24/contradiction-on-emotional-bias-in-the-climate-domain/

...this also highlights, additionally to the risks of communicating science via cultural framings as noted in comments one thread down, the extra danger inherent in pumping such frames with high emotive content. Unwisely, the highest emotive impact that does not produce backlash, is actually being deliberately sought in some cases.

July 14, 2017 | Unregistered CommenterAndy West

=={ Indeed we cannot expect even folks with such a mission to be immune to bias.}==

Of course not. We cannot expect anyone, or any particular group to be "immune" to bias. That would be totally unrealistic.

Related points, however:

(1) I think it would be advisable to be cautious about applying this study with one particular group to other groups - when it comes to the impact of the biases on work output. Scientists should, in theory at least, be applying an external control against bias (the scientific method), whereas other groups, such as "policy based professionals" would very likely not have an expectation of applying an external bias-controlling mechanism as a part of their work.

(2) One useful question is whether policy-based professionals, or scientists, or other groups which similarly engage in an explicit process of analysis, are more or less prone to these biases than other groups. While the results of this paper are quite interesting, it may still be theoretically possible that the biases that they find are actually less prevalent in the group that they target for their research than many other groups.

=={ This will especially be the case with long exposure to culturally aligned narrative frames that are loaded with high emotive content. }==

Of course, that may or may not be true. Long exposure to culturally aligned narrative frames may, in fact, lead to people being more aware of the pitfalls of ideological biases. People who are less aware of such biases might be more prone to their influence. It seems to me that the more salient measure is the awareness of, and commitment to, controlling for bias not an assumption of a proportional, or linear, dose-dependent, relationship between exposure to cultural narrative frames and a susceptibility to related bias. Those factors may well operate to some degree, or to a large degree, independently of each other. To assume otherwise seems to be, ironically, potentially the product of confirmation bias - say, for example, if you were prone towards labeling some people (in groups other than your own)_ as being more "emotive" than others and therefore more biased than others - present company excepted, of course ;-)

July 14, 2017 | Unregistered CommenterJoshua

" Scientists should, in theory at least, be applying an external control against bias (the scientific method), whereas other groups,..."

"On the one hand, as scientists we are ethically bound to the scientific method, in effect promising to tell the truth, the whole truth, and nothing but — which means that we must include all the doubts, the caveats, the ifs, ands, and buts...." ?

"...whereas other groups, such as "policy based professionals" would very likely not have an expectation of applying an external bias-controlling mechanism as a part of their work."

Depends on where you are. Some governments have politically appointed policy professionals who are openly biased and get swapped out every time a new President takes power. Other governments require their policy professionals to serve all parties impartially and without bias, and regard the scientific method (to the extent it can be applied to political/moral/economic questions) as a good way to achieve that.

"While the results of this paper are quite interesting, it may still be theoretically possible that the biases that they find are actually less prevalent in the group that they target for their research than many other groups."

True. Or they could equally well turn out to be *more* prevalent! we already know that scientific literacy increases polarisation on politically contentious topics, and who is more scientifically literate than a scientist?! Dan keeps on telling me they're not, but I don't remember him giving me any evidence of it.

" People who are less aware of such biases might be more prone to their influence."

Depends whether they're aware of them in the abstract, or personally. I find a lot of people are very much aware of the existence of biases, but firmly of the belief that they only apply to *other* people. Their own superior education, scientific literacy, and the evident superiority and truth of their political position immunises them against the biases that affect the poor, deluded souls on the other side.

The problem with cognitive biases is that everyone is *blind* to their own. It's like knowing about your visual blind spot - a lot of people know that if you close one eye, there's a large patch of your visual field in front of you that not only can't you see, you can't even see that you can't see it (until something disappears into it). But knowing about it doesn't make you any less blind to what's going on in there: awareness isn't enough. You have to make the effort to do something about it, too.

The authors of all those 'The Republican Brain' books and articles are clearly more aware of the existence of biases - did that make them any less prone to their influence?

July 15, 2017 | Unregistered CommenterNiV

J:
All individuals are subject to bias. However regarding the functional arms they may belong to in society (scientists, the law, government, business etc.), I did not anywhere say that equal amounts of exposure to the same levels of charged narrative, would result in equivalent manifestations of bias in such arms. This not only depends on the resistance algorithms of the functional arm, but its relationship to the contested domain. The scientific method is great resistance against bias, yet uncertainty is a window of opportunity for emotive narratives and consequent bias, which as Lewandowsky reminds us gain leverage in contested domains because scientists are only human too. The law sets up algorithms that protect core culture and rational executives based upon that core, presenting a high threshold to alien cultural waves and slowing their advance. Yet a sufficiently strong rising cultural wave concentrated in a short enough time, may still shift or overcome even fundamental laws rather than merely breaking against the barrier, and very likely compromise rationality until stability is one again achieved. History right up to the present strongly suggests no functional arm is immune, including science.


>'Long exposure to culturally aligned narrative frames may, in fact, lead to people being more aware of the pitfalls of ideological biases.'

Notwithstanding that social scientists at least should in theory regard all such frames equally, for the public generally it is exposure to culturally unaligned or challenging frames that does this.

July 15, 2017 | Unregistered CommenterAndy West

...that does this. But indeed it's not exclusive; long exposure to cultural excess can for instance trigger skepticism in some of the erstwhile aligned. As noted, where deliberate emotive framings are being sought, the idea seems to be to get the most possible impact that does *not* trigger a backlash.

July 15, 2017 | Unregistered CommenterAndy West

Perhaps not surprisingly:
https://www.nature.com/articles/s41562-017-0130

Across a range of policy settings, people find the general use of behavioural interventions more ethical when illustrated by examples that accord with their politics, but view those same interventions as more unethical when illustrated by examples at odds with their politics. Importantly, these differences disappear when behavioural interventions are stripped of partisan cues, suggesting that acceptance of such policy tools is not an inherently partisan issue.

July 15, 2017 | Unregistered CommenterJonathan

"Perhaps not surprisingly:"

I didn't find it surprising, but I don't think describing it as a "bias" is accurate.

What they're describing is the 'Authoritarian' mindset, who consider it their right and duty to be able to impose their own standards of lifestyle and morality on other people through their control of society. This policy by its definition supports coercive measures in support of its own agenda, but opposes the same coercive measures when their opponents impose on them. So asked a question like "Do you support coercive measures to bring about a desired behaviour change?" their answer is "Yes? No? All of the above? It depends what behaviour change you're talking about." It's like asking "Do you like one side of the political divide winning on policy decisions?" Only if it's them winning.

So it's not a "bias" in the sense of getting a wrong or unintended answer because of faulty processing. It's logically consistent and entirely correct, according to the intended definition of their Authoritarian policy framework. Results only appear confused because the questioners didn't give them the option of saying "Only if we're in charge" as an answer in their multiple choice.

Of course, it's also quite likely that if you did ask them baldly "Are you an Authoritarian?" or word to that effect that a lot of those same people would lie and say no, and come up with justifications for why it was OK for some policies and not others. Authoritarianism has acquired a bit of a bad reputation from history (although everyone will tell you that's only because it was the wrong side in charge of society). But it's still as popular as ever, and probably always will be.

July 15, 2017 | Unregistered CommenterNiV

Interesting contrasts.

Twenty years ago, fewer Americans were consistently liberal or conservative in their views about politics and society and even those who were ideologically oriented did not express the animosity toward the other side that is common today. In 1994 – hardly a moment of goodwill and compromise in American politics – just 23% of consistent liberals expressed a very unfavorable view of the Republican Party. And just 28% of consistent conservatives saw the Democratic Party in equally negative terms.

But today, the majority of ideologically-oriented Americans hold deeply negative views of the other side. This is particularly true on the right, as 72% of consistent conservatives have a very unfavorable opinion of the Democratic Party. Consistent liberals do not feel as negatively toward the GOP; nonetheless, 53% of consistent liberals have very unfavorable impressions of the GOP, more than double the share that did so two decades ago.

http://www.people-press.org/2014/06/12/section-2-growing-partisan-antipathy/

And...


A new volume by political scientists Christopher Achen and Larry Bartels further helps to fine-tune our understanding how people vote and which party they identify with. Their book, “Democracy for Realists: Why Elections Do Not Produce Responsive Government,” suggests “issue congruence [between voters and parties], in so far as it exists, is mostly a byproduct of other connections, most of them lacking policy content.” In other words, we don’t think through issues, policies and candidate characteristics, but instead see elections as “us versus them.” These scholars argue voters tie themselves with racial, ethnic, occupational, religious, recreational and other groups, with partisanship as the byproduct. Our group identity, not policy concerns or ideology, determines vote choice. That is to say, we gather comfortably with our tribe and tune out other points of view.

https://theconversation.com/why-some-are-applauding-donald-trump-jrs-win-at-all-costs-attitude-80885

I tend to agree that consistency on issues and policy is at an interesting juxtaposition with increasing ideological idemtification, which is why I find the taxonomy that Dan uses in much of his work (e.g "hierarchical individualism") to be less than optimal for describing identity protective reasoning/behaviors.

July 16, 2017 | Unregistered CommenterJoshua

Joshua,

Coincidentally, I'm reading "Democracy for Realists" now.

July 16, 2017 | Unregistered CommenterJonathan

A third of all World Bank reports are never downloaded by anybody, and the sample posted here shows why not even the authors' mothers can be bothered with reading this impenetrable prose. The new chief economist, Paul Romer - one of few honest theorists in the trade - was removed from supervisory responsibilities for the development economics report-writing crew (originators of the report linked by Dan here) for making that very observation:
https://www.economist.com/blogs/graphicdetail/2017/05/daily-chart-20

However the points Dan is trying to make concerning biases are valid, and are made clearly and forcefully by quantitative modelers, unconstrained by the mind-numbing platitudes of consensus-seeking organizations like the World Bank. Great example:

https://www.researchaffiliates.com/en_us/publications/articles/621-i-was-blindbut-now-i-see-bubbles-in-academe.html

".....Bubbles in academic finance and economic theory generally do not jeopardize lives, but they can adversely affect an individual’s career, not to mention collective prosperity. In finance, large sums of money are often put to work chasing popular ideas, whether they ultimately prove brilliant or flimsy. When strategies blow up, markets crash, and/or the economy tanks—or all three, as in the case of the global financial crisis—professional investors and academics may eventually need to reckon with monsters they have created..................."

July 16, 2017 | Unregistered CommenterEcoute Sauvage

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>