follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Gore's sequel -- good idea or bad? | Main | Making science documentaries that matter in a culturally divided society (lecture summary plus slides) »
Saturday
Dec102016

Weekend update: birth announcement--twins (sort of) on politically motivated reasoning

The Emerging Trends review commentary on "politically motived reasoning" is now officially published

As you can see, the working paper turned out to be siamese twins, who were severed at the spleen & published as a "two part" set:

 

  • Kahan, D. M. (2016). The Politically Motivated Reasoning Paradigm, Part 1: What Politically Motivated Reasoning Is and How to Measure It Emerging Trends in the Social and Behavioral Sciences: John Wiley & Sons, Inc.
  • Kahan, D. M. (2016). The Politically Motivated Reasoning Paradigm, Part 2: Unanswered Questions Emerging Trends in the Social and Behavioral Sciences: John Wiley & Sons, Inc.

 

 

 

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (7)

Dan -

Not that you seem particularly interested, but more on my theme of how "values" may have limited value for evaluating "motivated reasoning."


What could be a more basic value in this country than beliefs about fair representation and the power of the people to put politicians into office?

--snip-

Currently, 19% of Republicans and Republican-leaning independents favor basing the winner on the popular vote, down from 49% in October 2004 and 54% in 2011.

--snip-

http://content.gallup.com/origin/gallupinc/GallupSpaces/Production/Cms/POLL/vfmrqqtmq0ulvu5d83cj6a.png

Also fascinating that only 56% of the Republicans polled thought that Clinton won the popular vote.

http://www.gallup.com/poll/198917/americans-support-electoral-college-rises-sharply.aspx?g_source=Politics&g_medium=newsfeed&g_campaign=tiles

Talk about partisan filters.

Also fascinating:

http://www.people-press.org/2016/12/08/6-awareness-of-election-results/6_3-5/

But wait, it gets even better:

--snip--

Nearly 40 percent of the president-elect’s supporters believe the stock market has gone down under President Obama, despite the fact that it’s nearly doubled during Obama’s tenure.

--snip--


http://15130-presscdn-0-89.pagely.netdna-cdn.com/wp-content/uploads/2016/12/Screen-Shot-2016-12-08-at-9.26.23-PM-701x367.png

And


--snip--

Trump supporters are also living in an alternate reality with respect to the unemployment rate. Even though the jobless number has gone from 7.8 percent in January 2009 to 4.6 percent last month, a whopping 67 percent of Trump voters think it’s actually increased.

--snip--

http://twitter.com/keithboykin/status/807047554034704384/photo/1


I suppose the unanswered question is whether this kind of filtering reality through a partisan filter has increased in any way over time.

December 10, 2016 | Unregistered CommenterJoshua

BTW, the whole Pew Report is really quite something:

http://www.people-press.org/2016/12/08/low-approval-of-trumps-transition-but-outlook-for-his-presidency-improves/


All I can say is thanks god that "skeptics" are immune from the impact of motivated reasoning. Imagine how much worse it would be if they weren't the only holdouts.

December 10, 2016 | Unregistered CommenterJoshua

@Joshua

Thanks for the references.

I have to admit that I can't quite remember your theme, but as for "why values?" take a look at pp. 11-12 of "part 1."

December 10, 2016 | Registered CommenterDan Kahan

"I suppose the unanswered question is whether this kind of filtering reality through a partisan filter has increased in any way over time."

I noticed all the examples you gave were of Republican partisan filters, not Democrat ones.

So, would you say your comments are showing that sort of imbalance more now than in the past? I'd have thought that was an easy enough question for you to answer, but - well, you know - motivated reasoning...

:-)

December 11, 2016 | Unregistered CommenterNiV

Dan,

"For example, someone who is convinced that human-caused climate change is not happening might infer from the contrary view of the National Academy of Sciences that NAS members have no expertise on this issue, and thus dismiss an Academy “expert consensus” report as unentitled to weight."

This example supposes that 'Argument from Authority' constitutes evidence, and furthermore, ignores any evidence that might have gone into forming those opinions in the first place.

I realise you discuss heterogeneous priors later in the paper, but it's not clear from what you say that it applies here as well. It's written as if you believe it, and not as an example of an invalid argument. Did you intend to?

It's an interesting example to use to illustrate the same point, though. Consider the following equivalent argument:
For example, someone who is convinced that 2+2=4 might infer from the contrary view of the National Academy of Sciences that NAS members have no expertise on this issue, and thus dismiss an Academy “expert consensus” report as unentitled to weight.

If the NAS announced that 2+2=5, and that a consensus of 97% of scientists agreed with them, because a government panel had said so in a thousand-page report, would you believe they knew what they were talking about? That's a serious question, by the way.

Some people would disagree because they were able to count fingers, and thus have direct evidence that the NAS is wrong. Other people, perhaps not as confident at counting fingers themselves, would nevertheless know a fair number of those who are, and who argue that the NAS is wrong. They look at their arguments using a variety of everyday heuristics (like do they seem like generally smart people who are right on lots of other issues too) and find them convincing. They come to believe that these critics "have expertise on this issue", and modify their Bayesian confidence accordingly. The other group, who already "know" that 2+2=5, conclude they don't precisely because they say things they know not to be true, and call them "deniers".

Apart from those who can count fingers, both sides are using exactly symmetrical arguments, coming to exactly symmetrical judgements about expertise and truth. The conclusion is clear: - Argument from Authority is not a truth-convergent heuristic. Argument from Authority does not constitute "evidence".

And yet, you seem to cite the failure to credit expertise as an example of non-truth-convergent reasoning. People chose not to let expert opinion override prior beliefs they had more confidence in. The 2+2 example reminds us that these are not necessarily the priors of total ignorance, that plague the philosophy of Bayesian probability, but may be the outcome after many previous rounds of evidence gathering. Conservatives do not naturally - ab initio, from the cradle - disbelieve in human-caused climate change. They have only come gradually to that conclusion after many years of vociferous public debate.

Judging expertise by checking the statements the putative expert makes that you already know the answer to, and then weighting your belief in statements they make about other questions you don't know the answer to accordingly, is arguably a sensible truth-seeking heuristic. However, if you start with incorrect beliefs, the heuristic tends you lock you in to trusting the wrong set of experts. Confirmation bias can thus arise from truth-seeking behaviour.

(A way to test it is to give an example where the additional information provided about the experts is on-political. Do people still tend to discredit experts who say things they believe to be untrue, even when not tied up with cultural identities?)

It's the fundamental problem with Argument from Authority. If you don't have the knowledge to judge for yourself, how can you tell who is a trustworthy "expert"? The AfA approach tells you to consult an expert on expertise, who will tell you which 'experts' to trust. But how do you tell which expert on experts to trust? Well, obviously you need to consult an 'expert expert' expert, who will tell you what expert experts are real experts on experts. Great! We're making lots of progress! But there's just one question remaining... which expert expert experts are truly trustworthy? How can we tell? Hmmm.

Argument from Authority is fundamentally unsound as a way to discover the truth - but millions of people have no alternative but to use it. Sometimes they pick different sets of experts, come to different conclusions, and we can see the evidence of its failure. But if both sides happened by chance to pick the same set of experts to trust, should we have any more confidence than we did before that they picked the right ones? Why? What's different about what they're doing?

December 11, 2016 | Unregistered CommenterNiV

I'm falling way behind Dan. I added some of my thoughts that include the papers above on a comment to the post: Is cultural cognition an instance of "bounded rationality"? A ten-yr debate.

December 11, 2016 | Unregistered CommenterGaythia Weis

My original purpose in stopping by here again to day was to share this link: http://www.hcn.org/issues/48.21/can-this-video-game-make-alaskas-inupiat-more-visible. Transmitting culture, both language and values by means of a video game.

"“Values come out of the past,” says Ronald (Aniqsuaq) Brower Sr., who grew up listening to Nuna’s story and contributed traditional knowledge and voiceovers to the game. Such values include perseverance, compassion, sharing, conflict avoidance, respect for humans and non-humans, and adaptability in the face of a fickle environment. Kids nowadays see so many movies about killing, Brower says. “We want to see something about living. The past is very much alive.”"

"“The joy of the feast of wisdom lingers,” Ishmael (Angaluuk) Hope, a Tlingit-Inupiaq storyteller and collaborator, says on a webpage that promotes Never Alone. “Though it would require more deep investigation than one video game to fully understand ... this video game offers a tasty morsel, enough to know and to remember what we’ve been hungering for this whole time.”"

I think that this ought to segue into a discussion of how not just to understand how to measure "scientific curiousity" but also how to instill it. Or, in a political context, how was it that a society ready for hope and change, can, in 8 years turn towards fear and a narrowing and ethnocentric world view?

December 15, 2016 | Unregistered CommenterGaythia Weis

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>