follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Are you curious to see what Financial Times says about curiosity? | Main | WSMD? JA! Various summary stats on disgust and gm food risks »
Monday
Sep182017

The conservation of perplexity . . .

Every time one feels one has made progress by examining an important question empirically, at least one more important, unanswered empirical question reveals itself.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (11)

Yes, well, Aristotle supposedly said "The more you know the more you know you don't know."

September 18, 2017 | Unregistered CommenterJoshua

Nevertheless the number of fully (in context) or partially answered questions constantly increases (in a two steps forward one step back kind of way). Which means more and more aquired territory from which to launch assaults on the many remaining mysteries.

September 18, 2017 | Unregistered CommenterAndy west

link drop - is need for evidence an alternative to science curiosity?:
https://doi.org/10.1371/journal.pone.0184733

September 19, 2017 | Unregistered CommenterJonathan

@Andy--well put!

September 19, 2017 | Registered CommenterDan Kahan

@Jonathan-- thanks. I'm sketpical of the thesis of paper; every negative mental state wrapped up in one? C'mon...

Have to admit, too, that I'm skeptical of any psychology paper that ends up in Plos ONE. People dump their worst papers there-- ones they know they can't get into standard journals. Sad, b/c idea of the journal is a good one. But it can't work w/o serious editors & reviewers

September 19, 2017 | Registered CommenterDan Kahan

Jonathan,

"A growing body of scholarship suggests that some individuals’ beliefs are only weakly correlated with their knowledge of relevant information. For example, knowledge about climate science is a poor predictor of conservatives’ belief in climate change [27]. Many on the political right know that climate scientists believe anthropogenic climate change is real, while simultaneously rejecting the conclusion themselves."

And how is that evidence that "individuals’ beliefs are only weakly correlated with their knowledge of relevant information"?!

The whole paper is a mess of implicit partisan assumptions, biased topic choices, and invalid measures loaded with over-specific 'false dilemma' question choices. Worthless.

September 19, 2017 | Unregistered CommenterNiV

(better?) link drop:
https://osf.io/preprints/psyarxiv/mqzue/download?format=pdf

September 19, 2017 | Unregistered CommenterJonathan

Much better! I can't actually tell what the political orientation of the authors is just from reading the paper! (Apart from the use of the word "deniers", which does kinda give a big hint).

Fascinated by this bit:

The group that exhibited the most motivation to avoid crosscutting information was Donald Trump supporters. The one group that did not significantly exhibit the effect was climate change deniers. However, the small sample of deniers (and low power; see Table 4) may mean that the sample size prevented us from detecting a real effect. Recoding the targets as either congenial (like-minded) or uncongenial (unlike-minded) and re-running the ANOVA allowed us to test whether desires to remain within one’s ideological bubble is ideologically asymmetric. We consistently found null interactions, meaning that we found no evidence of asymmetry. The one exception, which yielded just-significant results, was climate change (believers > deniers).

On the one hand, I find it a congenial result in accord with my ideological preconceptions! :-) On the other, I'm suspicious on general principle, because it conflicts with my prior belief that people are psychologically basically the same. I'm wondering whether it might be because openness to the opposing arguments is specifically part of the climate sceptic ideology (at least in the science-oriented sub-culture I'm more personally familiar with, not so much the politically-oriented sub-culture), or because of some sort of selection effect, or because it's a spurious, only marginally significant result.

And this next bit is truly excellent! They actually did the "apolitical topics" test!

One limitation of Study 4a was the absence of non-political issues. To test whether liberals or conservatives have a stronger desire to avoid uncongenial information on non-political issues, we conducted a follow-up study [...] And we also found greater desire to hear from like- versus unlike-mined others on questions such as preferred beverages (Coke vs. Pepsi), seasons (spring vs. autumn), airplane seat (aisle vs. window), and sports leagues (the NFL vs. the NBA).

Their evidence is that it's not specific to ideological/partisan/controversial issues. I don't think there's any urge to protect one's partisan identity as a Pepsi-lover, or peer-group image as someone who likes autumn! It doesn't necessarily apply to other types of motivated reasoning, of course...

I've seen this paper being discussed previously (I remember hearing about the Pepsi-vs-Coke result) but I don't think I've ever read the paper itself before. Thanks.

September 19, 2017 | Unregistered CommenterNiV

One issue I have with Frimer et al. is that quality aspects of the opposition opinion really should matter when one is choosing to (or not to) be exposed to it - and they don't tell their subjects this info. For instance, being a believer in evolution, if I was told I would be exposed to an opposing opinion, I would very much like to know if it would be just the same-old "because the Bible says so" or something more exploratory and explanatory (or even just novel). However, I doubt I'd give up $ (or a chance to win more $) just to avoid hearing "because the Bible says so". Well, maybe....

Also, this has me contemplating the following: I think we all agree that most people don't change their minds much when exposed to opposition opinions. That would imply that there is low risk of social distress ("undermine shared reality") from being exposed to opposition opinions. Yet, Frimer et al. find this kind of risk avoidance as one of their explanations. Also, if it is evolutionary more "fit" (for social reasons) to not change one's mind when exposed to opposition opinions, one would expect that the not-change behavior would be the lower energy behavior (vs. change) - hence it shouldn't be expensive to resist changing one's mind when listening to an opposition opinion. Hence cognitive dissonance shouldn't be a problem either. Of course, that's just an evo-just-so story, but it does make me suspicious of the actual motivation. Additionally, if Mercier & Sperber are right that reason evolved to check opposition opinions, then again, what is the issue with this aversion to opposition opinions?

Is it possible that we have evolved (or perhaps learned) this aversion to opposition opinions for some entirely different reason? For example, could it be a type of immune reaction to potentially toxic memes (combined as they are with our own biases that enhance them, such as the availability bias)? Consider the similarity to malware-protection software.

September 19, 2017 | Unregistered CommenterJonathan

Heeheehee.

As teachers, we judge our students by the questions they ask, not by the answers they are capable of giving. Why should we think differently for ourselves?

September 20, 2017 | Unregistered Commenterdypoon

"I think we all agree that most people don't change their minds much when exposed to opposition opinions. That would imply that there is low risk of social distress ("undermine shared reality") from being exposed to opposition opinions."

For that particular form of social distress, I agree.

"Also, if it is evolutionary more "fit" (for social reasons) to not change one's mind when exposed to opposition opinions, one would expect that the not-change behavior would be the lower energy behavior (vs. change) - hence it shouldn't be expensive to resist changing one's mind when listening to an opposition opinion."

If all you're doing is shutting your ears to their arguments, then yes. But people don't. They need to feel their opinions are justified and true, and so listening to opposing arguments requires them to think and come up with counter-arguments (even if only internally) which costs cognitive energy.

They don't expect to lose, but it costs time and effort to win.

"Additionally, if Mercier & Sperber are right that reason evolved to check opposition opinions, ..."

Reason evolved to solve generic life problems in a more general and powerful way. Communication evolved to share knowledge and methods of solving problems. Lying evolved to use communication to manipulate other's behaviour to one's own advantage. Scepticism and counter-argument evolved as a defence against lying and error in communication. Then reason evolved further in endless iteration to make communication, lying, and counter-argument more effective.

<I>"... then again, what is the issue with this aversion to opposition opinions?"

Any form of hostile or angry social interaction raises the risk of violent escalation. In the era before written/electronic communication, this could get physical, and in any case can damage future cooperation. It can also evoke a competition for status, territory, and control - whose arguments are 'better'? Who 'wins' the argument? Both of those can trigger the fight-or-flight adrenaline response. That has an energy cost, and can be distressing, even if nothing happens. There's a clear evolutionary advantage to avoiding avoidable fights - they're dangerous. On the other hand, there's also an evolutionary necessity for defending one's territory and status.

"For example, could it be a type of immune reaction to potentially toxic memes (combined as they are with our own biases that enhance them, such as the availability bias)? Consider the similarity to malware-protection software."

I think it's certainly possible that people can learn or be taught that. A lot of toxic memes include measures to shut out opposing arguments - in much the way that malware will sometimes try to disable the anti-virus software. I've come across a few people advising newbies not to argue with the climate deniers, because unless they're experienced in the debate they'll lose. (Not because they're wrong, of course; but because we use trickery!) But my experience has been that this behaviour has to be taught - people assume by default that their own ideas and arguments are the strongest, and will be able to defeat and destroy any toxic memes easily. It takes quite a lot of experience to learn otherwise.

Gazelles are so fast and agile primarily because of the lions - so in an evolutionary sense lions do the gazelles a lot of good. (Gazelles don't see it that way, though!) Young gazelles soon learn that lions are fast and agile, too, and their survival prospects are only enhanced with lots of experience. But even an experienced gazelle confident in its own speed and agility will still try to avoid unnecessary encounters with lions.

It's a far more sophisticated post-evolutionary tactic to deliberately expose yourself to intellectual lions in order to weed out the weak beliefs, and I don't think it's inbuilt in humans. That's why it took so long for the Enlightenment and Science to be invented. People have to be taught to do it.

September 20, 2017 | Unregistered CommenterNiV

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>