follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« On self-deception & motivated reasoning: Who is fooling whom? | Main | Science of Science Communication 2.0, Session 9.1: Emerging technologies part II -- synthetic biology! »
Wednesday
Mar252015

"You -- talking to me? Are *you* talking to *me?" Actually, no, I'm not; the data don't tell us how any individual thinks (or that your side is either "biased" or "right").

A thoughtful correspondent writes:

I am a physician . . .  I was reading an article on Vox debunking the theory which states that more information makes people smarter. This article referenced your study concluding that those with the most scientific literacy and technical reasoning ability were less likely to be concerned about climate change and the safety of nuclear energy.

I read the paper which shows this quite nicely.

I am confused about the conclusions. I scored a perfect score on the science literacy test and on a technical reasoning test as well. I do not believe climate change is a settled science and I believe nuclear power is the safest form of reliable energy available.

The conclusion that I am biased by my scientific knowledge is strange.

In medical experiments data are scientifically gathered and tabulated. Conclusions are used as a way to explain the data. Could an alternate conclusion be reached that scientific and reasonable people downplay the danger of climate change and nuclear power precisely because we are well informed and able reason logically? It seems just as likely a conclusion as the one you reached yet it was never discussed.

My response:

Thanks for these thoughtful reflections. They deserve a reciprocally reflective and earnest response.

1st, I don't think the methods we use are useful for explaining individuals. In the study you described, they identify in large samples patterns that furnish more support than one would otherwise have for the inference that some group-related influence or dynamic is at work that helps to explain variance in genera.  

One can then do additional studies, experimental in nature (like this & this), that try to help to furnish even more support for the inference -- or less, since that is what a valid study has to be in the position to do to be valid. 

But once one has  done that, all one has is an explanation for some portion of the variance in groups of people.  One doesn't have an explanation all the variance (the practical & not & merely "statistical" significance of which is what a reflective person must assess). One doesn't have an instrument that "diagnoses" or tells one why any particular individual believes what he or she does.  

And most important of all you don't have a basis for saying anyone on any of the issues one is studying is "right" or "wrong": to figure that out, do a valid study on the issue on which people like this disagree; then do another & another & another & another. And compare your results w/ others doing the same thing.

2d, I don't believe the dynamic we are looking at is a "bias" per se. Things are more complicated than, at least for me!

I'm inclined to think that the dynamics that we observe generating polarization in our studies are the very ones that normally enable people to figure out what is known by science.

They are also the very same processes that enable people to effectively use information for another of their aims, which is to form stances and positions on issues that evince commitments that they care about and that connected them to others.  That is a matter that is cognitively demanding as well -- & of course one that that most people, even ones who don't get a perfect score on "science comprehension" tests, possess the reasoning proficiency that it takes to perform it.  

What to make of the situations, then, in which that same form of reasoning generates states of polarization on facts that admit of empirical inquiry is a challenging issue -- conceptually, morally & psychologically?  This is very perplexing to me!

I suspect sometimes it reflects the experience of a kind of interference between or confounding of mental operations that serve one purpose and those that serve another.  That in effect, the "science communication environment" has become degraded by conflicts between the stake people have in knowing what's known & being who they are.  

At others, times it might simply be that nothing is amiss from the point of view of the  people who are polarized; they are simply treating being who they are as the thing that matters most for them in processing information on the issue in question. . . .

3d, notwithstanding all this, I don't think our studies admit of your "alternate conclusion": that "scientific and reasonable people downplay the danger of climate change and nuclear power precisely because we are well informed and able reason logically."  

The reason is that that's not what the data show.   They show that those highest in one or another measure of science comprehension are the most polarized on a small subset of risk issues including climate change.  

That doesn't tell us which side is "right" & which "wrong."

But it tells us that we can't rely on what would otherwise be a sensible heuristic -- that the answer individuals with those proficiencies are converging on is most likely the right answer.  Because again, those very people aren't converging; on the contrary, they are the most polarized.

Many people write to me suggesting that an "alternative explanation" for our data is that "their side" is right.  

About 50% of the time they are part of the group whose group is "climate skeptical" & the other half of the time the one that is "climate nonskeptical" (I have no idea what terms I'm supposed to be using for these groups at this point; if they hold a convention and vote on a preferred label, I will abide by their decisions!).

I tell them every time that can’t actually be what the data are showing—for all the reasons I’ve just spelled out.   

Some fraction (only a small one, sadly), say "ah, yes, I see."  

 I can't draw any inferences, as I said, about the relationship between their "worldviews" & how they are thinking.  

I have no information about their scors on "science comprehension" or "critical reasoning" tests.

But at that point I can draw an inference about their intellectual character: that they possess the virtue of being able and willing to recognize complexity.

PrintView Printer Friendly Version

EmailEmail Article to Friend

References (1)

References allow you to track sources for this article, as well as articles that were written in response to this article.

Reader Comments (24)

==> "The conclusion that I am biased by my scientific knowledge is strange."

It's really quite fascinating to me how often I see smart people, who presumably have well-developed skills in scientific reasoning, make that obvious mistake.

Is there any other, better, evidence of motivated reasoning?

March 25, 2015 | Unregistered CommenterJoshua

Dan -

==> "About 50% of the time they are part of the group whose group is "climate skeptical" & the other half of the time the one that is "climate nonskeptical" (I have no idea what terms I'm supposed to be using for these groups at this point; if they hold a convention and vote on a preferred label, I will abide by their decisions!)."

Yeah. There aren't any good terms, but I will say that "climate skeptical" and "climate nonskeptical" is, like all of them, problematic. There are those whose views would be classified as "climate skeptical" who, I would argue, are not particularly skeptical. And there are those whose views would be classified as "climate nonskeptical" who, I would argue, are not particularly nonskeptical.

Not only are there no terms that are better than the others, there aren't any that are worse, either, IMO. If people are personally invested and so inclined, they can take offense no matter what terms are used.

March 25, 2015 | Unregistered CommenterJoshua

@Joshua--

well, remember, most -- as in 95%-- just don't care about any of this. Their only goal is to avoid the misadventure of having to geting into an argument about it w/ someone they are otherwise buying something from or selling something to

March 25, 2015 | Registered CommenterDan Kahan

"And most important of all you don't have a basis for saying anyone on any of the issues one is studying is "right" or "wrong":"

I agree! I think that's probably the most important statement in your reply - symmetry!

It's a real shame the Vox article didn't get it.

"Is there any other, better, evidence of motivated reasoning?"

I don't know, alternative hypotheses are available in this case.

The correspondent apparently doesn't have a lot to go on besides the Vox article, which far from emphasising the symmetry of the polarization, instead positively revels in a polarized application of it to their political opponents themselves. It's understandable that a reader could interpret that to mean Dan's results and theories were only talking about conservatives. With only half the data, the rational conservative seems like a reasonable alternative hypothesis. Only when you see the evidence of symmetry, the evidence the Vox article didn't mention, can that alternative be rejected.

The climate sceptic has to try to explain why the more science-literate a climate believer is, the more they believe in it. If coming to the right conclusion is just a matter of being "well informed and able reason logically" then how can that happen?

March 25, 2015 | Unregistered CommenterNiV

NiV -

Huh?

How do you get from this:

There’s a simple theory underlying much of American politics. It sits hopefully at the base of almost every speech, every op-ed, every article, and every panel discussion. It courses through the Constitution and is a constant in President Obama’s most stirring addresses. It’s what we might call the More Information Hypothesis: the belief that many of our most bitter political battles are mere misunderstandings. The cause of these misunderstandings? Too little information — be it about climate change, or taxes, or Iraq, or the budget deficit. If only the citizenry were more informed, the thinking goes, then there wouldn’t be all this fighting. It’s a seductive model. It suggests our fellow countrymen aren’t wrong so much as they’re misguided, or ignorant, or — most appealingly — misled by scoundrels from the other party. It holds that our debates are tractable and that the answers to our toughest problems aren’t very controversial at all. The theory is particularly prevalent in Washington, where partisans devote enormous amounts of energy to persuading each other that there’s really a right answer to the difficult questions in American politics — and that they have it. But the More Information Hypothesis isn’t just wrong. It’s backwards. Cutting-edge research shows that the more information partisans get, the deeper their disagreements become.

and

Perhaps humans reason for purposes other than finding the truth — purposes like increasing their standing in their community, or ensuring they don’t piss off the leaders of their tribe.


and

This kind of problem is used in social science experiments to test people’s abilities to slow down and consider the evidence arrayed before them. It forces subjects to suppress their impulse to go with what looks right and instead do the difficult mental work of figuring out what is right. In Kahan’s sample, most people failed. This was true for both liberals and conservatives.

and

Presented with this problem a funny thing happened: how good subjects were at math stopped predicting how well they did on the test. Now it was ideology that drove the answers. Liberals were extremely good at solving the problem when doing so proved that gun-control legislation reduced crime. But when presented with the version of the problem that suggested gun control had failed, their math skills stopped mattering. They tended to get the problem wrong no matter how good they were at math. Conservatives exhibited the same pattern — just in reverse.


To a conclusion that there is some lack of symmetry in the theory?

March 25, 2015 | Unregistered CommenterJoshua

I imagine the following conversation:

"Are there any conclusions you have reached that you know are false or arrived at using what you consider irrational arguments?"

"Uh, no."

"Do you think that's only true of you, that everyone else knows their conclusions are wrong or think they're based on irrational arguments?"

"Maybe that NiV guy, he's a real schmuck, but probably not many other people, no."

"And yet people disagree with each other, about thousands of different things."

"Well sure, but on issue X, I'm actually right and NiV is up to his typical biased ass hattery."

"Man what did NiV ever do to, look, nevermind. Would it surprise you to learn that people on your side generally think the people on the other side are biased, and they think your side is actually the biased one?"

"I guess that doesn't seem completely out of the cards."

"My research seems to indicate that you are all probably right."

March 25, 2015 | Unregistered CommenterRyan

Dan -

A comment from another blog that struck me as interesting:

In response to this comment from me:

==> "“Of course, it’s always important to remember that people who inhabit the blogosphere are a self-selecting lot, and they’re outliers in many ways,”


Mark Ryan said the following:

"The tricky bit is that, on the one hand, the people with the greatest motivation are the most likely to have already formed their opinions. They do use facts and theories to build things, but it is barricades they are building, not towers"


I'm not sure exactly why I liked the comment so much, because when I think about it, it makes a fairly obvious point - but still it cleared away just a bit of the (ubiquitous) fog that hangs out in my brain.

IMO, Mark's comment largely explains the relationship between more information and greater polarization that I see in the "climate-o-sphere," as well as in the blogosphere more generally w/r/t other contentious, politicized discussions.

I have often posted comments about the complexity of understanding the direction of causality in that association between information and polarization, and I have often found it hard to distinguish which part of motivated reasoning is attributable to strengthening group identification and which is part of strengthening people's sense of self within a more personal, psychological sense, or whether those two drives can't really exist in any meaningfully isolated state.

But beyond all of that, what that comment describes seems to me to be a pretty clear description of what I have seen over the years in blogospheric discussions.

March 25, 2015 | Unregistered CommenterJoshua

Arrgh. Meant to end the blockquote after "towers," in case it wasn't obvious.

March 25, 2015 | Unregistered CommenterJoshua

Dan -

Since you haven't told me to shut up yet - to continue with my bloviating:

As a thought experiment, imagine the possibility that if we collect a sample from the blogosphere, and from the "climate-o-sphere" more specifically.

Imagine that in such a sample, we can see a disproportionate level (when compared to a nationally representative sampling) of "scientific literacy" and "intelligence" (I use quotation marks because in my non thought experiment mode, I pretty much flatly reject how most people define "intelligence").

And imagine that in our sample taken from the "climate-o-sphere," we see a very high degree of polarization.

So how could we speculate about the related causality?

Here's a thought. Those who participate in the polarized "climate-o-sphere" are those who are best able to see how their views are biased and as a result, are those who are the most motivated to force fit a justification, and are the most "motivated" to prove their justifications (even if they know, in their deeper self, that they are only justifying bias).

March 26, 2015 | Unregistered CommenterJoshua

"How do you get from this: [...] To a conclusion that there is some lack of symmetry in the theory?"

There is no lack of symmetry in the theory. There's a lack of symmetry in the way the Vox article applies it to the climate change controversy in particular.

For example:
"If the problem was truly that people needed to know more about science to fully appreciate the dangers of a warming climate, then their concern should’ve risen alongside their knowledge. But here, too, the opposite was true: among people who were already skeptical of climate change, scientific literacy made them more skeptical of climate change."

Where does it say the same of believers in climate change?

"This will make sense to anyone who’s ever read the work of a serious climate change denialist. It’s filled with facts and figures, graphs and charts, studies and citations. Much of the data is wrong or irrelevant. But it feels convincing. It’s a terrific performance of scientific inquiry."

Where does it say the same of believers in climate change?

"More information, in this context, doesn’t help skeptics discover the best evidence. Instead, it sends them searching for evidence that seems to prove them right"

Where does it say the same of believers in climate change?

If you read the article casually without parsing every nuance, it's easy to get the impression that it's saying climate sceptics are biased and wrong because of the effects Dan is studying.

Since Vox is written for a liberal audience, this does arguably have some justification. They key observation that would show to a liberal believer that the Science Comprehension Thesis is wrong is that conservatives with more science training are more likely to disbelieve. The observation that liberals with more science training are less likely to disbelieve is not interesting, because it fits their null hypothesis (i.e. that Conservatives are climate-sceptical because they don't understand science).

But a conservative reading it has a different null - that liberals are climate believers because they don't understand science, while conservatives are sceptical because they do. The observation that Vox highlights is predicted by the null, and so doesn't distinguish the hypotheses. A conservative reader will be rightly puzzled why the alternative (his/her null) wasn't considered.

What Vox say about climate change makes it clear that they're not applying this principle to that debate, they're still totally confident that they're right and conservatives are wrong, so it's natural (if you're not reading carefully) to assume that this is because it doesn't apply. That it's a consequence of the theory.

Now, it is true that if you take what Dan says about the skin rash/gun control experiment and extrapolate it to the climate debate, you can recognise the symmetric implications, and that the argument is applicable to the liberal beliefs on climate change too, and Vox's one-sided conclusion about who's biased doesn't follow. But it requires cognitive reflection/system 2 processing. An easier and more accessible heuristic is that if a proposed theory draws an incorrect conclusion then the theory is wrong.

Dan's correspondent could have worked it out with the information provided - the symmetry in the gun control case extrapolated to symmetry in the climate change case, the implication that the more scientifically literate liberals are the more they believe, and the contradiction between this and the conservative's version of the Science Comprehension Thesis - but it's a lot to do when the other side of the balance is being explicitly spelt out. The reader's asymmetric conclusion may be explained (or at least explainable) by Vox's asymmetric presentation.

It's a possibility.

"Here's a thought. Those who participate in the polarized "climate-o-sphere" are those who are best able to see how their views are biased and as a result, are those who are the most motivated to force fit a justification, and are the most "motivated" to prove their justifications (even if they know, in their deeper self, that they are only justifying bias)."

Do you?

March 26, 2015 | Unregistered CommenterNiV

==> "Dan's correspondent could have worked it out with the information provided - the symmetry in the gun control case extrapolated to symmetry in the climate change case, the implication that the more scientifically literate liberals are the more they believe, and the contradiction between this and the conservative's version of the Science Comprehension Thesis - but it's a lot to do when the other side of the balance is being explicitly spell out. "

So yes, someone with scientific training, could have read selectively and only placed his reading emphasis on the part of the article that would confirm his biases, and not double-checked to see if the most obvious of his own biases might have influenced his take on the article - before drawing conclusions that a careful reading would dispel. Yes, that scientifically trained person could not read the article with skepticism of his own tendency towards bias - in addition to reading with skepticism towards the thesis being presented.

But then, IMO, that would be an example of motivated reasoning. Which was my point. Motivated reasoning in someone with scientific skills, IMO, can include not applying due skeptical diligence to his own conclusion formation.

Lol!

==> "Do you?"

I assume you mean "are you?" In answer to which - of course!

I thought you were a fan of Feynman? You know, the easiest person to fool is yourself, and all of that. Why else would I participate as much as I do in these discussions where no one gets convinced of anything that changes their views?

But at least I recognize that we all have those tendencies, and don't waste my time pretending that those who I agree with aren't inclined towards justifying biases. When I criticize "skeptics" for doing that, it isn't because I think that they're unique for doing so, or that they have that habit disproportionately relative to "realists." But what I do find amusing is just how confident they are that they're immune from those biasing influences and in realty just unbiased "truth" seekers - even though they have no actual evidence to back their claim other than those they justify by their appeals to self-authority.

March 26, 2015 | Unregistered CommenterJoshua

"So yes, someone with scientific training, could have read selectively and only placed his reading emphasis on the part of the article that would confirm his biases, and not double-checked to see if the most obvious of his own biases might have influenced his take on the article - before drawing conclusions that a careful reading would dispel."

The problem is in paying attention to those parts of the article that conflict with their prior beliefs - that's what draws the attention. And the scientific person, on being told something that conflicts with their beliefs, examines it more closely to see if the reasoning follows. Thus, on being told that conservatives are wrong about climate science not because they don't understand science but because they're politically motivated, the evidence presented being that conservatives with more scientific training are more likely to be sceptical, a conservative reader will try to figure out if this line of reasoning is valid. Are there no alternative hypotheses?

And since the conservative's prior belief - that climate change alarm is unscientific bunk - predicts that conservatives with more scientific training will be more sceptical, there clearly is an alternative hypothesis to explain the observation, and the reasoning fails. There's no actual need to go any further. The Vox argument is not valid.

But Dan's correspondent evidently found it surprising that Dan could have missed such an obvious alternative explanation, and therefore asked for more information. Is Dan really supporting the Vox argument that conservatives are biased by motivated reasoning on climate change because scientific knowledge makes them more sceptical? Is Dan really talking about conservatives, as in The Republican Brain? Dan replies: "Actually, no, I'm not; the data don't tell us how any individual thinks (or that your side is either "biased" or "right")."

"But at least I recognize that we all have those tendencies, and don't waste my time pretending that those who I agree with aren't inclined towards justifying biases. When I criticize "skeptics" for doing that, it isn't because I think that they're unique for doing so, or that they have that habit disproportionately relative to "realists.""

When you criticise "skeptics", and only "skeptics" for doing that, people assume you're just reflecting your own biases; that it's a partisan tactic. It seems hypocritical to lecture people on their political bias in a politically biased way. Thus, people don't respond to your comments about motivated reasoning as scientific statements, they respond to it as a politically-motivated attack on their beliefs, which as your own understanding of motivated reasoning could easily predict, tends to further pollute the science communication environment.

Now, it may be that you don't think they have that habit disproportionately relative to "realists", but that's not the impression people get from reading you. Whether there's anything that could or should be done about that depends on what your intentions are. If the effect is intentional, or at least not contrary to your intentions, then no.

I'm not criticising. I'm a partisan myself.

March 27, 2015 | Unregistered CommenterNiV

Dan,
I wonder what you would call the person who sits midway between a climate- sceptical and climate-non-skeptical? Surely all half-way positions are Leo climate - skeptical

March 27, 2015 | Unregistered CommenterMichael

NiV -

==> "The problem is in paying attention to those parts of the article that conflict with their prior beliefs - that's what draws the attention."

Absolutely.

==> "And since the conservative's prior belief - that climate change alarm is unscientific bunk - predicts that conservatives with more scientific training will be more sceptical, there clearly is an alternative hypothesis to explain the observation, and the reasoning fails. There's no actual need to go any further. The Vox argument is not valid."

Except that's not an accurate representation of the full argument that is presented in the article. So apparently this reader found himself in disagreement with the argument presented and read further only to confirm his disagreement without checking to see whether his bias led to a selective reading of the argument presented in the first place. I already listed the many ways that the argument laid out a thesis of symmetry.

You repeating that you don't think that's an example of motivated reasoning won't convince me that isn't.

==> "When you criticise "skeptics", and only "skeptics" for doing that, people assume you're just reflecting your own biases; that it's a partisan tactic. "

Lol!

First, I don't know why you repeatedly think that you should keep "explaining" this to me. It's pretty amusing. Yes, NiV, I know that the "people" you're talking about think that I am politically biased. I hear it constantly. They also tell me why they think I'm biased, just as you have these many, many times.

"People" can assume whatever they want. And in fact I know quite well that "people" will do so, independent of what I say, because their main drive is to confirm their biases. I have no problem with clarifying with "people" who are interested in checking for their own biases, that my view is that the issues in play are symmetrical. In my Internet discussions, I am satisfied that those "people" who are open to what I say, and deal with what I say on its own merits, understand my perspective in that regard. I'm not going to spend time trying to convince "people" who think that they can reverse engineer from my political opinions - to conclude for example that I think that depriving people of freedom of expression is a laughing matter. Or who think that I'm a "Casual Totalitarian."

==> "Thus, people don't respond to your comments about motivated reasoning as scientific statements, they respond to it as a politically-motivated attack on their beliefs, which as your own understanding of motivated reasoning could easily predict, tends to further pollute the science communication environment."

I don't think that it "further pollutes" the environment. The most salient characteristic of the environment is that it is polluted with or without my involvement. With "people" who are interested in good-faith exchange with me, the environment doesn't get further polluted. We exchange views, and I think grow in our understanding from that experience. Those who confirm their biases to see political bias in my views will do so no matter how I express my views, because they aren't responding to what I say, but instead are reverse engineering from my political views to confirm biases that they take as a matter of faith, exist.

==> "Now, it may be that you don't think they have that habit disproportionately relative to "realists", but that's not the impression people get from reading you. "

Lol! There you go again. "People." How do you know who gets what impression from reading me?

==> " If the effect is intentional, or at least not contrary to your intentions, then no."


Once again - I sometimes have good faith exchanges with "people" in these discussions. I tend to think that I have more good faith exchanges than most people I see engaged in these discussions - but I may very well be biased in that regard. I certainly think that I don't have any fewer good faith exchanges than is the norm. At any rate, I think that those "people" I encounter that are interested in good faith exchange can see that my "intent" is to explore, through a shared exchange, the process by which bias affects reasoning.

March 27, 2015 | Unregistered CommenterJoshua

"You repeating that you don't think that's an example of motivated reasoning won't convince me that isn't."

My point was that there's a distinction between motivated reasoning and all the many other forms of sloppy or biased reasoning. It's a matter of definitions. 'Motivated reasoning' is (as I understand it) reasoning that aims to build comfortable beliefs rather than true ones. That's got nothing to do with reading an article sloppily or lazily, or using common logical fallacies, or only picking out part of an article to argue with. It's not even necessarily the cause of politically correlated bias.

It's like you've got yourself a hammer, so every problem looks like a nail.

Had Dan's correspondent extrapolated the symmetry argument as I suggested, there wouldn't have been a conflict. The conclusion would have been "left-wing rag gets the science wrong again, liberals no better at science than conservatives" and biases would have been confirmed. The failure to follow the complex chain of reasoning isn't due to the conclusions being politically uncomfortable - so there must be some other reason.

"Lol! There you go again. "People." How do you know who gets what impression from reading me?"

Because they say so. As you very well know, since you wrote: "Yes, NiV, I know that the "people" you're talking about think that I am politically biased. I hear it constantly. They also tell me why they think I'm biased, just as you have these many, many times."

" I certainly think that I don't have any fewer good faith exchanges than is the norm."

I do. Offhand, I can't recall seeing anyone comment to the effect that you argued in good faith, and I can remember seeing many comments saying that you don't. That might be a biased selection on my part, but I've generally received the impression that you have a pretty poor reputation for 'good faith' discussions. Not the worst, but notable enough to be worth commenting on.

I'm sure you don't care, and I can't say I'm bothered by it either. I can't make up my mind about whether you know it or not. I've generally assumed you do - as you say, people have told you often enough - but then you go and write something like that and I'm not sure any more.

March 28, 2015 | Unregistered CommenterNiV

I don't think that it is exactly true that " 95%-- just don't care about any of this". I think that they can be made to care, especially if they are led to believe that the "this" impacts immediate things like jobs or lifestyle, or is in alignment with other matters of identity that they do care about. And I believe that considerable marketing and political advertising budgets go into so doing. By those with a vested interest in the outcomes.

In my opinion, we should be looking harder at how newly introduced items of concern, especially those id political and economic importance, are presented to the public or special interest groups within the public, by those with economic clout and/or those seeking political influence.

The public in general is not just randomly wandering about, independently picking up new bits of information and performing some sort of strictly personal evaluation of where this new bit should fit into their value system.

March 28, 2015 | Unregistered CommenterGaythia Weis

==> "That's got nothing to do with reading an article sloppily or lazily, or using common logical fallacies, or only picking out part of an article to argue with. It's not even necessarily the cause of politically correlated bias."

Lol! Really? Nothing to do with it?

I have seen thread after thread of "skeptics" pulling the Travis Bickle. They read Dan's work, or read about Dan's work, and they say on the order of

"But he doesn't understand - because he's a liberal, alarmist, academic - that "skeptics" are skeptical because the warmunistas are try to sell them a bill of goods. And that's how we know that his research is biased, and an attempt to try to convince the public that "skeptics" are skeptical because they're pathological or politically motivated. And how do I as an individual know that? Because I am seeking scientific truth, and I used to accept the work of mainstream climate scientists until I started investigating the evidenced and realized how incredibly obvious are the widespread flaws, that could only exist in that magnitude if those mainstream climate scientists are dumb or corrupt."

I see it over and over and over. From "skeptics" who come to this very site and that I know have read how his work shows symmetry among groups in the phenomena he's examining. And they read him say that, over and over.

Of course, I often read something similar when I see "realists" reacting to Dan's work. It's not personalized in quite the same way, of course - with respect to reverse engineering about the veracity and bias of Dan's work from what they assume to be his political orientation (what would the point of them doing that if they see themselves as roughly aligned with him politically!). But they do essentially offer the same basic argument that Dan's work doesn't apply to "us" because :we" are seekers of scientific truth and so we aren't biased by our ideological orientation. And I know that because I am seeking scientific truth. In essence, it's Travis Bickling also.

IMO - when someone has a scientific background...

...and they know the basic outlines of how to control for variables to attribute cause-and-effect, and they are presented with evidence that over and over states a symmetrical phenomenon for groups in general, and over and over they not only miss the symmetry outlined, but go further to claim that the work doesn't apply to them as individuals and therefore, the work offers a fallacious explanation for the behavior of their group in general, and even further then argue it is a fact without controlling for the reality that they are obviously susceptible to all kinds of observer bias Dan's work is biased by his political orientation without offering empirical evidence to refute Dan's evidence-based work...

...motivated reasoning is a very likely explanation. (Of course, anything is possible. : - )


==> "Because they say so. As you very well know, since you wrote...:"

I know that my writing is often verbose and unclear, but sometimes I think that you must not read what I write before you respond. I was pointing to the sloppiness and circular reasoning in how you describe "people." You described what "people" think - as if it isn't possible to discriminate between "them," and I spoke of people who reverse engineer from my political orientation to determine whether I'm engaging in good faith.

The same point applies to your other comments that follow: I am quite content that among people who have significantly different views than I, but don't judge whether I'm engaged in good faith on the basis of my political orientation or where I'm located within the constellation of views on climate change, I have a reasonably high % of reasonable exchange of views - of the sort that is characteristic of people exchanging views in good faith.

==> "I'm sure you don't care,"

I have no problem if people who are not open to discussions with me in good faith, determine that I'm not engaging in good faith. When I see "skeptics," on a regular basis, make mistakes about my views and my values on the basis of their prejudices as opposed to what I say (for example, when they assert with complete confidence what I believe about climate change even though they haven't even read me state what I believe about climate change and then draw conclusions about my beliefs that are completely false), I have no problem if they have determined that I'm not engaged in good faith.

I do care, however, if people who are open to good faith exchange with me (because they haven't pre-judged me by reverse-engineering about my intent on the basis of my political views) determine that I'm not engaging in good faith. That happens sometimes too, and when it does, it causes me to reflect on why that has occurred.

March 28, 2015 | Unregistered CommenterJoshua

Dan,

So when someone says in effect "but I am using System 2 thinking to reach my (polarized) conclusion" you wrote that:

I tell them every time that can’t actually be what the data are showing—for all the reasons I’ve just spelled out.

To you, the phrase "the data" refers to the results reported in a paper -- so I assume. But from my point of view I have additional data on how scientifically literate (and even how smart and successful at using System 2 thinking) I am. So what "the data" are we talking about?

I suggest that more highly skilled communicators makes explicit what they mean by "the data". I try to do this when doing technical communication in order to lessen the "cognitive load" on the reader.

It's a useful exercise to ask "OK, given the data reported by Dan in paper X, what additional data would be needed to show Y".

March 29, 2015 | Unregistered CommenterCortlandt Wilson

Dan,

I response to a question/objection raised by others you wrote:

I don't think our studies admit of your "alternate conclusion": that "scientific and reasonable people downplay the danger of climate change and nuclear power precisely because we are well informed and able reason logically."

The reason is that that's not what the data show. ...

More specifically, your data neither rejects or confirms the proposition. Yes? The data from your study does not address the question. (I am reminded of the phrase "you might very well think so, but I couldn't possibly comment".)

On a related point. Dialectically I tend to see situation as competing systems of System 2 based thinking.
Or more to the point of your work: I don't see anything in your data that contradicts the theory that both "sides" of the climate change issue are using some form of System 2 thinking, thinking that is well informed and able to reason logically. Yes?

This begs the question of how to design an experiment to get more data on the theory. Or is it even possible?
It seems the motivating question behind the question about System 2 thinking gets to issues beyond being well informed and able to reason logically and more to an ideal of rationality. An ideal of rational thinking that is akin to what is meant by wisdom.

March 29, 2015 | Unregistered CommenterCortlandt Wilson

OK, I'm another sceptical doctor like your correspondent - high level of technical knowledge, polarized views, IIs there any way that I can tell if I'm right or wrong. I understand that this is an issue that Dan is not particularly interested in ... But it's an interesting question in itself.

In climate change Dan has picked an excellent topic - essentially no-one knows what will happen in the future.Even in my lifetime and the lifetime of my children, we will not know the answer for sure! Neither my sceptical views or the widely proclaimed views of Joshua will ever be falsified.

We are whistling in the dark over this one, and I cane see that there are many aspects of my psychological makeup that cause me to whistle the skeptical tune ...but why not? There's not a lot else to do.

Dan is investigating something that is like Harry Potters's mirror of Esired. We look into it and see our desires.In the end, all we see is ourselves- a perfect subject for a psychologist to investigate!

April 2, 2015 | Unregistered CommenterMichael

I suspect that the study was intended to show that skeptics are less scientifically literate and less numerate than
those who believe in catastrophic anthropological global warming. When the results showed the opposite, those writing the paper went off on a tangent about polarization.

When actual data or science is involved, one would expect more knowledgable people to have more of a consensus. When a POLITICAL question is involved, one would expect more polarization the more knowledgable a person is. Test a group of people on the US Constitution, then rate them as to their political party and frequency of voting, and I suspect that one would find the same polarization found in this "study".
To be rated as a valid study, and not mere cherry picking after the fact, as I suspect this study was, they should have studied polariziation in other areas, not just "catastrophic anthropological global warming".

April 6, 2015 | Unregistered CommenterAlan McIntire

Alan -

==> "I suspect that the study was intended to show that skeptics are less scientifically literate and less numerate than
those who believe in catastrophic anthropological global warming. "

It's not entirely clear to me which study you're referring to...but I'm curious as to what is the evidence in which you base you suspicion?

I think you might find it interesting to spend some time looking at Dan's research findings w/r/t the association between "scientific literacy" and views on climate change.

April 6, 2015 | Unregistered CommenterJoshua

==> "I suspect that the study was intended to show that skeptics are less scientifically literate and less numerate than
those who believe in catastrophic anthropological global warming. "

Maybe "intended to show" means what the researcher expected to find?

Everyone should make note out that the Vox article referenced in the blog put a huge "spin" on Dan's research that says a lot about the mind-set of the writer at Vox and is not supported by Dan's study.

April 6, 2015 | Unregistered CommenterCortlandt Wilson

The essence of science is skepticism so it is apt to get their dander up when actual group thinkers turn things arrogantly call themselves skeptics and brand scientists as group thinkers.

December 27, 2015 | Unregistered CommenterHal Morris

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>