follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Still here . . . | Main | Weekend update: Precis for "are smart people ruining democracy? What about curious ones?" »

Guest post: early interest in science predicts long-term trust of scientitsts

Once again, we bring you the cutting edge of #scicomm science from someone who can actually do it! Our competitors can only watch in envy.

The Enduring Effects of Scientific Interest on Trust in Climate
Scientists in the U.S.

Matt Motta (@matt_motta)

Image result for matt motta minnesotaAmericans’ attitudes toward scientists are generally positive. While trust in the scientific community has been on the decline in recent years on the ideological right, Americans are usually willing to defer to scientific expertise on a wide range of issues.

Americans’ attitudes toward climate scientists, however, are a notable exception. Climate scientists are amongst the least trusted scientific authorities in the U.S., in part due to low levels of support from Republicans and Independents.

A recent Pew study found that less than a third (32%) of Americans believe that climate scientists’ research is based on the “best available evidence,” most of the time. Similar numbers believe that climate scientists are mostly influenced by their political leanings (27%) and the desire to advance their careers (36%).

Why do (some) Americans distrust climate scientists? This is an important question, because (as I have shown in previous research) negativity toward scientists is associated with the rejection of scientific consensus on issues like climate change. It is also associated with support for political candidates (like George Wallace and Donald Trump) that are skeptical of the role experts play in the policymaking process.

Figuring out why Americans distrust climate scientists may be useful for devising new strategies to rekindle that trust. Previous research has done an excellent job documenting the effects of political ideology on trust in climate scientists. Few, however, have considered the effect of Americans’ interest in science and knowledge of basic scientific principles – both of which have been linked to positivity toward science and scientists.

In a study recently published at Nature Climate Change, I demonstrate that interest in scientific topics at young ages (12-14)  is associated with increased trust in climate scientists decades later in adulthood, across the ideological spectrum. 

In contrast, I find little evidence that young adults’ levels of science comprehension (i.e., science knowledge and quantitative skills) increase trust later in life. To the extent that they do, the effects of science knowledge and quantitative ability tend to be strongly conditioned by ideology.

In addition to considering the effects of science interest and comprehension on trust in climate scientists, my work offers two additional points of departure from previous research. First, few have investigated these potential determinants of attitudes toward climate scientists in young adulthood. This is surprising, because previous research has found that this is a critical stage in the development of attitudes toward science.

Second, fewer still have studied how these factors might interact with political ideology to shape opinion toward climate scientists. As readers of this blog might expect, Americans who are highly interested in science should exhibit higher levels of trust across the ideological divide. This is consistent with research suggesting that science curiosity encourages open-minded engagement with scientific issues – thereby increasing acceptance of science and scientific consensus.

In contrast, science comprehension should polarize opinions about climate scientists along ideological lines. If science knowledge and quantitative skills increase trust in climate scientists, we might expect this effect to be greater for liberals – who tend to be more accepting of climate science than conservatives. Again familiar to readers of this blog, this point is consistent with research showing that people who “think like scientists” tend to use their skills to reinforce existing social, political, and cultural group allegiances.

Using panel data from the Longitudinal Study of American Youth (LSAY) I model American adults’ trust in climate scientists (in 2011) as a function of their science interest and comprehension measured at ages 12-14 (in 1987). I structure these models hierarchically because respondents were cluster sampled at the school level, and control for several potentially-relevant demographic factors (e.g., race, sex). For a more-technical discussion of how I do this, please consult the study’s methods section (just after the discussion).

I measure Americans’ trust in scientists using self-reported measures of trust in information from four different different groups; science professors, state environmental departments, NASA/NOAA, and the Intergovernmental Panel on Climate Change (IPCC). I also look at a combined index of all four.

I then measure science interest using a self-reported measure of respondents’ self-reported interest in “science issues.” I also operationalize science comprehension using respondents’ scores on standardized science knowledge and quantitative ability tests.

The results suggest that self-reported science interest at young ages is associated with trust in climate scientists about two decades later (see the figure below). On average, science interest in young adulthood is associated with about a 6% increase in trust in climate scientists. Young adults’ science knowledge and quantitative skills, on the other hand, bear little association with trust in climate scientists measured years later. 

The effects of science interest in young adulthood hold when factoring levels of science interest measured in adulthood into the model. I find that science interest measured in young adulthood earlier explains more than a third (36%) of the variable’s cumulative effect on trust in climate scientists.

Critically, and perhaps of most interest to readers of this blog, I find that the effects of interest are not conditioned by political ideology. Interacting science interest with political ideology, I find that young adults who are highly interested in science are more trusting of climate scientists – irrespective of their ideological allegiances.

In contrast, the effect of science comprehension in young adulthood on trust in climate scientists is significantly stronger for ideological liberals. This was true in nearly every case, for both science knowledge and quantitative skills. The lone exception is that the interaction between quantitative skills and ideology fell just short of one-tailed significance in the NASA/NOAA model (p = 0.13), and two-tailed significance in the IPCC model (p = 0.06).

As I discuss in the paper, these results suggest an exciting path forward for rekindling public trust in climate scientists. Efforts to boost scientific interest in young adulthood may have lasting effects on trust, decades later.

What these efforts might look like, of course, is an open question. Board and video games aimed at engaging young audiences could potentially be effective. A key challenge, however, will be to figure out how to use these tools to engage young adult audiences that are not already highly interested in scientific topics. 

I also think that this research underscores the usefulness of longitudinal approaches to studying Americans’ attitudes toward science. Future research should investigate whether or not these dynamics hold for Millennials and Generation Z (who tend to be more accepting of scientific consensus on climate change than older generations) is an interesting question, and one future longitudinal research should attempt to answer. 


PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (205)

Very interesting. Thanks.

May 3, 2018 | Unregistered CommenterJoshua


What's your thinking ont he possibility that early-life interest and later-life confidence are only spuriously correlated--i.e., that there might be some 3d variable out there causing both?

May 3, 2018 | Registered CommenterDan Kahan

Testing Dan's - rather obvious - hypothesis of a third variable causing both variables examined should be easy enough:

Instead of correlating tenuous early adulthood interest in science with trust in the alleged "97% consensus-climate-science", why not skip "potential" for "actual" scientific knowledge and only poll persons with actual degrees in math and science?

In particular, we KNOW this group supports greater use of nuclear power generation - even among respondents who subscribe to the "greenhouse-gases-will-turn-Earth-into-Venus" scenario. But we also know that "climate scientists" as a group, do not. That is both necessary and sufficient to justify the fact 68% of the general public does not trust the climate-science sub-group, while trusting scientists as a whole.

May 3, 2018 | Unregistered CommenterEcoute Sauvage

I've thought about this a bit! I've sketched out a couple of thoughts below:

First, I think one thing we need to figure out is *when* people formulate attitudes toward climate scientists (and other scientists) in relation to when they start to get interested in science. The current data don't permit me to do that to the extent that I'd like, but that's something I'd love to explore more in the future. My take on the existing literature is that interest should develop before attitudes toward the four groups I study here.

Actually, the IPCC outcome variable might also offer some useful insights on that front. The measure of interest I've got in this data was actually administered before the IPCC was created. This, of course, doesn't totally rebut the "common cause" argument – which I think would apply to climate scientists more broadly. But I can at least say with a relatively high degree of confidence that IPCC attitudes and early life interested weren't formulated at the exact same time. At least in this data.

More broadly, I've given some thought to what a "common cause" might be. I think doing more work to identify the antecedents of interest/curiosity could be instructive not only in helping us better understand interest, but in identifying candidates that could make this and other relationships spurious. Right now, I'm not sure any one candidate comes to mind. But, open to suggestions !

What I think may be more likely is that there's some factor that causes interest that also explains long-run attitudes toward scientists. Maybe something *very* early in childhood, with interest acting as a mediator of sorts? Would love to explore more!

May 3, 2018 | Unregistered CommenterMatt

Matt -

You say the following:

Climate scientists are amongst the least trusted scientific authorities in the U.S., in part due to low levels of support from Republicans and Independents.

I looked at the associated link, and as near as I can tell the related information refers to some combination of levels of trust in "scientific advice" and deference towards "scientific expertise" - with those levels being relatively low among Republicans/conservatives in reference to "global warming/climate change" and "global warming," respectively.

I have a question with regard to how to tease out, and whether lower levels of trust and deference shown reflect an assessment of the scientists themselves, as compared to not really the scientists themselves but perceptions about the advice/ and/or perceptions about the product of the expertise (i.e., not the scientists or their expertise per se).

In other words, I would guess that Republicans, Democrats, and Independents would feel quite differently, respectively, about various climate scientists dependent upon (1) signs related to the political orientation of those scientists, (2) the stated opinions of those scientists about global warming/climate change and/or (3) the policies associated with those scientists' scientific opinions.

You also say this:

A recent Pew study found that less than a third (32%) of Americans believe that climate scientists’ research is based on the “best available evidence,” most of the time. Similar numbers believe that climate scientists are mostly influenced by their political leanings (27%) and the desire to advance their careers (36%).

Do you have a reference to data that look at other scientists along parallel lines of analysis? For example, how do Americans score on polling about the researchers who study GMOs, or the effects of gun control, the safety of nuclear energy, the damage that pesticides and mining/fracking do to the environment, the long term psychological impact on women from having abortions, the association between vaccines and autism, etc.?

May 3, 2018 | Unregistered CommenterJoshua

@Matt--interesting, thanks!

May 3, 2018 | Registered CommenterDan Kahan


Thanks for this! A couple of quick replies (grouped by quote):

1. I think this is an important point. We need to do more work to separate perceptions of scientists from attitudes toward their advice, research, etc. (Which is what I try to do in some of my other research!) We also need to situate climate scientists in relation to other types of scientists -- with various partisan "reputations." We might also look at broader opinion toward "scientists" and "the scientific community," although it's not always clear what pictures survey respondents carry in their heads with thinking about scientists in such broad terms. Maybe an unsatisfying answer, but one I promise I'm trying to figure out with new data!

2. Actually, this is ALSO something I'm working on right now! At the moment, I'm unaware of research asking the Pew questions of other scientific groups. In general, social scientists need to do a much better job measuring people's attitudes toward scientists and other experts.

Like you suggest, scientists studying more salient and politicized topics are probably going to be more likely to be distrusted in some respects. But I think there's also reason to believe that climate scientists are pretty unique in this respect -- in addition to the reasons cited in the post. The Climategate and Climategate 2.0 controversies from the mid-to-late 2000s, for example, come to mind as being pretty high profile examples of where the public's ability to trust climate scientists was called into question.

You mentioned gun research, and I actually think that would be **fascinating** to study right now. The partial repeal (or re-interpretation) of the Dickey Amendment banning federally funded gun research from agencies like the CDC has made the research/science side of the gun violence issue more salient. I'd be interested to see how trust in these researchers might change over time, and how it compares to the pre-repeal era.

May 3, 2018 | Unregistered CommenterMatt

Thanks Matt -

I think it would also be interesting to contextualize views about climate scientists with information on the degree to which people think that advice from lawyers, plumbers, auto mechanics, doctors, priests, CEOs, car salespeople, etc., is free from the influence of self-interest.

May 3, 2018 | Unregistered CommenterJoshua


Did you see anything in your data that would suggest that science interest is or is not stable longitudinally?

May 3, 2018 | Unregistered CommenterJonathan

Hi Jonathan,

That's a great question. I know I looked at this, and (if I recall correctly) the stability coefficient is in r = 0.30 territory? I've observed similar results in other data too, with shorter time lags (r = 0.30 - 0.50). I'd caveat that while some instability is probably explained by variation over time, these measures are also not ideal (they're good -- but I'd prefer approaches to measurement that look more like Dan's in his work on science curiosity).

So, I'd say that science interest is fairly stable over time, but there's certainly room for movement over time.

Hope this is useful!

May 3, 2018 | Unregistered CommenterMatt


"...but there's certainly room for movement over time."

Did you see any evidence that it increases more often than decreases over time?

May 3, 2018 | Unregistered CommenterJonathan

Hi Jonathan,

In another study (just because it's what I happen to have handy right now), I find that this movement is pretty equal across groups. Roughly a third of the sample exhibits no (or very small) change, another third gains interest, and another third decreases. A quick look at the data suggests that the "decrease" side is slightly larger than the "increase" side, but not by much.

I don't know if that's helpful, but I find it very interesting!

May 3, 2018 | Unregistered CommenterMatt

Matt -


The Climategate and Climategate 2.0 controversies from the mid-to-late 2000s, for example, come to mind as being pretty high profile examples of where the public's ability to trust climate scientists was called into question.

I would be curious as to the validity of self-report data that Climategate (1.0 or 2.0) significantly modified anyone's preexisting views on reliability of climate scientists advice on climate change. As I recall from the evidence I've seen, not a huge segment of the public even heard much about Climategate, and of that segment, not a terribly large % followed very closely.

And then within that group, those who self-report a significant impact (in either direction) reported changes that suspiciously aligned with ideological orientation. W/o pre-test/post-test data, I wonder if people wouldn't have a tendency to assert a causality post-hoc and, essentially, use Climategate to confirm a bias rather than significantly determine views towards climate scientists.

May 3, 2018 | Unregistered CommenterJoshua


Another follow up - any data about how the trust in climate scientists in adulthood varied with those groups that gained/lost/stayed science interested?

May 3, 2018 | Unregistered CommenterJonathan


Interesting study.

"As readers of this blog might expect, Americans who are highly interested in science should exhibit higher levels of trust across the ideological divide."

There are two viewpoints on science. One is that scientific experts, scientific consensus, peer reviewed scientific journals, and so on have replaced religion as a source of infallible truth and wisdom. The other is that science is based on systematic scepticism of *all* scientific dogmas, no matter how well established, and that the rule in science is to reject all Argument from Authority, that science is the belief in the ignorance of experts, Nullius in Verba, that cites the Galileo example, and so on. Scientists are human and fallible.

When measuring people's interest in and knowledge of science, and climate science in particular, do you make any distinction between the two viewpoints?


I've long had a hypothesis on Dan's results on science curiosity that there is a certain group of the public who, rather than putting their blind trust in ideological authorities, instead put their blind trust in scientific authorities, and in particular, popular science in the media, like science documentaries, and so on. People who like and enjoy watching science documentaries will tend to believe whatever the documentaries tell them, whether that's the scientific consensus, or based on the best scientific evidence, or not.

Hence the anomalous result on fracking Dan found. Although geologists and safety experts are agreed that fracking is safe - it the "scientific consensus" - the media presentations on it tend to play up the controversy, for the sake of the drama, and therefore (according to the hypothesis) people higher in "scientific curiosity" are more likely to have greater concerns about fracking. My prediction was that any other areas where the popular media presentation differed from the best scientific evidence, that science-curious people would follow the media, not the evidence.

So people are more trustful of climate scientists simply because the media takes a particular side in the debate and portrays them that way. As soon as the media shifts its view, the public will follow.

I see parallels with the similar media consensus on the dangers of overpopulation that were popular in the 1960s and 1970s. We were told that the world was doomed, that food and resources were running out, that famine and war were unavoidable, and that pesticides would wipe out the oceans. People who trusted science believed - including many scientists - although expert economists already knew that the reasoning was wrong and contradicted by the evidence. And since then, there have been a long series of further environmental scares that arise, are widely believed to be unarguable scientific truth, and then gradually peter out and disappear as their predictions fail to come to pass, only to be replaced by a new one.

Is it necessarily a good thing for the public to trust too blindly in scientific authority? Wouldn't widespread public scepticism of scientific dogma, with more demands to see the evidence, and to check and challenge it, be more "scientific"?

May 3, 2018 | Unregistered CommenterNiV

A few replies:

Joshua: All possible! Although I cited the Climategate example, I'd be wary that any one event would have a big impact on opinion. My guess is that trust in climate scientists soured gradually, over time (similar to the argument Gordon Gauchat makes in his 2012 ASR paper. But, we'd need more longitudinal data on that!)

Jonathan: That's an interesting question! I'll definitely give that a look in the data.

NiV: I had a good conversation on Twitter about this earlier today, and I'll re-iterate some of what I said there here. I totally agree with you -- some amount of skepticism is healthy. In fact, it's the cornerstone of the scientific method! Trusting climate scientists shouldn't (in my view) mean blind acceptance.

Like you suggest, I think scholars need to do a better job asking people to tell us not only whether they trust information from a particular group, but what the *nature* of the trust is. Is it blind acceptance; almost like a type of religious deference to science? Or does it include a healthy skepticism – a recognition that even the most trustworthy experts will get things wrong on occasion? My data can't really tease these two things apart, but I think this is a really important area for future research.


I'd also note more generally that these comments have been _really_ great for thinking about extensions of this project and future research. Thanks everyone!

May 3, 2018 | Unregistered CommenterMatt

Pyhrric policy victories of the religious right:

May 3, 2018 | Unregistered CommenterJonathan

Matt - at your convenience please look up queries by Dan and me earlier here concerning a possible third variable underlying - and directly causing - the apparent correlation you find between your own two variables.

May 3, 2018 | Unregistered CommenterEcoute Sauvage

@Matt. Longditudinal studies very useful indeed :)

Is this the *type* of trust we would want to encourage? If purely isolated, the quantitative ability and science knowledge we can measure are, theoretically at least, not emotive. Whereas the quality of curiosity aka 'interest', even when we are purely measuring this, *is* emotively based. This means that any enhanced trust in science resulting from encouraging (only) interest, would itself be more emotively based. So, rather than cutting through the emotively based affiliations given to the two main parties in the US, we would be adding a 3rd emotive loyalty, also a part of identity, i.e. a loyalty to science, which steals somewhat from the former two. The problem with this is that we'd not be solving the existing issue, but rather masking it. Emotive loyalties are blind; if the mainstream science went wrong in any particular domain, for instance as has happened in the past with Eugenics, this kind of trust would be less able, not more able, to spot that issue. A trust that is based more upon reason, as indeed the whole enterprise of science is itself, by virtue of finding a way to reduce the existing emotive investments that intersect science issues, would be preferable to a method that merely swaps one type of emotive investment with another (and also, essentially increases the number of confounding variables by adding 'science as culture' to the existing Rep / Dem cultures).

Notwithstanding above and assuming it is after all the kind of trust we want, I've wondered out loud here before about another potential issue. Teachers appear to act universally to encourage curiosity, because it leads eventually to increased results (as indicated by exam passes, though in some cases the full benefits may not show until years later), i.e. more knowledge and more ability. Well just because this is the de-facto wisdom of the educational system, doen't mean it's true. But let's say that it is, unless you are pretty certain it's not . So if, in a bid to increase trust in science, we embark on an ambitious program to say double the level of science curiosity in young adults, this will lead to many more of them who are, later in life, more knowledgeable and able. Which in turn means they will be more subject to polarization on conflicted issues. So... whether there is a *net* increase in 'trust in science', would depend upon the level of increased polarization due to more knowledgeable / able folks, versus the level of decrease due to more trust based on youthful interest.

If such effects ever do reach equilibrium values in a population, what would the new equilibrium look like with double the encougragement? In the current equilibrium you indicate that the youthfully interested maintain high trust later, but within this group is the effect itself a net of those who became more knowledgeable and hence more subject to polarization, and those for whom the teacher's trick already didn't work (I recall being encouraged to science curiosity by parents / teachers long before the age of 11 - and indeed looking earlier as you suggest would be cool), i.e they retained curiosity but it didn't lead to knowledge. Hmmm.... way too many questions, some general comment will suffice, espc if you have longitudinal data to address this, or know that the teachers are simply wrong anyhow. But how early the ecouragement happens may matter regarding a net of conflicting effects.

I wrote 'only' in brackets above to simplify the case. Yet in reality we would for sure not want to hugely increase interest / curiosity, and yet then act to prevent the (later) natural fulfilment of this within individiuals (i.e potential increases in knowledge and ability, if the teachers are right), in order to prevent swings back to polarization.

Do we know whether or not we've reached a saturation point in science interest encouragement (in developed societies) already? There's an awful lot of science encouragement out there. Presumably, however high this encouragement was pitched, many in the population will simply always be more interested in other things. Is there any data that tells us where the saturation point lies and whether there is sufficient gain in the system still left to make a significant impact? My own pure guess is that there is headroom for more scientific interest, but whether enough to make a major impact on the big beasts of culture (without turning science itself into a culture), is a much more difficult proposition to consider.

May 3, 2018 | Unregistered CommenterAndy West

@Ecoute -- so sorry I missed your original post! I did reply to Dan's comments on spuriousness [it's my first comment; forgot to "at" Dan. I'm new to commenting here!]. To your additional points, I would say that you're absolutely right to note that the knowledge tests I and others use are no substitute for actual experience in scientific fields. Polling scientific experts is possible (although difficult to do longitudinally).

Although, if I'm understanding your argument correctly, it's at least in part predicated on the idea that climate scientists _do not_ recognize the safety of nuclear power in the same numbers as the scientific community more broadly? If so, I'd be interested to see some data on that; on the extent to which climate scientists deviate from the scientific community more broadly in their support for nuclear power. (It very well could be the case! I'm just not aware of any studies that have shown this, and would love to check them out).

@Andy -- this is fascinating! I will reply in full shortly! (I've got family in town this weekend and limited time at my computer). Briefly, I'll say that I did give this dynamic some thought in the letter, and wish I could've expanded more on it. In the meantime, I'd suggest Asheley Landrum's excellent work on the mutually reinforcing relationship between knowledge and trust:

May 4, 2018 | Unregistered CommenterMatt

Matt - it's faster if I just give you my source

He actually thinks there is some climate change - nobody denies THAT - but has had a very hard time convincing the greenhouse-gases-will-turn-Earth-into-Venus crowd to even CONSIDER the newer, safer, smaller nuclear generation designs of a startup he's associated with. We're not talking here about people like Al Gore or and other ignorant "greens", we are talking about a group of perfectly competent scientists and modelers - including even the late Stephen Hawking.

Call him for background info - he doesn't understand the opposition any more than I do, but he's more familiar with it.

May 4, 2018 | Unregistered CommenterEcoute Sauvage

P.S. to Matt: and tread cautiously - massive paranoia has set in the Earth-to-Venus set, and blind panic may make them dangerous, at least legally. They are dangerous financially, given their takeover of supercomputer time to back up government databases NOT being deleted in case they MIGHT be. If that's not paranoia I don't know what it is.

May 4, 2018 | Unregistered CommenterEcoute Sauvage

Jonathan -

It's a good thing I funny believe in asymmetry:“fake-news”-different-reasons

May 4, 2018 | Unregistered CommenterJoshua

Interesting take on Moniz's emotiveness:

Fill in the blank: I’m scared of _________________.

…the risks posed by climate change and by nuclear weapons in the wrong hands. Addressing these challenges is a big part of the DOE’s work and calls for continued American leadership and enhanced international cooperation for a long time.

May 4, 2018 | Unregistered CommenterJoshua

An interesting angle on the "smarter people are more polarized" framework:

May 4, 2018 | Unregistered CommenterJoshua


“In these studies we have shown that both Liberals and Conservatives are equally likely to believe fake news. The psychological motivations associated differ between the two groups though. Liberals believe news stories to maintain a favourable feeling about their own group. But Conservatives believe news stories because of a tendency to use their gut instincts.

“Understanding the similarities and differences between groups will be important as we seek to develop strategies to reduce the growing tide of political polarisation in our democratic societies.”

That's one point I made a while back about different genotype but same phenotype, which Dan didn't seem to enjoy. Although, with all the symmetry exposure here, I was questioning it myself. The problem is if there's symmetry due to different motivations, then why the symmetry? Why are liberals collectively narcissistic in just the same amount that conservatives are gut-goers? Would liberals be equally gut-goers if their narcissism didn't interrupt the intuitive process first? Or, vice versa for conservatives?

Hmmm - one point I made ... interrupt the ... process - curious wording, there....

May 4, 2018 | Unregistered CommenterJonathan

Jonathan -

My gut instincts tell me that conz are just as likely to have their reasoning held hostage by group identity and sense or moral superiority as libz.


May 4, 2018 | Unregistered CommenterJoshua

curious wording, there....

I see what you did there..

My reverse jinx hot streak seems to have run its course. It was a good run.

Now I'm on to looking forward to seeing the C's crushed by Bron (or if not, look like a G league team while getting swept in the finals).

May 4, 2018 | Unregistered CommenterJoshua

An interesting angle on the "smart people are more polarized" framework:

Would Moniz, as someone identified as "cognitively proficient" and then assessed against a background of a dichotomized scale of "world view," be simultaneously "more polarized" on climate change and "less polarized" on fracking and nuclear energy? Or maybe its more that there's something about his personal life trajectory that puts him on his personal grid of polarization highways, rather than his cognitive proficiency."

I'm struggling to find a way to exempt that from intuitive speculation....

May 4, 2018 | Unregistered CommenterJoshua


"My gut instincts tell me that conz are just as likely to have their reasoning held hostage by group identity and sense or moral superiority as libz."

Ha. But, taken seriously, does that mean you are writing off the finding of that bps article (can't find the paper it's based on - must not have been published yet)?

Or, are you suggesting that group identity/moral superiority would occur to the same degree if allowed to, but intuition in conz trumps that? In other words, conz don't get a chance to gloat because they're too busy not reflecting, but would if they could? What would happen to high-CRT conz in this case?

I think it's possible that libs have trained themselves to override their guts, but haven't done so for schadenfreude (Freud throwing shade?).

May 4, 2018 | Unregistered CommenterJonathan

@Matt: 'I will reply in full shortly!'

I look forward to it, but no hurry. I have some family issues here too so my own further responses could get delayed.

May 4, 2018 | Unregistered CommenterAndy West

Jonathan -

My comment was a joke, but it's also true that my gut instincts say there's something wrong about the paper. I just see too much moral superiority and group identity among conz to believe that there is a materially significant difference between libz and conz in that regard.

So it is hard for me to get past a background skepticism about the study. I don't dismiss it, but I don't see one study as sufficient for me to go with their conclusion as generalizable. In particular, I go back to the "diversity within groups is greater than the diversity between the groups" thingy. How much would a difference they found, - even if it did pan out over multiple studies (with longitudinal analysis) with large sample sizes, in a variety of contexts (cultural and otherwise), etc. - really explain much about the real world?

The study is interesting and worth thinking about.

I remain open to the possibility that my gut instincts are true, or that they're false and there really is an asymmetry and my biases (including identity defensive/identity aggressive cognition) explain my "gut instincts" and observations. I also try to remain open to the possibility even that somehow (as of yet not explainable) my gut instincts and the authors' conclusions aren't incompatible.

So, for me, the goal is rather than looking at the study as overriding my past observations, to think of the study as I continue to observe in the future - holding my gut instincts in relief against the potential explanatory power of the study. I think of it as being like how I balance theory and practice as a teacher, using each to inform the other.

Or, are you suggesting that group identity/moral superiority would occur to the same degree if allowed to,

I more question whether they really occur to different degrees.

but intuition in conz trumps that?

Wouldn't, then, there have to be some kind of counterbalancing mechanism in the other direction, where reflecting would overwhelm an innate tendency towards gut instinct in libz?

In other words, conz don't get a chance to gloat because they're too busy not reflecting, but would if they could? What would happen to high-CRT conz in this case?

As per above, it's just really tough for me to get past the contention that conz don't gloat.....but I'm not sure how you pull out "gloating" from the study. Or why high CRT would be inversely correlated with gloating

I think it's possible that libs have trained themselves to override their guts, but haven't done so for schadenfreude (Freud throwing shade?).

That's interesting - as that does, kind of, dovetail with my gut instincts. It isn't so much that I go with the idea of "training," but that it might be possible that libz have a dispositional disinclination towards going with gut instincts (different orientation towards self-efficacy?) - while not enjoying schadenfreude even one tenth of one smidgen less than conz.

All of that, was evidence-based and intuition-free, of course.

May 4, 2018 | Unregistered CommenterJoshua

Paper by everyone at Yale not named Dan on link between fake news belief, cognitive style, and dispositions to dogmatism, fundamentalism, and delusionality:

May 4, 2018 | Unregistered CommenterJonathan

@Jonathan-- there are more than 5 students,post docs & faculty at Yale, which isn't to say the 5 authors on this one are "wrong" etc (that depends on strength of evidence & inferences therefrom, not nose counting)

May 5, 2018 | Registered CommenterDan Kahan

Dan - I'm convinced your original hypothesis here is correct. We are looking at a 3-dimensional object which we are trying to fit in 2 dimensions - and by the simple introduction of a z-axis we can solve the problem. Not that it will help us much, as x-y correlations are easier to grasp. There is a wonderful visual prepared by a Japanese mathematician, prof. Sugihara, of a 3D object mulishly interpreted as 2D by our perception - nothing to be done about it, the illusion persists even after you see if in 3D

So, to your point: the climate scientists would gain instant credibility (eventually maybe reaching that of other scientists) if they dropped the Earth-to-Venus greenhouse gases "cures" (including impossibilities, like absorbing CO2 out of the atmosphere, dangerous plans like seeding clouds with SO2, public monies wasted in financial black holes like solar and wind) and focusing on KNOWN zero greenhouse gas emission technologies. As long as they don't I'm with the 68% of the general population who don't believe the climate scientists for the time of the day.

May 5, 2018 | Unregistered CommenterEcoute Sauvage


"...there are more than 5 students,post docs & faculty at Yale..."

Just practicing my alternative facts about crowd size, hopefully in a way that won't set off any uncontrolled meme fires.

BTW, here's a recent bit of circumstantial evidence that science knowledge (lack thereof) might be involved:

May 5, 2018 | Unregistered CommenterJonathan


"...public monies wasted in financial black holes like solar and wind..."

not when there's a buck to be made and a politician to be bought:

May 5, 2018 | Unregistered CommenterJonathan

Picture of Dan's sheepish grin starts reddit flame war:

Who knew Dan had such a famous pedigree?:

Vince Vaughn and Bill Hader lovechild in thumbnail?

With a dash of Rainn Wilson.

And Kelsey Grammer

Rainn Wilson and Conan O'brien

May 5, 2018 | Unregistered CommenterJonathan

Ugh - here's that reddit link again

May 5, 2018 | Unregistered CommenterJonathan

"Just practicing my alternative facts about crowd size, hopefully in a way that won't set off any uncontrolled meme fires."

Oh? Was that it? I assumed you meant that all the others were called 'Dan'. (And that since you had, by your statement, implicitly just called them all 'Dan', this was now technically true.) Logically, there's nothing wrong with the statement.

When it comes to alternative facts, it ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so. Vice versa, too. :-)

May 5, 2018 | Unregistered CommenterNiV

Jonathan - the problem is even worse than black holes in wind and solar when you include ethanol. Literally burning food for dismal thermal generation (BTUs) results while ALSO ruining engines. But the man who kept all those subsidies in the current budget, Sen. Chuck Grassley, chairman of the Judiciary Committee, is from Iowa, so there's nothing to be done about them currently, other than whatever Iowa does not produce, like electric cars. Tesla and GM are both losing their tax credit ($7,500/car) as they will cross the max permitted sales of vehicles (200,000) this year, at which production volume both are supposed to cease requiring subsidies.

If you want to follow the money flow, forget direct donations to politicians and check out the PACs. Tom Steyer sank $100 million into his NextGen renewable energy PAC and has exactly zero results to show for it. His current effort, another PAC to impeach president Trump, is sure to meet the same fate - but that's why Sen. Grassley is needed where he is now. There is a beautiful symmetry here :)

May 5, 2018 | Unregistered CommenterEcoute Sauvage

Motivated reasoning vs. the campus wars, featuring Dan's favorite football game (but with Charles Murray at QB):

May 5, 2018 | Unregistered CommenterJonathan


Interesting paper. Although I do definitely like the politics, I've got a suspicion it's still politically biased. Is the best argument that could be put forward by the protestors really: "We contend that experiences and emotions are valid ways to see the world, and that the hegemony of rational thought-based perspective often found in a university setting limit our collective creativity, health, and potential"?

The following snippet was also new to me:

Furthermore, the trend toward political homogeneity increased across cohorts: Whereas 10% of faculty were conservative, only 2% of graduate students and postdocs were.

Wow! That's a bigger split than on climate change! I knew about the 10%, but the 2% surprises me. (I probably ought to check it, but my own bias is saying 'why bother?') That's surely going to have an effect on the next generation of academics.

I wonder why more psychologists are not researching the causes of this opinion-polarisation effect? The stronger the effect, the easier it is to study, surely?

Or maybe they already know why it's happening, and don't need to look?

May 5, 2018 | Unregistered CommenterNiV


I think the "Who Decides" paper is great, but it misses one point that I think it should have covered, considering its modern cognitive psychological take on the issue: given the many cognitive biases that humans share, is the marketplace of ideas ideal based on faulty assumptions of convergence to truth given sufficiently friendly conditions? Certainly, we should try to overcome these biases as much as possible - but the current marketplace is rife with them. Given that people know this, perhaps indirectly through the effects they see - and perhaps this is exacerbated by a bias to feel that others are even more biased than they are - but also given that their demands and grievances aren't going to wait for these biases (real and perceived) to vacate the marketplace - how should people act? Add to this the fact that the grievances we think we have are themselves likely based on biases to varying degrees.

Also, what if we become self-gaslighted by our own de-biasing attempts? There could be some recognition of this risk in that "hegemony of rational thought-based perspective" justification.

May 5, 2018 | Unregistered CommenterJonathan

"given the many cognitive biases that humans share, is the marketplace of ideas ideal based on faulty assumptions of convergence to truth given sufficiently friendly conditions?"

I agree that the common assumption of convergence to truth is faulty, but I don't think it has anything to do with the marketplace of ideas. The reason they're sometimes associated is that a marketplace of ideas reveals the failure to converge. When different groups are culturally isolated from one another they converge on different points. But each one is doing in isolation what humans do. There are many scientific questions that are not the subject of political controversy, on which public ignorance and error abounds. (Does the sun orbit the Earth or vice versa? Do electrons whizz down electical wires at high speed?) When the human tribe is culturally united, why would things work any differently? We would all converge on the wrong point, but would never realise it because alternative opinions don't occur (or are not permitted) to make the possibility of alternative conclusions visible to us.

One of the major points of having a marketplace of ideas, of intellectual diversity not only permitted but actively sought out and encouraged, is to increase the likelihood of us noticing that we've got it wrong. Humans are fallible. All those biases lead us into many sorts of error. Scientific training can reduce some of them, but does not remove them. And how can we possibly calculate correctly with a faulty calculating engine? Simple! We collect together engines with many different failure modes, with different sources of error, and note where their answers differ. That's where the errors are most likely to lie. And then we closely examine those parts of the argument, in debate, to try to spot them. It's still not perfect, but it's far better than anything else we've got.

No, I'd argue that the current problem isn't the marketplace of ideas, it's intellectual protectionism. Protectionism in economics is where you protect weak businesses from competition by raising barriers to import. Intellectual protectionism does the same by seeing opposition and dissent not as an opportunity, a resource to be nurtured, but as a threat. We don't see disagreement as an opportunity to correct our own errors, because we're all convinced we have none, but instead as unarguable evidence of the manifest errors of the opposing views, of their mental and moral faults. We don't listen. We try to exclude them, to silence them, to stop the spread of their pernicious and dangerous ideas. There's no point in debate; there's nothing to learn from such ridiculous error. We try to ensure only reliable authorities are trusted.

It's the symmetric asymmetry thesis thesis. We believe in the asymmetry thesis - that our side is intellectually better than the other, that we're right and they're wrong - and *both* sides believe in the same thing symmetrically.

We don't have a marketplace of ideas - we're obviously not going to trade for their shoddy goods - instead we have a trade war, where we fight to keep theirs out, and our own pure.

98% of the next generation of academics non-conservative is an impressive achievement in that line. And I'm sure the result will be greater convergence on a consensus, and greater trust in the authority of experts. But convergence to truth?! I don't think so.

May 6, 2018 | Unregistered CommenterNiV


>'When different groups are culturally isolated from one another they converge on different points. But each one is doing in isolation what humans do'

Indeed. Something the widely traveled ancient Greeks first noticed (cultural isolation more often matched geographical isolation then), causing them to question their own values and introduce formally reasoned skepticism (as distinct from value dependent innate skepticism).

>'...instead we have a trade war...'

Trade is a good analogy, because economics shares some mathematics with evolutionary processes, whether biological or cultural evolution. The optimum balance in both economics and evolution is an optimum of 'co-opetition'.

The polarization in some current culturally conflicted topics may be 'too wide'. But the optimum is not zero. This would indicate a tyranny or maybe a dying society clinging to some past ideal.

Regarding the lack of diversity in social psychology, see:

May 6, 2018 | Unregistered CommenterAndy West

I don't for a moment believe the 98% number - in fact I know it to be false, because the poll involves people brought up wholly under the tyranny of political correctness who have learned that stating true facts simply gets you booted off the public internet. Latest casualty, for those not following the subject, is the domain, pulled by the domain registrar after some publicly funded legal "charity" in DC complained.

Supporting evidence: yesterday was the funeral of Pamela Mastropietro, an 18-year old murdered and dismembered by Nigerian migrants in Italy. Her body parts were found in two suitcases, minus her heart and liver, presumed kept by the Nigerians for voodoo rituals. A huge crowd accompanied the coffin in Rome. Her mother carried the flowers sent by Luca Traini, a local man so appalled by this horror he started driving around shooting random Africans in the street and is currently in jail awaiting trial for attempted murder.

A search for "funeral Pamela Mastropietro" in English gives ZERO results. Nobody in the English-speaking mainstream media covered it. NOBODY. I had to look up the Italian press (Corriere della Sera, one of their best newspapers) to find out what her mother said about Luca Traini's bouquet she chose to carry, and since my Italian isn't the best I am re-posting the original as well as my translation:

«Ci hanno fatto piacere — dicono —, è stato un omaggio apprezzato, di vicinanza. Anche perché se avessimo rifiutato la sua corona, allora non avremmo dovuto nemmeno stringere la mano a tutti quei politici che non hanno fatto nulla per evitare la morte di Pamela»

"It was a homage of friendship that was appreciated and gave us much pleasure. And if we had turned down his tribute, we should also have refused to shake the hands of all those politicians who did nothing to avert Pamela’s death."

However, the news was all over the alt-right media - which have simply gone underground - which is how I know about it. Absence of - easily accessible - evidence is not evidence of absence.

May 6, 2018 | Unregistered CommenterEcoute Sauvage

P.S. The above is not only relevant for Matt's hypothesis on gen Z (patently false) but also on the question how could electoral polls fail to predict results from the election of president Trump to Brexit to triumphs of the right in Austria, Poland, Hungary, Italy, the Czech Republic, and almost everywhere in the West there has been an election recently.

People have learned to keep mouth shut. Or lie to pollsters.

May 6, 2018 | Unregistered CommenterEcoute Sauvage

"A search for "funeral Pamela Mastropietro" in English gives ZERO results. Nobody in the English-speaking mainstream media covered it. NOBODY."

Apart from the New York Post, The Daily Mail, The Express, The Telegraph, The Times of Israel, ...

The problem with "true facts" is that everyone has their own set of them, all equally convinced that theirs are the only "true" ones, all equally convinced that the threat they see justifies violence and forceful suppression of the opposition, instead of debating with it. All equally convinced they're under threat of being unjustly suppressed themselves.

Nobody has a monopoly on truth. Nobody is free from error. The more they believe in their own infallibility, the more fallible and mistaken you know they are.

May 6, 2018 | Unregistered CommenterNiV

Ecoute -

but also on the question how could electoral polls fail to predict results from the election of president Trump

Actually, polling on the popular vote was pretty accurate.

People have learned to keep mouth shut. Or lie to pollsters.

Could you provide some evidence to reinforce your theory thst people lying to pollsters or keeping their mouth shut had a material impact on polling? Your theory looks like one of self-cicrimizomg snowflakes convincing themselves that they are being persecuted. But maybe I'm wrong. I'd like to see your evidence that I am.

As just one aspect of that question, as it turned out, Trump outperformed his polling in areas where there was widespread support for him - which suggests to me that your theory is false.

I know that your argument is one that has been promoted widely by alt-righters, but it looks to me like a classic case of people inventing mechanisms, without data, to support theories that fit their preconceived agenda.

Thanks in advance.

May 6, 2018 | Unregistered CommenterJoshua
Member Account Required
You must have a member account on this website in order to post comments. Log in to your account to enable posting.