follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Travel journal entries | Main | Are misconceptions of science & misinformation the *problem* in the Science Communication Problem?... »
Tuesday
Jun132017

Science comprehension without curiosity is no virtue, and curiosity without comprehension no vice

For conference talk in Stockholm in Sept. The "without comprehension" part might be a slight exaggeration but point is that people normally recognize valid decision-relevant science w/o understanding it. 25 CCP points for first person to correctly identify allusion in title; hopefully it won't perplex Swedes too much.

It has been assumed (very reasonably) for many years that the quality of enlightened self-government demands a science-literate citizenry (e.g., Miller 1998).   Recent research, however, has shown that all manner of reasoning proficiency—from cognitive reflection to numeracy to actively open-minded thinking—magnifies politically motivated reasoning and hence political polarization on policy-relevant science (e.g., Kahan, Peters et al. 2012, 2017; Kahan 2013; Kahan & Corbin 2016).  The one science-comprehension-related disposition that defies this pattern is science curiosity, which has been shown to make citizens more amenable to engaging with evidence that challenges their political predispositions (Kahan, Landrum et al. 2017).  The presentation will review the relevant research and offer conjectures on their significance, both theoretical and practical.

References

Kahan, D.M. & Corbin, J.C. A note on the perverse effects of actively open-minded thinking on climate-change polarization. Research & Politics 3 (2016).

Kahan, D.M. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making 8, 407-424 (2013).

Kahan, D.M., Landrum, A., Carpenter, K., Helft, L. & Hall Jamieson, K. Science Curiosity and Political Information Processing. Political Psychology 38, 179-199 (2017).

Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change 2, 732-735 (2012).

Miller, J.D. The measurement of civic scientific literacy. Public Understanding of Science 7, 203-223 (1998).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (83)

Dan:

On April 14th at this blog you say:"I'm persuaded by your argument about "reasonsing proficiencies" & science curiosity."

My argument there is that: "...curiosity (whether of science or music or philosophy or whatever), while it may motivate reasoning, is more usually considered an emotive phenomenon related both to anxiety and pleasure. I.e. it is not a reasoning proficiency, so your assumption would first have to be established."

Given your agreement, and notwithstanding you distance somewhat by using the word 'related' above, i.e. "The one science-comprehension-related disposition that defies this pattern is science curiosity..." , this sentence seems to contradict your agreement of that date. At the very least before writing sentences that re-introduce a strong association with reasoning proficiency, surely one has to first establish what SC actually is.

Notwithstanding evidence that SC makes 'citizens more amenable to engaging with evidence that challenges their political predispositions', if it is primarily an emotive phenomena this will not overall grant a better recognition or support of decision relevant science. As I noted regarding SC on the April thread:

"As such it would merely introduce a different kind of emotive bias to that imposed by political identity, and rather than removing an interference, it is replacing this interference with another that will therefore have its own different issues. Much like political bias this will mean sometimes operating in a direction that seems desirable, and sometimes in a direction that seems undesirable (e.g. against the scientific consensus on fracking according to your Figure 8 data)."

Your referenced paper doesn't seem to mention SC, so it doesn't seem like your arguments rest upon an association of SC with reasoning proficiency anyhow.

It's possible that most science has been accepted simply because it is sufficiently below the social radar not to create controversy. For the science that does become controversial, likely as many folks evaluate wrongly as rightly, which the existence of many polarized domains suggests. (Iteratively, those who resist correct science will nevertheless diminish, though it may be a very long process). Without as your paper notes "even a rudimentary level of expertise in any of the myriad forms of science essential to their well-being", the public has no rational system available for judgement that can be free of social interpretation and negotiation. In highly contested domains the validity of sources is just as complex to unravel as the science itself (hence they have no expertize in this either), and 'pattern recognition' of real science seems highly speculative. Pattern recognition includes instinctive elements, and science hasn't been around long enough for those to be relevant. However, we've evolved a system for recognizing collective deception over a hundred millennia or more, so theoretically we can spot that which is *not* correct science, i.e. when it is primarily cultural in origin even if wearing the mask of science. But the huge let-down of this system is that it only works for folks who are not already aligned to the culture that is wearing the mask. So it is just as often inapt as apt.

Your 'natural rationality' seems to me to have an issue if we take it to the limit (and granted taking things to the limit provides insight but may also be unrealistic). If the public can recognize 'real science' by some subtle yet reliable rationality based system, when scientists themselves are conflicted we could simply ask the public to deploy their system and tell us who is right. I.e. you are proposing a magical system. Or alternatively, you are simply proposing that the public always recognizes the authoritative position (which certainly seems highly plausible), and equates this with correctness. However, history shows that the authoritative position has frequently been wrong, and despite science self-corrects in time, per above that time may be very long. Nor do the public always equate such authority with correctness anyhow, otherwise there'd be no controversial domains.

June 13, 2017 | Unregistered CommenterAndy West

@Awest-- the Kahan, Landrum paper does "mention" SC; it's all about how we formed the scale & what relationship it had to formation of opinions on decision-relevant science.

What I agree with is that science curiosity, while not an element of science comprehension, is an enabler of it. Compare Dewey, How We Think (1914), ch. 3 sec. (1910).

On "recognition of valid science"-- I don't mean people intuitively know what science is yet to find out. I mean they form relatively reliable intuitions about what science knows when it comes to know it.

June 13, 2017 | Registered CommenterDan Kahan

Dan,

Title allusion is to Barry Goldwater, right?
“Extremism in defense of liberty is no vice. Moderation in pursuit of justice is no virtue.”

June 13, 2017 | Unregistered CommenterJonathan

@Jonathan-- you win! You can redeem your points for CCP merchandise at any time

June 13, 2017 | Registered CommenterDan Kahan

Dan,

I'll consider my points redeemed if you answer this: what is your opinion on the Goldwater Rule vs. scientists' obligations to communicate with the public? How about in contrast to the Johnson Amendment? Should the two "non-overlapping magisteria" keep silent or go full in?

Sorry for the hard questions, but a "CCP Rules!" coffee mug just isn't my thing.

June 13, 2017 | Unregistered CommenterJonathan

Self-licensing effect?:
https://phys.org/news/2017-06-emphasizing-individual-solutions-big-issues.html

June 13, 2017 | Unregistered CommenterJonathan

@Jonathan, I think it is a good rule. Scientists should form & communicate opinions based on genuine evidence.

June 13, 2017 | Registered CommenterDan Kahan

Sorry I meant your pre-print paper from the hyperlink at the top of your post. I still need to look at the Landrum one.

However, I don't think this exits you from the difficulties of the approach. While 'a positive intellectual force', without which I presume science would not even exist, Dewey says in the main that curiosity motivates a search for knowledge, which if sustained over time and a range of changing input, may also encourage 'consistent and orderly thinking'. Yet if via such a search some knowledge of a controversial domain is aquired, then per your other findings this will lead to more polarization, depending on the particular level of domain knowledge. And 'ordered thinking' is a process, which without domain knowledge (i.e. without 'even a rudimentary level of expertise'), has no input to work on that even in principle could be objective (essentially only social information remains as input, hence all social bias is in play). Plus in some polarized domains, the effortful navigation of labyrinthine paths of source information validity and its heaps of challenges and counter challenges (which indeed is all social information), is just as out of reach for most of the public as the science itself anyhow. While an obvious (i.e. even to the public) authority consensus dominates some domains, this is no guarantee of correct science either, yet in any case from your charts the folks who are more science curious lean *towards* the consensus on CC but *away* from the consensus on Fracking. So if you take consensus as the gold standard for both (your position I believe), then either the ordered thinking is not reliable and so not predictive, or the ordered thinking is submerged beneath other effects. So for instance the effect of a different set of biases to political bias, which may result from the emotive nature of curiosity.

"I don't mean people intuitively know what science is yet to find out. I mean they form relatively reliable intuitions about what science knows when it comes to know it."

For the second sentence only, and with a complete free hand about *how* such an intuition might work, how can there be any intuition about the correctness of science knowledge that *could* work, when there is only social information as input. This input can only tell you things about identity, not truth. So it may tell you what is *not* science in some cases, because it is culture instead (though one won't see this if one is value aligned), but it cannot tell you what *is* science. I don't see how even in principle your second sentence amounts to anything other than magic. That may be my poor grasp, yet your SC charts for CC and Fracking suggest indeed that there is no reliable intuition, however it works.

June 13, 2017 | Unregistered CommenterAndy West

"On "recognition of valid science"-- I don't mean people intuitively know what science is yet to find out. I mean they form relatively reliable intuitions about what science knows when it comes to know it."

Do they?

"Is all radioactivity man-made?" - 83% Correct.
"Do lasers work by focusing sound waves?" - 68% correct.
"Are electrons smaller than atoms?" - 69% correct.
"Which gas makes up most of the Earth's atmosphere? [Hydrogen, Nitrogen, Carbon Dioxide, Oxygen]" - 25% correct.
"Does the Earth go around the Sun, or does the Sun go around the Earth?" - 60% correct.
"Do antibiotics kill viruses as well as bacteria?" 65% correct.

People seem to me just as likely to get it wrong even on scientific issues that are not the subject of political controversy. The difference is that if told that 40% of people think the sun goes round the Earth, people just shrug. So? Does anyone besides the astronomers really need to know that? But if they're told that a similar number don't believe in evolution, or climate change, or any other political shibboleth that just happens to be about science. they're outraged that in this modern world such scientific ignorance and error should be allowed to exist! What's wrong with these people?!

If ordinary people form reliable intuitions about what science knows, why don't they form them when it comes to what science knows about the constitution of the atmosphere, or the effects of antibiotics? What went wrong? If a scientist in a white lab coat told them that antibiotics killed viruses, would they be able to intuit that he was incorrect? Or is that not what you mean?

June 13, 2017 | Unregistered CommenterNiV

Dan,

I was hoping for a more elaborated opinion. Next time I'll just go with the coffee mug.

BTW: Nyhan has a new post Wood&Porter article:
http://www.dartmouth.edu/~nyhan/nature-origins-misperceptions.pdf

June 13, 2017 | Unregistered CommenterJonathan

Dan:

"social proof gets the job done."

Social proof is, unsurprisingly, a social phenomenon. In line with my comment above, social phenomena with social information only as input (e.g for this phenomenon, what one's peers or authority figures signal) cannot result in anything intuitive about the correctness of science in disputed domains (or any science). Social proof rules vary with cultural environment and even if, as many argue, trust in science from gentleman's codes (or however such arose) wasn't being eroded in recent times, the particular rules here such as the special authority status of science, will tend to lend weight towards support for a mainstream scientific consensus whether or not this consensus is correct. While this may more often than not be correct, the *reason* for leaning that way has nothing to do with any intuition about what science knows; it is a just a social contract. And when grouping around social proof is wrong it can be spectacularly wrong (the 50 year old collapsed consensus on saturated fats may well have harmed the health of hundreds of millions of people). Nor do all disputed domains have a dominant mainstream consensus, so other social loyalties will be even more prevalent. And sometimes the public and scientific perceptions of consensus are out of kilter too. So what about data... one can expect to see the social contract with science weighted against other social contracts in play, other aspects of identity, if you will, both strong and weak. This looks very like the case from your trust in science charts at the 30th May post 'Generalized trust in science: nukes vs. climate', which I mention in comments there.

June 13, 2017 | Unregistered CommenterAndy West

P.S. Dan the data you kindly post here is great stuff !

June 13, 2017 | Unregistered CommenterAndy West

sigh. That new Nyhan article contains nothing new - it's really just a survey.

Dan - have you seen this? It might even warm NiV's heart:
DOI: 10.1111/tops.12186
"We argue in this article that although a simple Bayesian view cannot accommodate
belief polarization, a more sophisticated variant involving Bayesian belief networks can
give rise to polarization even though agents behave entirely “rationally”"

June 13, 2017 | Unregistered CommenterJonathan

The opening sentence of one author's doctoral thesis in psychology honestly (though probably inadvertently) summarizes the text of the article just linked - and he doesn't even bother with 97%, instead going for 100%! Any attempt to overlay this repulsive dogmatism with Bayesian networks is like adding theatrical makeup to a decomposing corpse:
https://skepticalscience.com/docs/Cook_dissertation.pdf
CLOSING THE CONSENSUS GAP
"There is a consensus among climate scientists that humans are causing global
warming. "

June 14, 2017 | Unregistered CommenterEcoute Sauvage

NiV - the difference between ignorance of scientific knowledge and ignorance of prevailing scientific opinion (even if prevailing means 100%, as Mr Cook of Western Australia maintains) matters little unless enforced by some law.

Years ago I ran into the late professor Netanyahu (in a very large reception in Vienna) and asked him how on earth he could support idiotic laws making it criminal to come up with ANY number other than Six Million concerning persons of the Jewish faith murdered in Nazi-occupied Europe during WWII when his very own meticulous actual enumeration never exceeded roughly (allowing for obvious uncertainties) five and a half million. How could he know that nobody in Austria - where his book had been freely published - would decide to prosecute him criminally for breach of precisely that law? He vaguely said that to a mathematician (me) numbers have a significance greater than they do to a historian (him) which is true but hardly a reason for passing criminal laws one knows to be factually incorrect - fortunately this could never happen in the US.

But climate change is a purely mathematical issue, and if you followed the comment on the immediately preceding blog post (Jonathan misquoting Plato's Politeia) you will see that the only way to pass such laws is to limit freedom of speech. And while I find those "100%, 97%, etc climate scientists" laughable, the new proposed restrictions on what may be posted on the internet - not coincidentally in the same countries enforcing the "six-million" law, plus China, Russia, assorted other totalitarian-tendencies regimes - is in my view a cause for major alarm. Ignorance of particle physics or any other items on your ignorance list here is unfortunate but not a cause for concern.

https://documents.trendmicro.com/assets/white_papers/wp-fake-news-machine-how-propagandists-abuse-the-internet.pdf

June 14, 2017 | Unregistered CommenterEcoute Sauvage

Ecoute,

"(Jonathan misquoting Plato's Politeia)" - I was not attempting to quote Plato. My line: "As for Shearman's dream for a Plato-style Republic, well, the plutocrats currently in charge would never allow it" was an attempt at mildly humorous cynicism. Any resemblance it bears to anything from Plato is purely coincidental.

June 14, 2017 | Unregistered CommenterJonathan

"Dan - have you seen this? It might even warm NiV's heart:"

It looks familiar, I think it's been discussed here before.

I still find it funny the way they keep on repeatedly stating the 97% myth over and over again all through the paper, with the whole list of citations given every time, as if it was some meta-experiment in Argument from Authority. If making the statement works on the general public, maybe it will work on the scientific community, too? That they cite Anderegg is even funnier than them citing Doran and Zimmerman. I think Anderegg's list was about 34% climate sceptics, although it didn't even pretend to be uniformly sampled.

Yes, they're correct that you can fit events into a Bayesian model. You can even fit it into a single node, if you accept the possibility of more than two hypotheses. All that is required is that you include the reliability of the source of information as being subject to uncertainty. Then if someone repeatedly asserts something you know to be untrue, it doesn't affect your belief in the assertion, but their perceived reliability drops quite precipitately.

Speaking of unreliable sources, I wouldn't trust Messrs Cook and Lewandowsky if they wrote a paper saying rain was wet. But fortunately the argument for it is independent of their authority.

"NiV - the difference between ignorance of scientific knowledge and ignorance of prevailing scientific opinion (even if prevailing means 100%, as Mr Cook of Western Australia maintains) matters little unless enforced by some law."

J S Mill would have certainly argued with that ("...a social tyranny more formidable..."). But fortunately, this time around such a law is not going to happen, despite the ambitions of people like Cook and Lewandowsky. The belief is already politically a dead duck, and in a sense it has been since the Byrd-Hagel resolution.

The reason I point it out is first to make clear what sort of people these are, and second to provide a warning for next time. Because we've been here before. Back in the 1970s there was the overpopulation scare, that demonstrated with mathematical and scientific certainty that food and resources were about to run out, resulting in starvation, war, and the collapse of civilisation, and the only hope was to immediately impose authoritarian government to take the harsh measures necessary to save what we could, because people would never do it voluntarily. (We're talking population control measures here, like China's one child policy, compulsory sterilisation programmes, and worse.)

And even today there are still people who believe it is true. Like every millennarian cult, when the prophecies don't happen on the predicted date, they just move the date further down the line. (Some of the earliest global warming predictions foretold that America would be a dust bowl by now.) But they're a tiny and ineffective minority now - as far as political power goes, there was a point where it lost the critical momentum and the moral panic died. It never got publicly debunked or discredited, it simply faded from the public consciousness as the media stopped reporting on it.

However, that didn't matter because it was soon replaced. It is the "whole aim of practical politics". And now that global warming too is dying, we need to be on the look out for the next one. None of them will achieve all their aims in one step, but if they can carve out a little extra territory with each wave, they will eventually claim control over the entire beach.

So long as we can maintain a mildly humorous cynicism about their attempts, I think there is little to worry about. But if we ever stop being cynical ..., well. History tells us that people always believed it was impossible it could ever happen to them, too. Ask the Greeks what's happened to Plato's civilisation.

June 14, 2017 | Unregistered CommenterNiV

NiV, Ecoute:

My point with the Cook & Lewandowsky article was merely an attempt to refute Dan's insistence that there is no rational Bayesian account that can accommodate polarization and backfire effect. It is even possible that higher OSI folks will have the ability to create and maintain more complex Bayesian networks than those at the lower end, and that might explain their greater polarization. This is certainly similar to discussions we've had before, but I was hoping that Dan might be more amenable to an explanation provided by Cook & Lewandowsky.

June 14, 2017 | Unregistered CommenterJonathan

Jonathan -

=={ It is even possible that higher OSI folks will have the ability to create and maintain more complex Bayesian networks than those at the lower end, and that might explain their greater polarization. }==

It doesn't seem to me that polarization requires complexity. That is why I question the assigning of causality in the association between "science comprehension" and polarization. People who don't know! Science have absolutely no problem coming up with pathways of reasoning to reject evidence or interpretations of scientific evidence they don't like.

I can't understand Dan's confident comclusions of causality there... Maybe you can help me understand.

June 15, 2017 | Unregistered CommenterJoshua

NiV - Mill would most certainly agree with me for the very reason you mention, a reason shared by Kant >
http://www.gutenberg.org/files/46873/46873-h/46873-h.htm
> and consisting of recognizing that excessive debt is dangerous, and a nation that builds up an enormous pyramid of credit - especially if part of that debt is due to foreigners - becomes a danger to itself as well as to its neighbors.

Taxation is, of course, based on a law. And continued membership of the US in that absurd UN Climate fund would have cost taxpayers $3 billion for openers. Obama actually managed to pay out half a billion, on top of a previous half billion, just 3 days before president Trump was inaugurated. Nor was this an isolated case of a shakedown in favor of friends and relatives of the previous administration - monies properly due to the Treasury have been diverted to other dubious groups: http://www.judicialwatch.org/press-room/press-releases/judicial-watch-sues-justice-department-records-forcing-corporations-fund-leftist-groups/

Dan and any other legal eagles here can probably come up with a term more elegant than shakedown - blackmail is probably too strong, since a shakedown depends on convincing the reluctant giver he owes the money because of past misdeeds. But Indians asking for money in order to stop drinking from the Ganges because of an excess of CO2? That doesn't even meet the giggle test.

June 15, 2017 | Unregistered CommenterEcoute Sauvage

Joshua,

"It doesn't seem to me that polarization requires complexity. That is why I question the assigning of causality in the association between "science comprehension" and polarization"

I was not assigning causality. I was offering an alternate (I thought) explanation. I had thought I understood that Dan was claiming, under the identity-protective hypothesis, that some type of cognition that was not truth conducive must be going on in polarization cases, especially when polarization is seen to increase at the high OSI end of the spectrum, and most especially when backfire occurs. Maybe he's not doing that (?). But, assuming that Dan did make such a claim, I was offering up this "more complex Bayesian network" idea from Cook & Lewandowsky as a counter claim that seems more Occam's Razor-ish to me because it could explain polarization + increased polarization at high OSI + backfire but doesn't propose any abandonment of truth-conducive Bayesian cognition.

In other words, I'm still a skeptic about whether identity protection (if I understand it properly as non-truth conducive) is really going on in these cases. Are people really more concerned about fitting in with their selected groups than about the truth of the matter, or is something else going on that is truth conducive yet also manages (as a side effect) not to threaten their affiliations?

June 15, 2017 | Unregistered CommenterJonathan

Oops - meant "convergent" in all cases where I said "conducive" in the previous post (because Dan uses "convergent", and I meant to refer to his particular usage, which I was mis-remembering as "conducive").

June 15, 2017 | Unregistered CommenterJonathan

Jonathan -

thanks for the clarification:

=={ Are people really more concerned about fitting in with their selected groups than about the truth of the matter, or is something else going on that is truth conducive yet also manages (as a side effect) not to threaten their affiliations? }==

My own view is that there can be different brands of identity-protective cognition. One is connected to fitting in with selected groups, and another is more of an internal process, where people are driven to protect their identity as someone who is right, or someone who seeks truth, or someone who has high moral values by not allowing bias to influence her/his reasoning. It is closely connected to the affinity group brand, but also someone independent. i have never seen Dan account for that aspect of identity protective cognition (assuming it exists) even though he has wondered why some people identity with beliefs that are in conflict with their affinity groups; beliefs. IMO, a shift with one individual compared to another, w/r/t relative magnitude of different kinds of identity protective reasoning, sometimes in association with particular issues, could help explain the "outliers."

i don't see why there can't be some combination of factors. It certainly seems to me that it doesn't have to be either affinity group "motivation" or a 'motivation" to see oneself as 'right" - which i think may be a form of truth-seeking.

June 15, 2017 | Unregistered CommenterJoshua

I think that there are many cultural components as to how "science" is viewed. A lot has to do with views of authority. It starts with the family. Are children to be encouraged to explore the world around them? Or should they keep quiet and obey? A big one would be education. Is science (and other topics) part of a list of things to be memorized with proficiency determined by testing? Or is it a process of exploration and further learning?

Our society has many divides that correspond to the socio-economic divide. These often relate to what individuals do about science once they've heard about it. They also relate to how successful one can be in a highly technological society. And even why these same groups tend to set up education funding in manners that perpetuate that lack of success. Denial, or at least deliberately ignoring the available evidence that trickles down to that individual is a strong option. So is rejection of messages coming from an annoying external power structure. For example: https://www.washingtonpost.com/national/americas-new-tobacco-crisis-the-rich-stopped-smoking-the-poor-didnt/2017/06/13/a63b42ba-4c8c-11e7-9669-250d0b15f83b_story.html?utm_term=.77ff9423f412. "America’s new tobacco crisis: The rich stopped smoking, the poor didn’t"

Note that tobacco companies know all of the above. They have large teams of dark side Dan Kahans working for them. And they know how to target these weaknesses to their own economic advantage. This is accentuated by the cultural gap shown in map in the accompanying article. We live in two separate worlds with a large gap between what everybody knows and everybody knows that is formed by the observations made in the world around you.

Sort of the flip side can be seen in this casual, man on the street poll given here: https://www.washingtonpost.com/news/wonk/wp/2017/06/15/seven-percent-of-americans-think-chocolate-milk-comes-from-brown-cows-and-thats-not-even-the-scary-part/.

This is not new. I just lent out my several volumes of "Practical Education" a John Dewey style teachers curriculum guide written in 1922 mostly by Cornell professors and intending to bridge the gap in understanding of a then newly urbanizing society. I got this from an older woman who had it from her college days and preserved it through many other waves of educational theory. I'm hoping to influence a young woman who has been a classroom teacher but who is now an instructor and mentor for other elementary school teachers. This advocates practical hands on instruction in science, and nature. It also advocated a classroom corner library for free reading, instruction in history aimed at including diverse ethnic groups, and the idea that introductory algebraic concepts in mathematics are appropriate topics for third graders. Evaluations were to be based on individual assessments. Nothing new there. And nothing new in the way we escaped from Cotton Mather only to end up with Betsy DeVos.

Advances are made, only to be beaten back again. It's not just individuals, it's how they interact with the system. And the relationship between their power to shape their economic future (unions for example) vs the power of those with money to retain control.

Another problem is that much of the issue of those with differences have in fitting in is resolved by simply moving away. But with a political system that is based on geography and not just total population this becomes a policy issue.

I think that the analysis of cultural influence needs to pay more account for the drivers by those in power.

After all, Native Americans grew tobacco and managed to limit its use to largely ceremonial occasions. Their view of nature also involved fitting in with that system and not harnessing aspects of it for individuals to make mega bucks.

There is a dynamic tension between what we know as "Scientific Progress" and the freedom to wreak socio-economic havoc.

June 16, 2017 | Unregistered CommenterGaythia Weis

"A big one would be education. Is science (and other topics) part of a list of things to be memorized with proficiency determined by testing? Or is it a process of exploration and further learning?"

If the teacher tells you about global warming endangering the planet, should you simply memorise and repeat what you're told for the exam; or should you explore alternative viewpoints, chase down the evidence, download data, and check things for yourself? Opinions differ, on that, and in a way not aligned with traditional stereotypes about political ideologies.

"Denial, or at least deliberately ignoring the available evidence that trickles down to that individual is a strong option. So is rejection of messages coming from an annoying external power structure. [...]"America’s new tobacco crisis: The rich stopped smoking, the poor didn’t""

The issue is that the more fanatical ideologues with a particular policy aim tend towards a very unbalanced, one-sided view of the world. They only see evidence and argument on one side of the debate, and deny any possibility of the other side's arguments having any validity.

Tobacco is a classic case. It's a product that brings pleasure to millions in the short term, but has serious long-term health effects. There's a trade-off there - would you rather forego the short-term benefits in hope of better chances of a longer life in your old age (chances that are still far from certain), or would you rather take the short-term benefit and pay the price in a higher chance of an earlier death? It's a personal choice - different people have different values and weigh the risks differently.

There's no problem with people deciding for themselves that the risk isn't worth it, and that smoking is a bad idea. That's their right and their choice. But for some of them, they're so convinced that there is no other possible decision, no other possible rational point of view, that they see the failure to persuade everybody else of their viewpoint is some sort of national policy failure. And if persuasion doesn't work, compulsion is thereby justified.

That elitist authoritarian impulse can itself breed resistance, because in the same sort of way that smoking can be a long-term hazard, so is political authoritarianism. Historically, it's killed a considerable number of people, too (in excess of 100,000,000 in the 20th century alone), and continues to do so even today in many other parts of the world. Vaccinating people against authoritarianism might be seen as just as critical to preventing society-threatening epidemics of it as vaccinating them against measles - there is a herd immunity effect at work there too.

Businesses gain wealth and power, of a sort, by giving other people what they want and need; by working for the benefit of others. It is automatically limited by the fact they they only keep the power for as long as they keep delivering what people actually want. But there is another sort of power - the power to tell other people what to do based not on delivering what the people themselves want, but based on what the powerbrokers think they *ought* to want. The ones bringing in no-smoking laws are among those with such power. And yes, there is a conflict steadily building between the people and those seeking power over them to dictate to them their permissible activities "for their own good", like smoking.

There are people who warn society against the dangers of creeping authoritarianism, and there are those who deny or deliberately ignore those warnings, aided by a cadre of "Dark Side Dans". Isn't that the same thing? Isn't this what the symmetry thesis says we ought to expect?

June 16, 2017 | Unregistered CommenterNiV

@Jonathan & Joshua--

the individual "dissonance avoidance" alternative to "cultural cognition" is plausible but doesn't account for evidence that individuals change their beliefs readily in response to cues about position that predominates in affinity group. E.g.,

1. Kahan, D., Braman, D., Cohen, G., Gastil, J. & Slovic, P. Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law Human Behav 34, 501-516 (2010); and

2. Cohen, G.L. Party over Policy: The Dominating Impact of Group Influence on Political Beliefs. J. Personality & Soc. Psych. 85, 808-822 (2003).

IN any case, there's no reason to think that the sort of confirmation bias associated w/ dissonance-avoidancethat could account for politically motivated reasoning steers people to truth. Consider Rabin, M. & Schrag, J.L. First Impressions Matter: A Model of Confirmatory Bias*. Quarterly Journal of Economics 114, 37-82 (1999).

June 17, 2017 | Registered CommenterDan Kahan

"the individual "dissonance avoidance" alternative to "cultural cognition" is plausible but doesn't account for evidence that individuals change their beliefs readily in response to cues about position that predominates in affinity group."

What "dissonance avoidance"? I thought we were talking about a "source credibility effect"?

The hypothesis is that people are judging both the credibility of the claim and the credibility of the source simultaneously, allowing for the fact that sources can be wrong, and they themselves can be misled by incomplete information.

Thus, on being told that an expert supports a position that the subject previously believed strongly to be false, they'll downgrade the expert's credibility more than they will their disbelief in the position. But if given stronger evidence that the expert is actually credible, the likelihood ratio shift moves back to the content of the belief. If the reasons for doubting credibility are knowledge that it is a politically divided subject, then the lack of credibility is equated to the expert having an opposing political bias. Evidence that such a bias isn't applicable is therefore evidence of extra credibility when the claim goes against the stereotype. The presumption is that if other Republicans (who probably know more about the details of the policy) support a generous welfare policy, it can only be because they've seen especially solid evidence to convince them. They wouldn't go against the stereotyped expectations without it. Subjects are deducing the existence of strong but unseen evidence from the nature of the people who have been convinced by it.

So to take an example from a different part of science, like quantum entanglement. This consists of a ghostly interconnection between distant events that defies all everyday logic, and is therefore inherently non-credible. On watching a presentation from an excited and enthusiastic adherent to the theory, extolling its mystery but at the same time giving assurances of its truth, people are likely to be sceptical still, considering the possibility that the guy is just a crank, and this is crank physics on the level of crystal healing and many other strange uses of the word "quantum", which is widely seen as a "hall pass" excusing abandonment of all common sense. But if you're then given examples of the same guy decrying other examples of junk science and "woo", vocally opposing all the quantum healing nonsense, you might actually give it more serious consideration. If such a guy has been convinced, despite surface appearances of the theory going against what you know of his stance, maybe there's more to this "quantum entanglement" than the usual quantum crankery. In fact, if someone known for their scepticism of spurious quantum claims supports this, the evidence of there being evidence for it is especially strong. No politics is involved in this decision, you note. It's pure truth-seeking.

Or to take another intuitive example, an atheist once convinced that scripture is an unreliable source, can thereafter be supplied with an infinite supply of scriptural testimony supporting the existence of God and still never change their mind. If the initial lump of evidence reduces source credibility to zero, all the subsequent evidence is ignored. (It's an interesting question whether this procedure is actually truth convergent, on average, but I think it's clear enough that it's intended to be.)

People consider the credibility of sources as well as the credibility of the content, they use the previously known truth or falsity of this and other statements from the same source as evidence of source credibility, and they regard political truths in the same way as any other truths. It's a highly unreliable heuristic, but it's what a lot of people do.

To disprove this hypothesis, you need to test the same effects on factual questions on which people have strong prior beliefs, but which are *not* entangled with political or cultural identity. If people fail to revise source credibility when sources make implausible factual claims on non-political subjects that the people nevertheless have strong prior beliefs on, but do when opinions are politically divided, that would be good evidence that there's something special about political identity.

June 17, 2017 | Unregistered CommenterNiV

NiV - non-locality is not against everyday experience, since we all know gravity. Entanglement just is - there is no arguing with experiment, and nature is under no obligation to explain anything to us.

I think I follow Dan's distinction between dissonance avoidance and cultural cognition, but I can tell you the best explanation I ever heard for entanglement involves calling it "magic". Not in any religious sense - instead in the same sense that Keynes, great collector of Newton's alchemical writings, called Newton a "magician". Read the whole speech, it's fascinating and will make you more comfortable with the concept :)

http://www-groups.dcs.st-and.ac.uk/history/Extras/Keynes_Newton.html

>>>> In the eighteenth century and since, Newton came to be thought of as the first and greatest of the modern age of scientists, a rationalist, one who taught us to think on the lines of cold and untinctured reason.

I do not see him in this light. I do not think that any one who has pored over the contents of that box which he packed up when he finally left Cambridge in 1696 and which, though partly dispersed, have come down to us, can see him like that. Newton was not the first of the age of reason. He was the last of the magicians, the last of the Babylonians and Sumerians, the last great mind which looked out on the visible and intellectual world with the same eyes as those who began to build our intellectual inheritance rather less than 10,000 years ago. Isaac Newton, a posthumous child bom with no father on Christmas Day, 1642, was the last wonderchild to whom the Magi could do sincere and appropriate homage. <<<<<<<<<

June 17, 2017 | Unregistered CommenterEcoute Sauvage

... and a new study by Pew Internet on free speech online:
http://www.pewinternet.org/2017/03/29/the-future-of-free-speech-trolls-anonymity-and-fake-news-online/

June 17, 2017 | Unregistered CommenterEcoute Sauvage

"non-locality is not against everyday experience, since we all know gravity"

Gravity (in general relativity) is local. It's not in the Newtonian version - that was one of the reasons used to show that it must be wrong.

"Entanglement just is - there is no arguing with experiment, and nature is under no obligation to explain anything to us."

People argue with experiment all the time.

But I'm talking about people who haven't seen the experiments being done - all they've got is some fella with wild hair talking about it on the Discovery channel. Should they believe him? And if so, should they believe all scientists about everything?

"but I can tell you the best explanation I ever heard for entanglement involves calling it "magic". Not in any religious sense - instead in the same sense that Keynes, great collector of Newton's alchemical writings, called Newton a "magician". Read the whole speech, it's fascinating and will make you more comfortable with the concept"

Arthur C Clarke was a lot more concise: "Any sufficiently advanced technology is indistinguishable from magic."

However, there is a distinction to be made between phenomena that follow as yet unknown rules, and phenomena that follow the rules of narrative storytelling/fantasy, which is what most "magic" is. Newton was trying to take the latter and turn it into the former. It's a common symptom of schizophrenia (aberrant salience). People build their understanding of the world out of the cultural components available to them at the time they live.

But actually, there is a perfectly reasonable, local, deterministic, realist explanation for entanglement already known - it's just not commonly used in mass media pop science presentations of the idea. It's a lot more fun, and sells the wonders of science better, to leave a bit of mystery and magic to it. (And it's also a minority view among physicists - so those who prioritise "scientific consensus" tend to disagree with it purely on that basis.)

June 17, 2017 | Unregistered CommenterNiV

Much here that relates to polarization over climate change, methinks:

The survey responses, along with follow-up interviews and focus groups in rural Ohio, bring into view a portrait of a split that is tied more to social identity than to economic experience.

https://www.washingtonpost.com/graphics/2017/national/rural-america/?hpid=hp_hp-top-table-main_no-name%3Ahomepage%2Fstory&utm_term=.b8aff0a72b6d

Perhaps a lot there that explains the causal mechanism of polarization over climate change, and perhaps more than "science comprehension" or "science curiosity," IMO

While urban counties favored Hillary Clinton by 32 percentage points in the 2016 election, rural and small-town voters backed Trump by a 26-point margin...

Consider the association between votes for president in 2016 and views on climate change.

June 17, 2017 | Unregistered CommenterJoshua

"I thought we were talking about a "source credibility effect"?"

So did I. Whether its "groupthink" as in Cook & Lewandowsky, or some other possibility that isn't influenced by ones identity (though may correlate with it) that can flip the sense of evidence.

The framed defendant is another example of this - if the implicating evidence is really good, is the defendant likely guilty, or likely innocent due to the evidence being too good and thus implying the defendant is being framed? Both possibilities are potentially true, hence Bayesian networks that take really good evidence in either direction are truth convergent. The Bayesian networks that account for the possibility that the defendant is framed have an extra node sensitive to implicating evidence that is too high quality that functions to flip the sense of all such evidence from adding to belief in guilt to adding to belief in innocence.

However, I suspect that in some cases when this happens, some people construct Bayesian networks that include not practically falsifiable nodes. They added these nodes when they began to suspect or perhaps even encounter some evidence for them. Which seems sensible. However, they don't consider falsifiability. So, once added, these nodes produce a "cognitive rut" that they can't get out of.

June 18, 2017 | Unregistered CommenterJonathan

"However, I suspect that in some cases when this happens, some people construct Bayesian networks that include not practically falsifiable nodes. They added these nodes when they began to suspect or perhaps even encounter some evidence for them. Which seems sensible. However, they don't consider falsifiability. So, once added, these nodes produce a "cognitive rut" that they can't get out of."

Another example is the 'conspiracy theory' trap. If one forms a hypothesis adjoined to the belief that the evidence has been manipulated by a powerful conspiracy (e.g. oil companies campaigning on global warming), then any evidence against your hypothesis can be set down to manipulation by the conspiracy, and any evidence for is true evidence that bypassed the conspiracy.

The trouble is that on pure Bayesian grounds, a world in which evidence-manipulating conspiracies can exist *does* have this property. You can't actually deduce *anything* from observation of them, because evidence no longer has any firm relationship with the truth. The observed evidence has the same probability under every hypothesis, and the likelihood ratio is 1.

So in this case we have to argue for *rejection* of the Bayesian framework as a cognitive ideal, because although truth-converging (in a world that genuinely *has* evidence-bending conspiracies), it's not useful. The Bayesian method is nearly blind, and often locks on to noise randomly.

It's not a new idea.

In short, truth-converging algorithms can only be effective in a limited set of possible worlds, and only under the (unsupported) assumption that you are in such a world. The trouble is, experience suggests that we might not be (or at least, that large parts of it approximate a world in which it's not true arbitrarily closely), and in that case any limited-worlds truth-converging algorithm will likely converge on untruth.

This is especially the case when we also consider agents with finite computational resources. There might be evidence revealing the truth, but it's inaccessible without a disproportionate amount of effort, and therefore for all practical purposes might as well not exist.

Perhaps the best we can hope for is some sort of model dependent realism?

June 18, 2017 | Unregistered CommenterNiV

Niv:

don't know about theoretical other worlds, but it's worth noting that in ours a social consensus is neccessarily untrue. It is arrived at by selection, and is that which is the central narrative of a culture, which via social enforcement defines and holds the group together. We are subject to these mechanisms because culture has been a huge evolutionary advantage; emergent consensuses are geared to satisy the neccessary social equations and so cannot reflect truth. So it's worse than 'maybe locking on to noise', social consensus will *always* lock on to untruth; it is the job of culture to create an arbitrary consensus in the face of the unknown.

Idealised science is a truth algorithm, hard evidence short-circuits emotive memes (which tend to have the highest selection value in the above formation of consensus). Fortunately it doesn't have to be ideally practised to still work very well on a long timescale. Yet also science is a relatively new endeavor and vulnerable to the long-evolved cultural mechanisms, being very frequently adversely impacted and sometimes completely derailed, via a whole raft of bias effects that reflect cultural incursion. And no possible algorithm based only upon social information can possibly intuit scientific truths in conflicted domains (or for any science, though I guess it is only conflicted topics that generate a strong need to know).

Social information (for instance via social proofing or the special authority status of science, as Dan suggests above) can only tell you about identity, not truth. Given that science is often diverted or derailed by cultural mechanisms, the identity of the group currently holding the authority baton in a conflicted domain is of absolutely no use to us. Even in simple cases where there's a 1 to 1 mapping of baton holders to scientists (rather than the baton holders turning out to be a complex mix of scientists, policy makers and politicians etc. serving a common narrative). Sometimes history will prove them right, and sometimes it will prove them wrong. It does not even matter if, statistically, the 'current' baton holders will most often be proved right. If you need to know the truth in a particular domain no amount of, or algorithm regarding, social information can ever tell you this; these can only tell you about identity.

Regarding people without 'even a rudimentary level of expertise', Dan says: "I don't mean people intuitively know what science is yet to find out. I mean they form relatively reliable intuitions about what science knows when it comes to know it."

The problem with this is that the proposition in the second sentence still implies that which is sensibly rejected in the first. Intuition means based upon feelings or instinct, *not reason*. So if instinct can reliably determine which side is right regarding a socially conflicted science issue, while history tells us of many cases where the mainstream / scientific authority side has been wrong, then this instinct must be divining more than just identity, it must be divining correct science. In which case, the same instinct should be able to do this just as well for those competing theories where there is not yet one side claiming an unchallengeable consensus. This goes against our whole understanding of what science is and the principle of advance based not upon feelings but on evidence only, and your list of science misunderstandings emphasizes how wrong the public usually is anyhow.

We do have long honed instincts that can tell us things about identity. BUT these instincts are themselves modulated by identity. For only for those who are not already culturally aligned, they can detect a culture even when it wears the mask of science. This is the origin of public skepticism, which in the climate change domain for folks without 'even a rudimentary level of expertise', is rampant even in countries that have weak or no left-right cultural alignment on the issue. The culture detected in the latter cases is not primarily left or right, but climate calamitous.

June 18, 2017 | Unregistered CommenterAndy West

"don't know about theoretical other worlds, but it's worth noting that in ours a social consensus is neccessarily untrue."

Mmm. I think that's a long way from axiomatic. There's a social consensus that 2+2 = 4. Is that necessarily untrue?

"We are subject to these mechanisms because culture has been a huge evolutionary advantage; emergent consensuses are geared to satisy the neccessary social equations and so cannot reflect truth."

Social consensus is a consequence of both internal and external forces - like rigid body dynamics. The internal forces require all parts of society to hold the same opinion, but don't set any constraints on what opinion that is. The external forces apply constraints to satisfy some desirable condition. This might be truth-seeking, as in social consensus on factual knowledge, or it might be the efficient and effective communication of ideas between people as with defining language, or conflict reduction as high population densities as with defining morality. External constraints on language and morality leave a lot of degrees of freedom remaining, and I think it would be fair to say that there is no "truth" they're seeking, but that's not the case for factual knowledge.

"If you need to know the truth in a particular domain no amount of, or algorithm regarding, social information can ever tell you this; these can only tell you about identity."

I agree. Argument ad populam is a heuristic - an unreliable approximation process that works well enough to be worth using when computational or informational resources are limited. But a lot of people don't have an alternative, and Dan is interested in understanding the internal forces by which social consensus is developed and spread. It's an interesting scientific question in itself.

"Intuition means based upon feelings or instinct, *not reason*."

Intuition, in the sense we use it here, means implicit models of the world developed by subconscious reasoning. As toddlers, we all develop physical intuitions about forces and balance and motion, sufficient to stack blocks on top of one another. That understanding is developed by extremely sophisticated subconscious reasoning - to the point that ordinary teenagers can watch an approaching bowler/pitcher, solve a set of under-determined equations in projective geometry to map the image of the ball's trajectory on their retina into a 3D trajectory and get the bat moving to intercept it and place it precisely in the outfield; some of them fast enough to do it with a ball moving past them at 100 mph! Then another teenager can solve a similar set of differential equations to run and catch it!

There's nothing unreasonable or unsophisticated about intuition. Kids who can catch a ball couldn't explain the subconscious mathematics that enables them to do so. But the mathematics works, and very accurately in most cases.

"So if instinct can reliably determine which side is right regarding a socially conflicted science issue, while history tells us of many cases where the mainstream / scientific authority side has been wrong, then this instinct must be divining more than just identity, it must be divining correct science."

The problem is that we're dealing with an extremely effective and powerful algorithm being applied to an even more impossibly difficult problem. Because our understanding is subconscious, we tend to underestimate the difficulty of what we're trying to do.

What we're asking is for people to draw conclusions about questions with some very non-obvious answers on which they have no direct experience or direct evidence. (How many people have personally analysed the chemical constitution of the atmosphere?) They're required to weigh up multiple conflicting half-remembered sources of extremely indirect evidence, modeling the psychology and motivations of the people involved, and the chains of communication between them, to figure out the most likely answer, all within a few seconds. It's a miracle they do as well as they do!

I agree with you that it's a lot less reliable than Dan seems to think, and that his faith in Argument from Authority / argument ad populam / trust in Scientific Consensus is misplaced. People don't converge particularly reliably on the scientific truth even on uncontested subjects - but we only notice or care about that fact on the contested ones.

"For only for those who are not already culturally aligned, they can detect a culture even when it wears the mask of science."

Yes, and we all have a culture. So we can all see into the cultural blind spots of all the other cultures, and we're all blind to and unaware of our own cultural blind spots. Which is kinda Dan's main point.

Dan happens to believe in climate change, because that's part of his culture. We don't, because that's part of ours. We are both of us convinced that science is on our side. The effects are all symmetric, because we're all humans and are therefore all using the same basic cognitive algorithms and heuristics. Dan thinks that political culture in particular creates the blind spots and distortions that throw the algorithms off track. I personally think that any prior knowledge can do the same, and we only notice it when the political divide causes us to draw different conclusions from the same data. But it's still an open question.

June 18, 2017 | Unregistered CommenterNiV

NiV and Andy - you are underestimating both the genius of Newton and the power of intuition generally.

Eratosthenes first had the intuition that the earth is a sphere, tilted on its axis, and THEN did he go out and calculate exact numbers (remarkably close to the numbers we have today). And Newton studied the old records not because he believed in magical incantations but because he wanted to see how knowledge was discovered, and why it could be lost for millennia, as the calculations of Eratosthenes were.

Bayesian networks is how big data works to program AI, and the results are so far unimpressive - the bots lack any common sense because they lack intuition. As to heuristics, take a dog to a park!
www.bis.org/review/r120905a.pdf

>>>>>>>>>.Catching a frisbee is difficult. Doing so successfully requires the catcher to weigh a complex array of physical and atmospheric factors, among them wind speed and frisbee rotation. Were a physicist to write down frisbee-catching as an optimal control problem, they would need to understand and apply Newton’s Law of Gravity. Yet despite this complexity, catching a frisbee is remarkably common. Casual empiricism reveals that it is not an activity only undertaken by those with a Doctorate in physics. It is a task that an average dog can master. Indeed some, such as border collies, are better at frisbee-catching than humans. So what is the secret of the dog’s success? The answer, as in many other areas of complex decision-making, is simple. Or rather, it is to keep it simple. For studies have shown that the frisbee-catching dog follows the simplest of rules of thumb: run at a speed so that the angle of gaze to the frisbee remains roughly constant.<<<<<<<<<<<<<<

To sum up, I think Dan is right and that you both are adding complexity by confusing proof of the existence of an optimum with the assumption that the optimum is also calculable - hardly ever the case in real-world problems. Not to
forget those other Everett worlds - one exasperated physicist correctly noted that anyone who believes in their existence should start playing Russian roulette with high stakes, since in some of those other worlds he is sure to be alive and very rich. Experiment always trumps theory.

June 18, 2017 | Unregistered CommenterEcoute Sauvage

>>There's a social consensus that 2+2 = 4.

There is not. This is manifest. Ice age peoples made counting marks on bones, and likely long before can count on their fingers as can most everyone in the modern world. Much that is far more complex is likewise manifest. Tower blocks stay up, shells hit their targets over many miles, rockets land on the moon, cars run. The maths that determines these things is likewise obvious to all. Where things are not so obvious, neither are they socially contentious in the great majority of cases, so likewise no social consensus forms regarding specific issues (there is as noted in this thread, nevertheless a net trust in science generically, which *is* a social contract). Such consensuses formed long before science and are emergent phenomena that create abitrary and artifical 'certainty' in the face of uncertainty. Religions are good examples, plugging the hole of uncertainty of who we are, where we came from, what happens to us next. Where science has significant uncertainy, or even perceived uncertainty, in domains that for whatever reason become socially contentious, a social consensus along the same lines is likely to form. Sometimes it will not last, sometimes it will. Sometimes social inertia means it will live on in some of the population long after the original uncertainty has been removed. Whether entangled with science issues or not, such consensuses are always wrong. Their typically emotive stories 'fill in' for the unobtainable truth.

>>Social consensus is a consequence of both internal and external forces...

I don't know what you mean by that. Social consensus is the result of a iterative process across society, at the heart of which is selection. I'm not talking about consensus in terms of a jury output say, but the ouput on a larger scale, cultural consensus is the better term. But the motivation of the participating individuals is not neccessary to know over a large enough number. The consensus is emergent, and occurs as the most highly selective narratives rise to the top. Long evolved policing mechanisms emerge to reinforce the winning narrative.

>>...of them fast enough to do it with a ball moving past them at 100 mph!

The evolution of instincts regarding balls and all other direct physical phenomena are not useful or applicable to that which we didn't evolve with directly, i.e. the hidden questions of science.

>>There's nothing unreasonable or unsophisticated about intuition.

I didn't say intuition was unsophisticated, and it is not 'unreasonable' in the sense that it has specific and valid evolutionary fulfillment in social situations. But it is not reason.

>>Kids who can catch a ball couldn't explain the subconscious mathematics that enables them to do so.

This mathematics is not accessable for anything other than catching balls or similar tasks. This is from an analogue engine highly constrained and optimized for the (many) direct physical challenges we humans face. It has no bearing on applying reason to abstract problems. We have great reasoning capability too, of course, but I presume you are not suggesting that our ball catching abilities can feed this reason to inform us about what contentious science issues may be correct or not.

>>The problem is that we're dealing with an extremely effective and powerful algorithm being applied to an even more impossibly difficult problem.

The problems are indeed extremely difficult, impossibly difficult where sufficient evidence is not yet forthcoming. But it doesn't matter how powerful or sophisticated our intutive (i.e. not conscious reasoning) mental algorithms are. When applying to the public for instance, as Dan says to those without 'even a rudimentary level of expertise', if their intuition could tell us who is right on contentious science issues, some of which the scientists themselves are indeed conflicted about (whether or not the public knows this), then we wouldn't really need science. We'd just ask the more intuitive among us what the truth is, or indeed could they inform us whether sufficient evidence exists yet or not. This is clearly wrong. And as no scientific information is input, no possible computation can be output that says anything about the science issue. Only social information is input, so only conclusions about identity can be output, the mind is no more than a very sophisticated computer and subject to the same basic limitations; it cannot form an answer for which it has no relevant input.

>>I agree with you that it's a lot less reliable than Dan seems to think, and that his faith in Argument from Authority / argument ad populam / trust in Scientific Consensus is misplaced.

Yes.

>>Yes, and we all have a culture. So we can all see into the cultural blind spots of all the other cultures, and we're all blind to and unaware of our own cultural blind spots. Which is kinda Dan's main point.

One in which I'm in agreement with and have never challenged.

>>Dan happens to believe in climate change, because that's part of his culture. We don't, because that's part of ours. We are both of us convinced that science is on our side.

Although I'm far from convinced that skeptical climate science is on 'my side'. In that sense I don't have a side. I don't know whether ACO2 will work out to be good bad or indifferent, and I note many contentions and many good scientists on both sides, many convoluted arguments and far too much complexity (despite a degree in physics, which barely arms me at all), to figure it out. What I can see from social data is that the narrative of certainty of imminent climate calamity, as advocated by virtually the entire western authority matrix from presidents and prime ministers on downwards, in the most emotive and urgent manner, is a cultural narrative that is not supported by either skeptical or orthodox climate science. Hence this narrative does not straddle the polarization anyhow, and is wrong, like all cultural narratives and their associated consensuses.

June 18, 2017 | Unregistered CommenterAndy West

Ecout:

I never mentioned Newton at any point. However, those intuitions that yielded to his voracious reasoning via maths and experiment, became a foundation stone of our science. Those intuitions that did not yield to his voracious reasoning (e.g. the alchemical papers) remained nonsense despite years of effort and no doubt continued belief. Intuition is just as likely to be wrong as right on matters scientific; reason filters one from the other.

>>...you both are adding complexity by confusing proof of the existence of an optimum with the assumption that the optimum is also calculable

I can't speak for Niv, but I'm doing no such thing.

June 19, 2017 | Unregistered CommenterAndy West

"And Newton studied the old records not because he believed in magical incantations but because he wanted to see how knowledge was discovered, and why it could be lost for millennia, as the calculations of Eratosthenes were."

Don't think so. I think he was a lover of occult knowledge, in the original sense of "hidden, known only to a few". He studied mathematics and geometry for the same reason. Given that pretty much everyone at the time was very religious, and believed in all sorts of supernatural effects as parts of the real world, it would be surprising if people intent on picking apart how the universe works hadn't studied magic.

It was only as a result of their failure to find magic that worked that we eventually dropped it. We like to make up anachronistic stories about the pioneer scientists as if they were people with a totally modern perspective dropped into the past, inventing/presenting science as a polished and complete worldview. It wasn't like that. Kepler's 'Harmonies of the World' is packed with mystic dribble. Newton juxtaposed the mathematics in Principia with little commentaries on God's plan. That was how they saw the world. The universe operated according to hidden, supernatural rules, and mathematics was just one way of trying to find out what the were. Study of hidden meanings in ancient writings was another. They were the same thing - the geometry of the Greeks was just one of those ancient sources of arcane knowledge.

That doesn't in any way take away from Newton's genius. But he was still a man of his times.

"Or rather, it is to keep it simple. For studies have shown that the frisbee-catching dog follows the simplest of rules of thumb: run at a speed so that the angle of gaze to the frisbee remains roughly constant."

And how do you know that's the right rule to follow?

"Not to forget those other Everett worlds - one exasperated physicist correctly noted that anyone who believes in their existence should start playing Russian roulette with high stakes, since in some of those other worlds he is sure to be alive and very rich. Experiment always trumps theory."

That makes no sense. Besides the obvious point that the winner would know that in other 'worlds' he is sure to be dead, so by the exact same reasoning claimed here ought to be very unhappy about the win; it's intuitively obvious that your other selves are effectively different people. Questions of identity are more complicated in an Everett-Wheeler interpretation, but what we experience is analogous to that.

The point about Everett-Wheeler is that where people thought theory and experiment were in conflict (we only see one world, not many), Everett pointed out that this is exactly what the standard theory predicts we should experience. Theory and experiment are in perfect agreement, it's only the common misunderstanding of the theory that is in conflict with it.

June 19, 2017 | Unregistered CommenterNiV

NiV - the fact is and remains that nobody has taken the Russian roulette bet so far and therefore the particular experiment involving that universal Schroedinger wave function has not been run. Whether running it would provide a solution to the continuum hypothesis https://www.ias.edu/ideas/2016/pires-hilbert-hotel I don't know. But I do think this is right:
https://www.aps.org/publications/apsnews/200905/physicshistory.cfm
"...As Everett wrote in his original 1957 dissertation: “Once we have granted that any physical theory is essentially only a model for the world of experience, we must renounce all hope of finding anything like the correct theory … simply because the totality of experience is never accessible to us.”..." Of course theories are models, what else could they be?

Not to digress from Dan's topic here: what do you mean that you, Andy, I, and anyone else questioning that climate change "consensus" belong to one culture, while Dan and all those holding the opposing view belong to a different culture? I just can't see the criterion you use. Thank you.

June 19, 2017 | Unregistered CommenterEcoute Sauvage

PS

"....And how do you know that's the right rule to follow?...."

Well I apologize to Andy for adding his name here >>

" "....you both are adding complexity by confusing proof of the existence of an optimum with the assumption that the optimum is also calculable...."
"I can't speak for Niv, but I'm doing no such thing." "

>> but isn't this exactly what you are doing? I don't claim the border collie has calculated an optimum, just that the simple rule works as a dog heuristic.

June 19, 2017 | Unregistered CommenterEcoute Sauvage

Dan - these are 2 articles on climate change countermeasures:

http://www.scragged.com/articles/incendiary-truths
https://www.technologyreview.com/s/608059/why-bad-things-happen-to-clean-energy-startups/

June 19, 2017 | Unregistered CommenterEcoute Sauvage

Ecout:

>> but isn't this exactly what you are doing?

No. I'm doing anything at all wrt your question because...

>>I don't claim the border collie has calculated an optimum, just that the simple rule works as a dog heuristic.

...it doesn't matter what you claim for your dog heuristic and how optimal this may or may not be or whether your rule works or not. The whole issue is not relevant to the thread. Nor have I mentioned anything about optimums or proofs for such regarding these things anyhow. These types of skills in both dogs and humans evolved as an engine to deal with direct physical interactions over hundreds of millions of years, and are not transferable to the things they didn't evolve to deal with, such as the abstract questions of science. I presume you would not ask your dog, or even a fantastic baseball player, to adjudicate for us on say climate change, because their superlative ball skills implied an enhanced reliability at deducing such issues. I have no idea where you are going with the whole dog / frisbee thing.

June 19, 2017 | Unregistered CommenterAndy West

Andy - sorry for confusion, should have made it clearer the PS was a continuation of the prior message to NiV and had nothing to do with anything you said. Again, I regret the confusion.

June 20, 2017 | Unregistered CommenterEcoute Sauvage

Ecoute:

ah... ok. Thanks.

June 20, 2017 | Unregistered CommenterAndy West

PS (to Andy!)
I'm posting here (and everywhere on the open internet) under a pseudonym: "ecoute sauvage" means "illegal wiretap", but it sounds so much better in French - well most things do. I have no access to this account from my work computer (too many firewalls) or my phone (ditto) or I would have written to correct my bad edits sooner.

June 20, 2017 | Unregistered CommenterEcoute Sauvage

More on the climate wars - Van Jones, Mark Ruffalo, Bernie Sanders, now have modeling expertise:

https://www.technologyreview.com/s/608126/in-sharp-rebuttal-scientists-squash-hopes-for-100-percent-renewables/
"....Various political and advocacy figures have embraced Jacobson’s ideas.[....] He also cofounded a clean-energy advocacy group, the Solutions Project, whose board members include actor and activist Mark Ruffalo and commentator Van Jones. In late April, Senator Bernie Sanders co-wrote an op-ed with Jacobson in the Guardian, highlighting the 50-state research....."

June 20, 2017 | Unregistered CommenterEcoute Sauvage

Ecoute: if you are going to be talking about a policy advocacy group, of course it ought to contain politicians and other thought leaders who are not themselves scientists. There are many such groups with many different compositions and positions. https://www.bloomberg.com/news/articles/2017-06-20/exxonmobil-and-stephen-hawking-just-agreed-to-the-same-climate-fix

The issue that is pertinent here is how to keep discussions with and between such groups science based. It is quite likely that they could still do that with quite different policy objectives, all based on the same best available science.

June 20, 2017 | Unregistered CommenterGaythia Weis

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>