follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

Tuesday
Dec262017

Still more entries in "Cultural Cognition Dictionary/Glossary (whatever)"

This--creating a dictionary/glossary of terms used in the study of cultural cognition--is kind of fun. So I'll add terms whenever the mood strikes me. I've arranged the new entries for today in a sort of thematic order.  In the new page that houses the growing number of dictionary/glossary entries, however, everything is alphabetical (I'll likely add cross-reference links where one term is best understood in relation to one or more other ones).

Secular harm. Refers to a set-back to interest the nature of which is independent of assent to any  culturally partisan conception of the best way to live.  Principal examples include damage to individuals’ physical security and impediments to their apprehension of collective knowledge.  Precisely because such harms can be experienced universally by citizens of diverse cultural identities, protecting citizens from such set-backs is a legitimate end for law in a liberal state [Sources: Rawls, Political Liberalism 175, 217-18 (1993); & Mill, On Liberty, ch. 1 (1859).  Date added: Dec. 26, 2017.] 

Sectarian harm. Refers to a set-back to interest the nature of which is dependent on assent to a partisan conception of the best way to live.  A principal example is the offense individuals experience when they are exposed to behavior that expresses commitments to values alien to theirs. Precisely because such harms depend on—cannot be defined independently of—adherence to a particular conception of the best life, using law to avert or remedy them is illegitimate in a liberal state. [Source: Mill, On Liberty, ch. 1 (1859).  Dated added: Dec. 26, 2017.]

Cognitive illiberalism. Refers to a  tendency to selectively impute cognizable secular harms to behavior that generates non-cognizable sectarian harms. Such a tendency is unconscious and hence invisible to the actor whose information-processing capabilities have been infected by it.  Indeed, the bias that cognitive illiberalism comprises can subvert a decisionmaker’s conscious, genuine intent to exercise legal authority consistent with liberal ideals [source: Kahan, Hoffman & Braman, Harv. L. Rev. (2009), 126, 837-906; Kahan, Hoffman, Braman, Evans & Rachlinski, Stan. L. Rev., 64, 851-906  (2012). Date added Dec. 26, 2017.]

Cognitively illiberal state. Refers to a liberal political regime pervaded—and hence subverted—by institutions and laws that reflect the unconscious tendency of legal and political decisionmakers to impute secular harms to behavior that imposes only sectarian ones. [source: Kahan, Stanford L. Rev. 60:115-54 (2007). Date added Dec. 26, 2017.]

Saturday
Dec232017

Weekend update: more "Cultural Cognition Dictionary/Glossary"

Cultural Cognition Dictionary (or Glossary, whatever)

Note: his is part of a document under construction. New terms will be added intermittently during periods in which there is nothing else to do or in which there is something else to do and hence an opporunity to engage in creative procrastination.  

Cultural cognition thesis. The conjecture that culture is prior to fact in debates over contested societal risks and related facts. Culture is prior not just in the normative sense that cultural values guide action conditional on beliefs about states of affairs; it is also prior in the positive sense that cultural commitments, through a variety of mechanisms, shape what individuals believe the relevant facts to be. [source: Kahan, Slovic, Braman & Gastil, Harvard Law Review 119, 1071-1109 (2006), p. 1083. Date added Dec. 23, 2017].

Identity-protective reasoning.  The tendency of individuals to selectively credit and dismiss factual assertions in a manner that reflects and reinforces their cultural commitments, thereby expressing affective orientations that secure their own status within cultural groups. [source: Kahan, Slovic et al., J. Empirical Legal Studies, 4, 465-505 (2007). Date added Dec. 23, 2017]

The “knowledge deficit fallacy.” A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with facts as the cause of the public’s failure to converge on the best available scientific evidence on human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence will dispel public conflict over facts.  [Date added Dec. 19, 2017]

The “ ‘knowledge deficit fallacy’.”  A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with the “knowledge deficit fallacy” as the cause of science communicators’ failure to converge on the best available scientific evidence on how to communicate human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence on science communication will dispel science communicators’ reliance on the knowledge deficit theory. [added Dec. 19, 2017]

“Motivated System 2 reasoning.” A summary of the empirical research finding that as individuals’ cognitive proficiencies (measured by a variety of wide variety of critical thinking assessments including the Cognitive Reflection Test, Numeracy, Actively Opened-minded thinking, and the Ordinary Science Intelligence assessment) increase, so does their tendency to display identity-protective reasoning in their perception of relevant facts. [source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199, pp. 181-182 (2017). Date added Dec. 23, 2017].

 MS2R. An abbreviation for “Motivated System 2 reasoning.” [source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199, p. 182; Cultural Cognition Blog, passim;. Date added Dec. 23, 2017].

 

Thursday
Dec212017

Great stocking stuffers--Politically Motivated Reasoning Paradigm, parts 1 & 2!

Buy now to assure arrival before 12/25!


Wednesday
Dec202017

Motivated System 2 Reasoning (MS2R) ... a fragment

from something I'm working on 

2. Background

2.1. MS2R in general. Where expert and lay judgments of risk diverge, cultural polarization, not mere confusion, is the most conspicuous feature of public opinion. Any satisfactory explanation of the public’s failure to assent to scientific consensus in these instances, then, must account for public dissensus among individuals of diverse cultural identities (Kahan, Braman, Cohen, Gastil & Slovic 2010).

One widespread account of this kind is rooted in dual process reasoning theory.  According to this view, accurate perception of risk and like facts demands the consistent and sustained use of conscious and effortful “System 2” reasoning, a form of information processing associated with expert judgment.  Members of the public are obviously not experts. Because they lack the time, knowledge, and mental discipline that System 2 reasoning demands, members of the public are forced to resort to a heuristic substitute that is rapid, intuitive, and emotion-laden (Kahneman & Frederick 2005). “What do people like me think?” (myside bias) is one of the unconscious heuristics associated with this type of “system 1” information processing (Baron 1995).  As a result, overreliance on System 1 reasoning not only generates error but also creates a correlation between error and membership in one or another identity-defining group (Sunstein 2003, 2007).

Basic observational data, however, is inconsistent with this account. If over-reliance on heuristic reasoning explained why the average member of the public was out of synch with scientific experts, then we’d expect conflict—between experts and the public, and also between different public factions—to lessen as individuals became more disposed to rely instead on conscious, effortful information processing. But that’s not what we see; indeed we observe the opposite: correlational data consistently show that political polarization, far from abating, increases in lockstep with cognitive reflection (Kahan 2013; Kahan & Stanovich 2016), actively open-minded thinking (Kahan & Corbin 2016), science comprehension (Kahan, Peters et al. 2012), and like capacities and aptitudes.

This pattern suggests an alternative theory of public risk-perception and cultural conflict. On this account, instead of using their cognitive proficiencies to discern the truth, individuals disposed to, and capable of, System 2 reasoning can be expected to use their cognitive proficiencies to conform their beliefs to the ones that have come to signify membership in a particular cultural group (Stanovich 2013; Kahan 2013).

This form of reasoning is perfectly rational at the individual level. Individuals have stakes in both being protected from societal risks and being judged socially competent and trustworthy by their peers. But it is only the latter that is affected by their own personal beliefs. Forced to choose between “getting it right” from a scientific perspective and being who they are from a cultural one (Kahan 2015), individuals can be expected by and large to pick the latter (not consciously, but unconsciously, as a result of habits of mind that conduce to their well-being).

Collectively, however, this distinctively expressive form of information processing is irrational. Where democratic citizens all form their perceptions in this way, they are less likely to converge on best evidence of the dangers they all face.  This prospect, however, does not change the advantages that any individual obtains by forming and persisting in group-affirming beliefs. 

This is the tragedy of the science communications commons, the dispelling of which is one of the primary aims of the science of science communication (Kahan, Peters et al. 2012).

2.2. Motivated numeracy in particular. * * *

References

Baron, J. (1995). Myside bias in thinking about abortion. Thinking & Reasoning, 1(3), 221-235. doi: 10.1080/13546789508256909

Kahan, D. M. (2015). Climate-Science Communication and the Measurement Problem. Advances in Political Psychology, 36, 1-43. doi: 10.1111/pops.12244

Kahan, D. M. (2013). Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making, 8, 407-424.

Kahan, D. M., & Corbin, J. C. (2016). A note on the perverse effects of actively open-minded thinking on climate-change polarization. Research & Politics, 3(4). doi: 10.1177/2053168016676705

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Clim. Change, 2(10), 732-735.

Kahan, D., Braman, D., Cohen, G., Gastil, J., & Slovic, P. (2010). Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law and Human Behavior, 34(6), 501-516. doi: 10.1007/s10979-009-9201-0

Kahan, Dan M. and Stanovich, Keith E., Rationality and Belief in Human Evolution (September 14, 2016). Annenberg Public Policy Center Working Paper No. 5. Available at SSRN: https://ssrn.com/abstract=2838668.

Kahneman, D., & Frederick, S. (2005). A model of heuristic judgment. In K. J. H. R. G. Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293): Cambridge University Press.

Stanovich, K. E. (2013). Why humans are (sometimes) less rational than other animals: Cognitive complexity and the axioms of rational choice. Thinking & Reasoning, 19(1), 1-26.

Sunstein, C. R. (2007). On the Divergent American Reactions to Terrorism and Climate Change. Columbia Law Review, 107, 503-557. 

Tuesday
Dec192017

"Knowledge deficit theory^2": a definition

From the Cultural Cognition Dictionary (Mockingbird Univ. Press, forthcoming):

The “knowledge deficit fallacy.” A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with facts as the cause of the public’s failure to converge on the best available scientific evidence on human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence will dispel public conflict over facts. 

* * *

The “ ‘knowledge deficit fallacy’.”  A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with the “knowledge deficit fallacy” as the cause of science communicators’ failure to converge on the best available scientific evidence on how to communicate human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence on science communication will dispel science communicators’ reliance on the knowledge deficit theory.

Thursday
Dec142017

"Gateway belief" illusion--published and critiqued (?)

Get your copy now before it sells out!

 

VLFM, minus F, "respond"; 100 CCP points, redeemable in CCP gift shop, to anyone who can explain what cultural cognition has to do with the critique of VLFM for not reporting their control condition data.

Saturday
Dec092017

A draw in the “asymmetry thesis meta-analysis” steel-cage match? Nope. It’s a KO.

As the 14 billion regular subscribers to this blog know all too well, I’ve been discussing the so-called “asymmetry thesis” (AT) on this site (and in published papers [Kahan 2013]) for approximately 65 years now.

AT posits that the impact of ideologically motivated reasoning is asymmetric in relation to so-called “liberal” and “conservative” orientations. Conservatives, AT proponents maintain, are substantially more vulnerable to this form of biased information processing than are liberals (e.g., Jost et al 2003).

What about AT opponents? What do they say?

Well, I don’t recall any empirical researcher who asserts that liberals are more biased than conservatives (maybe motivated reasoning is causing me to overlook or just not recall such research).

Rather, AT opponents contend politically motivated reasoning is uniform—i.e., symmetric—across the conventional left-right spectrum.  So let’s call this position “ST” for “symmetry thesis.”

The fight between AT and ST looks like the kind of dispute that ought to be adjudicated by meta-analysis.  And in fact, in the last 6 mos. or so, we’ve been treated to two meta-analytic investigations, one by John Jost (2017) and another by Pete Ditto & a large contingent of collaborators (in press).

The problem, however, is that Jost and Ditto et al. appear to strongly disagree with one another about what their massive literature surveys imply.

Jost reports finding approximately 280 studies involving almost 400,000 subjects. From the “need for closure” to “dogmatism” to “self-deception”—the self-report measures featured in these studies support the conclusion that conservatives are more biased than are liberals.

Meanwhile, Ditto et al. report the results from 51 experiments, comprising 18,000 subjects. Their conclusion? That “there was strong support for the symmetry hypothesis: liberals (r = .235) and conservatives (r = .255) showed no differnce in mean levels of bias across studies"—a compelling affirmation of ST over AT.***

So now what? Do we just throw up our hands and give up?

The answer is no. It turns out that Jost’s and Ditto et al.’s results can be reconciled pretty easily. All one has to do is examine what they were measuring and how.

Jost’s meta-analysis was based on survey data correlating conservatism and various measures of cognitive style.  Jost did not present any meta-analytic data on motivated-reasoning experiment results.

That’s what Ditto et al. measured.  They included in their sample, moreover, only experimental studies that conformed to the Politically Motivated Reasoning Paradigm (“PMRP”). PMRP identifies a method specifically crafted to avoid the myriad confounds that can rob a study of politically motivated reasoning of its validity (Flynn et al. 2017; Johnston & Ballard 2016; Kahan 2016a).  Focusing on studies that meet the PMRP standard, Ditto et al. conclude that liberals and conservatives were equally vulnerable to politically motivated reasoning.

More or less as an aside, Jost does refer to several experimental studies in his paper. But he doesn’t say anything about the criteria he used for singling them out, much less about whether they were consistent with PMRP.

Indeed, it’s clear that the main criterion Jost used to flag these particular experimental studies was that they reached a result congenial to his hypothesis.  We can tell that he resorted to cherry-picking of this sort * because he didn’t cite a single one of the myriad experimental studies that suggest that liberals are as prone to ideologically motivated cognition as conservatives.  We know there are many studies like that because plenty of them were featured in Ditto et al., an earlier version of which is in fact cited by Jost.****

There’s no reason, though, to doubt that Jost used appropriate criteria, applied with appropriate impartiality and care, to select studies that report the relationship between liberal-conservative ideology and one or another self-report measure of cognitive style.

But that only makes things worse for AT.  For notwithstanding the preponderance of evidence that conservatism is associated with a closed-minded style based on “epistemic” self-report  measures, Ditto et al. demonstrate that liberals are every bit as likely to succumb to politically motivated reasoning when one tests partisans’ information processing experimentally. This combination of results, then, implies that the self-report measures Jost analyzes are externally invalid indicators of what we actually care about—viz., how individuals of opposing political outlooks actually process information.

The only objective reasoning-style disposition that Jost reports on is the Cognitive Reflection Test (CRT), on which liberals, according to Jost, have a modest performance advantage over conservatives.

But here, too, Jost’s fixation on correlational studies and his resolute disregard for experimental ones undermines his conclusions. MS2R—“motivated system 2 reasoning”—describes the tendency of those who score highest on objective measures of cognitive proficiency (including not only CRT but also Numeracy and Ordinary Science Intelligence) to display more bias, not less, when they process political information (Kahan 2016b).

Thus, if we take Jost’s compilation of studies featuring CRT at face value, his finding that liberals score higher on it is a reason to infer that liberals are more vulnerable, not less, to politically motivated reasoning than are conservatives.

But we shouldn’t do this.

If one is trying to figure out who is more disposed to process political information in a biased manner-- conservatives or liberals—one should examine how they actually reason.

Ditto et al. do this.  Jost doesn’t.

Thus, the “meta-analysis steel-cage match” was no tie. 

On the contrary, it was a knock-out victory for ST over AT.

Refs

Ditto, Peter H. and Liu, Brittany and Clark, Cory J. and Wojcik, Sean P. and Chen, Eric E. and Grady, Rebecca Hofstein and Zinger, Joanne F., (in press). At Least Bias Is Bipartisan: A Meta-Analytic Comparison of Partisan Bias in Liberals and Conservatives. Perspectives on Psychological Sci. Working paper available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2952510.

Flynn, D. J., Nyhan, B., & Reifler, J. (2017). The Nature and Origins of Misperceptions: Understanding False and Unsupported Beliefs About Politics. Political Psychology, 38, 127-150. doi: 10.1111/pops.12394

Johnston, C. D. and A. O. Ballard (2016). "Economists and Public Opinion: Expert Consensus and Economic Policy Judgments." The Journal of Politics 78(2): 443-456.

Johnston, C. D., & Ballard, A. O. (2016). Economists and Public Opinion: Expert Consensus and Economic Policy Judgments. The Journal of Politics, 78(2), 443-456. doi: 10.1086/684629

Jost, J. T. (2017). Ideological Asymmetries and the Essence of Political Psychology. Political Psychology, 38(2), 167-208.

Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political Conservatism as Motivated Social Cognition. Psych. Bull., 129(3), 339-375.

Kahan, D. M. (2013). "Ideology, Motivated Reasoning, and Cognitive Reflection." Judgment and Decision Making 8: 407-424.

Kahan, D. M. (2016a). The politically motivated reasoning paradigm, part 1: What politically motivated reasoning is and how to measure it. Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource.

Kahan, D. M. (2016b). The politically motivated reasoning paradigm, part 2: Open questions. Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource.

* John convinced me that the stricken language comes across as asserting that he engaged in wrongdoing, which is not what I meant to assert.  My point is that he cites the experiments in question for illustration, not for proof that experimental studies show the asymmetry that he reports for cognitive-disposition measures.

** Not in original post.

*** Revised to reflect "in press" version of Ditto et al.

**** John still (reasonably) objects to the discussion of his treatment of experiments in the paper. I included that discussion only b/c I anticipated John would point out that he did look at experimental evidence too (albeit by non meta-analytic techniques). But the post doesn't require the relevant paragraphs  to make its points--none of which is to imply that John acted in bad faith.

Monday
Dec042017

Hey, want to know someting? Science curiosity is a culturally random variable!

Friday
Dec012017

Dewey on curiosity & science comprehension

Wow . . . . (downloaded from  here).

How We Think

John Dewey

1910, Boston: D.C. Heath & Co.; selections from Part One, “The Problem of Training Thought,” spelling and grammar modestly modernized

§1. Curiosity

The most vital and significant factor in supplying the primary material whence suggestion may issue is, without doubt, curiosity. The wisest of the Greeks used to say that wonder is the mother of all science. An inert mind waits, as it were, for experiences to be imperiously forced upon it. The pregnant saying of Wordsworth:

“The eye—it cannot choose but see; We cannot bid the ear be still;
Our bodies feel, where’er they be, Against or with our will”—

holds good in the degree in which one is naturally possessed by curiosity. The curious mind is constantly alert and exploring, seeking material for thought, as a vigorous and healthy body is on the qui vive for nutriment. Eagerness for experience, for new and varied contacts, is found where wonder is found. Such curiosity is the only sure guarantee of the acquisition of the primary facts upon which inference must base itself.

(a)  In its first manifestations, curiosity is a vital overflow, an expression of an abundant organic energy. A physiological uneasiness leads a child to be “into everything,”—to be reaching, poking, pounding, prying. Observers of animals have noted what one author calls “their inveterate tendency to fool.” “Rats run about, smell, dig, or gnaw, without real reference to the business in hand. In the same way Jack [a dog] scrabbles and jumps, the kitten wanders and picks, the otter slips about everywhere like ground lightning, the elephant fumbles ceaselessly, the monkey pulls things about.” The most casual notice of the activities of a young child reveals a ceaseless display of exploring and testing activity. Objects are sucked, fingered, and thumped; drawn and pushed, handled and thrown; in short, experimented with, till they cease to yield new qualities. Such activities are hardly intellectual, and yet without them intellectual activity would be feeble and intermittent through lack of stuff for its operations.

(b)  A higher stage of curiosity develops under the influence of social stimuli. When the child learns that he can appeal to others to eke out his store of experiences, so that, if objects fail to respond interestingly to his experiments, he may call upon persons to provide interesting material, a new epoch sets in. “What is that?” “Why?” become the unfailing signs of a child’s presence. At first this questioning is hardly more than a projection into social relations of the physical overflow which earlier kept the child pushing and pulling, opening and shutting. He asks in succession what holds up the house, what  holds up the soil that holds the house, what holds up the earth that holds the soil; but his questions are not evidence of any genuine consciousness of rational connections. His why is not a demand for scientific explanation; the motive behind it is simply eagerness for a larger acquaintance with the mysterious world in which he is placed. The search is not for a law or principle, but only for a bigger fact. Yet there is more than a desire to accumulate just information or heap up disconnected items, although sometimes the interrogating habit threatens to degenerate into a mere disease of language. In the feeling, however dim, that the facts which directly meet the senses are not the whole story, that there is more behind them and more to come from them, lies the germ of intellectual curiosity.

(c)  Curiosity rises above the organic and the social planes and becomes intellectual in the degree in which it is transformed into interest in problems provoked by the observation of things and the accumulation of material. When the question is not discharged by being asked of another, when the child continues to entertain it in his  own mind and to be alert for whatever will help answer it, curiosity has become a positive intellectual force. To theopen mind, nature and social experience are full of varied and subtle challenges to look further. If germinating powers are not used and cultivated at the right moment, they tend to be transitory, to die out, or to wane in intensity.

This general law is peculiarly true of sensitiveness to what is uncertain and questionable; in a few people, intellectual curiosity is so insatiable that nothing will discourage it, but in most its edge is easily dulled and blunted. Bacon’s saying that we must become as little children in order to enter the kingdom of science is at once a reminder of the open-minded and flexible wonder of childhood and of the ease with which this endowment is lost. Some lose it in indifference or carelessness; others in a frivolous flippancy; many escape these evils only to become incased in a hard dogmatism which is equally fatal to the spirit of wonder. Some are so taken up with routine as to be inaccessible to new facts and problems. Others retain curiosity only with reference to what concerns their personal advantage in their chosen career. With many, curiosity is arrested on the plane of interest in local gossip and in the fortunes of their neighbors; indeed, so usual is this result that very often the first association with the word curiosity is a prying inquisitiveness into other people’s business.

With respect then to curiosity, the teacher has usually more to learn than to teach. Rarely can they aspire to the office of kindling or even increasing it. Their task is rather to keep alive the sacred spark of wonder and to fan the flame that already glows. Their problem is to protect the spirit of inquiry, to keep it from becoming blasé from overexcitement, wooden from routine, fossilized through dogmatic instruction, or dissipated by random exercise upon trivial things.

Sunday
Nov262017

Clarendon Law Lectures 2017: what happened

When I was an infant academic, one of my senior colleagues advised me that if I devoted my first summer to mapping out all the classes for my upcoming fall course, I’d find out that I spent three months preparing for the first one. Each class thereafter, from the second until the last, would have to be planned the night before.

 He was right.                                                               

Now, if any future Clarendon Lecture invitee should happen to consult me, I’d advise her (or him) that if she attempts to use the entire interval between the invitation and the start of the series mapping out each of the three lectures,  she will discover that she spent 18 months preparing to deliver the first one. The remaining two lectures, she (or he)  will find out, will have to be prepared the night before.

 Or in any case, such was my experience.

After my first lecture, I realized that I had better abandon my plan for the second and prepare a new one to address in depth a theme persistently pursued by the audience questioners. Did I really have sufficient basis, they wanted to know, to infer that the difference between the culturally polarized responses of the general public and the unpolarized ones of judges in the “ ‘Ideology’ or ‘Situation Sense?’ ” (aka “They saw a statutory ambiguity) study was attributable to the professionalization of the latter?  Maybe judges were more disposed to use “System 2” information processing (conscious, effortful, “slow”) rather than rely on “System 1” (intuitive, automatic, “fast”). Or perhaps judges had an advantage over ordinary members of the public differed in some other form of critical reasoning.

So in the 22-hr interval that separated the first lecture from the second, I fashioned a new presentation addressing this issue.  It featured MS2R (“motivated system 2 reasoning”), a cognitive dynamic that rebuts the conjecture that differences in cognitive proficiency accounted for judges’ domain-specific immunity from identity-protective information processing. Indeed, if anything, before the study was conducted, this line of research might have led one to believe that judges, lawyers, and law students—to the extent that they do score higher on critical reasoning assessments—would actually display more, not less, bias in the “saw a statutory ambiguity” experiment.

I also introduced the audience to the Science Curiosity Scale. High scores on it, research suggests, do constrain polarization on societal risks and related policy-relevant facts.  But there was little reason, it seemed to me, to believe members of the legal profession are more science curious than members of the public generally.

Having made this change in focus for lecture 2, I had to revise the content of final lecture as well.  For that one, I knit together compressed versions of the planned lecture 2 & lecture 3.  Accordingly, the audience was exposed to modest amounts of the “evidence rules impossibility theorem” and the “(real) realist program for the science of judging and adjudication.”

Audience questions and insights persisted. But the series had drawn to a close.

So you’ll have to watch for more engagement with the Clarendon Lecture audience here “tomorrow.”™

Lecture slides: No. 1, No. 2, No. 3.

Sunday
Nov192017

Weekend update: paradox of scientific knowledge dissemination in the liberal state

From The Cognitively Illiberal State, an early formulation of Popper's Revenge:

A popular theme in the history and philosophy of science treats the advancement of human knowledge as conjoined to the adoption of liberal democratic institutions. It is through incessant exposure to challenge that facts establish themselves as worthy of belief under the scientific method. Liberal institutions secure the climate in which such constant challenging is most likely to take place, both by formally protecting the right of persons to espouse views at odds with dominant systems of belief and by informally habituating us to expect, tolerate, and even reward dissent.

But at the same time that liberalism advances science, it also ironically constrains it. The many truths that science has discovered depend on culture for their dissemination: without culture to identify which information purveyors are worthy of trust, we’d be powerless to avail ourselves of the vast stores of empirical knowledge that we did not personally participate in developing. But thanks to liberalism, we don’t all use the same culture to help us figure out what or whom to believe. Our society features a plurality of cultural styles, and hence a plurality of cultural certifiers of credible information

Again, the belief that science will inevitably pull these cultural authorities into agreement with themselves reflects unwarranted optimism. In accord with its own professional norms and in harmony with the social norms of a liberal regime, the academy tolerates and even encourages competitive dissent. As a result, cultural advocates will always be able to find support from seemingly qualified experts for their perception that what’s ignoble is also dangerous, and what’s noble benign.  States of persistent group polarization are thus inevitable— almost mathematically —as beliefs feed on themselves within cultural groups, whose members stubbornly dismiss as unworthy insights originating outside the group.

Because we have the advantage of science, we undoubtedly know more than previous ages about what actions to take to attain our collective wellbeing. But precisely because we tolerate more cultural diversity than they did, we are also confronted with unprecedented societal dissensus on exactly what to do. 

Friday
Nov172017

Where am I?... Part 2

Ummmm... this is typical view of the podium when I give a talk...

  But you can watch/listen at https://www.youtube.com/watch?v=ktHtLIF8R6Q&feature=youtu.be.

Friday
Nov172017

Where am I?... part 1

Just wanted to reassure the 14 billion readers of this blog that I haven't been kidnapped by aliens; I'm simply busy preparing for this --

Drop by if you get a chance!

Tuesday
Nov142017

Science curiosity, not science literacy, is prime virtue in Liberal Republic of Science (here are my slides; see any glitches or mistakes?) 

Talking in few hours here at Northwestern University. Basic message/title of presentation:  "Comprehension without curiosity is no virtue, and curiosity without comprehension no vice." Sums up the quadrillions of studies finding that cognitive proficiency magnifies political polarization and the less-than-a-year's old research suggesting that science curiosity helps to offset this perverse dynamic.

If you hurry & look through, you can still advise me on what to say up until about noon US eastern time!

Watch out for your ears-- we're ready for a fookin good show!

Wednesday
Nov082017

Midweek update: teaching criminal law--voluntary manslaughter

I usually start class (sessions of which are 120 mins. this semester at Harvard Law) with a mini-lecture that synthesizes the material and discussion from the immediately preceding class. The one below recaps voluntary manslaughter":

Voluntary manslaughter.  Last time we looked at voluntary manslaughter.  There are two formulations.  The common law version mitigates murder to manslaughter when an offender who intentionally kills does so in the heat of passion brought on by adequate provocation and without “cooling time.”  The Model Penal Code, in contrast, mitigates when a homicide that would be murder is committed as a result of an extreme emotional or mental disturbance for which there is a “reasonable excuse.”

On the first day of this course, I made the point that disputes about what the law means are frequently disputes about two things: (1) what it ought to mean; and (2) who ought to say what it means.  Our discussion of the common law voluntary manslaughter yesterday nicely illustrated this.

What, for example, does “adequate provocation” mean?  Is adultery adequate provocation?  How about a same-sex overture?  The answer can’t be found in the plain meaning of the doctrine.  Rather, it must be constructed according to some theory about what the doctrine is all about.  And because it must be constructed someone must do the constructing.  So what ought the law mean and who ought to say?

We considered a number of specific theories about why the voluntary manslaughter doctrine exists.  I suggested that we call one the voluntarist view: impassioned killers are treated leniently, on this account, because passion compromises their volition, and thus reduces culpability for their acts.  The problem with this hypothesis, though, is that it can’t explain why there is a provocation requirement at all, much less why the provocation must be adequate.  As cases like Anderson illustrate, people don’t experience uncontrollable, homicidal impulses only when provoked.

keep reading

 

Sunday
Nov052017

Weekend update: does transparency help with this overplotting problem?

Another example of how to use transparency functionality of Stata 15.

Compare this ...

 ... with this:

Which one is better? Why? Other ideas?

 

Friday
Nov032017

Next stop (not counting weekly trips to Cambridge, MA) 

Northwestern University, Evansville, Ill., Nov. 14:

 

Wednesday
Nov012017

How many talks did I give last yr? And how about yr before that, & yr before that ...

Huh... Well just think of how many more I would have done if I weren't so shy.

 

Tuesday
Oct312017

#scicomm question: what communicates essential information more effectively--unfilled overlapping pdd's or filled/transparency ones?

Been having more fun with Stata 15's new transparency feature but was wondering if maybe I'm neglecting communication effectiveness in favor of some other aesthetic consideration.

So tell me: Which looks better--this

 or this?

 

Both convey the same info on how "high numeracy" & "low numeracy" study subjects do on a covariance problem, the numbers of which are manipulated to make the right answers either identity-affirming or identity-threatening.  What they are both illustrating, then, is that high numeracy subjects lose nearly all their accuracy edge when they analyze covariance data that contradicts their political presuppositions and thus threatens their cultural identity.

So assume an attentive reader comes across this point in the text and is directed to look at the Figures to make the point even more vivid.  Does one of these graphic reporting methods work better than the other?

Monday
Oct302017

More evidence of AOT's failure to counteract politically motivated reasoning 

Notice 2 things about this Figure:

1st, Stata 15 can now do transparencies!

2nd, this is even more evidence that “Actively open-minded thinking,” as commonly measured, furnishes no meaningful protection against politically motivated reasoning.

The results here are based on the same experimental design featured in the CCP Motivated Numeracy paper (Kahan, Peters et al. 2017). Subjects were asked what inference was supported by data presented in a 2x2 contingency table.  In one condition, the data were described as results of an experiment to test a new skin-rash cream.  In another, the data were described as results of an experiment to determine whether banning the carrying of concealed handguns in public increased or decreased crime.

In Motivated Numeracy, we found that individuals of opposing ideological orientations were substantially more likely to get the correct answer in the gun-control version if the data, properly interpreted, supported (or “affirmed”) the position associated with their ideology; when the data, properly interpreted did not not support their ideological group's position, individuals were more likely to select the wrong answer.

What’s more, the effect was stronger among the subjects of the highest degree of Numeracy, an aptitude to reason well with quantitative information.

The data here are pretty similar to those in Motivated Numeracy, except now it's “Actively Open-minded Thinking” (AOT) that is being shown to interact with ideology.  On the effectiveness of the new skin cream, individuals who score highest on a standard measure of AOT do better than those who score low, regardless of their political outlooks.

In the “gun control” condition, those who score highest on AOT do only slightly better on the version of the problem that presents ideologically congenial data. 

In the version that presents threatening or ideologically uncongenial evidence, however, those who score highest on AOT do no better than those who score the lowest.

This is not what you’d expect.

AOT is supposed to counteract ideologically motivated reasoning along with kindred forms of “my side bias” (e.g., Stanovich 2013; Baron  1995). Accordingly, in the "identity threatened" condition, one would expect those highest in AOT to do just as well as their high-scoring counterparts in the "identity affirmed" condition. One would expect, too, that the performance of those high in AOT would not show a level of degradation (-30%, +/- 14%) comparable to the degradation in performance shown by low scoring AOT subjects (-23%, +/-10%).

But it didn’t work this way here.

It also didn’t work that way in a study that Jon Corbin and I did last year, in which we showed that those highest in AOT, far from converging, were even more politically polarized on the danger posed by climate change (Kahan & Corbin 2016).

What to make of this?

Well, again, one possibility is that the version of AOT we are using simply is not valid.  I don’t buy that, really, because the measure has been validated in various settings (e.g., Baron et al. 2015).

The other possibility, which I think is more plausible, is that AOT--like Numeracy  (Kahan, Peters et al. 2017), Cognitive Reflection (Kahan 2013), and Ordinary Science Intelligence (Kahan 2016)—magnifies identity-protective reasoning where certain policy-relevant facts have become entangled with group-based identities (Kahan 2015).  Basically, where that’s the case, people use their critical reasoning proficiencies, of which AOT is clearly one, not to figure out the truth but rather to cement their status and relations with other group members (Stanovich & West 2007, 2008; Kahan & Stanovich 2016).

But I don’t want to be closed-minded toward other possibilities. 

So what do you think?

Refs

Baron, J. Myside bias in thinking about abortion. Thinking & Reasoning 1, 221-235 (1995).

Baron, J., Scott, S., Fincher, K. & Emlen Metz, S. Why does the Cognitive Reflection Test (sometimes) predict utilitarian moral judgment (and other things)? Journal of Applied Research in Memory and Cognition, 265-284 (2015).

Kahan, D. & Stanovich, K. Rationality and Belief in Human Evolution (2016), CCP/APPC Working paper available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2838668.

Kahan, D.M. & Corbin, J.C. A note on the perverse effects of actively open-minded thinking on climate-change polarization. Research & Politics 3 (2016).

Kahan, D.M. ‘Ordinary science intelligence’: a science-comprehension measure for study of risk and science communication, with notes on evolution and climate change. J Risk Res, 1-22 (2016).

Kahan, D.M. Climate-Science Communication and the Measurement Problem. Advances in Political Psychology 36, 1-43 (2015).

Kahan, D.M. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making 8, 407-424 (2013).

Kahan, D.M., Peters, E., Dawson, E.C. & Slovic, P. Motivated numeracy and enlightened self-government. Behavioural Public Policy 1, 54-86 (2017).

Stanovich, K.E. Why humans are (sometimes) less rational than other animals: Cognitive complexity and the axioms of rational choice. Thinking & Reasoning 19, 1-26 (2013).

Stanovich, K. & West, R. On the failure of intelligence to predict myside bias and one-sided bias. Thinking & Reasoning 14, 129-167 (2008).

Stanovich, K.E. & West, R.F. Natural myside bias is independent of cognitive ability. Thinking & Reasoning 13, 225-247 (2007).