follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

Thursday
Jan112018

"Kentucky farmer" spotted in Montana

This site's 14 billion regular subscribers know the Kentucky Farmer as one of the types of people whose habits of mind feature cognitive dualism--the tendency to adopt one set of action-enabling beliefs in one setting and another, opposing set of action-enabling beliefs in another. For Kentucky Farmer, this style of reasoning helps him to maintain his membership in a cultural group for whom climate-change skepticism is identity-defining while also using scientific information on climate change to be a good farmer.

Well, he was cited recently, not in Kentucky but in Montana.  The reporter for a story on the detrimental impact of climate change on barley farming is the one who spotted him:

In the field, looking at his withering crop, Somerfeld was unequivocal about the cause of his damaged crop – “climate change.” But back at the bar, with his friends, his language changed. He dropped those taboo words in favor of “erratic weather” and “drier, hotter summers” – a not-uncommon conversational tactic in farm country these days.

Great #scicomm by Ari LeVaux, the reporter.

But of course this form of information processing remains tinged with mystery.

Wednesday
Jan102018

Applying the Science of Science Communication

I’m giving a talk tomorrow on motivated numeracy at the University of Utah.  In the very generous allotment of time they’ve afforded me (30 mins or so), I should be able to make pretty good progress in showing why cultural cognition is not attributable to some defect in individual rationality. 

But I’ll still end up with things that I don’t have time to work in. Like the biased processing of information on whether one’s cultural adversaries process political information in a biased fashion. And the role curiosity can play in buffering the magnification of biased information processing associated with greater cognitive proficiency.

I’m sure many of you have experienced this sort of frustration, too.

Well, here’s how I plan to overcome this obstacle.  Likely you’ve seen salespersons at retail outlets wearing colorful “Ask me about . . .” buttons to promote prospective buyers’ awareness of and interest in some new product or service. 

So why shouldn’t academics do the same thing?

Consider:

 

I won’t be wearing these “buttons”—I didn’t have time to make them  before I left home.   But I will insert them into my slides at the point at which I allude to the relevant studies.  Then, I figure, someone—his or her open-minded curiosity aroused-- will surely “ask me!” about these ideas in the Q&A!

See how knowing about the science of science communication helps to promote effective communication of scientific data?

I'll write  back tomorrow to report how effective this device was

Tuesday
Jan092018

Stupid smart phone or brilliant handgun? You make the call (so to speak)

Who do you think will fear this "smart-phone-disguised" handgun, who won't, & why? 

I have my own hypothesis, of course, but am eager to hear what others think.

Or maybe the existence of this gun/phone is "fake news"?...

 

Monday
Jan082018

Science communication environment; toxic memes; and politically motivated reasoning paradigm

Some more for Glossary. Arranged conceptually, not alphabetically.

Science communication environment and science communication environment “pollution.” To flourish, individuals and groups need to make use of more scientific insight than they have either the time or capacity to verify.  Rather than become scientific experts on myriad topics, then, individuals become experts at recognizing valid scientific information and distinguishing it from invalid counterfeits of the same. The myriad cues and related influences that individuals use to engage in this form of recognition form their scientific communication environment.  Dynamics that interfere with or corrupt these cues and influences (e.g., toxic memes and politically motivated reasoning) can be viewed as science-communication-environment “pollution.” [Source: Kahan in Oxford Handbook of Science of Science Communication, Eds. Jamieson, Kahan & Scheufele) pp, 35-50 (2017); Kahan, Science, 332, 53-54 (2013). Added Jan. 8, 2018.]

Toxic memes. Recurring tropes and idioms, the propagation of which (usually at first by conflict entrepreneurs) fuses diverse cultural identities to opposing position on some form of decision-relevant science. In the contaminated science communication environment that ensues, individuals relying on the opinion of their peers—generally a successful strategy for figuring out what science knows—polarize rather than converge on the best possible evidence. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017). Added: Jan. 7, 2018.]

Politically motivated reasoning paradigm (“PMRP”) and the PMRP design. A model of the tendency of individuals of diverse identities to polarize when exposed to evidence on a disputed policy-relevant science issue.  Starting with a truth-seeking Bayesian model of information processing, the PMRP model focuses on the disposition of individuals of diverse identities to attribute opposing likelihood ratios to evidence; this mechanism would assure that individuals of diverse identities will not converge but rather become more sharply divided when they process information. The PMRP method refers to study designs suited for observing this dynamic if it in fact exists. [Source: Kahan, D. M. in Emerging Trends in the Social and Behavioral Sciences (2016). Added: Jan. 8, 2018.]

 

 

Sunday
Jan072018

You guessed it: some more cultural cognition glossary/whatever entries--affect heuristic & conflict entrepreneurs

For the ever-expanding dictionary/glossary. You can actually get a long way in explaining why some science issues provoke cultural polarization and why others don't by examining these dynamics.

Affect heuristic. Describes the role that visceral feelings play in the formation of public perceptions of risks and related facts. Such feelings, research suggests, are not a product but rather a source of the costs and benefits individuals attribute to a putative risk source (e.g., nuclear power, GM foods, climate change). Such feelings likewise shape public perceptions of expert opinion, the trustworthiness of regulators, and the efficacy of policy interventions, etc. Psychometrically, all of these perceptions are properly viewed as indicators of a latent pro- or con-attitude, which varies continuously in the general population.  The cultural cognition thesis posits that cultural outlooks determine the valence of such feelings, which can be treated as mediating the impact of cultural worldviews on risk perceptions and related facts. [Sources: Slovic et al., Risk Analysis, 24, 311-322 (2004); Peters & Slovic, J. Applied Social Psy., 16, 1427-1453 (1996); Peters, Burraston & Mertz, Risk Analysis, 18, 715-27 (1998); Poortinga & Pidgeon, Risk Analysis, 25, 199-209. Dated added: Jan. 7, 2018.]

Conflict entrepreneurs. Individuals or groups that profit from filling public discourse with antagonistic memes, thereby entangling diverse cultural identities with opposing positions on some science issue. The benefit conflict entrepreneurs derive—greater monetary contributions to the advocacy groups they head, the opportunity to collect speaking fees, remunerative deals for popular books—doesn’t depend on whether their behavior genuinely promotes the cause they purport to be advancing. On the contrary, they profit most in an atmosphere pervaded by cultural recrimination and contempt, one in which democratic convergence on valid science is decidedly unlikely to occur. Their conduct contributes to that state. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017) Cultural Cognition blog, passim. Dated added: Jan. 7, 2018.]

 

 

Saturday
Jan062018

Culture, worldviews, & risk perception (glossary entries)

More for this:

Cultural cognition worldviews. A typology of risk-perception predispositions formed by the intersection of two moral orientations—hierarchy-egalitarianism and individualism-communitarianism.  Scales measuring these predispositions figure in empirical inquiries informed by the cultural cognition thesis. [Source: Kahan, in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk (eds. R. Hillerbrand, P. Sandin, S. Roeser, & M. Peterson), pp. 725-760 (2012). Date added: Jan. 6, 2018.]

Cultural theory of risk. A theory that asserts that individuals can be expected to conform their perception of all manner of risk, along with the efficacy of measures to abate the same, to their worldviews, which are based on Mary Douglas’s “group-grid” typology. [Sources: Douglas, Risk Acceptability According to the Social Sciences (1985); Douglas & Wildavsky, Culture & Risk (1982); Rayner, Cultural Theory and Risk Analysis, in S. Krimsky & D. Golding (Eds.), Social Theories of Risk (Krimsky &Golding eds.) 83-115 (1992). Date added: Jan. 6, 2018.]

 

Friday
Jan052018

New entries for CCP "glossary": cognitive dualism and the disentanglement principle

Still more for this dictionary/glossary in progress:

Cognitive dualism.  A theoretical account of reasoning that purports to reconcile opposing states of belief and disbelief in fundamental scientific facts. The theory posits that individuals variously endorse and reject such facts depending on which state—belief or disbelief—best enables such individuals to achieve context-specific goals.  Thus a science-trained professional might “believe in” human evolution when he or she is engaged in professional tasks that depend on the truth of that theory, yet still disbelieve in human evolution when he or she is acting as a member of a religious community, in which such disbelief enables her to both experience membership in and loyalty to such a community and to express the same. Farmers, too, have been observed to “disbelieve in” human-caused climate change when acting as members of their cultural communities, but to “believe in it” when endorsing farming practices that anticipate human-caused climate change. [Sources: Everhart & Hameed, Evolution: Education and Outreach, 6(1), 1-8; Prokopy, Morton et al., Climatic Change, 117, 943-50 (2014); Cultural cognition blog passim. Date added: Jan. 4 2018].

* * *

The distentaglement principle.  Label for a normative practice, derived from empirical findings, that supports the self-conscious presentation of scientific information in a manner that effectively severs positions on contested science issues from message recipients’ cutlural identities.  The effective use of the disentanglement principle has been credited with the successful teaching of evolutionary theory to secondary school students who "disbeliever" evolution. It also is the basis for science communication in Southeast Florida, where community engagement with climate change science draws together groups and communities that hold opposing beliefs in human-caused climate change. [Sources: Lawson & Worsnop, Journal of Research in Science Teaching, 29, 143-66 (1992). Kahan, Advances in Pol. Psych., 36, 1-43. Added on Jan. 4, 2018.]

 

Tuesday
Jan022018

"Science curiosity" and "SCS", plus "Mobility and Stability hypotheses"--latest entries in Cultural Cognition Dictionary/Glossary (Whatever)

I know, I know -- the construction of this document has taken over this blog of late, but that's becasue the alternative is to grade 85 criminal law exams. . . . 

Science curiosity and “SCS.” Science curiosity is a general disposition that reflects the motivation to seek out and consume scientific information for personal pleasure.  Variance in this disposition across persons and groups is measured by the Science Curiosity Scale (“SCS”). Intended to facilitate research on engagement with science documentaries, SCS scores have also been shown to predict resistance to politically motivated reasoning, including Motivated System 2 Reasoning (“MS2R”) [Source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199 (2017). Added Jan. 2, 2018.].

* * *

The mobility and stability hypotheses. Competing conjectures about how individuals’ perceptions of risk and related facts can be expected to behave across different settings (e.g., the workplace vs. the home). The “stability hypothesis” predicts that “individuals will seek to homogenize their experience of social structure in different areas of their lives” in a manner that reflects individuals’ static cultural worldviews. The “mobility hypothesis,” in contrast, holds that individuals’ can be expected to form differing perceptions to risk as they move across social contexts, which themselves are understood to embody distinct, and often opposing, cultural worldviews: “according to this view, individuals may flit like butterflies from context to context, changing the nature of their arguments as they do so” [Source: Rayner, Cultural Theory and Risk Analysis in Social Theories of Risk (Krimsky & Golding eds.) 83-115 (1992), pp. 105-106. Added Jan. 2, 2018.]

 

 

Wednesday
Dec272017

Hey-- still *more* entries for Cultural Cognition Dictionary/Glossary/Whatever

You can read all the entries  (all for now, that is) here.

Expressive rationality. Refers to the tendency of individuals to (unconsciously) form beliefs that signify their membership in, and loyalty to, identity-defining affinity groups. Among opposing groups, expressive rationality does not produce convergence but rather political polarization on the best available scientific evidence. Nevertheless, the strongest basis for treating this type of reasoning as rational is that it intensifies rather than dissipates as ordinary members of the public attain greater proficiency in the styles of reasoning essential to science comprehension (e.g., cognitive reflection, science literacy, and numeracy)  [Sources: Kahan, Peters, et al., Nature Climate Change, 2, 732-35 (2012), p. 734.  Kahan, Behavioral & Brain Sci. 40,26-28 (2016); Stanovich, Thinking & Reasoning, 19, 1-26 (2013). Added Dec. 27, 2017.] 

The tragedy of the science communications commons. A normative objection to expressive rationality.  While it is often rational for an individual to engage in this form of reasoning, it is a disaster when all members of a culturally diverse democratic society do so at once: in that case, members of opposing cultural groups are unlikely to converge (or at least converge as soon as they should) on what science has to say about the risks their society faces.  This consequence of expressive rationality, however, does nothing to reduce the psychic incentives that make it rational for any particular member  of the public to form identity-protective rather than truth-convergent forms of information processing. [Source: Kahan, Peters, et al., Nature Climate Change, 2, 732-35, (2012), p. 734,  Added Dec. 27, 2017.]

Tuesday
Dec262017

Still more entries in "Cultural Cognition Dictionary/Glossary (whatever)"

This--creating a dictionary/glossary of terms used in the study of cultural cognition--is kind of fun. So I'll add terms whenever the mood strikes me. I've arranged the new entries for today in a sort of thematic order.  In the new page that houses the growing number of dictionary/glossary entries, however, everything is alphabetical (I'll likely add cross-reference links where one term is best understood in relation to one or more other ones).

Secular harm. Refers to a set-back to interest the nature of which is independent of assent to any  culturally partisan conception of the best way to live.  Principal examples include damage to individuals’ physical security and impediments to their apprehension of collective knowledge.  Precisely because such harms can be experienced universally by citizens of diverse cultural identities, protecting citizens from such set-backs is a legitimate end for law in a liberal state [Sources: Rawls, Political Liberalism 175, 217-18 (1993); & Mill, On Liberty, ch. 1 (1859).  Date added: Dec. 26, 2017.] 

Sectarian harm. Refers to a set-back to interest the nature of which is dependent on assent to a partisan conception of the best way to live.  A principal example is the offense individuals experience when they are exposed to behavior that expresses commitments to values alien to theirs. Precisely because such harms depend on—cannot be defined independently of—adherence to a particular conception of the best life, using law to avert or remedy them is illegitimate in a liberal state. [Source: Mill, On Liberty, ch. 1 (1859).  Dated added: Dec. 26, 2017.]

Cognitive illiberalism. Refers to a  tendency to selectively impute cognizable secular harms to behavior that generates non-cognizable sectarian harms. Such a tendency is unconscious and hence invisible to the actor whose information-processing capabilities have been infected by it.  Indeed, the bias that cognitive illiberalism comprises can subvert a decisionmaker’s conscious, genuine intent to exercise legal authority consistent with liberal ideals [source: Kahan, Hoffman & Braman, Harv. L. Rev. (2009), 126, 837-906; Kahan, Hoffman, Braman, Evans & Rachlinski, Stan. L. Rev., 64, 851-906  (2012). Date added Dec. 26, 2017.]

Cognitively illiberal state. Refers to a liberal political regime pervaded—and hence subverted—by institutions and laws that reflect the unconscious tendency of legal and political decisionmakers to impute secular harms to behavior that imposes only sectarian ones. [source: Kahan, Stanford L. Rev. 60:115-54 (2007). Date added Dec. 26, 2017.]

Saturday
Dec232017

Weekend update: more "Cultural Cognition Dictionary/Glossary"

Cultural Cognition Dictionary (or Glossary, whatever)

Note: his is part of a document under construction. New terms will be added intermittently during periods in which there is nothing else to do or in which there is something else to do and hence an opporunity to engage in creative procrastination.  

Cultural cognition thesis. The conjecture that culture is prior to fact in debates over contested societal risks and related facts. Culture is prior not just in the normative sense that cultural values guide action conditional on beliefs about states of affairs; it is also prior in the positive sense that cultural commitments, through a variety of mechanisms, shape what individuals believe the relevant facts to be. [source: Kahan, Slovic, Braman & Gastil, Harvard Law Review 119, 1071-1109 (2006), p. 1083. Date added Dec. 23, 2017].

Identity-protective reasoning.  The tendency of individuals to selectively credit and dismiss factual assertions in a manner that reflects and reinforces their cultural commitments, thereby expressing affective orientations that secure their own status within cultural groups. [source: Kahan, Slovic et al., J. Empirical Legal Studies, 4, 465-505 (2007). Date added Dec. 23, 2017]

The “knowledge deficit fallacy.” A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with facts as the cause of the public’s failure to converge on the best available scientific evidence on human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence will dispel public conflict over facts.  [Date added Dec. 19, 2017]

The “ ‘knowledge deficit fallacy’.”  A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with the “knowledge deficit fallacy” as the cause of science communicators’ failure to converge on the best available scientific evidence on how to communicate human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence on science communication will dispel science communicators’ reliance on the knowledge deficit theory. [added Dec. 19, 2017]

“Motivated System 2 reasoning.” A summary of the empirical research finding that as individuals’ cognitive proficiencies (measured by a variety of wide variety of critical thinking assessments including the Cognitive Reflection Test, Numeracy, Actively Opened-minded thinking, and the Ordinary Science Intelligence assessment) increase, so does their tendency to display identity-protective reasoning in their perception of relevant facts. [source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199, pp. 181-182 (2017). Date added Dec. 23, 2017].

 MS2R. An abbreviation for “Motivated System 2 reasoning.” [source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199, p. 182; Cultural Cognition Blog, passim;. Date added Dec. 23, 2017].

 

Thursday
Dec212017

Great stocking stuffers--Politically Motivated Reasoning Paradigm, parts 1 & 2!

Buy now to assure arrival before 12/25!


Wednesday
Dec202017

Motivated System 2 Reasoning (MS2R) ... a fragment

from something I'm working on 

2. Background

2.1. MS2R in general. Where expert and lay judgments of risk diverge, cultural polarization, not mere confusion, is the most conspicuous feature of public opinion. Any satisfactory explanation of the public’s failure to assent to scientific consensus in these instances, then, must account for public dissensus among individuals of diverse cultural identities (Kahan, Braman, Cohen, Gastil & Slovic 2010).

One widespread account of this kind is rooted in dual process reasoning theory.  According to this view, accurate perception of risk and like facts demands the consistent and sustained use of conscious and effortful “System 2” reasoning, a form of information processing associated with expert judgment.  Members of the public are obviously not experts. Because they lack the time, knowledge, and mental discipline that System 2 reasoning demands, members of the public are forced to resort to a heuristic substitute that is rapid, intuitive, and emotion-laden (Kahneman & Frederick 2005). “What do people like me think?” (myside bias) is one of the unconscious heuristics associated with this type of “system 1” information processing (Baron 1995).  As a result, overreliance on System 1 reasoning not only generates error but also creates a correlation between error and membership in one or another identity-defining group (Sunstein 2003, 2007).

Basic observational data, however, is inconsistent with this account. If over-reliance on heuristic reasoning explained why the average member of the public was out of synch with scientific experts, then we’d expect conflict—between experts and the public, and also between different public factions—to lessen as individuals became more disposed to rely instead on conscious, effortful information processing. But that’s not what we see; indeed we observe the opposite: correlational data consistently show that political polarization, far from abating, increases in lockstep with cognitive reflection (Kahan 2013; Kahan & Stanovich 2016), actively open-minded thinking (Kahan & Corbin 2016), science comprehension (Kahan, Peters et al. 2012), and like capacities and aptitudes.

This pattern suggests an alternative theory of public risk-perception and cultural conflict. On this account, instead of using their cognitive proficiencies to discern the truth, individuals disposed to, and capable of, System 2 reasoning can be expected to use their cognitive proficiencies to conform their beliefs to the ones that have come to signify membership in a particular cultural group (Stanovich 2013; Kahan 2013).

This form of reasoning is perfectly rational at the individual level. Individuals have stakes in both being protected from societal risks and being judged socially competent and trustworthy by their peers. But it is only the latter that is affected by their own personal beliefs. Forced to choose between “getting it right” from a scientific perspective and being who they are from a cultural one (Kahan 2015), individuals can be expected by and large to pick the latter (not consciously, but unconsciously, as a result of habits of mind that conduce to their well-being).

Collectively, however, this distinctively expressive form of information processing is irrational. Where democratic citizens all form their perceptions in this way, they are less likely to converge on best evidence of the dangers they all face.  This prospect, however, does not change the advantages that any individual obtains by forming and persisting in group-affirming beliefs. 

This is the tragedy of the science communications commons, the dispelling of which is one of the primary aims of the science of science communication (Kahan, Peters et al. 2012).

2.2. Motivated numeracy in particular. * * *

References

Baron, J. (1995). Myside bias in thinking about abortion. Thinking & Reasoning, 1(3), 221-235. doi: 10.1080/13546789508256909

Kahan, D. M. (2015). Climate-Science Communication and the Measurement Problem. Advances in Political Psychology, 36, 1-43. doi: 10.1111/pops.12244

Kahan, D. M. (2013). Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making, 8, 407-424.

Kahan, D. M., & Corbin, J. C. (2016). A note on the perverse effects of actively open-minded thinking on climate-change polarization. Research & Politics, 3(4). doi: 10.1177/2053168016676705

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Clim. Change, 2(10), 732-735.

Kahan, D., Braman, D., Cohen, G., Gastil, J., & Slovic, P. (2010). Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law and Human Behavior, 34(6), 501-516. doi: 10.1007/s10979-009-9201-0

Kahan, Dan M. and Stanovich, Keith E., Rationality and Belief in Human Evolution (September 14, 2016). Annenberg Public Policy Center Working Paper No. 5. Available at SSRN: https://ssrn.com/abstract=2838668.

Kahneman, D., & Frederick, S. (2005). A model of heuristic judgment. In K. J. H. R. G. Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293): Cambridge University Press.

Stanovich, K. E. (2013). Why humans are (sometimes) less rational than other animals: Cognitive complexity and the axioms of rational choice. Thinking & Reasoning, 19(1), 1-26.

Sunstein, C. R. (2007). On the Divergent American Reactions to Terrorism and Climate Change. Columbia Law Review, 107, 503-557. 

Tuesday
Dec192017

"Knowledge deficit theory^2": a definition

From the Cultural Cognition Dictionary (Mockingbird Univ. Press, forthcoming):

The “knowledge deficit fallacy.” A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with facts as the cause of the public’s failure to converge on the best available scientific evidence on human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence will dispel public conflict over facts. 

* * *

The “ ‘knowledge deficit fallacy’.”  A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with the “knowledge deficit fallacy” as the cause of science communicators’ failure to converge on the best available scientific evidence on how to communicate human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence on science communication will dispel science communicators’ reliance on the knowledge deficit theory.

Thursday
Dec142017

"Gateway belief" illusion--published and critiqued (?)

Get your copy now before it sells out!

 

VLFM, minus F, "respond"; 100 CCP points, redeemable in CCP gift shop, to anyone who can explain what cultural cognition has to do with the critique of VLFM for not reporting their control condition data.

Saturday
Dec092017

A draw in the “asymmetry thesis meta-analysis” steel-cage match? Nope. It’s a KO.

As the 14 billion regular subscribers to this blog know all too well, I’ve been discussing the so-called “asymmetry thesis” (AT) on this site (and in published papers [Kahan 2013]) for approximately 65 years now.

AT posits that the impact of ideologically motivated reasoning is asymmetric in relation to so-called “liberal” and “conservative” orientations. Conservatives, AT proponents maintain, are substantially more vulnerable to this form of biased information processing than are liberals (e.g., Jost et al 2003).

What about AT opponents? What do they say?

Well, I don’t recall any empirical researcher who asserts that liberals are more biased than conservatives (maybe motivated reasoning is causing me to overlook or just not recall such research).

Rather, AT opponents contend politically motivated reasoning is uniform—i.e., symmetric—across the conventional left-right spectrum.  So let’s call this position “ST” for “symmetry thesis.”

The fight between AT and ST looks like the kind of dispute that ought to be adjudicated by meta-analysis.  And in fact, in the last 6 mos. or so, we’ve been treated to two meta-analytic investigations, one by John Jost (2017) and another by Pete Ditto & a large contingent of collaborators (in press).

The problem, however, is that Jost and Ditto et al. appear to strongly disagree with one another about what their massive literature surveys imply.

Jost reports finding approximately 280 studies involving almost 400,000 subjects. From the “need for closure” to “dogmatism” to “self-deception”—the self-report measures featured in these studies support the conclusion that conservatives are more biased than are liberals.

Meanwhile, Ditto et al. report the results from 51 experiments, comprising 18,000 subjects. Their conclusion? That “there was strong support for the symmetry hypothesis: liberals (r = .235) and conservatives (r = .255) showed no differnce in mean levels of bias across studies"—a compelling affirmation of ST over AT.***

So now what? Do we just throw up our hands and give up?

The answer is no. It turns out that Jost’s and Ditto et al.’s results can be reconciled pretty easily. All one has to do is examine what they were measuring and how.

Jost’s meta-analysis was based on survey data correlating conservatism and various measures of cognitive style.  Jost did not present any meta-analytic data on motivated-reasoning experiment results.

That’s what Ditto et al. measured.  They included in their sample, moreover, only experimental studies that conformed to the Politically Motivated Reasoning Paradigm (“PMRP”). PMRP identifies a method specifically crafted to avoid the myriad confounds that can rob a study of politically motivated reasoning of its validity (Flynn et al. 2017; Johnston & Ballard 2016; Kahan 2016a).  Focusing on studies that meet the PMRP standard, Ditto et al. conclude that liberals and conservatives were equally vulnerable to politically motivated reasoning.

More or less as an aside, Jost does refer to several experimental studies in his paper. But he doesn’t say anything about the criteria he used for singling them out, much less about whether they were consistent with PMRP.

Indeed, it’s clear that the main criterion Jost used to flag these particular experimental studies was that they reached a result congenial to his hypothesis.  We can tell that he resorted to cherry-picking of this sort * because he didn’t cite a single one of the myriad experimental studies that suggest that liberals are as prone to ideologically motivated cognition as conservatives.  We know there are many studies like that because plenty of them were featured in Ditto et al., an earlier version of which is in fact cited by Jost.****

There’s no reason, though, to doubt that Jost used appropriate criteria, applied with appropriate impartiality and care, to select studies that report the relationship between liberal-conservative ideology and one or another self-report measure of cognitive style.

But that only makes things worse for AT.  For notwithstanding the preponderance of evidence that conservatism is associated with a closed-minded style based on “epistemic” self-report  measures, Ditto et al. demonstrate that liberals are every bit as likely to succumb to politically motivated reasoning when one tests partisans’ information processing experimentally. This combination of results, then, implies that the self-report measures Jost analyzes are externally invalid indicators of what we actually care about—viz., how individuals of opposing political outlooks actually process information.

The only objective reasoning-style disposition that Jost reports on is the Cognitive Reflection Test (CRT), on which liberals, according to Jost, have a modest performance advantage over conservatives.

But here, too, Jost’s fixation on correlational studies and his resolute disregard for experimental ones undermines his conclusions. MS2R—“motivated system 2 reasoning”—describes the tendency of those who score highest on objective measures of cognitive proficiency (including not only CRT but also Numeracy and Ordinary Science Intelligence) to display more bias, not less, when they process political information (Kahan 2016b).

Thus, if we take Jost’s compilation of studies featuring CRT at face value, his finding that liberals score higher on it is a reason to infer that liberals are more vulnerable, not less, to politically motivated reasoning than are conservatives.

But we shouldn’t do this.

If one is trying to figure out who is more disposed to process political information in a biased manner-- conservatives or liberals—one should examine how they actually reason.

Ditto et al. do this.  Jost doesn’t.

Thus, the “meta-analysis steel-cage match” was no tie. 

On the contrary, it was a knock-out victory for ST over AT.

Refs

Ditto, Peter H. and Liu, Brittany and Clark, Cory J. and Wojcik, Sean P. and Chen, Eric E. and Grady, Rebecca Hofstein and Zinger, Joanne F., (in press). At Least Bias Is Bipartisan: A Meta-Analytic Comparison of Partisan Bias in Liberals and Conservatives. Perspectives on Psychological Sci. Working paper available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2952510.

Flynn, D. J., Nyhan, B., & Reifler, J. (2017). The Nature and Origins of Misperceptions: Understanding False and Unsupported Beliefs About Politics. Political Psychology, 38, 127-150. doi: 10.1111/pops.12394

Johnston, C. D. and A. O. Ballard (2016). "Economists and Public Opinion: Expert Consensus and Economic Policy Judgments." The Journal of Politics 78(2): 443-456.

Johnston, C. D., & Ballard, A. O. (2016). Economists and Public Opinion: Expert Consensus and Economic Policy Judgments. The Journal of Politics, 78(2), 443-456. doi: 10.1086/684629

Jost, J. T. (2017). Ideological Asymmetries and the Essence of Political Psychology. Political Psychology, 38(2), 167-208.

Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political Conservatism as Motivated Social Cognition. Psych. Bull., 129(3), 339-375.

Kahan, D. M. (2013). "Ideology, Motivated Reasoning, and Cognitive Reflection." Judgment and Decision Making 8: 407-424.

Kahan, D. M. (2016a). The politically motivated reasoning paradigm, part 1: What politically motivated reasoning is and how to measure it. Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource.

Kahan, D. M. (2016b). The politically motivated reasoning paradigm, part 2: Open questions. Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource.

* John convinced me that the stricken language comes across as asserting that he engaged in wrongdoing, which is not what I meant to assert.  My point is that he cites the experiments in question for illustration, not for proof that experimental studies show the asymmetry that he reports for cognitive-disposition measures.

** Not in original post.

*** Revised to reflect "in press" version of Ditto et al.

**** John still (reasonably) objects to the discussion of his treatment of experiments in the paper. I included that discussion only b/c I anticipated John would point out that he did look at experimental evidence too (albeit by non meta-analytic techniques). But the post doesn't require the relevant paragraphs  to make its points--none of which is to imply that John acted in bad faith.

Monday
Dec042017

Hey, want to know someting? Science curiosity is a culturally random variable!

Friday
Dec012017

Dewey on curiosity & science comprehension

Wow . . . . (downloaded from  here).

How We Think

John Dewey

1910, Boston: D.C. Heath & Co.; selections from Part One, “The Problem of Training Thought,” spelling and grammar modestly modernized

§1. Curiosity

The most vital and significant factor in supplying the primary material whence suggestion may issue is, without doubt, curiosity. The wisest of the Greeks used to say that wonder is the mother of all science. An inert mind waits, as it were, for experiences to be imperiously forced upon it. The pregnant saying of Wordsworth:

“The eye—it cannot choose but see; We cannot bid the ear be still;
Our bodies feel, where’er they be, Against or with our will”—

holds good in the degree in which one is naturally possessed by curiosity. The curious mind is constantly alert and exploring, seeking material for thought, as a vigorous and healthy body is on the qui vive for nutriment. Eagerness for experience, for new and varied contacts, is found where wonder is found. Such curiosity is the only sure guarantee of the acquisition of the primary facts upon which inference must base itself.

(a)  In its first manifestations, curiosity is a vital overflow, an expression of an abundant organic energy. A physiological uneasiness leads a child to be “into everything,”—to be reaching, poking, pounding, prying. Observers of animals have noted what one author calls “their inveterate tendency to fool.” “Rats run about, smell, dig, or gnaw, without real reference to the business in hand. In the same way Jack [a dog] scrabbles and jumps, the kitten wanders and picks, the otter slips about everywhere like ground lightning, the elephant fumbles ceaselessly, the monkey pulls things about.” The most casual notice of the activities of a young child reveals a ceaseless display of exploring and testing activity. Objects are sucked, fingered, and thumped; drawn and pushed, handled and thrown; in short, experimented with, till they cease to yield new qualities. Such activities are hardly intellectual, and yet without them intellectual activity would be feeble and intermittent through lack of stuff for its operations.

(b)  A higher stage of curiosity develops under the influence of social stimuli. When the child learns that he can appeal to others to eke out his store of experiences, so that, if objects fail to respond interestingly to his experiments, he may call upon persons to provide interesting material, a new epoch sets in. “What is that?” “Why?” become the unfailing signs of a child’s presence. At first this questioning is hardly more than a projection into social relations of the physical overflow which earlier kept the child pushing and pulling, opening and shutting. He asks in succession what holds up the house, what  holds up the soil that holds the house, what holds up the earth that holds the soil; but his questions are not evidence of any genuine consciousness of rational connections. His why is not a demand for scientific explanation; the motive behind it is simply eagerness for a larger acquaintance with the mysterious world in which he is placed. The search is not for a law or principle, but only for a bigger fact. Yet there is more than a desire to accumulate just information or heap up disconnected items, although sometimes the interrogating habit threatens to degenerate into a mere disease of language. In the feeling, however dim, that the facts which directly meet the senses are not the whole story, that there is more behind them and more to come from them, lies the germ of intellectual curiosity.

(c)  Curiosity rises above the organic and the social planes and becomes intellectual in the degree in which it is transformed into interest in problems provoked by the observation of things and the accumulation of material. When the question is not discharged by being asked of another, when the child continues to entertain it in his  own mind and to be alert for whatever will help answer it, curiosity has become a positive intellectual force. To theopen mind, nature and social experience are full of varied and subtle challenges to look further. If germinating powers are not used and cultivated at the right moment, they tend to be transitory, to die out, or to wane in intensity.

This general law is peculiarly true of sensitiveness to what is uncertain and questionable; in a few people, intellectual curiosity is so insatiable that nothing will discourage it, but in most its edge is easily dulled and blunted. Bacon’s saying that we must become as little children in order to enter the kingdom of science is at once a reminder of the open-minded and flexible wonder of childhood and of the ease with which this endowment is lost. Some lose it in indifference or carelessness; others in a frivolous flippancy; many escape these evils only to become incased in a hard dogmatism which is equally fatal to the spirit of wonder. Some are so taken up with routine as to be inaccessible to new facts and problems. Others retain curiosity only with reference to what concerns their personal advantage in their chosen career. With many, curiosity is arrested on the plane of interest in local gossip and in the fortunes of their neighbors; indeed, so usual is this result that very often the first association with the word curiosity is a prying inquisitiveness into other people’s business.

With respect then to curiosity, the teacher has usually more to learn than to teach. Rarely can they aspire to the office of kindling or even increasing it. Their task is rather to keep alive the sacred spark of wonder and to fan the flame that already glows. Their problem is to protect the spirit of inquiry, to keep it from becoming blasé from overexcitement, wooden from routine, fossilized through dogmatic instruction, or dissipated by random exercise upon trivial things.

Sunday
Nov262017

Clarendon Law Lectures 2017: what happened

When I was an infant academic, one of my senior colleagues advised me that if I devoted my first summer to mapping out all the classes for my upcoming fall course, I’d find out that I spent three months preparing for the first one. Each class thereafter, from the second until the last, would have to be planned the night before.

 He was right.                                                               

Now, if any future Clarendon Lecture invitee should happen to consult me, I’d advise her (or him) that if she attempts to use the entire interval between the invitation and the start of the series mapping out each of the three lectures,  she will discover that she spent 18 months preparing to deliver the first one. The remaining two lectures, she (or he)  will find out, will have to be prepared the night before.

 Or in any case, such was my experience.

After my first lecture, I realized that I had better abandon my plan for the second and prepare a new one to address in depth a theme persistently pursued by the audience questioners. Did I really have sufficient basis, they wanted to know, to infer that the difference between the culturally polarized responses of the general public and the unpolarized ones of judges in the “ ‘Ideology’ or ‘Situation Sense?’ ” (aka “They saw a statutory ambiguity) study was attributable to the professionalization of the latter?  Maybe judges were more disposed to use “System 2” information processing (conscious, effortful, “slow”) rather than rely on “System 1” (intuitive, automatic, “fast”). Or perhaps judges had an advantage over ordinary members of the public differed in some other form of critical reasoning.

So in the 22-hr interval that separated the first lecture from the second, I fashioned a new presentation addressing this issue.  It featured MS2R (“motivated system 2 reasoning”), a cognitive dynamic that rebuts the conjecture that differences in cognitive proficiency accounted for judges’ domain-specific immunity from identity-protective information processing. Indeed, if anything, before the study was conducted, this line of research might have led one to believe that judges, lawyers, and law students—to the extent that they do score higher on critical reasoning assessments—would actually display more, not less, bias in the “saw a statutory ambiguity” experiment.

I also introduced the audience to the Science Curiosity Scale. High scores on it, research suggests, do constrain polarization on societal risks and related policy-relevant facts.  But there was little reason, it seemed to me, to believe members of the legal profession are more science curious than members of the public generally.

Having made this change in focus for lecture 2, I had to revise the content of final lecture as well.  For that one, I knit together compressed versions of the planned lecture 2 & lecture 3.  Accordingly, the audience was exposed to modest amounts of the “evidence rules impossibility theorem” and the “(real) realist program for the science of judging and adjudication.”

Audience questions and insights persisted. But the series had drawn to a close.

So you’ll have to watch for more engagement with the Clarendon Lecture audience here “tomorrow.”™

Lecture slides: No. 1, No. 2, No. 3.

Sunday
Nov192017

Weekend update: paradox of scientific knowledge dissemination in the liberal state

From The Cognitively Illiberal State, an early formulation of Popper's Revenge:

A popular theme in the history and philosophy of science treats the advancement of human knowledge as conjoined to the adoption of liberal democratic institutions. It is through incessant exposure to challenge that facts establish themselves as worthy of belief under the scientific method. Liberal institutions secure the climate in which such constant challenging is most likely to take place, both by formally protecting the right of persons to espouse views at odds with dominant systems of belief and by informally habituating us to expect, tolerate, and even reward dissent.

But at the same time that liberalism advances science, it also ironically constrains it. The many truths that science has discovered depend on culture for their dissemination: without culture to identify which information purveyors are worthy of trust, we’d be powerless to avail ourselves of the vast stores of empirical knowledge that we did not personally participate in developing. But thanks to liberalism, we don’t all use the same culture to help us figure out what or whom to believe. Our society features a plurality of cultural styles, and hence a plurality of cultural certifiers of credible information

Again, the belief that science will inevitably pull these cultural authorities into agreement with themselves reflects unwarranted optimism. In accord with its own professional norms and in harmony with the social norms of a liberal regime, the academy tolerates and even encourages competitive dissent. As a result, cultural advocates will always be able to find support from seemingly qualified experts for their perception that what’s ignoble is also dangerous, and what’s noble benign.  States of persistent group polarization are thus inevitable— almost mathematically —as beliefs feed on themselves within cultural groups, whose members stubbornly dismiss as unworthy insights originating outside the group.

Because we have the advantage of science, we undoubtedly know more than previous ages about what actions to take to attain our collective wellbeing. But precisely because we tolerate more cultural diversity than they did, we are also confronted with unprecedented societal dissensus on exactly what to do.