follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

Tuesday
Feb062018

A science of science communication manifesto ... a fragment 

From something I'm working on . . . .

Our motivating premise is that advancement of enlightened policymaking  depends on addressing the science communication problem. That problem consists in the failure of valid, compelling, and widely accessible scientific evidence to dispel persistent public conflict over policy-relevant facts to which that evidence directly speaks. As spectacular and admittedly consequential as instances of this problem are, entrenched public confusion about decision-relevant science is in fact quite rare. They are not a consequence of constraints on public science comprehension, a creeping “anti-science” sensibility in U.S. society, or the sinister acumen of professional misinformers.  Rather they are the predictable result of a societal failure to integrate two bodies of scientific knowledge: that relating to the effective management of collective resources; and that relating to the effective management of the processes by which ordinary citizens reliably come to know what is known (Kahan 2010, 2012, 2013).

The study of public risk perception and risk communication dates back to the mid-1970s, when Paul Slovic, Daniel Kahneman, Amos Tversky, and Baruch Fischhoff began to apply the methods of cognitive psychology to investigate conflicts between lay and expert opinion on the safety of nuclear power generation and various other hazards (e.g., Slovic, Fischhoff & Lichtenstein 1977, 1979; Kahneman, Slovic & Tversky 1982).  In the decades since, these scholars and others building on their research have constructed a vast and integrated system of insights into the mechanisms by which ordinary individuals form their understandings of risk and related facts. This body of knowledge details not merely the vulnerability of human reason to recurring biases, but also the numerous and robust processes that ordinarily steer individuals away from such hazards, the identifiable and recurring influences that can disrupt these processes, and the means by which risk-communication professionals (from public health administrators to public interest groups, from conflict mediators to government regulators) can anticipate and avoid such threats and attack and dissipate them when such preemptive strategies fail (e.g., Fischhoff & Scheufele 2013; Slovic 2010, 2000; Pidgeon, Kasperson & Slovic 2003; Gregory 2005; Gregory, McDaniels & Field 2001) .

Astonishingly, however, the practice of science and science-informed policymaking has remained largely innocent of this work.  The persistently uneven success of resource-conservation stakeholder proceedings, the sluggish response of localities the challenges posed by climate-change, and the continuing emergence of new public controversies such as the one over fracking—all are testaments (as are myriad comparable misadventures in the domain of public health) to the persistent failure of government institutions, NGOs, and professional associations to incorporate the science of science communication into their efforts to promote constructive public engagement with the best available evidence on risk.

This disconnect can be attributed to two primary sources.  The first is cultural: the actors most responsible for promoting public acceptance of evidence-based policymaking do not possess a mature comprehension of the necessity of evidence-based practices in their own work.  For many years, the work of policymakers, analysts, and advocates has been distorted by the more general societal misconception that scientific truth is “manifest”—that because science treats empirical observation as the sole valid criterion for ascertaining truth, the truth (or validity) of insights gleaned by scientific methods is readily observable to all, making it unnecessary to acquire and use empirical methods to promote its public comprehension (Popper 1968).

Dispelled to some extent by the shock of persistent public conflict over climate change, this fallacy has now given way to a stubborn misapprehension about what it means for science communication to be genuinely evidence based.  In investigating the dynamics of public risk perception, the decision sciences have compiled a deep inventory of highly diverse mechanisms (“availability cascades,” “probability neglect,” “framing effects,” “fast/slow information processing,” etc.). Used as expositional templates, any reasonably thoughtful person can construct a plausible-sounding “scientific” account of the challenges that constrain the communication of decision-relevant science. But because more surmises about the science communication problem are plausible than are true, this form of story telling cannot produce insight into its causes and cures. Only gathering and testing empirical evidence can.

Some empirical researchers have themselves contributed to the failure of practical communicators to appreciate this point. These scholars purport to treat general opinion surveys and highly stylized lab experiments as sources of concrete guidance for actors involved in promoting public engagement with information relevant to particular risk-regulation or related policy issues. Even when such methods generate insight into general mechanisms of consequence, they do not—because they cannot—yield insight into how those mechanisms can be brought to bear in particular circumstances.  The number of plausible surmises about how to reproduce in the field results that have been observed in the lab exceeds the number that truly will as well. Again, empirical observation and testing are necessary—now in the field.  The number of researchers willing to engage in field-based research, and unwilling to acknowledge candidly the necessity of doing so, has stifled the emergence of a genuinely evidence-based approach to the promotion of public engagement with decision-relevant science (Kahan 2014).

The second source of the disconnect between the practice of science and science-informed policymaking, on the one hand, and the science of science communication, on the other, is practical: the integration of the two is constrained by a collective action problem.  The generation of information relevant to the effective communication of decision-relevant science—including not only empirical evidence of what works and what does not but practical knowledge of the processes for adapting and extending it in particular circumstances—is a public good.  Its benefits are not confined to those who invest the time and resources to produce it but extend as well to any who thereafter have access to it.  Under these circumstances, it is predictable that producers, constrained by their own limited resources and attentive only to their own particular needs, will not invest as much in producing such information, and in a form amenable to the dissemination and exploitation of it by others, as would be socially desirable.  As a result, instead of progressively building on their successive efforts, each initiative that makes use of evidence-based methods to promote effective public engagement with policy-relevant science will be constrained to struggle anew with the recurring problems.

Fischhoff, B. & Scheufele, D.A. The science of science communication. Proceedings of the National Academy of Sciences 110, 14031-14032 (2013).

Gregory, R. & McDaniels, T. Improving Environmental Decision Processes. in Decision making for the environment : social and behavioral science research priorities (ed. G.D. Brewer & P.C. Stern) 175-199 (National Academies Press, Washington, DC, 2005).

Gregory, R., McDaniels, T. & Fields, D. Decision aiding, not dispute resolution: Creating insights through structured environmental decisions. Journal of Policy Analysis and Management 20, 415-432 (2001).

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Kahan, D. Making Climate-Science Communication Evidence Based—All the Way Down. In Culture, Politics and Climate Change: How Information Shapes Our Common Future, eds. M. Boykoff & D. Crow. (Routledge Press, 2014).

Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).

Kahan, D.M. A Risky Science Communication Environment for Vaccines. Science 342, 53-54 (2013).

Kahneman, D., Slovic, P. & Tversky, A. Judgment under uncertainty : heuristics and biases (Cambridge University Press, Cambridge ; New York, 1982).

Pidgeon, N.F., Kasperson, R.E. & Slovic, P. The social amplification of risk (Cambridge University Press, Cambridge ; New York, 2003).

Popper, K.R. Conjectures and refutations : the growth of scientific knowledge (Harper & Row, New York, 1968).

Slovic, P. The feeling of risk : new perspectives on risk perception (Earthscan, London ; Washington, DC, 2010).

Slovic, P. The perception of risk (Earthscan Publications, London ; Sterling, VA, 2000).

Tuesday
Jan302018

More glossary entries: pattern recognition, professional judgment & situation sense

Again, complete (or more complete) document here.

Pattern recognition. A cognitive dynamic in which a person recognizes some object or state of affairs by matching it (preconciously) to a rapidly conjured set of prototypes acquired through experience. [Source: Margolis, H. (1987), Patterns, Thinking, and Cognition (Univ. Chicago Press. Date added Jan. 29, 2018.] 

Professional judgment.  Domain specific “habits of mind” (most likely specialized forms of pattern recognition) that guide domain experts (e.g., judges). [Source: Margolis, H. (1996), Dealing with risk : why the public and the experts disagree on environmental issues. (University of Chicago Press.). Date added Jan. 29, 2018.] 

Situation sense. Karl Llewellyn’s description of domain-specific habits of mind, acquired through education and experience, that enable judges and lawyers to rapidly and reliably converge on case outcomes notwithstanding the indeterminacy of formal legal norms. [Source: Llewellyn, K. (1989), The Case Law System in America (M. Ansaldi, Trans.).  Date added Jan. 29, 2018.]

Monday
Jan292018

Meet the Millennials, part 4: Motivated System 2 reasoning ...

First, this (familiar) result --

Then this --

Do you see what I see? What does it all mean, if anything??

Sunday
Jan212018

Weekend update: who has more items for "Cultural Cognition Dictionary/Glossary/whatever"?

Okay, loyal listeners:

Version 1.0 of the CCP Dictionary/glossary/whatever ("DGW") can be viewed here.

Nominations for additional entries are now being solicited. Proposed items should be technical terms, terms of art, or idioms that recur with reasonable frequency on this site and that are likely to be unfamiliar to anyone not among the 14 billion regular readers of this blog.

Friday
Jan192018

Latest entries to CCP glossary thing: Science of #Scicomm; Rope-a-dope; and "From mouth of scientist. . . ."

Here are some more. At some point, I'll post the entire document & invite nominations for additional terms worthy of definition of explicatioin therein.

Science of science communication. A research program that uses science’s own signature methods of disciplined observation and valid causal inference to understand and manage the processes by which citizens come to know what is known by science. [Source: Oxford Hand of the Science of Science Communication, eds. K.H. Jamieson, D.M. Kahan & D.Scheufele, passim ;Kahan, J. Sci. Comm., 14(3) (2015). Added Jan. 19, 2018.]

From mouth of the scientist to ear of the citizen. A fallacious view that treats the words scientists utter as a causal influence on formation and reform of public opinion on controversial forms of science. The better view recognizes that what science knows is transmitted from scientists to the public via the influence of dense, overlapping networks of intermediaries, which include not just the media but (more decisively) individuals' peers, whose words & actions vouch for the science (or not) through their own use (or non-use) of scientific insights.  Where there is a science communication problem, then, the source of it is the corruption of these intermediary networks, not any problem with how scienitsts themselves talk. [Source: Kahan, Oxford Hand of the Science of Science Communication, eds. K.H. Jamieson, D.M. Kahan & D.Scheufele. Added: Jan. 19, 2018.]

Rope-a-dope. A tactic of science miscommunication whereby a conflict entrepreneur baits the communicators into fighting him or her in a conspicuous forum. The strength of the arguments advanced by the antagonists, the conflict entrepreneur realizes, are largely irrelevant. What matters is the appearance of of a social controversy, which cues onlookers to connect the competing positions with membership in and loyalty to members of their cultural group. Falling for this gambit marks science communicators as the miscommunicators’ “dope.” [Source: Cultural Cognition Project blog here & here. Added: Jan. 19, 2018.]

 

Wednesday
Jan172018

Comment function restored!

So let's hear what you think.

Wednesday
Jan172018

"Meet the Millennials!," part 3: climate change, evolution, and generational polarization

This is part 3 of CCP’s hit  series, “Meet the Millennials!”

In episode 1, we saw that the Millennials like to go to the zoo more often than do members of the Baby Boom, Generation X, and Silent Generation cohorts.

In episode 2, we observed that Millennials did better than other generational cohorts on a standardized test of science comprehension (the Ordinary Science Intelligence assessment), but were nevertheless no more science-curious than members of those other age brackets.we observed that Millennials did better than other generational cohorts on a standardized test of science comprehension (the Ordinary Science Intelligence assessment), but were nevertheless no more science-curious the members of those other age brackets.

Now, in what will be either the final or not the final episode of the series, we take a look at how Millennials fare in their beliefs in human-caused climate change and human evolution.

What do we see?  This on climate change,

 

and this on evolution.

Basically in relation to political outlooks Millennials are the group least polarized on climate change. Similarly, in relation to religiosity the Millennials are the least polarized on acceptance of human evolution. 

The difference in the degree of polarization, moreover, increase as the age differentials grow.  There’s not much difference between Millennials (born 1982-1999) and members of Generation X (1965-1981). But there is a decided difference between Millennials and Baby Boomers (1946-1964) and an even greater difference between Millennials and members of the Silent Generation (born before 1946).

I can think of two explanations.  The first is cohort shift: based on their experience and common exposure to social influences, the Millennials, while far from uniform in their assessments of controversial science-issues, are less worked up over them.  On this theory, we can expect a gradual, generational abatement of controversy over matters like climate change and evolution.

The second theory, however, is opinion shift: as they age, members of every generational cohort become more partisan and thus more divided on controversy-provoking forms of science.  There’s actually some literature to support this view, which I’ve commented on before.

Normally, at this point I’d say, “What do you think?” But thanks to Squarespace (which has admitted that it views older sites like this one as “low priority”), the CCP blog’s comment function is broken.

So tell you what: If you’d like to comment on this post, send me an email, and I’ll manually insert them into the comment field.  Use “Meet the Millennials!” in the re line so I can be sure to spot the messages and take the work-around steps necessary to let you be heard.

Tuesday
Jan162018

Couple more items for CC dictionary/glossary

Dual process theory/theories. A set of decisionmaking frameworks that posit two discrete modes of information processing: one (often referred to as “System 1”) that is rapid, intuitive, and emotion pervaded; and another (often referred to as “System 2”) that is deliberate, self-conscious, and analytical. [Sources: Kahan, Emerging Trends in the Social and Behavioral Sciences (2016); Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Stanovich & West, Behavioral and Brain Sciences, 23(5), 645-665 (2000). Added Jan. 12, 2018.]

Motivated reasoning. A form of unconscious information-processing that is characterized by the selective crediting and discrediting of evidence in patterns that advance some goal or interest independent of the apprehension of truth. Cultural cognition—the biased assessment of evidence  protective of one’s status in identity-defining affinity groups—is one form of motivated reasoning. But there are many others, including self-serving apprehension of one’s own abilities, and inattention to evidence of one’s own mortality. Accordingly, cultural cognition should be not be equated with motivated reasoning but rather be treated as a species of it. [Source: Kunda Psychological Bulletin, 108, 480-498; Kahan, Harv L. Rev., 126, 1-77 (2011), pp. 19-26. Added Jan. 15, 2016.]

* * *

Check out the entire dictionary/glossary document; it's getting pretty cool.

Monday
Jan152018

Aren't you curious to know how Millennials rate in science curiosity?!

Okay—so it was so easy to pretrodict that Millennials would be more likely than other age cohorts to visit a zoo in the last yr.

Well, try these:

(1)  Which age cohort displays greatest level of science comprehension?

The answer is … the Millennials!

Stands to reason given how often they visit the zoo, right?

Actually the margin isn’t particularly big—less than a third of a standard-deviation separates the Millennials from the Silent Generation, whose members had the lowest OSI_2.0 score.

(2) Does the edge that the Millennials enjoy in OSI mean that they are more science curious (as measured by the SCS scale) than members of other generations?

Nope:

Surprising?  Well, it shouldn’t be when we recall that Ordinary Science Intelligence is only modestly correlated with Science Curiosity. 

But maybe it should surprise us, given Millennials’ immersion in new communication technologies: they have a greater opportunity to form and nourish the desire to know how the technologies that surround them work....

Or maybe the immersion cuts the other way: due to the extraordinary advances in information technologies over the course of their adulthood,  “silent,” “Boomers” and GenX might have been expected to have a greater degree of awe than the Millennials, who’ve had that technology all around them their whole lives.

Well, "Everythig is obvious--once you know the answer," as they say.

Another thing to ponder  here is the platykurtic (low peak, fat tails) distribution for the Millennials’ responses to the OSI assessment. . . . How should that affect our inferences?

How about another:

Before you know the answer, try to guess this one: are Millennials more likely than are other age cohorts to accept that human beings have caused climate change? (Don’t Google to find out results from general public opinion pollsters).

Answer “tomorrow.”™

Saturday
Jan132018

Introducing .. the Millenials! They like to go to the zoo!

As part of the CCP Science of Science Filmmaking project, I've been digging around in our existing data sets trying to learn more about the propensities of Millennials. I will share now & again tid bits that seem worthy of note.

So ... here's one thing: the Millennials are more likely to go to the zoo than are members of other age cohorts:

Does this surprise you? Or did you know of course Millennials are more regular zoo goers (many of the older ones with children in tow, points out Loyal Listener Gaythia Weis).

In the future try to predict things like how the Millennials size up, say, on the Science Curiosity Scale or on the Ordinary Science Intelligence assessment and we'll find out how good a sense you really have for cohort-effects in relation to science communication.

For purposes of this and future entries, the age cohorts birth years are as follows:

Millennials: 1982-1999
Generation X:  1965-1981
Boomers: 1946-1964
Silent generation: before 1946

 

Friday
Jan122018

A few more glossary entries: dual process reasoning; bounded rationality thesis; and C^4

I haven't had time to finish my "postcard" from Salt Lake City but here are some more entries for the glossary to tide you over:

Dual process theory/theories. A set of decisionmaking frameworks that posit two discrete modes of information processing: one (often referred to as “System 1”) that is rapid, intuitive, and emotion pervaded; and another (often referred to as “System 2”) that is deliberate, self-conscious, and analytical. [Sources: Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Stanovich & West, Behavioral and Brain Sciences, 23(5), 645-665 (2000). Added Jan. 12, 2018.]

Bounded rationality thesis (“BRT”). Espoused most influentially by Daniel Kahneman, this theory identifies over-reliance on heuristic reasoning as the source of various observed deficiencies (the availability effect; probability neglect; hindsight bias; hyperbolic discounting; the sunk-cost fallacy, etc.) in human reasoning under conditions of uncertainty. Nevertheless, BRT does not appear to be the source of cultural polariation over societal risks. On the contrary, such polarization has in various studies been shown to be the greatest in the individuals most disposed to resist the errors associated with heuristic information processing. [Sources: Kahan, Emerging Trends in the Social and Behavioral Sciences (2016); Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Kahneman, Slovic, & Tversky, A., Judgment Under Uncertainty: Heuristics and Biases, Cambridge ; New York: Cambridge University Press (1982). Added Jan. 12, 2018].

Cross-cultural cultural cognition (“C4”). Describes the use of the Cultural Cognition Worldview Scales to assess risk perceptions outside of the U.S. So far, the scales have been used in at least five nations other nations (England, Austria, Norway, Slovakia, and Switzerland). [CCP Bog, passim. Added Jan. 12, 2018.]

 

Thursday
Jan112018

"Kentucky farmer" spotted in Montana

This site's 14 billion regular subscribers know the Kentucky Farmer as one of the types of people whose habits of mind feature cognitive dualism--the tendency to adopt one set of action-enabling beliefs in one setting and another, opposing set of action-enabling beliefs in another. For Kentucky Farmer, this style of reasoning helps him to maintain his membership in a cultural group for whom climate-change skepticism is identity-defining while also using scientific information on climate change to be a good farmer.

Well, he was cited recently, not in Kentucky but in Montana.  The reporter for a story on the detrimental impact of climate change on barley farming is the one who spotted him:

In the field, looking at his withering crop, Somerfeld was unequivocal about the cause of his damaged crop – “climate change.” But back at the bar, with his friends, his language changed. He dropped those taboo words in favor of “erratic weather” and “drier, hotter summers” – a not-uncommon conversational tactic in farm country these days.

Great #scicomm by Ari LeVaux, the reporter.

But of course this form of information processing remains tinged with mystery.

Wednesday
Jan102018

Applying the Science of Science Communication

I’m giving a talk tomorrow on motivated numeracy at the University of Utah.  In the very generous allotment of time they’ve afforded me (30 mins or so), I should be able to make pretty good progress in showing why cultural cognition is not attributable to some defect in individual rationality. 

But I’ll still end up with things that I don’t have time to work in. Like the biased processing of information on whether one’s cultural adversaries process political information in a biased fashion. And the role curiosity can play in buffering the magnification of biased information processing associated with greater cognitive proficiency.

I’m sure many of you have experienced this sort of frustration, too.

Well, here’s how I plan to overcome this obstacle.  Likely you’ve seen salespersons at retail outlets wearing colorful “Ask me about . . .” buttons to promote prospective buyers’ awareness of and interest in some new product or service. 

So why shouldn’t academics do the same thing?

Consider:

 

I won’t be wearing these “buttons”—I didn’t have time to make them  before I left home.   But I will insert them into my slides at the point at which I allude to the relevant studies.  Then, I figure, someone—his or her open-minded curiosity aroused-- will surely “ask me!” about these ideas in the Q&A!

See how knowing about the science of science communication helps to promote effective communication of scientific data?

I'll write  back tomorrow to report how effective this device was

Tuesday
Jan092018

Stupid smart phone or brilliant handgun? You make the call (so to speak)

Who do you think will fear this "smart-phone-disguised" handgun, who won't, & why? 

I have my own hypothesis, of course, but am eager to hear what others think.

Or maybe the existence of this gun/phone is "fake news"?...

 

Monday
Jan082018

Science communication environment; toxic memes; and politically motivated reasoning paradigm

Some more for Glossary. Arranged conceptually, not alphabetically.

Science communication environment and science communication environment “pollution.” To flourish, individuals and groups need to make use of more scientific insight than they have either the time or capacity to verify.  Rather than become scientific experts on myriad topics, then, individuals become experts at recognizing valid scientific information and distinguishing it from invalid counterfeits of the same. The myriad cues and related influences that individuals use to engage in this form of recognition form their scientific communication environment.  Dynamics that interfere with or corrupt these cues and influences (e.g., toxic memes and politically motivated reasoning) can be viewed as science-communication-environment “pollution.” [Source: Kahan in Oxford Handbook of Science of Science Communication, Eds. Jamieson, Kahan & Scheufele) pp, 35-50 (2017); Kahan, Science, 332, 53-54 (2013). Added Jan. 8, 2018.]

Toxic memes. Recurring tropes and idioms, the propagation of which (usually at first by conflict entrepreneurs) fuses diverse cultural identities to opposing position on some form of decision-relevant science. In the contaminated science communication environment that ensues, individuals relying on the opinion of their peers—generally a successful strategy for figuring out what science knows—polarize rather than converge on the best possible evidence. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017). Added: Jan. 7, 2018.]

Politically motivated reasoning paradigm (“PMRP”) and the PMRP design. A model of the tendency of individuals of diverse identities to polarize when exposed to evidence on a disputed policy-relevant science issue.  Starting with a truth-seeking Bayesian model of information processing, the PMRP model focuses on the disposition of individuals of diverse identities to attribute opposing likelihood ratios to evidence; this mechanism would assure that individuals of diverse identities will not converge but rather become more sharply divided when they process information. The PMRP method refers to study designs suited for observing this dynamic if it in fact exists. [Source: Kahan, D. M. in Emerging Trends in the Social and Behavioral Sciences (2016). Added: Jan. 8, 2018.]

 

 

Sunday
Jan072018

You guessed it: some more cultural cognition glossary/whatever entries--affect heuristic & conflict entrepreneurs

For the ever-expanding dictionary/glossary. You can actually get a long way in explaining why some science issues provoke cultural polarization and why others don't by examining these dynamics.

Affect heuristic. Describes the role that visceral feelings play in the formation of public perceptions of risks and related facts. Such feelings, research suggests, are not a product but rather a source of the costs and benefits individuals attribute to a putative risk source (e.g., nuclear power, GM foods, climate change). Such feelings likewise shape public perceptions of expert opinion, the trustworthiness of regulators, and the efficacy of policy interventions, etc. Psychometrically, all of these perceptions are properly viewed as indicators of a latent pro- or con-attitude, which varies continuously in the general population.  The cultural cognition thesis posits that cultural outlooks determine the valence of such feelings, which can be treated as mediating the impact of cultural worldviews on risk perceptions and related facts. [Sources: Slovic et al., Risk Analysis, 24, 311-322 (2004); Peters & Slovic, J. Applied Social Psy., 16, 1427-1453 (1996); Peters, Burraston & Mertz, Risk Analysis, 18, 715-27 (1998); Poortinga & Pidgeon, Risk Analysis, 25, 199-209. Dated added: Jan. 7, 2018.]

Conflict entrepreneurs. Individuals or groups that profit from filling public discourse with antagonistic memes, thereby entangling diverse cultural identities with opposing positions on some science issue. The benefit conflict entrepreneurs derive—greater monetary contributions to the advocacy groups they head, the opportunity to collect speaking fees, remunerative deals for popular books—doesn’t depend on whether their behavior genuinely promotes the cause they purport to be advancing. On the contrary, they profit most in an atmosphere pervaded by cultural recrimination and contempt, one in which democratic convergence on valid science is decidedly unlikely to occur. Their conduct contributes to that state. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017) Cultural Cognition blog, passim. Dated added: Jan. 7, 2018.]

 

 

Saturday
Jan062018

Culture, worldviews, & risk perception (glossary entries)

More for this:

Cultural cognition worldviews. A typology of risk-perception predispositions formed by the intersection of two moral orientations—hierarchy-egalitarianism and individualism-communitarianism.  Scales measuring these predispositions figure in empirical inquiries informed by the cultural cognition thesis. [Source: Kahan, in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk (eds. R. Hillerbrand, P. Sandin, S. Roeser, & M. Peterson), pp. 725-760 (2012). Date added: Jan. 6, 2018.]

Cultural theory of risk. A theory that asserts that individuals can be expected to conform their perception of all manner of risk, along with the efficacy of measures to abate the same, to their worldviews, which are based on Mary Douglas’s “group-grid” typology. [Sources: Douglas, Risk Acceptability According to the Social Sciences (1985); Douglas & Wildavsky, Culture & Risk (1982); Rayner, Cultural Theory and Risk Analysis, in S. Krimsky & D. Golding (Eds.), Social Theories of Risk (Krimsky &Golding eds.) 83-115 (1992). Date added: Jan. 6, 2018.]

 

Friday
Jan052018

New entries for CCP "glossary": cognitive dualism and the disentanglement principle

Still more for this dictionary/glossary in progress:

Cognitive dualism.  A theoretical account of reasoning that purports to reconcile opposing states of belief and disbelief in fundamental scientific facts. The theory posits that individuals variously endorse and reject such facts depending on which state—belief or disbelief—best enables such individuals to achieve context-specific goals.  Thus a science-trained professional might “believe in” human evolution when he or she is engaged in professional tasks that depend on the truth of that theory, yet still disbelieve in human evolution when he or she is acting as a member of a religious community, in which such disbelief enables her to both experience membership in and loyalty to such a community and to express the same. Farmers, too, have been observed to “disbelieve in” human-caused climate change when acting as members of their cultural communities, but to “believe in it” when endorsing farming practices that anticipate human-caused climate change. [Sources: Everhart & Hameed, Evolution: Education and Outreach, 6(1), 1-8; Prokopy, Morton et al., Climatic Change, 117, 943-50 (2014); Cultural cognition blog passim. Date added: Jan. 4 2018].

* * *

The distentaglement principle.  Label for a normative practice, derived from empirical findings, that supports the self-conscious presentation of scientific information in a manner that effectively severs positions on contested science issues from message recipients’ cutlural identities.  The effective use of the disentanglement principle has been credited with the successful teaching of evolutionary theory to secondary school students who "disbeliever" evolution. It also is the basis for science communication in Southeast Florida, where community engagement with climate change science draws together groups and communities that hold opposing beliefs in human-caused climate change. [Sources: Lawson & Worsnop, Journal of Research in Science Teaching, 29, 143-66 (1992). Kahan, Advances in Pol. Psych., 36, 1-43. Added on Jan. 4, 2018.]

 

Tuesday
Jan022018

"Science curiosity" and "SCS", plus "Mobility and Stability hypotheses"--latest entries in Cultural Cognition Dictionary/Glossary (Whatever)

I know, I know -- the construction of this document has taken over this blog of late, but that's becasue the alternative is to grade 85 criminal law exams. . . . 

Science curiosity and “SCS.” Science curiosity is a general disposition that reflects the motivation to seek out and consume scientific information for personal pleasure.  Variance in this disposition across persons and groups is measured by the Science Curiosity Scale (“SCS”). Intended to facilitate research on engagement with science documentaries, SCS scores have also been shown to predict resistance to politically motivated reasoning, including Motivated System 2 Reasoning (“MS2R”) [Source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199 (2017). Added Jan. 2, 2018.].

* * *

The mobility and stability hypotheses. Competing conjectures about how individuals’ perceptions of risk and related facts can be expected to behave across different settings (e.g., the workplace vs. the home). The “stability hypothesis” predicts that “individuals will seek to homogenize their experience of social structure in different areas of their lives” in a manner that reflects individuals’ static cultural worldviews. The “mobility hypothesis,” in contrast, holds that individuals’ can be expected to form differing perceptions to risk as they move across social contexts, which themselves are understood to embody distinct, and often opposing, cultural worldviews: “according to this view, individuals may flit like butterflies from context to context, changing the nature of their arguments as they do so” [Source: Rayner, Cultural Theory and Risk Analysis in Social Theories of Risk (Krimsky & Golding eds.) 83-115 (1992), pp. 105-106. Added Jan. 2, 2018.]

 

 

Wednesday
Dec272017

Hey-- still *more* entries for Cultural Cognition Dictionary/Glossary/Whatever

You can read all the entries  (all for now, that is) here.

Expressive rationality. Refers to the tendency of individuals to (unconsciously) form beliefs that signify their membership in, and loyalty to, identity-defining affinity groups. Among opposing groups, expressive rationality does not produce convergence but rather political polarization on the best available scientific evidence. Nevertheless, the strongest basis for treating this type of reasoning as rational is that it intensifies rather than dissipates as ordinary members of the public attain greater proficiency in the styles of reasoning essential to science comprehension (e.g., cognitive reflection, science literacy, and numeracy)  [Sources: Kahan, Peters, et al., Nature Climate Change, 2, 732-35 (2012), p. 734.  Kahan, Behavioral & Brain Sci. 40,26-28 (2016); Stanovich, Thinking & Reasoning, 19, 1-26 (2013). Added Dec. 27, 2017.] 

The tragedy of the science communications commons. A normative objection to expressive rationality.  While it is often rational for an individual to engage in this form of reasoning, it is a disaster when all members of a culturally diverse democratic society do so at once: in that case, members of opposing cultural groups are unlikely to converge (or at least converge as soon as they should) on what science has to say about the risks their society faces.  This consequence of expressive rationality, however, does nothing to reduce the psychic incentives that make it rational for any particular member  of the public to form identity-protective rather than truth-convergent forms of information processing. [Source: Kahan, Peters, et al., Nature Climate Change, 2, 732-35, (2012), p. 734,  Added Dec. 27, 2017.]