follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk


Confidence in science 2016

I've featured a version of this graphic a couple of times in the past (here & here), but this updates it for 2016, the lastest GSS report on responses to GSS's well known "institutional trust" battery  The batter solicits "confidence" in those who "run" 13 institutions.

One of these is the "science community, which since the beginning of the GSS (1974) has always ranked second, behind either "the military" or "medicine." What's more, it has always been ranked no lower than second regardless of citizens' political outlooks.

No change in 2016:

 Conservatives still rank "those who run" the "science community" ahead of big business and banks. Religious people still see the science community as more worthy of confidence than organized religion.

So why do we see so many conflicts over decision-relevant science?

I have my own answer to that question. But what's yours?



*Now* do you see an effect? CRT & political outlooks

Okay, the use of ggplot's density function was unhelpful

But this is pretty darn good: a stacked area graph computed along the continuous "Left-right" outlook measure (which is coded so that conservatism increases as outlook score goes up).  One can easily see how every score on CRT (a 3-item assessment) relates to Left-right values of interest in these four large datasets. 

 (Click on it for better view.)

I was exchanging views with someone who specializes in the relationship between political ideology and reasoning style. When I described the  relationships here as "trivially different from zero," he objected, stating that "p < .01 means that it IS non-trivially different from zero . . . ."  

Anyone disagree?...


On curiosity as a civic virtue . . . (fragment)

From corresondence with a colleague who expressed despair for the prospects of enlightened self-government in the age of identity-protective cognition:

Image result for john dewey. . . 1. Curiosity. In How we Think (1910), ch. 3, § 1, Dewey identifies curiosity as the “most vital and significant” of the auxiliary mental “resources” presupposed by reflective thought.  He focuses on its power both to stock the inventory of “primary facts upon which inference must base itself” and to motivate “suggestion”—the unconscious or pre-conscious process by which attention is aroused and critical thought activated.  He is right to emphasize these functions.

But I wonder if we, with the benefit (?!) of our own experience, might identify another form of aid that curiosity supplies: the negation of politically motivated reasoning. PMR consists in an aggressive avoidance of and defensive resistance to evidence that challenges one’s identity-defining preconceptions. But when one is in the grip of curiosity, one is impelled to engage foreign or exotic ideas in pursuit of the anticipated pleasure of discovering that things work differently from what one could have imagined. By hypothesis, then, curiosity disarms the mental sentries that seek to bar engagement with mind-changing forms of evidence. 

If this is right, then we might elevate curiosity as a civic-cognitive virtue even above rational habits of mind. For without curiosity, the societal benefits of the latter are wasted. Indeed, without curiosity, rational habits of mind are themselves conscripted onto the side of identity-protective cognition, on the behalf of which they contribute to the annihilation of the prospects of meaningful civic participation in science-informed and –guided self-government. . . .


Me & my shadow(s)


What can the Cognitive Science of Religion learn from the Science of Science Communication--and vice versa (lecture summary & slides)

These are the basic points that I recall making at the recent New Perspectives on Science & Religion conference in Manchester, England. Slides here.

1.  What CSR can learn from SSC.  The Cognitive Science of Religion (CSR) uses dual process theory to understand religious convictions.  Religious beliefs, according do CSR,  reflect “natural” reasoning, which is rapid, intuitive, and affect-laden. Scientific beliefs, in contrast, reflect “unnatural” reasoning, which is conscious, deliberate, and analytical (Seybold 2017).  Because “natural” reasoning is easier than “unnatural,” we should expect to see pervasive conflicts between religious convictions and scientific insights, such as human evolution.

CSR’s natural-unnatural framework is (as many CRS scholars recognize) a conception of the distinction between “System 1” and “System 2” information processing featured in cognitive science generally (Stanovich & West 2000). The Science of Science Communication (SSC) has developed concepts and methods that help identify how these forms of reasoning figure in public conflicts over science (Kahan 2015b). Incorporating these concepts and methods into CSR can enhance its power to explain public rejection of scientific insights that transgress religious convictions.

Two mechanisms are particularly relevant. One is expressive rationality, which refers to the use of reasoning to form identity-congruent rather than truth-congruent beliefs. The other is motivated system 2 reasoning (MS2R), which investigates the role that System 2 information processing plays in factual beliefs that signify one’s identity (Kahan 2016, 2017b).

Both of these mechanisms figure in beliefs about human evolution.  CRS scholars have attributed religious disbelief of human evolution to overreliance on System 1 heuristic-reasoning (e.g., Gervais 2015).  But in fact, much like ideological skepticism about climate change, religiously grounded resistance to evidence of human evolution increases as the capacity and disposition to use conscious, effortful System 2 reasoning increases (Kahan 2017a; cf Kahan & Stanovich 2016).  

This is what SSC tells us to expect to see insofar as positions on human evolution symbolize competing cultural styles. It is an example of how incorporating expressive rationality and MS2R into CSR would enhance CSR’s power to explain the distinctive effects of religion on information processing.

2.  What SSC can learn from CSR.  The relationship between SSC and CSR, however, is not a one-way street.  Just as SSC is in a position to enrich CSR, so CSR is in a position to advance the agenda of SSC. 

The primary contribution CSR can contribute to SSC, I believe, consists of myriad  distinctive real-world examples of information-processing strategies that variously resist and accommodate the tension between truth-seeking and identity-expressive goals.

An example is the phenomenon of cognitive dualism.  As illustrated by Hameed’s “Pakistani Dr.” paradox, this dynamic refers to the harboring of opposing role-specific factual perceptions (Everhart & Hameed).  

“I believe in it [human evolution] at work, but disbelieve in it at home,” says the Dr.

No academic, the Dr. nevertheless has a more nuanced and sophisticated view of “beliefs” than do many decision scientists.  He appropriately recognizes “beliefs” not as registers of assent or non-assent to abstract propositions but rather as action-enabling, affective states, the rationality and consistency of which must be judged relative to the goals of the actor.

Because beliefs so understood necessarily exist within clusters of action-enabling intentional states (emotions, moral judgments, desires, etc.), it is a mistake to apply to them a criterion of identity that conceives of them as free-standing states of assent or non-assent to general claims about  how the world works.  Rather, they can be judged for their rationality and consistency only in relation to the actions they enable: if those actions are suited to the Dr.’s goals and are consistent with one another, then there is no psychological contradiction in the cognitive dualistic stance the Dr. adopts toward them.

I am convinced that the Pakistani Dr. has numerous counterparts in the field of risk perception. These include U.S. farmers who (like the Dr.) disbelieve in climate change in order to be members of a cultural community but who believe in it in order to be successful farmers.

The Dr.’s counterparts also include citizens in SE Florida who, despite being polarized on the reality of human-caused climate change, are of one mind about the collective mission to preserve their way of life from the dangers that human-caused climate changes poses to it.

We will not understand these complex and consequential phenomena without an account of cognitive dualism.   And the likely most profitable place to look for such accounts is in CSR.

3.  Normative/prescriptive upshot. Finally, the points of contact between CSR and SSC can help inform moral and prescriptive assessments.

Hameed’s work (2013, 2015) suggests that cognitive dualism on evolution is socially contingent.  It can flourish in a natural—indeed, unremarkable—fashion in societies in which competing positions on evolution have not become entangled with social identity. But where such entanglement has occurred (often as a result of the strategic behavior or conflict entrepreneurs), cognitive dualism is less viable; in that situation, individuals will perceive that they are being put to a choice between knowing what science knows and being the kind of person whose identity is defined by holding a particular position on the fact in question (e.g., human evolution).

They are highly likely in that situation to pick the identity-defining position and forgo engagement with the position that is supported by scientific evidence (Kahan 2015b).

This is a highly undesirable outcome. It is productive of needless group conflict; and it obliterates the division between the private domain, in which free and reasoning individuals should be allowed to form their own conception of the good life, and the public domain, in which they are legitimately obliged to be guided by  the best scientific evidence when inhabiting a role (e.g., a medical Dr.) that can be successfully occupied only with the benefit of such insight.

At least one objective of SSC should be to identify practices and norms that preempt this conflation. Because it is rich with conflicts of this sort, CSR can help SSC to sharpen and refine this function (Kahan 2015b)


Everhart, D. & Hameed, S. Muslims and evolution: a study of Pakistani physicians in the United States. Evo Edu Outreach 6, 1-8 (2013).

Gervais, W.M. Override the controversy: Analytic thinking predicts endorsement of evolution. Cognition 142, 312-321 (2015).

Hameed, S. Making sense of Islamic creationism in Europe. Public Understanding of Science 24, 388-399 (2015).

Kahan, D.M. ‘Ordinary science intelligence’: a science-comprehension measure for study of risk and science communication, with notes on evolution and climate change. J Risk Res 20, 995-1016 (2017a).

Kahan, D.M. Climate-Science Communication and the Measurement Problem. Advances in Political Psychology 36, 1-43 (2015a).

Kahan, D.M. The expressive rationality of inaccurate perceptions. Behavioral and Brain Sciences 40 (2017b).

Kahan, D.M. The Politically Motivated Reasoning Paradigm, Part 2: Unanswered Questions. in Emerging Trends in the Social and Behavioral Sciences (John Wiley & Sons, Inc., 2016).

Kahan, D.M. What is the "science of science communication"? J. Sci. Comm, 14, 1-12 (2015b).

Kahan D.M. & Stanovich K.. Rationality and Belief in Evolution, CCP/APPC Working paper (2017).

Seybold, K.S. Questions in the Psychology of Religion (Cascade Books, 2017).

Stanovich, K.E. & West, R.F. Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences 23, 645-665 (2000).


Still *another* study finds "scientific consensus messaging" ineffective

At this point, we are seeing the social-science equivalent of running up the score, but here's yet another study that finds that "scientific consensus messaging" doesn't work.  

So that makes 4 studies that have explicitly found this result --

  • Bolsen, T. & Druckman, J.N. Do Partisanship and Politicization Undermine the Impact of Scientific Consensus on Climate Change Beliefs? Working paper (2017)
  • Deryugina, T. & Shurchkov, O. (2016). The Effect of Information Provision on Public Consensus about Climate Change. PLOS ONE 11, e0151469.
  • Cook, J. & Lewandowsky, S. Rational Irrationality: Modeling Climate Change Belief Polarization Using Bayesian Networks. Topics in Cognitive Science 8, 160-179 (2016).
  • Dixon, et al. Improving Climate Change Acceptance Among US Conservatives Through Value-Based Messages, Science Communication. (2017)
plus one that disguises that it found such a result,
  • van der Linden, S.L., Leiserowitz, A.A., Feinberg, G.D. & Maibach, E.W. The Scientific Consensus on Climate Change as a Gateway Belief: Experimental Evidence. PLoS ONE 10 (2015).
to 1 reasonably sound study that genuinely found that the message did seem to work with an Austrailian convenience sample:
  • Lewandowsky, S., Gignac, G.E. & Vaughan, S. The pivotal role of perceived scientific consensus in acceptance of science. Nature Climate Change 3, 399-404 (2012).
I don't know if 83% is a "consensus," but I do know that the weight of the evidence is growing stronger in favor of rejection of the proposition that all you have to do to change their minds is tell skeptics that the vast majority of scientists disagree with them.

Do you see an effect here? Some data on correlation of cognitive reflection with political outlooks

It couldn't last. The "asymmetry thesis" is again sucking me into the vortex....

John Jost included one of my papers in his meta-analysis of research on conservatives' cognitive style, including cognitive reflection.  But I have many many datasets with these data in them, and I would have been happy to furnish the essential details with him had he asked me.

Anyway, here are some more findings that support the conclusion that the relationship between CRT scores and ideology is only trivially different from zero:

If one gets a meaningful effect using a convenience sample, then the sample probably is not valid for trying to draw population-level inferences.

Gives me a chance to renew the question, too, about whether probability density distributions of the sort generated by ggplot are a good way to present this sort of info. 


Now "in press": the Gateway Belief Illusion . . .

Coming soon to a newstand near you . . . .


Which provides more information--probability density distribution or scatter plot with locally weighted regression line? Which is easiest to comprehend?

I don't mean in all contexts, but here, which is better? Probability density distributions or scatter plots with locally weighted regression? Why?

Pair A

Pair B


Weekend update: bias & unreliability in peer review

Here's a classic on biases of peer reviewers:

The experiment results are pretty cool. But one conclusion I view as even more alarming than the reviewers' confirmation bias was Mahoney's finding that there's very little "inter-rater reliability" among peer reviewers, something that I think people who regularly review for & publish in decision-science journals get to observe first hand. Another instance of the self-measurement paradox, this finding ought to make those in the business pretty nervous about the power of pre-publication peer review to winnow the wheat from the chaff.

One thing  I hadn't had occasion to notice or think about previously is the citation history:


Two possibilities occur to me.  One is that electronic search has made it easier for authors to find this paper.  

The other is that the topic of confirmation bias among reviewers has become much more topical as challenges to the integrity & effectiveness of pre-publication peer review have become more common.  

One way to sort those out would be to see if articles of other sorts--ones unrelated to cognitive bias of scientists as reviewers -- display this same pattern, in which case possibility one would seem the stronger explanation. One could also look to see if other older articles on reviewer confirmatory bias (e.g., Koehler 1993) have recent-citation upsurge that goes against usual steady decline in citation of papers.

For what it is worth, with virtually no investigation on my part, I'm willing to bet, oh, $10,000 that papers inquiring into reviewer bias have increased a lot in last few years in response to anxiety that there are group-conformity influences in the study of politically charged topics (e.g., climate change).

But if anyone else wants to share some back-of-envelope data analysis here, go for it.

Be interesting too to get some color on the sort of thinker Mahoney, who died at age 60 in 2006, was.

BTW, one of the funniest/scariest things in Mahoney's paper is his quotation of this bit of feedback on the study method, which in part involved having some reviewers read papers w/ just methods & w/o results.

"Personally, I don't see how anyone can write the Introduction and Method without first having the Results" (p. 172), the subject stated. 

It's less surprising than that an empirical researcher would feel that way than that he or she would so unself-consciously admit to a style of study presentation that is based on post-hoc story-telling.


Koehler, J.J. The Influence of Prior Beliefs on Scientific Judgments of Evidence Quality. Org. Behavior & Human Decision Processes 56, 28-55 (1993).

Mahoney, M. & DeMonbreun, B. Psychology of the scientist: An analysis of problem-solving bias. Cogn Ther Res 1, 229-238 (1977).  


Travel journal entries

Last week & a half . . .

1. American Academy of Arts & Sciences, Cambridge, MA: Talk on misperceptions & misinformation, based on paper related to the same. Slides here.

2. World Science Festival, NYC. No slides for this panel discussion, but video here.

3. Metcalf Institute, Rhode Island. Ran through “good” and “not so good” explanations of public polarization over (certain) forms of decision-relevant science.  Great program. Slides here.

4. Judges Conference. Second Circuit, US Court of Appeals, Mohonk Mtn House, New Paltz, NY. Discussed hazards of video evidence, which not only can excite motivated reasoning but which also distinctively stifle decisionmakers’ perception of the same (some [not all] slides here).  Great presentations by co-panelists Jessica Silbey & Hany Farid.

5. Yale bioethics program lecture:   Communicating science in a polluted science communication environment.

On deck . . .

1. New perspectives on science & religion, Manchester England. Will likely highlight findings from study on relationship between religiosity, cognitive reflection & belief in human evolution.

2. Symposium: Gene Drive Modified Organisms and Practical Considerations for Environmental Risk Assessments.  My talk is tentatively titled, "Forecasting Emerging Technology Risk Perceptions: Are We There Yet?"


Science comprehension without curiosity is no virtue, and curiosity without comprehension no vice

For conference talk in Stockholm in Sept. The "without comprehension" part might be a slight exaggeration but point is that people normally recognize valid decision-relevant science w/o understanding it. 25 CCP points for first person to correctly identify allusion in title; hopefully it won't perplex Swedes too much.

It has been assumed (very reasonably) for many years that the quality of enlightened self-government demands a science-literate citizenry (e.g., Miller 1998).   Recent research, however, has shown that all manner of reasoning proficiency—from cognitive reflection to numeracy to actively open-minded thinking—magnifies politically motivated reasoning and hence political polarization on policy-relevant science (e.g., Kahan, Peters et al. 2012, 2017; Kahan 2013; Kahan & Corbin 2016).  The one science-comprehension-related disposition that defies this pattern is science curiosity, which has been shown to make citizens more amenable to engaging with evidence that challenges their political predispositions (Kahan, Landrum et al. 2017).  The presentation will review the relevant research and offer conjectures on their significance, both theoretical and practical.


Kahan, D.M. & Corbin, J.C. A note on the perverse effects of actively open-minded thinking on climate-change polarization. Research & Politics 3 (2016).

Kahan, D.M. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making 8, 407-424 (2013).

Kahan, D.M., Landrum, A., Carpenter, K., Helft, L. & Hall Jamieson, K. Science Curiosity and Political Information Processing. Political Psychology 38, 179-199 (2017).

Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change 2, 732-735 (2012).

Miller, J.D. The measurement of civic scientific literacy. Public Understanding of Science 7, 203-223 (1998).


Are misconceptions of science & misinformation the *problem* in the Science Communication Problem?...

From Misconceptions, Misinformation, and the Logic of Identity-protective Cognition . . .

This paper investigates the role that “misinformation” and “misconceptions of science” play in political controversies over decision-relevant science (DRS). The surmise that their contribution is large is eminently plausible. Ordinary members of the public, we are regularly reminded (e.g., National Science Foundation 2014, 2016), display only mod7est familiarity with fundamental scientific findings, and lack proficiency in the forms of critical reasoning essential to science comprehension (Marx et al. 2007; Weber 2006). As a result, they are easily misled by special interest groups, who flood public discourse with scientifically unfounded claims on global warming, genetically modified foods, and other issues (e.g., Hmielowski et al. 2013). I will call this perspective the “public irrationality thesis” (PIT).

The unifying theme of this paper is that PIT itself reflects a misconception of a particular form of science: namely, the science of science communication. One of the major tenets of this emerging body of work is that public controversy over DRS typically originates in identity-protective cognition—a tendency to selectively credit and discredit evidence in patterns that reflect people’s commitments to competing cultural groups (Sherman & Cohen 2002, 2006). Far from evincing irrationality, this pattern of reasoning promotes the interests of individual members of the public, who have a bigger personal stake in fitting in with important affinity groups than in forming correct perceptions of scientific evidence. Indeed, the members of the public who are most polarized over DRS are the ones who have the highest degree of science comprehension, a capacity that they actively employ to form and persist in identity-protective beliefs (Kahan 2015a).

The problem, in short, is not a gullible, manipulated public; it is a polluted science communication environment. The pollution consists of antagonistic social meanings that put individuals in the position of having to choose between using their reason to discern what science knows or using it instead to express their group commitments. Safeguarding the science communication environment from such meanings, and repairing it where protective measures fail, should be the principle aim of those committed to assuring that society makes full use of the vast stock of DRS at its disposal (Kahan 2015b)....


Scicomm-centerism: Another “selecting on the dependent variable” saga

Okay, this is an argument that I tried to make on my Saturday panel at the World Science Festival & that went over like a fourth-dimension lead balloon. Surely, this is b/c of my own limitations as a science communicator (studying and doing are very different things!).

Cultural polarization affects science but is not about science!

This point—which I invite the 14 billion readers of this blog to help reduce to a better, more descriptive, more evocative sentence—is of vital importance to science communication because it forecloses many explanations of public science controversies and prescriptions for how they should be addressed.

As is usually the case with concepts like these, the best way to make the point is to demonstrate it.

So consider the CCP study reported in They Saw a Protest (2012)

There subjects, instructed to play the role of mock jurors in a civil trial, watched a film of a political protest that the police had broken up.  The protesters claimed they were involved in peaceful if vigorous debate protected by the First Amendment; the police, in contrast, asserted that the demonstrators had crossed the line to intimidation & coercion, which are not “free speech” for purposes of the U.S. Constitution.

There was an experimental component. We told half the subjects that the the protest occurred at an abortion clinic, and that the demonstrators opposed the rights established by Roe v. Wade and its progeny. The other half were told that the protest occurred outside a military recruitment center, and that the demonstrators were criticizing the policy of excluding openly gay and lesbian citizens from the military.

We saw big effects.

Subjects with different cultural values reported seeing different things (protesters blocking pedestrians and screaming in their faces vs. vigorous but peaceful exhortations) if they were assigned to the same condition and thus thought they were watching the same sort of protest.

Be 5000th clicker & win a prize!At the same time, subjects with the same values disagreed with one another on those same facts, and on the proper disposition of the case (order to enjoin police from interfering with future protests), if they were assigned to different experimental conditions and thus thought they were watching  different kinds of protests (anti-abortion vs. nondiscriminatory military recruitment).

These are the same groups that are divided over issues like climate change, nuclear power, fracking etc.

This is a powerful demonstration of how cultural cognation can generate culturally polarized reactions to facts.  But obviously the trigger of such conflict had nothing to do with science—nothing to do, that is, with specific science issues or with one or another group’s position on any such issue.

Because the mechanisms at work in the study (identity-protective cogntion, in particular) are the same ones at work in debates over what is known about nuclear power, climate change, fracking, the HPV vaccine etc., the study strongly suggests that scholars and activists who are centering their attention exclusively on comprehension of science are making a grievous error.

That is, if what is clearly not a disputed-science conflict provokes exactly the species of motivated reasoning that divides cultural groups on science, then it is implausible to believe that anything intrinsic to science—e.g., the “uncertainty” of science, “trust in science,” “acceptance of the authority of science” etc.—drives cultural polarization on science issues, or that remedies designed specifically to address those kinds of barriers to comprehension of science will have any effect.

This is another instance of the errors of inference one can make if one selects on the dependent variable—that is, populates his or her set of observations with ones that presuppose the truth of the hypothesis being tested.  Here, science communication scholars and practitioners are formulating their explanations of, and prescriptions for, science conflicts without reference to whether the cognitive and social dynamics in question apply in any non-science setting.

What I’m saying here does not imply there aren't solutions to public conflicts over science—only that the solutions that treat science conflicts as unique or as uniquely focused on mechanisms of comprehension of science are bound to be mistaken.

Indeed, once one recognizes that many non-science cultural conflicts exhibit exactly the same sorts of biases in factual perceptions, then one's access to potential explanations and remedies are likely to widen or in any case become much more accurate and effective—because in that case one will have access to the empirical work of scholars who’ve been studying pertinent dynamics of cultural polarization outside the setting of science controversies.

What are those researchers discovering?

That’s open to debate, of course. But in my view, they see more value in the “law of social proof”—the principle that holds that individuals will generally conform their behavior to that of others with whom they identify and whom they understand to be informed, socially competent actors.

In any case, I’m less concerned with identifying exactly what casting aside the science-controversy blinders will teach us than I am with helping science communicators to understand why they should cast the blinders aside. 

If they did that--if they jettisoned the “scicomm centerism” that is now shaping their work—what they’d be enabled to see would vastly enrich their craft.


Next stop: Metcalf Institute

Next stop . . .


Let's talk about science polarization . . . NY World Science Festival

Looking for something to do on Sat. evening?