I had the tremendous privilege—which yielded an even larger benefit in enlargement of personal knowledge—of being able to participate in the SENCER summer institute at Santa Clara University last week.
SENCER—which stands for Science Education for New Civic Engagements and Responsibilities—is an integrated set of practical research initiatives aimed at promoting the development and use of scientific knowledge on how to teach science. It is actually one of a family of programs create to carry out the broader mission of the National Center for Science and Civic Engagement, “to inspire, support, and disseminate campus-based science education reform strategies that strengthen learning and build civic accountability among students in colleges and universities.”
It’s not amusing that those job it is to impart knowledge on empirical methods so infrequently even ask themselves whether their own methods for doing so—from the mode of teaching they use in the classroom to the materials and exercises they assign to students to the examinations they administer to test student comprehension—are valid and reliable.
On the contrary, it’s an outright scandal that demeans the culture of science.
SENCER comprises a sprawling, relentless, and expanding array of resources aimed at dissolving this embarrassing contradiction. These include a growing stockpile of empirical research findings; a trove of practical materials designed to enable use of this knowledge to improve science education; the sponsorship of regular events at which such knowledge is shared and plans for enlarging it formulated; a set of regional centers that coordinate efforts to promote evidence-based methods in the teaching of science; and most important of all a critical mass of intelligent and passionate people committed to the program’s ends.
The occasion for SENCER—the peculiar insularity of a craft dedicated to propagating valid empirical methods from empirical evidence relating to the realization of its own goals—is not unique to science education.
It is at the root, too, of what I have called the science communication problem—the failure of ample, compelling, readily accessible and indeed widely disseminated evidence to quiet persistent public controversy over risks and other facts to which that evidence directly speaks. Climate change is, of course, the most conspicuous example of the science communication problem but it is hardly the only consequential instance of it.
Immense resources are being dedicated to solving this problem and appropriately so.
But the aggressive resistance to evidence-based practice that pervades the climate-change advocacy community and their counterparts on other issues means that the vast majority of these resources are simply wasted.
I’m not kidding: hundreds and hundreds of millions of dollars are foreseeably expended on programs that are certain not to have any positive impact (aside from raising the profile of those who operate the programs)—not so much because the initiatives being sponsored are ill-considered (although many indisputably are!) but because those who are being awarded the money to carry them out aren’t genuinely committed (or maybe just not genuinely capable) of considering empirical evidence.
They don’t meaningfully engage existing evidence on communication dynamics to determine what psychological and political mechanisms their initiatives presuppose and what is known about those mechanisms.
They don’t carry out their initiatives in a manner that is geared to generating what might be called programmatic evidence in the form of pretest results or early-return data that can be used to refine and calibrate communication efforts as they are unfolding.
And worst of all, they lack any protocols that assure information on the impact of their efforts (including the lack thereof) is collected, preserved, and freely distributed in the manner that enables the progressive accretion of knowledge.
Instead, every surmise from every source—no matter how innocent of the conclusions of those who have previously used scientific methods to test theirs—is created equal in the world of science communication advocacy.
Everyday is a new day, to be experienced free of the burden to take seriously what was learned (from failure as well as success) the day before.
So has Amy Luers, in a perceptive, evidence-informed article in Climatic Change that was addressed specifically to the foundations that are the primary sources of support for efforts to promote constructive engagement with climate science.
Her article is evidence of a heartening awareness that the evidence-free culture that has characterized science communication in this area of public policy and others is barren of the supportive practices and habits and outlooks that nourish growth of empirical knowledge.
Maybe things will change.
But there are still other science-communication professions that are puzzlingly—unacceptably, intolerably!—innocent of science in their own operations.
Science journalism—including (here) popular science writing and science documentary production as well as science news writing—is one.
I have said before that I regard these professionals with awe—and gratitude, too. Much as the bumblebee defies the calculations of physicists who insist that their capacity for flight defies physical laws, so science journalists seem to defy basic mechanisms of psychology by creating a form of commensurability in understanding that enables the curious nonscientist to participate in—and thus experience the wonder of—what scientists, by applying their highly specialized knowledge, discover about the mysteries of nature.
There is no communication alchemy involved here. Using a form of professional judgment exquisitely tuned by experience, the science journalist mines the fields of common cultural understanding for the resources needed to construct this remarkably engineered bridge of insight.
Yet how to do what they do is a matter that constantly confronts the members of this special profession with factual questions that they themselves do not have confident answers to—or have confident but conflating opinions about.
Do norms of journalistic neutrality—such as “balanced” coverage of science issues that generate controversy, within science or without—distort public understanding or help inform curious individuals of the nature of competing claims?
Is the segment of the population that experiences wonder and awe at scientific discovery more culturally diverse than the one than the current regular consumers of the highest quality science documentaries? If so, do those programs convey meanings collateral to their core, scientific content that constrain the size and diversity of their audience?
(These are issues that figured, actually, in two of the sessions of my Science of Science Communication course from last spring; I am delinquent in my promise to report on the nature of those sessions.)
These are empirical questions, ones the answers to which would be made better if journalists had evidence generated specifically to informing the ongoing collective discussion and practice that are the source of their craft knowledge. But instead, we see here, too, the sort of “every-conjecture-created-equal,” “every-day-a-new-day” style of engagement that is the signature of evidence-free, nonscientific thought that by its nature is incapable of creating incremental enlargement of knowledge.
I could go on; not just about science journalism, but about many other evidence-or science-communication professions that are evidence-free about the nature of their own practices. Like the law, e.g.
But the point is that these professions, too, are ripe for SENCERizing. They need to be fortified with the sorts of resources and programs that SENCER comprises. And to get that fortification they require a core of practitioners who not only agree with this philosophy—I think they all already have them, actually—but also structures of collective action that will, through the dynamics of reciprocity, create the self-reinforcing contributions of those practioners to those resources and programs.
SENCER itself might be well be a vehicle for such developments. It’s gracious invitation to me to participate in its summer institute reflects the interest of its members in enlarging the scope of their endeavor to the communication of decision-relevant science.
But it would be a mistake to think that SENCERizing science communication generally means relying on SENCER, or SENCER alone, to facilitate the advent of evidence-based practices within the relevant science-communication professions.
The remarkable founder of SENCER, Wm. David Burns, made this clear to me, in fact.
I asked him if he himself regarded the program as an “engine for” or a “model of” what needs to be done to make science education and science communication generally more evidence based.
He answered that the only appropriate way to think of SENCER is as an “experiment” of a fractal nature: by enabling those who believe science education must be evidence based to continuously form, refine, and test competing conjectures about how to build on and refine their knowledge of how to effectively impart scientific knowledge, SENCER itself is a test of a hypothesis that the particular mode of organization that it is and will become in such a process is an effective way to achieve its own ends.
SENCER, then, is surely a model (an iterative, self-updating one at that!) of the style of conjecture and refutation that is the engine that drives scientific discovery.
And such a model is necessarily one that cannot be reduced to a particular form or formula. For the very logic on which its own success is founded consists in the continuous engagement of competing models, whose successive remedies for one another's inevitable imperfections are what continuously make us smarter than we were before.