This is the second in a series that will be between 3 and 14,321 posts on the connection between science and the craft norms of science journalism.
The point of the series, actually, is that there isn’t—ironically—the sort of connection there should be.
I myself revere science journalists. To me, they perform a kind of magic, making it possible for me, as someone of ordinary science intelligence to catch a glimpse of, and be filled with the genuine wonder and awe inspired by, seeing what we have come to know about the workings of the universe by use of science.
This isn’t really magic, of course, because there’s no such thing as magic, and it would insult anyone who accepts science’s way of knowing as the best—the only valid—way of knowing to say that what he or she is doing amounts to “magic” if the person saying this weren’t being ironic or whimsical (I could imagine describing something as “magic” in a tone of rebuke or contempt: e.g., “Freudian psychoanalysis is a form of magic.”).
But what science journalists do is amazing and hard to fathom. They perform an astonishing task of translation, achieving a practical, workable commensurability between the system of rational apprehension that ordinary people use to make sense of the phenomena that they must recognize and handle appropriately in the domain of everyday life and the system of rational apprehension that scientists in a particular field must use to make sense of the phenomena in their professional domain.
Both systems are stocked with prototypes finely turned to enable the sort of recognition that negotiating the respective domains requires.
But those prototypes are vastly different; or in any case, the ones the experts use are absent and very distinct from anything that exists in the inventory of patterns and templates of the ordinary, intelligent person.
These special-purpose expert prototypes (acquired through training and professionalization and experience) are what allow the expert to see reliably what others in his or her field see, and thus to participate in the sharing and advancement of knowledge in that expert domain.
But enabling the ordinary nonexpert to see the things that science comes to know as experts use their specialized professional judgment is the whole point of science journalism!
Necessarily science journalists must find some means of bridging the gap between the prototypes of the expert scientist and the everyday ones of the curious nonexpert so that the latter can form a meaningful apprehension of the amazing, and awe-inspiring insights that the former glean through science's methods of knowing.
This isn’t magic, in fact.
It is craft. Of the most impressive and admirable sort.
It comprises norms that reliably populate the mind of the science journalist with prototypes and patterns of communication practices that achieve the amazing commensurability I’m talking about.
Science journalists generate these craft norms through their collective activity, and acquire them through experience.
But they aren’t static. They evolve.
Moreover, they aren’t invisible. They are matters that science journalists, like any other professionals, become keenly and acutely aware of as they do their jobs, and do them in concert with others with whom they discuss, and from whom they learn, their craft.
And like other professionals, science journalists are keenly interested in whether their craft norms are in order.
In the account I’m giving, craft norms are the medium by which professional judgment is formed and through which it operates.
Like a method of scientific measurement, professional judgments need to be reliable: they must enable consistent, replicable, shared apprehension of the phenomena that are of consequence to members of the profession.
But like methods of scientific measurement they must also be valid. The thing they are enabling those who possess them reliably, collectively, to apprehend and form judgments about must genuinely be the thing that those in the profession are trying to see.
In the case of the science journalist, that thing that must be seen—not just reliably but accurately—is how to make it possible for the nonexpert of ordinary science intelligence to form the most meaningful, authentic, true picture of the awesome things that are genuinely known to science.
Science journalists, like other professionals, are constantly arguing about whether their norms are valid in this sense. "Are we really doing what we want to do as best we can?," they ask themselves.
Actually, there is no sense of crisis in the profession (as far as I can tell). They know full well that in the main their craft norms are reliably guiding them to ways of communicating that actually work.
But there are plenty of particular matters—ones of genuine consequence—that they worry about, that they have different opinions on, that relate to whether particular things they are doing might actually be working less well than some alternative or maybe even frustrating their goals.
The last post touched on one of those things: In it I discussed Andrew Gelman’s critique of the passivity of science journalists in reporting on “WTF!” social science studies—ones that report remarkable, astonishing, unbelievable results that, in Gelman’s view, almost inevitably are shown to rest on a very basic methodological defect.
It’s not as if science journalists aren’t aware of that issue & filled with views about it!
What’s more, Gelman proposed a solution: interview lots of additional experts besides the study authors and find out if they think the study is valid.
Actually, science journalists talk about this too! The issue isn’t just whether this is a feasible idea but whether it is actually a sound one given what the aim of science journalism is trying to do.
Gelman didn’t recognize that his prescription is bound up with the controversy over whether “balanced coverage”—a norm that enjoins science journalists to cover “both sides” and evince a posture of “neutrality” toward disputed scientific claims—actually contravenes the objective of helping the public form an accurate perception of what’s known by science, particularly on controversial issues like, say, evolution or climate change.
Which gets to another thing that I think was missing, not just from Gelman’s (excellent!) essay but from the discussion that science journalists, as a professional community, are constantly having.
The matters they are debating when they reflect on the validity of their craft norms are very often empirical ones.
The admit of empirical investigation. Indeed, they demand it: members of a profession are no more able to determine through simple debate which of multiple plausible accounts of a phenomenon is true than are scientists!
Scientists don’t just debate in that situation. They collect empirical evidence!
That’s what science journalists need to do too.
They need to make their profession evidence-based—the need to create procedures for identifying craft-norm issues that admit of empirical testing and mechanisms institutions for collecting that evidence, transmitting, and reflecting in common on what that evidence reveals.
Not as a substitute for their craft-norm informed professional judgment—but as a self-consciously managed source of knowledge that they can use as they do what they participate in the process by which their craft norms are formed, evolve and are transmitted.
The need for an evidence-based culture in science journalism is one of the things I had in mind when I said that the points of connection between science journalism and science itself need to be strengthened.
In fact, it is the most important. But there are other points worth mentioning—ones that it will be easier to explain now that this point is out there.
So I will say more. Later.
But the one last thing I will say is that science journalism is not the only profession that is committed to the transmission of scientific knowledge that, to its disadvantage, fails to use science’s way of knowing to advance its knowledge of how to transmit what science knows.
Indeed, science journalists are in a position to do a tremendous favor for those other professions by showing them how to remedy this problem.
Some might think, after decades of aggressive inattention to the science of science communication by those responsible for transmitting decision-relevant science in our democracy, that nothing short of magic will ever remedy our democracy’s deficit in science communication intelligence.
If so, then science journalists are the ones we need to show us how to pull this trick off.