follow CCP

Recent blog entries
« What inferences can be drawn from *empirical evidence* about the science-communication impact of using the term "climate change denier"? | Main | Amazingly cool & important article on virulence of ideologically motivated reasoning »
Tuesday
Jan152013

Yale University "Science of Science Communication" course

Am teaching this course this semester:

PSYC 601b. The Science of Science Communication. The simple dissemination of valid scientific knowledge does not guarantee it will be recognized by nonexperts to whom it is of consequence. The science of science communication is an emerging, multidisciplinary field that investigates the processes that enable ordinary citizens to form beliefs consistent with the best available scientific evidence, the conditions that impede the formation of such beliefs, and the strategies that can be employed to avoid or ameliorate such conditions. This seminar will survey, and make a modest attempt to systematize, the growing body of work in this area. Special attention will be paid to identifying the distinctive communication dynamics of the diverse contexts in which nonexperts engage scientific information, including electoral politics, governmental policymaking, and personal health decision making. 

Here's a "manifesto" of sorts, which comes from course syllabus:

1. Overview. The most effective way to communicate the nature of this course is to identify its motivation.  We live in a place and at a time in which we have ready access to information—scientific information—of unprecedented value for our individual and collective welfare. But the proportion of this information that is effectively used—by individuals and by society—is shockingly small. The evidence for this conclusion is reflected in the manifestly awful decisions people make, and outcomes they suffer as a result, in their personal health and financial planning. It is reflected too not only in the failure of governmental institutions to utilize the best available scientific evidence that bears on the safety, security, and prosperity of its members, but in the inability of citizens and their representatives even to agree on what that evidence is or what it signifies for the policy tradeoffs acting on it necessarily entails.

This course is about remedying this state of affairs. Its premise is that the effective transmission of consequential scientific knowledge to deliberating individuals and groups is itself a matter that admits of, and indeed demands, scientific study.  The use of empirical methods is necessary to generate an understanding of the social and psychological dynamics that govern how people (members of the public, but experts too) come to know what is known to science. Such methods are also necessary to comprehend the social and political dynamics that determine whether the best evidence we have on how to communicate science becomes integrated into how we do science and how we make decisions, individual and collective, that are or should be informed by science.

Likely you get this already: but this course is not simply about how scientists can avoid speaking in jargony language when addressing the public or how journalists can communicate technical matters in comprehensible ways without mangling the facts.  Those are only two of many "science communication problems," and as important as they are, they are likely not the ones in most urgent need of study (I myself think science journalists have their craft well in hand).  Indeed, in addition to dispelling (assaulting) the fallacy that science communication is not a matter that requires its own science, this course will self-consciously attack the notion that the sort of scientific insight necessary to guide science communication is unitary, or uniform across contexts—as if the same techniques that might help a modestly numerate individual understand the probabilistic elements of a decision to undergo a risky medical procedure were exactly the same ones needed to dispel polarization over climate science! We will try to individuate the separate domains in which a science of science communication is needed, and take stock of what is known, and what isn’t but needs to be, in each.

The primary aim of the course comprises these matters; a secondary aim is to acquire a facility with the empirical methods on which the science of science communication depends.  You will not have to do empirical analyses of any particular sort in this class. But you will have to make sense of many kinds.  No matter what your primary area of study is—even if it is one that doesn’t involve empirical methods—you can do this.  If you don’t yet understand that, then perhaps that is the most important thing you will learn in the course. Accordingly, while we will not approach study of empirical methods in a methodical way, we will always engage critically the sorts of methods that are being used in the studies we examine, and from time to time I will supplement readings with more general ones relating to methods.  Mainly, though, I will try to enable you to see (by seeing yourself and others doing it) that apprehending the significance of empirical work depends on recognizing when and how inferences can be drawn from observation: if you know this, you can learn whatever more is necessary to appreciate how particular empirical methods contribute to insight; if you don’t know this, nothing you understand about methods will furnish you with reliable guidance (just watch how much foolishness empirical methods separated from reflective, grounded inference can involve).

Will post course info & weekly reading lists (not readings themselves, sadly, since they consist mainly of journal articles that it would violate Yale University licensing agreement for me to distribute hither & yon; I certainly don't want the feds coming down on me for the horrible crime of making knowledge freely available!)

First session was yesterday & topic was HPV vaccine. It was great class.  Plan to post some reflections on reading & discussion soon. But have to go running now!

 

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (12)

This sounds like an awesome class! I wish it was distance delivered, so that I could join you all! But thanks for posting the reading lists any other course materials. I surely will be checking in on it, as my university does not offer anything similar. Thank you!

January 16, 2013 | Unregistered CommenterKristin Timm

Situational science communication.

Though I am somewhat suprised you did not include the difference of comprehending in the general sense, and the nuanced understanding(s) wrt differences in experts and resulting confusion that may be misidentified as a failure in the general sense, yet isn't. And my all time favorite, the intrinsic assumption, its impact on (mis)understanding in both general and nuanced understandings.

January 16, 2013 | Unregistered CommenterJohn F. Pittman

@Kristin: Thank you! I definitely want to try to simulate parallel class sessions via the blog. Stay tuned!

@John: Oh, just wait. There will be plenty on communcation of "what scientific experts know," including how experts communicate what they know to one another

January 17, 2013 | Unregistered Commenterdmk38

Will one of your suggestions or perhaps acknowledged pitfalls be about language and definition? Science like law has code words that express complex thoughts and relations that may have misleading or culturally defined precepts that are incorrect for communication in science or law. An example of this in the climate change war is the use of temperature which is a poor metric for the actual science which requires enthalpy; or such a claim automatically has assumptions, stated or not, that can be challenged.

I also wonder did you include in your notes how "real" communication between computers takes place and the effect, good and bad, that our use of language, context, body language, etc, has to reduce the time and effort? Which is why definitions and communicating science can run into traps.

I also wonder or you going to use Slovic and Fischhoff, or similar on the Tell and Ask paradigms of communication, especially risk communication?

January 17, 2013 | Unregistered CommenterJohn F. Pittman

@John: these are fascinating points. I'm sure difference in science language & ordinary usage (e.g., "error") will come up. But I think we overestimate the number of context in which this is a source of public confusion or conflict. It poses a challenge for those trying to understand tne primary sources (including obviously those trying to learn to become scientists). But I don't thing the language of science is the source, say, of disputes like the one over climate change (or else we'd have 10^15 as many such conflicts). Maybe you disagree & would shoe me I should revise my views; that is what a seminar is good for! Also good for learning completely new things, such as the information you are adverting too on computers & communication. Send me citations? Also please keep making points like these, particularly in connection with the blog version of the course session.
On Slovic & Fischoff: leaving them out would be like purporting to teach physics without mentioning Newton!

January 17, 2013 | Unregistered CommenterDan Kahan

I will look up some citation(s) later today I hope. It is how a computer does a "handshake." In the introductory electrical engineering class I had to take, it was an exercise for the student such that we would realize the nature of "certain" communication as exemplified by two computers agreeing to exchnage information. This was in the days of the modem and baud. Humans only do about 2 of the six steps IIRC. Great time saver, but it introduces error. In your paradigm, our cultural cognition reduces this and makes such communication by humans more time and use effective than if we used the computer communication model. Humans' ways obviuosly work at least somewhat.

On definitions, the use by engineers of gain and feedback versus the use in climate science. The public use of temperature rather than enthalpy when choosing a metric to explain climate change that has been exposed as problematic with the skeptics using the same methodology as the IPCC communicators to claim the warming has stopped, or that we are actually cooling. Both of these have caused heated exchanges, and if I understand you and the lit., such encounters should be avoided due to heated exchanges increasing the entrenchment effect.

But it may be useful in your seminar only as an example(s) of what can go wrong in poor choices, or choosing words that have accepted scientific meaning in one field, and yours is different. So let me ask you if water vapor is a positive feedback, what is the effect of increasing temperature at top of atmosphere? You can also go to about any climate (war) blog and find all sorts of discussion or miscommunication from word usage. Again, it may only be something to highlight.

January 17, 2013 | Unregistered CommenterJohn F. Pittman

http://en.wikipedia.org/wiki/Handshaking gives the general start of communication. http://www.ehow.com/how_8596746_explain-six-steps-communication-process.html gives the six steps in human communication in the author's opinion.

I can't find one like our EE book, but these should give the idea.

People like myself are often unintentionally impolite. You don't see us unless we want something. I am likely to come in, ask my question, get the answer and leave. So verbal steps equal two.

From memory:
a. protocol agreement
1. 1st computer: I will send infomation
2. 2nd computer: I am ready to recieve information
3. 1st computer: I am sending information
4. 2nd computer: I am receiving information
5. 1st computer I sent this particular (verifiable) information
6. 2nd computer: I confirm I recieved this particular verified information.
b. end protocol or start another communication.

There sure is a lot of advice on communication on the net, especially the successful bit. But they do not seem to have the elements that yours has.

Your comment on Sunstein

I believe the phenomenon at work in polarized science debates is something more general: identity-protective motivated reasoning. This refers to the tendency of people to conform their processing of information -- whether scientific evidence, policy arguments, the credibility of experts, or even what they see with their own eyes -- to conclusions that reinforce the status of, and their standing in, important social groups.

My question: how would one constrain experience not to be a variable? My favorite example, anecdotal I assume, is the story of the young woman who after being mugged three times by young balck males became fearful and prejudiced against young black males, who was herself black.

I would not assume that one's only knowledge and experience was with one's social group, nor could one's social group know of all the inputs of knowledge and experience that makes up one's history. So, I guess the operative word is tendency, and as the course develops the ability to detect, direct, or compensate for these tendencies will be demonstrated, my assumption.

January 17, 2013 | Unregistered CommenterJohn F. Pittman

John,

Interesting analogy. But there are a few more steps needed for real-world communication channels.

1. All participants need to be given a shared background knowledge of the range of possible communication methods. This has to happen outside the communication channel itself.
2. Establish a basic connection using the most robust method available. One participant calls for attention, the other replies to say they can hear the signal.
3. Measure the error characteristics of the channel (including the encoding/decoding machinery). What bandwidth can it take? What sort of noise does it introduce? This is done by sending various test messages and seeing if they come back garbled.
4. One participant suggest a range of communication protocols that it can understand and that will tolerate the noise on the channel. The other participant selects the best one that it too can understand.
5. Switch to the new protocol, and repeat. Establish a connection, measure the noise, decide whether to stick with the protocol or suggest a switch.
6. Send the message using the selected protocol.
7. Test whether the message was received correctly. Resend or change protocol if not.

A protocol has to define the encoding (alphabet), the vocabulary, and the grammar. It has to specify who talks when, and for how long. It has to specify ways to interrupt, confirm reception, notify of errors, recover from errors, request temporary or permanent changes to protocol, and detect and respond to a total loss of the communication channel.

With computers you have to do some fairly careful mathematical analysis ahead of time to make sure all possibilities are covered, and you don't get a permanent deadlock because one side is waiting for a signal the other side isn't going to send. With humans, you don't have to be so prescriptive, because they're usually flexible and intelligent enough to figure out what's gone wrong and how to fix it. But there is a remarkable amount of human-human communication devoted to managing the communication channel itself. Body language and facial expressions, intonation of voice, speed of delivery, standard phrases like "excuse me!" and "excuse me?" and "Huh?". You don't notice it until it goes wrong. People follow standard scripts for many common exchanges, like purchases, apologies, requests, complaints, business meetings, social chatter, and so on. Badly breaking the communications protocol is a social faux pas.

In science communication, particular attention has to be paid to step 4, selecting a level of communication that the recipient can understand. Lower levels are more robust, but less efficient. If you try to explain radiative transfer in baby-talk it will take you weeks. Higher levels are far more efficient, but not as widely known. You can do it in a couple of lines if you use differential equations, but not many will know how to decode them.

And you also have to pay attention to continual error checks. Have they understood? Have they mis-understood, and if so how? Do you notify them of the error and suggest they read it again, or do you try to explain it a different way? How can you do diagnostic tests to identify the cause of the error?

And of course it all gets even more complicated when participants have other motivations than simple communication; when interests conflict. Like, one participant doesn't just want to communicate, they want to persuade - to manipulate the internal state of the other participant in a certain way, that's not necessarily what the other participant wants. The obvious electronic analogy is with hacking. And designing a secure protocol that can't be hacked is several orders of magnitude more difficult than designing a protocol simply to cooperatively communicate.

It's a hard problem anyway, and people are harder than computers. But I would think there were indeed some insights to be shared with the computer communications and security sciences.

January 18, 2013 | Unregistered CommenterNiV

From http://www.bbc.co.uk/news/science-environment-21066534

""In a frank conversation this week a Met Office contact told me: "I must admit to feeling a bit misled myself as to how experimental the seasonal forecasts were. When the team said the forecasts were pretty good they meant that they were scientifically good given the extreme difficulty of the challenge. I accept that from the point of view of people deciding where to take their holiday they were actually pretty useless. We have learned a lesson from that."

But is there a lesson for the media in this? Well, (and this is a forlorn hope) it would be helpful to report on the ongoing debate over science and policy without treating climate change like a rancorous "he said, she said" political debate.""

My question, how should uncertainty be handled in the science of science communication?

The modem protocol was developed because it was known that the communication path was noisy. I would assume the same for communicating science. So, we have the noise from uncertainty, and noise from different levels of education and experience, and noise from democratic conversations of policy.

I will try to fink a link tonight or tomorrow for what I think is a must read paper for discussing climate change science communication. It was commissioned by the UK gov. The consultants advised advertising agencies as to the cause of failed advertising campaigns. They had a list of do's and don't's and why. Dan if you already have this, could you provide a link? I would like to read it again.

January 18, 2013 | Unregistered CommenterJohn F. Pittman

Here is a link to one such article: http://www.ippr.org/publications/55/1529/warm-wordshow-are-we-telling-the-climate-story-and-can-we-tell-it-better . The one I would like to find is a uk.gov article.

January 20, 2013 | Unregistered CommenterJohn F. Pittman

What a fascinating class. I wonder if you'll explore the lack of precision in informal, "pop-science" communication. In blog posts and podcasts, social scientists often make statements such as the following:

"Intuitively, we assume that we're all good at behavior X. But it turns out that we're all really, really bad at X."

"The conventional wisdom holds that humans exhibit behavior X. But it turns out that we almost never do X, but rather do Y all the time."

Social scientists do this all the time. (Kidding!) Watch out for it. The delivery follows a pattern; it's almost lyrical. Indeed, the phrase -- "It turns out that" -- has become a linguistic tick in the pop-social-scientific community.

The point is obvious, but to be clear: In the above example, the underlying study that animated the social scientific statement yielded a statistically significant result and thus suggested that humans exhibit a certain behavioral tendency. Nonetheless, it's unlikely that the study *proved* or *established* anything about human behavior with absolute certainty. (Besides, who knows if we can ever make conclusions about cognition with absolute certainty. Cf. Fodor.)

Perhaps social scientists are impelled to overstate their findings because they are motivated to promote their own views. The motivation could be explicit or implicit. The motivation could be self-interested (more exposure) or noble (spread the truth).

Anyway, neat idea for a course.

January 23, 2013 | Unregistered CommenterAmit Vora

@Amit: I know exactly what you are talking about. My assessment of the causes.

a. 25% jumbled thinking among social scientists. Some social scientists really *believe* that this style of analysis is valid.

b. 25% opportunism among entrepreneurial academics. Some know this is absurd but see the whole enterprise as a game to get publicity & notoriety.

c. 25%. jumbled thinking among science journalists. They think that this style of social science is valid. They have a vulgar sense of science, too, as "proving" things rather than (when valid) supplying evidence that makes one hypothesis more worthy of belief than it otherwise woudl have been -- but that doesn't in itself conclusively demonstrate anything and must be weighed w/ all the other evidence that already exists and remains subject to reassessment in light any new evidence.

d. 25%. opportunism among entrepreneurial science journalists. They see their enterprise as a competition to rack up attention-grabbing "wow!" studies. They are in a symbiotic relationship with social scientists in (b).

These problems will figure in the course but only intermittently, indirectly, in the course of looking at other things. The course is not really about how to improve *craft* standards of either social science or journalism.

BTW -- & perhaps you already follow Gelman & know about this -- but the theme of your post is very much in the spirit of a recurring one for Gelman, who (I think he underestimates its prevalence) w/ econometricians. Eg., here and here and here

January 27, 2013 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>