follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« What are fearless white hierarchical individualist males afraid of? Lots of stuff! | Main | "Public comprehension of science--believe it or not!": the public and decision-relevant science, part 1 »

Five theses on science communication: the public and decision-relevant science, part 2

This is the second part of a two-part series that recaps a talk I gave at a meeting of the National Academy of Science's really cool Public Interfaces of the Life Sciences Initiative.

The subject of the talk (slides here) was the public's understanding of what I called "decision relevant science" (DRS)--meaning science that's relevant to the decisions that ordinary members of the public make in the course of their everyday lives as consumers, as parents, as citizens, and the like.

Part 1 recounted a portion of the talk that I invited the audience to imagine came from a reality tv show called "Public comprehension of science--believe it or not!," a program, I said, dedicated to exploring oddities surrounding what the public knows about what's known to science.  The concluding portion of the talk, which I'll reconstruct now, presented five serious points --or points that I at least intend to be serious and be taken seriously--about DRS, each of which in fact could be supported by one of the three "strange but true" stories featured in the just-concluded episode of "Public comprehension of science--believe it or not!"

I. Individuals must accept as known more DRS than they can ever possibly understand

In the first story featured in the show, we learned that individuals belonging to that half of the US population that purports to "believe" in evolution are not more more likely to be able to give a cogent account of the "modern synthesis" (natural selection, genetic variance, and random mutation) than those belonging to the half that asserts "disbelief."  In fact, very small proportions of either group can give such an account.  

Thus, most of the people who quite properly accept evolution as "scientific fact" (including, I'm confident, the vast majority who view those who disbelive in it as pitifully ignorant) believe in something they don't understand.

That's actually not a problem, though.  Indeed, it's a necessity!

The number of things known to science that it makes sense for a practical person to accept as true (that a GPS systems, exquisitely calibrated in line with Einstein's theory of special relativity, will reliably guide him to where he wants to go, for example) far exceed what such an individual could ever hope to comprehend in any meaningful way on his own. Life is too short.

Indeed, it will be a good deal shorter if before accepting that it makes sense not to smoke such a person insists on verifying for himself that smoking causes cancer -- or that before taking antibiotics that they do in fact kill disease-causing bacteria but do not -- as 50% of the U.S. population thinks-- "believe it or not!"--kill viruses.

II. Individuals acquire the insights of DRS by reliably recognizing who has it.

Yet it's okay, really, for a practical, intelligent person not to acquire the knowledge that antibiotics kill only bacteria and not viruses. He doesn't have to have an MD to get the benefits of what's known to medical science.  He only has to know that if he gets sick, the person he should consult and whose advice he should follow is the doctor.  She's the one who knows what science knows there.

That's how, in general, individuals get the benefit of DRS--not by understanding it themselves but by reliably recognizing who knows what about what because they know it in the way that science counts as knowing.  

Why not go to a faith healer or a shaman when one has a sore throat -- or a cancerous legion or persistent hacking cough? Actually, some very tiny fraction of the population does. But that underscores only that there really are in fact people out there whose "knowledge" on matters of consequence to ordinary people's lives are not ones that science would recognize and that precious few people (in a modern liberal market society) treat them as reliable sources of knowledge.

Ordinary people reliably make use of all manner of DRS -- medical science is only one of many kinds -- not because they are experts on all the matters to which DRS speaks but because they are themselves experts at discerning who knows what's known to science.

III.  Public conflict over DRS is a recognition problem, not a comprehension problem.

Yet ordinary members of the public do disagree--often quite spectacularly--about certain elements of DRS. These conflicts are not a consequence of defects in public comprehension of science, however. They are a product of the the failure of ordinary members of the public to converge in the exercise of their normal and normally reliable expert ability to recognize who knows what about what.

Believe it or not, one can work out this conclusion logically on the basis of information related in the "Public Comprehension of Science--Believe it or Not!" show.  

Members of the public, we learned, are (1) divided on climate science and (2) don't understand it (indeed, the ones who "believe" in it, like the ones who believe in evolution, generally don't have a meaningful understanding of what they believe).

But (2) doesn't cause (1).  If it did, we'd expect members of the public to be divided on zillions of additional forms of DRS on which they in fact are not.  Like the efficacy of antibiotics, which half the population believes (mistakenly) kill viruses.  

Or pasteurized milk.  No genuine cultural conflict over that, at least in the US.  And the reason isn't that people have a better grasp of biology than they do of climate science. Rather it's that there, as with the health benefits of antibiotics, they are reaching the same conclusion when they exercise their rational capacity to recognize who knows what science knows on this matter.  

Indeed, those of you who are leaping out your seats with excitement to point out the freaky outlier enclaves in which there is a dispute about pasteurization of milk in the US, save yourself the effort! What makes the spectacle of such conflicts newsworthy is precisely that the advocates of the health benefits of "raw milk" are people whom the media knows the vast run of ordinary people (the news media consumers) will regard as fascinatingly weird.

Because people acquire the insights of DRS by reliably recognizing who knows what science knows, conflicts over DRS must be ones in which they disagree about what those who know what science knows know.

This conclusion has been empirically verified time and again.  

On matters like the risks of climate change, the safety of nuclear power waste disposal, the effects of gun control on crime, and the efficacy and side effects of the HPV vaccine, no one (or no one of consequence, if we are trying to understand public conflict rather as opposed to circus sideshows) is saying "screw the scientists--who cares what they think!"

Rather, everyone is arguing about what "expert scientists" really believe. Using their normal and normally reliable rational powers of recognition, those on both sides are concluding that the view that their side accepts is the one consistent with "scientific consensus."

What distinguishes the small number issues on which we see cultural polarization over DRS from the vast number of ones in which we don't has nothing to do with how much science the public comprehends. Rather, it has everything to do with the peculiar tendency of the former to evade the common capacity enjoyed by culturally diverse citizens to recognize who knows what it is known to science.

IV. The recognition problem reflects a polluted science communication environment.

A feature that these peculiar, recognition-defying issues share is their entanglement in antagonistic cultural meanings. 

For the most part, ordinary people exercise their capacity to recognize who knows what about what by consulting other people "like them."  They are better able to "read" people who share their particular outlooks on life; they enjoy interacting with them more than interacting with people who subscribe to significantly different understandings of the best way to live, and are less likely to get into squabbles with them as they exchange information. "Cultural communities" -- networks of people connected by intense, emotional and like affinities -- are the natural environment, then, for the exercise of ordinary citizen's rational recognition capacity.

Ordinarily, too, these communities, while plural and diverse, point their respective members in the same direction.  Any such community that consistently misled its members about DRS wouldn't last long given how critical DRS is to the flourishing -- indeed, simple survival -- of their members.

But every now and again, for reasons that are not a complete mystery but that are still far from adequately understood, some fact -- like whether the earth is heating up -- comes to be understood as a kind of marker of cultural identity.  

The position one holds on a fact like that will then be experienced by people -- and seen by others (the two are related, of course) -- as a badge of membership in, and loyalty to, one or another cultural group.

At that point, reasonable people become unreasonably resistant to changing their minds--and for reasons that, in a sad and tragic sense, are perfectly rational.  

The stake they have in maintaining group-convergent beliefs will usually be much bigger than any they might have in being "right." Making a "mistake" on the science of climate change, e.g., doesn't affect the risk that any ordinary member of the public or any person or any other thing she cares about faces: she just doesn't matter enough as a a consumer, a voter, a public deliberator etc. to make a difference.  But if she forms a view that is out of line on it from the point of view of those who share her cultural allegiances, then she is likely to suffer tremendous costs--psychic, emotional, and material--given the function that positions on climate change perform in identifying to members of such groups who belongs to it and can be trusted.

These antagonistic meanings, then, can be viewed as a form of pollution in the science communication environment.  They enfeeble the reliable operation of the normally reliable faculties of recognition that ordinarily members of the public use to discern DRS.

People overwhelmingly accept that doctors and public health officials are the authorities to turn to to have access to the health benefits of what's known to science, and ordinarily have little difficulty in discerning what those experts believe and are counseling them to do.  But when facts relating to medical treatments become suffused with culturally antagonistic meanings, ordinary members of the public are not able to figure out what such experts actually know.

The US public isn't divided over the risks and benefits of mandatory vaccination of children for Hepatitis B, a sexually transmitted disease that causes a deadly form of cancer.  Consistent with the recommendation of the CDC and pediatricians, well over 90% of children get the HBV vaccination every year.

Americans are culturally divided, however, over whether children should get the HPV vaccine, which likewise confers immunity to a sexually transmitted disease (the human papillomavirus) that causes a deadly form of cancer. For reasons having to do with the ill-advised process by which it as introduced into the US, the HPV vaccine became suffused with antagonistic cultural meanings--ones relating to gender norms, sexuality, religion, and parental sovereignty.

Parents who want to follow the advise of public health experts can't discern what their position is on the HPV vaccine, even though it is exactly he same as it is on the HBV vaccine.  Experimental studies have confirmed that their exposure to the antagonistic meanings surrounding the former make them unable to form confident judgments about what experts believe about the risks and benefits of the HPV vaccine, even though CDC and pediatricians support it to the same extent as they do the  HBV vaccine and for the same reasons.  

The antagonistic cultural meanings that suffuse issues like climate change and the HPV vaccine confront ordinary people with an extraordinary conflict between knowing what's known to science and being who they are. This toxic environment poses a singular threat to their capacity to make use of DRS to live happy and healthy lives. 

V. Protecting the science communication environment from contamination is a critical aim of the science of science communication.

Repelling that threat demands the development of a systematic societal capacity to protect the science communication environment form the pollution of antagonistic cultural meanings.

Technologies for abating the dangers human beings face are not born with antagonistic cultural meanings.  They acquire them through historical contingencies of myriad forms. Strategic behavior plays a role; but sheer accident and misadventure also contribute.

Understanding the dynamics that govern this pathology is a central aim of the science of science communication.  We can learn how to anticipate and avoid them in connection with emerging forms of practical science, such as nanotechnology and synthetic biology. And we can perfect techniques for removing antagonistic meanings in the remaining instances in which intelligent, self-conscious protective action fails to prevent their release into the science communication environment.

The capacity to reliably recognize what is collectively known is not some form of substitute for attainment of scientific knowledge.  It is in fact a condition of it within the practice of science and outside of it.

In discerning DRS, the public is in fact exercising the most elemental form of human rationality.

Securing the political and social conditions in which that faculty can reliably function is the most important aim of the science of science communication. 

PrintView Printer Friendly Version

EmailEmail Article to Friend

References (1)

References allow you to track sources for this article, as well as articles that were written in response to this article.
  • Response
    David Berreby, in his useful book Us and Them, argues strenuously that science does not equal truth, that the scientific enterprise is a matter of ever-shifting approaches to solving problems, and that we are pretty much guaranteed to find all that we consider quite solid and irrefutable now to be overturned ...

Reader Comments (48)

Interesting and useful. Your thoughts fit with a couple of ideas.
The best way for a political party to lose a legislative battle in Congress is for a President of their own party to support the party position. The president comes with too much antagonistic baggage.
Experts are disbelieved when the apparent cost of disbelieving them is low and the experts are heavily supported by people your side considers to be demonstrably irrational and antagonistic to your beliefs.

June 7, 2013 | Unregistered CommenterEric Fairfield

Before I get started, I should note that I agree with a lot of the things you say - and that in many cases I knew what you meant, and intended. This is not about saying it's all wrong, but about looking at the same things from a different point of view, and strengthening an argument by challenging it. While there are aspects I don't agree with, I still think yours is a useful point of view.

I know I don't need to say any of that, that you already know, but I think it helps to re-emphasise it sometimes.


I. Individuals must accept as known more DRS than they can ever possibly understand.

The word "must" is a problem here. It's true they can't understand everything. But there are more alternatives than just accepting decision-relevant science without understanding it.

Separate points are:
1) Do they claim to know the answer?
2) Do they claim to know what science knows?
3) Do they claim their answer to be scientifically supported?
4) Do they claim to understand it?

So you can claim not to know; you can claim not to know, and not know if science knows; you can claim not to know, and that science doesn't know either; you can claim that science might know but that you don't. You can claim to know, without knowing the science; you can claim to know what the science says when you actually don't; you can claim to understand the science when you don't; you can claim to know what you claim science does not know.

What do you mean by "science" here? The consensus opinion of scientists, or the products of the scientific method?

And in labelling all the things people need to know about "science", do you mean that everything people need to know about can be/should be/is subject to the scientific method? Are there not billions of facts people know that have nothing to do with science? How do they know what they know then?

And are all these things people *need* to know about? Why do people *need* to know about evolution? What happens to them if they don't?
What about all the things science doesn't know, either? Do we not "need" to know them too?

"... before accepting that it makes sense not to smoke she insists on verifying for himself that smoking causes cancer"

That's a good example of one of the things science doesn't speak on, although some people think it does. Smoking increases the risk of cancer. But that's an 'is', not an 'ought'. Whether it makes sense to smoke depends on whether the pleasures and social advantages of smoking outweigh the future risk of a shortened life. Even if lung cancer was certain - and it's not - it might still make sense to smoke.

Science says that driving cars and lorries causes fatal traffic accidents. Does it make sense not to drive?

Certain of the 'conflicts over science' are conflicts over issues of this sort. Science is read as saying something about a social issue that it does not, in fact, say.

And don't forget that while people can't understand the science on everything, they can understand the science on some things. Don't assume that because they can't always check for themselves, that they can therefore never check for themselves.

II. Individuals acquire the insights of DRS by reliably recognizing who has it.

Argument from Authority is one of the ways people acquire the insights of science, or their beliefs generally, and it's not reliable.

The fundamental problem, as you go on to say, is that you face exactly the same problem trying to figure out who is an Authority. Once you have started down this road, the most obvious answer is to consult an Authority on Authorities - for example, universities hand out qualifications to people they regard as experts, so you could see if they have such a qualification. But then there are good universities and bad ones, and people on the internet who will sell you a qualification for fifty bucks. You need an Authority on Authority Authorities, to tell you which universities and qualifications are best. And so on, ad infinitum.

Some people don't just consult Authorities on who is an Authority. They consult Authorities on what the Authorities know. You don't conduct your own survey of scientists to find out what the consensus opinion is, you find somebody who claims to know. You rely on them to have determined who is an Authority, and then to report an accurate summary of their views. Which - it's funny, you'll have to admit! - usually coincide with the views of the person doing the reporting.
Which gives rise to exactly the same problem, of course.

But authority is only one way of knowing things. Tradition is another. Free choice is a third. Nobody consults an expert to decide what football team they ought to support. Social networks are a fourth. Guessing is a classic method, widely used.

Using technology is a very important way of knowing about it. People buy GPS systems and conclude they work (or don't work) scientifically - by seeing if the thing tells them where they already know they are. They see if they still get lost. If it didn't work, they'd soon know, and the people selling them soon out of business.

People acquire the benefits of GPS technology not by understanding general relativity, or even by asking somebody who does. They acquire it by buying a GPS set in a shop, where an ignorant teenager will demonstrate to the sceptical that it works as advertised by taking it out into the street and walking up and down.

People learn that manned flight is possible by going to airports, not by studying aerodynamics. People learn about electricity by plugging appliances into wall sockets, not by studying electromagnetic theory. You don't know that the computer has to be plugged in before it will work because an expert told you so.

People knew that hot ice cream freezes faster than cold ice cream, by making ice cream. Scientists laughed at them and said they were wrong, that it was a common myth. The people making ice cream didn't have a better understanding of thermodynamics than the scientists. They didn't understand why it worked. But there are more ways of knowing things than consulting the experts.

And I think a lot (but not all) of public conflict over DRS arises from conflicts between the different ways of knowing.

June 7, 2013 | Unregistered CommenterNiV

Here is what I think is missing:

"In the case of any person whose judgment is really deserving
of confidence, how has it become so? Because he has kept his
mind open to criticism of his opinions and conduct. Because it
has been his practice to listen to all that could be said against
him; to profit by as much of it as was just, and expound to
himself ... the fallacy of what was fallacious." - J.S. Mill (in "On liberty")

The point, I think, is that we can learn to recognize expertise by knowing how people come to claim it. The study of science should teach people why science works, namely, because of (and to the extent that) it encourages exactly this kind of critical reflection, even if done by a group rather than an individual. Also known as actively open-minded thinking.

June 7, 2013 | Unregistered CommenterJon Baron

Do you not think this -- this acquisition of the ability to recognize who has expertise because he or she has acquired knowledge by means that trace back to science's signature way of knowing -- was what made it possible for those who established the authority of science's way of knowing to overcome the authority of science's rivals? If so, is there an appropriate latinism for this? Someting to the effect, "On the word of those whose knowledge is based not on the word of any authority but rather on the faithful transmission of what humans have deduced from observation using reason"?...

June 7, 2013 | Registered CommenterDan Kahan

@Dan and Jon,
There is a word, scientia, and the person who follows this is a scientist in Mill's sense.

What has frustrated me and many other practicing experimental scientists (I am a molecular biochemist) for decades is the willingness of other scientists to carefully prove certain things while taking many other things on faith or on argument from authority.
One of the big holes in experimental conclusions listed in many peer reviewed papers is using statistics to decide what your data means but using the statistics inappropriately because the assumptions on which the statistical method was based do not fit the assumptions of your experiment. The statistical method is often chosen because it was in an undergraduate statistics course or is the recognizable name in the statistics package that the scientist owns not because it is the proper method to analyze the data.
So science done well is great but the actual results often pick up adhesions that make the published science, especially after the primary results have passed through the hands of a number of experts, less solid than the proponents might claim. One journalist does not put much credence in the latest nutrition evidence, for instance, because he has seen the same scientist come out with the opposite conclusion two years ago and with a different conclusion two years before that.
So solid science is foundational, but it is difficult to know whether the science that you believe is solid.

June 7, 2013 | Unregistered CommenterEric Fairfield

"Do you not think this -- this acquisition of the ability to recognize who has expertise because he or she has acquired knowledge by means that trace back to science's signature understanding of knowing"

First, do people know what science's signature way of knowing actually is? Do they, for example, recognise the openness to opposing points of view Mill mentions as part of science's essential signature?

And second, how can they possibly do so when the scientists don't tell the public how they know? When the data and methods are secret, and when their deliberations are held in private where there are no unfriendly eyes?

"On the word of those whose knowledge is based not on the word of any authority but rather on the faithful transmission of what humans have deduced from observation using reason?"

On whose word do you accept that this is deduced from observations using reason? Remember, you're not allowed to actually look at the reasons, because that would be tantamount to not taking anybody's word for it. You have to be able to tell the reasons are there, without seeing them.

June 7, 2013 | Unregistered CommenterNiV


On whose word do you accept that this is deduced from observations using reason? Remember, you're not allowed to actually look at the reasons, because that would be tantamount to not taking anybody's word for it. You have to be able to tell the reasons are there, without seeing them.

Precisely. That's what's so remarkable about human intelligence. And so precarious about it.

June 7, 2013 | Unregistered Commenterdmk38


I strongly agree that statistics have come to be used as a substitute for valid inference rather than as a tool for structuring and disciplining it (my sense of this comes from immersion in social science empirics, however, not from personal observation of this shortcoming in the natural sciences). I think @JB has made it one of his life missions to fix this problem -- along with the cause of it, which is the failure to inculcate the habits of mind on which reliable inference from observation depend.

June 7, 2013 | Unregistered Commenterdmk38

It would be nice if scientists would " the public how they know", but they are busy with other things. Teachers could do this:

On the other hand, scientists could at least publish their data, and quit reviewing for journals that do not require this.

But, on the first hand, all this criticism of science and scientists makes me ask, "Compared to what?"

The very fact that we can argue about statistics is part of what makes the process work.

In the short term, it is a mess. I am a journal editor, and I spend a good chunk of my time trying to avoid bad statistics, p-hacking, file-drawer effects, underpowered studies, etc. I'm sure I don't always succeed. But we try to do this right, and trying is more likely to succeed than not trying.

June 7, 2013 | Unregistered CommenterJon Baron

I agree with all your points. I was just trying to say that arguments from authority are more pervasive than they might appear.

June 7, 2013 | Unregistered CommenterEric Fairfield

"It would be nice if scientists would "[tell] the public how they know", but they are busy with other things."

Isn't it part of the scientific method to publish the reasons for how they know?

That's the big argument, you see. Some people think that if you don't show your working, it isn't science. Others see it as an insult to be asked to show their working, and expect people to take them at their word, which scientists in the same profession largely do. They are, in effect, saying that as busy scientists they don't have time to follow the scientific method. What at the least seems like an odd thing to say. At worst, it appears to defeat the entire object.

"But, on the first hand, all this criticism of science and scientists makes me ask, "Compared to what?""

Well, some examples could be company accountants, whose books are published and audited. Or software engineers, who have developed an entire discipline of verification and validation to do this. Steve McIntyre reminisces fondly about mining geology, where the scientists doing mineral surveys on which a few tens of millions of Other People's Money rides routinely get audited.

"In the short term, it is a mess. I am a journal editor, and I spend a good chunk of my time trying to avoid bad statistics..."

Glad to hear it.

Some of what people say is intended not as a criticism of the process, but a recognition of its limitations.

It is not a journal editor's job to check the science. It is his readers' job. Scientists publish their work in journals for other scientists to check, replicate, debunk, criticise, refine, extend, etc. They used to do it by writing letters to one another. Then when science got bigger, they would write to a central clearing house - a learned society - who would forward it on. People joined societies in order to more efficiently keep up with the news and developments in their own field. Then the letters got collected up and published in bulk in journals, but they were essentially doing the same job. An editor's job is therefore not to certify papers as good science - they don't have anything like enough time and resources for that - it is to certify papers as worth their reader's while to look at, and to make sure they provide enough detail and back-up for their readers to be able to effectively do that.

So papers that don't publish the data are useless. Papers that are vague and incomplete about the method and calculations used are useless. Certainly, papers that have obvious errors that a cursory check can pick up should be fixed or if they can't be fixed then rejected. But even a paper that turns out to be wrong - as most do - can still be interesting and useful. Recognised for what they are and the purpose they serve, some of the criticisms journals have received are unjust; the product of propagandists who have tried to sell peer-reviewed publication to the public as infallible gospel.

That said, some journals have failed notably at even the function they genuinely do have, and are sometimes unwilling to accept criticism of their past failures. It's understandable, as word of that sort of thing is bad for business. Journals don't get audited, either. It is, as you say, a mess.

June 7, 2013 | Unregistered CommenterNiV

I meant "compared to what" as a means of arriving at accurate beliefs in general, about all sorts of things.

And scientists do explain how they reach their conclusions. That is what journal editors should make them do. But this is not sufficient for public comprehension. When I read about the Higgs boson, I have some idea what the discussion is about, but I do not expect to be able to understand the published papers. But I know enough about physics to believe confidently that this is pretty serious science. I do not think that physicists should have to explain stuff to me so that I understand it (unless I go back to college and major in physics).

I believe that my email address is public, and I am happy to continue these discussions by email, but not here.

June 7, 2013 | Unregistered CommenterJon Baron

Understanding the dynamics that govern this pathology [namely, technologies "acquiring" antagonistic cultural meanings] is a central aim of the science of science communication.

But I don't see much actual understanding, beyond "accident", and "strategic behavior", inferentially on the part of bad people, which certainly resembles just a very old type of political paranoia. What, after all, are you left with once you throw away a potentially richer and more workable explanation by assuming from the start that technologies cannot have any inherent cultural meaning?

We can learn how to anticipate and avoid them in connection with emerging forms of practical science, such as nanotechnology and synthetic biology. And we can perfect techniques for removing antagonistic meanings in the remaining instances in which intelligent, self-conscious protective action fails to prevent their release into the science communication environment.

Again, it would be interesting to see some actual means or techniques for such avoidance and removal. Censorship, I suppose, would be one way, but we can't guarantee that the "right people" would be the censors, and anyway that generally just drives dissenters underground. Or, of course, we could just communicate that we mean no harm to anyone's cultural values, and that we only want to help. But doesn't it seem odd that no one would have thought of that before? Do we really need a "science" of science communication to tell us?

Securing the political and social conditions in which that faculty [of discerning who actually knows what science knows, and therefore who to believe] can reliably function is the most important aim of the science of science communication.

Any idea what those "political and social conditions" might actually be? I assume they'd still be compatible with a "Liberal Republic", of course, but ... what could they be? And, in any case, doesn't it sound strange that "securing political and social conditions" would be the most important aim of an actual science, any science?

I don't want to sound overly negative here -- I agree with NiV that much of the concrete work Dan has done is interesting and valuable. But I continue to think that it's founded upon a bad model -- of an unrealistic gulf between an idealized Science and a herd-like public -- that leads him into all kinds of conceptual problems, such as paying lip service to a motto that has virtually no function in real life, abandoning critical intelligence in favor of falling back on an ancient Argument from Authority, relying upon some mystical, magical process for discerning who to accept as Authority, etc. More importantly, it creates a conceptual box that I think seriously hinders if not prevents any further understanding of how real controversies arise and sustain themselves, and what we might do to improve the situation. A start would be to see such controversy as normal rather than pathological.

June 7, 2013 | Unregistered CommenterLarry

@NiV & @Larry:

I don't think I say anything about "argument from authority."

I'm saying reasonable, reasoning people have a rational capacity to discern in those around them the possession of knowledge that counts as such from the point of view of science. The exercise of that faculty, moreover, is distinct from comprehension (much less verification) of what is known to science. That faculty is a reasoned one; it doesn't work through some supernatural force. Moreover, while it isn't perfect -- and is subject to distortions of various sorts; that's one of the central points of this post -- it works extremely well.

Ordinary people, possessed of critical powers of reason, assent to the authority of science's way of knowing b/c they choose to -- if they do. They then use their own reason to figure out who is reliably in possession of that knowledge -- a process that is not the same as "taking orders" from designated officials.

Let me ask the two of you: do antibiotics kill only bacteria or also viruses? Or for that matter, do they kill either of them? Indeed, what is a bacterium? A virus? And can one kill one of them w/ vitamin C?

I take it you believe you know the answers to these questions. If you think you have figured out the answers "on our own," w/o trusting anyone else whose understanding of many many many things you have not verified, you are simply mistaken. If you tell me you have seen a bacterium with your own eyes, e.g., then you know full well the question becomes-- how did you know that's what you were seeing? did you invent the microscope? did you do what was necessary to verify that the things you saw w/ that instrument were living organisms? that they multiply? that they live inside of humans-- and that some of them, when inside humans, damage cells? What's a cell & what observations did you make --what experiments did you run -- so that you wouldn't have to "take the word of anyone" in figuring that out? etc ad infinitum for crying out loud.

And you are above-average. I'm sure you have figured a great many things that you realize others haven't. What do you propose those people do? Refuse to take antibiotics until they've retraced the steps -- from the very beginning of science -- to the point at which it was discerned that penicillin kills streptococcus? Refuse to do that when in fact they are able to use reason to form reasonable grounds to believe they have identified those whose knowledge of such things has the status of knowledge from science's point of view?

That sort of position would not embody a defense of rationality or individual autonomy. It would just be nonsense.

June 8, 2013 | Unregistered Commenterdmk38


I would suggest resisting the urge to dismiss fringe groups. It is sometimes the case that they are expressing an entirely intelligible cultural viewpoint in a more extreme form than the norm but within the bounds of cultural understanding. (Pro-norm deviance?) This can make their message resonate with people who have a similar cultural understanding and it seems to me can be influential over time. In the case of milk pasteurization and the "raw milk movement," there are shared elements of both anti-government regulation and the innate superiority of closest-to-natural-state forms being more pure forms of food. (Views on purity are never trivial and seem almost contagious.) Items matching this sort of purity slot profitably fill the shelves of health food stores. The health food store I frequent pressed for people to sign petitions to prevent vitamins and supplements from being regulated, and the arguments echoed those that surround support for raw milk: regulation would destroy their access to supplements that are vitally helpful but that science doesn't accurately assess, that individual's health would be compromised by regulation and would result in heavy-handed restrictions. Few people may indulge in raw milk, but they may "understand" it and empathize with the social construct and seek to ease its right existence out of a sense of the moral uprightness of the venture, even if they would never seek it out themselves. Perhaps it is only people observing these happenings from markedly differing/opposing cultural viewpoints who see particular fringe issues as so entirely freakish and outlandish and, as a result, actually inconsequential.

I wonder what you think of Schwarz &Thompson's Divided We Stand, (1990) in particular their argument, as I understand it, that the four "ideal cultures" need each other. I can't resist quoting, "Those fringe-dwellers, the cranks, the romantics, the eco-freaks and the lumpenproletariat who, throughout the long years of the industrial age, could safely be ignored can no longer be ignored. That is the crucial, and perhaps none too palatable, message that the engineer now has to receive and respond to; that is the one great change that the transition to the information society (as it is sometimes called) has wrought in his environment . . ." (p. 125)

Is it possible to apply the concepts of "expressive over-determinism" suggested for public policy debates to science?

June 8, 2013 | Unregistered CommenterIsabel Penraeth


"And scientists do explain how they reach their conclusions. That is what journal editors should make them do."

In both cases, they should do, but let us say... some are better at it than others.

"But this is not sufficient for public comprehension."

I'm not saying it has to be.

The point is that the entire argument and reasoning has to be out there, in public, and that in principle anybody can learn what they need to learn and check every step. The point of nullius in verba is that you can, not necessarily that you have done.

If the argument is out there, other people can come along and translate. There are always people around happy to teach what they know. And if you can see this process going on it gives you confidence that any issues would likely get picked up, even if you're not following the debate yourself.

"When I read about the Higgs boson, I have some idea what the discussion is about, but I do not expect to be able to understand the published papers. But I know enough about physics to believe confidently that this is pretty serious science. I do not think that physicists should have to explain stuff to me so that I understand it (unless I go back to college and major in physics)."

The maths behind the Higgs Boson prediction is undoubtedly difficult - I've tried for years to get to grips with it (just out of curiosity), and still struggle. But my point is that I can make the attempt. Every step of the argument is out there. And even following the general shape of it, you can pick up on what the uncertainties and limitations of the argument are.

But there are certain areas of 'science' where they say things like "Why should I make the data available to you, when your aim is to try and find something wrong with it" and “p.s. I know I probably don’t need to mention this, but just to insure absolutely clarify on this, I’m providing these for your own personal use, since you’re a trusted colleague. So please don’t pass this along to others without checking w/ me first. This is the sort of “dirty laundry” one doesn’t want to fall into the hands of those who might potentially try to distort things…” and “The two MMs have been after the CRU station data for years. If they ever hear there is a Freedom of Information Act now in the UK, I think I’ll delete the file rather than send to anyone.”

I find that stuff outrageous! I find that sort of attitude incomprehensible in a scientist, and I find it even more incomprehensible that the scientific community apparently don't find it so, too. The people who said these things are still working in science, still respected, still defended to the hilt.

And I find that stuff a clear indication that this is not science! Fair enough if they've published it and I find it too hard to follow, but what kind of scientist would delete their own data rather than let critics see it?!

I think a lot of the general public get that, too, which is why we have so much scepticism. The mystery for me is why we're still sat here discussing what went wrong. Why do so many people not react the way I reacted? It appears to be correlated with politics and cultural factors, which is why I find Dan's work so interesting.

And Dan's tendency - along with many others in the community - to see the problem as understanding why people don't all take scientists' word for it may be another clue.



"Let me ask the two of you: do antibiotics kill only bacteria or also viruses?"

Only bacteria. But it's the wrong question - since most people don't know which diseases are bacterial and which viral. What they need to know is that antibiotics only work against some diseases. And that's an easy statement to test without knowing anything about viruses or bacteria.

"I take it you believe you know the answers to these questions. If you think you have figured out the answers "on our own," w/o trusting anyone else whose understanding of many many many things you have not verified, you are simply mistaken."

This is the same darn point I made in my first post at this site. Nullius in verba does not mean that everybody has to check everything before believing it. It means that everything has to be checkable, that if you know of any plausible reason for doubt you have to check it and not just assume somebody else has done so, and that you have to at least have good reason to believe many people have already done so before believing it.

There are no statements anywhere that are not routinely open to challenge. There is no science so settled that you are not allowed to argue with it. You can - if you choose to - chase down any step in the chain of reasoning and ask the scientists "how do we know this is so?", and "you have to take my word for it" is never an acceptable answer.

I do, in fact, know most of the steps in the bacteria/virus chain. I can explain how microscopes work. I've grown bacteria in petri dishes, where you can see their growth. I've confirmed they live inside humans. And I've seen human cells. The existence of infectious agents and many of their properties can be inferred from epidemiology, even without seeing them. There are steps I haven't confirmed - that a certain disease is caused by a certain organism - but I know that I could if I wanted to. I can take samples from the people with the disease and see if any of the bacteria grown in culture from them cause the disease in healthy people. I know that I don't have to take anybody's word for it.

Nullius in verba is what "Why should I make the data available to you, when your aim is to try and find something wrong with it" is not. It is the principle Tom Wigley was talking of when he said "No scientist who wishes to maintain respect in the community should ever endorse any statement unless they have examined the issue fully themselves", (and you should go and see the full context for that statement - it's worth it). Nullius in verba is science's immune system, constantly scouring the body of scientific knowledge for potential errors. It is only by knowing that people do still check that we have any warrant in its accuracy.

And when we discover that there is a body of results that nobody has checked - that nobody could have checked because the data has never been made available, and in a critical policy-relevant area with $trillions at stake too, we all ought to be concerned. Whether the conclusions are true or not, it isn't science, and doesn't have science's warranty. My question is, why do so many scientists and people respectful of science nevertheless make excuses for it?

June 8, 2013 | Unregistered CommenterNiV


I don't think I say anything about "argument from authority."

No, you don't use that phrase, but what you do say sounds like just a paraphrase, adding in some words like "reasonable", "rational", etc. -- e.g., saying ordinary people "use their own reason to figure out who is reliably in possession of that [scientific] knowledge ." But how do they figure that out? How do you figure it out? Do you really, for example, simply look at others in your cultural cognition quadrant and believe whoever or whatever most of them believe? If so, is that really using your own reason, or is that merely following a herd instinct (which, I'm not doubting, is quite real and widespread)?

I'm not trying to be picky here, but I think the notion that people are engaged in discerning whom to believe (has the authority of science behind them) doesn't capture people's -- e.g., yours and mine -- actual thought processes. Most of the time -- e.g., re: bacteria and viruses -- the question of belief doesn't even arise, any more than it does when we drive a car to get somewhere, or put milk in a refrigerator when we get home from the store. If you're saying that the belief is implicit, then sure, the vast majority of whatever such beliefs get us through a day are derived beliefs -- we "believe" in that implicit or unconscious sense what (or who) we have no reason not to believe. But that's always been the case, as it has to be for any not just rational but sane person, and has nothing particularly to do with science.

The problems come when we do have some reason not to believe something or someone. Now what do we do? One response would be simply to follow the herd, where "herd" is defined as whatever cultural niche we find ourselves in. But another response would be to try to use our own reasoning powers as best we can given our always limited abilities and context (a limitation that applies to everyone, scientists included). That may include using our own reason to figure out who is reliably in possession of scientific knowledge, as you say, but that's more as a side effect than the point -- what we're really trying to do is understand what to believe, not who. And that distinction is important because, alas, anyone, even those stamped with the label "scientist", may, at different points or in different ways, be susceptible to bias when there is reason to doubt. And that susceptibility will not go away with any improved science communication, since it applies to the science communicators themselves, and to the media by which they communicate, and reasonable, reasoning people understand that.

You're right, of course, that the vast majority of our information comes via others rather than through direct experience, and you're right too that as a rule this isn't particularly problematic. But when it is, we have two general options -- finding the right person or people to believe, or finding the right information. If our focus is on the latter, then we'll still need to get that information from others, it's true -- we don't as a rule have microscopes or weather stations, or complex software models of our own, and even if we did, we'd have to get them from others. But now that information is pieced together from a variety of sources, of varying reliability, and filtered through our own critical array of detectors. The heuristics and habits of critical intelligence, in other words, are developed precisely to assess the information we get from others, from whoever we get it, and at whatever level.

June 8, 2013 | Unregistered CommenterLarry


Thank you. Very insightful. Very wise. I will reflect on this & I think do a better job as a result in contributing to what we would both like to understand (and to what we would both like to promote as a form of life that maintains the complex, mutually supporting entanglements between the advancement of scientific knowledge and the distinctively perpetually, permanently pluralistic & disputatious way of life that characterizes liberal democracy)

June 8, 2013 | Registered CommenterDan Kahan


I really think you are talking in circles. On the one hand:

This is the same darn point I made in my first post at this site. Nullius in verba does not mean that everybody has to check everything before believing it. It means that everything has to be checkable, that if you know of any plausible reason for doubt you have to check it and not just assume somebody else has done so, and that you have to at least have good reason to believe many people have already done so before believing it.

on the other:

On whose word do you accept that this is deduced from observations using reason? Remember, you're not allowed to actually look at the reasons, because that would be tantamount to not taking anybody's word for it. You have to be able to tell the reasons are there, without seeing them.

You say of course you accept that it is just fine to accept what admits of scientific proof w/o checking it yourself so long as you have "good reason reason to believe many people have already done [such checking] before believing it." But "on whose word do you accept that" such checking has been done; how do "you know" whether you have "any plausible reason for doubt" that such checking has been done given, "remember," that "you're not allowed to actually look at the reasons" when you accept a claim that "admits" of checking; "[y]ou have to be able to dell the reasons are there, without seeing them"--i.e., w/o replicating them.

Either you are enveloped in a web of contradiction -- or more likely you agree w/ me despite your persistent statements to the contrary! You agree that a system of shared knowledge counts as reasoned *&* counts as consistent with the autonomy of individdual, critical thought *when* people make use of an independent critical reasoning faculty to determine whether there is reason to accept or trust the representations of what's know that are offered by others whose knowledge derives from science's way of knowing.

If that latter rational faculty exists, it (a) explains how something that would otherwise be impossible (the steady accretion of knowledge within science, the full scope of which far exceeds the capacity of any individual to replicate it on his or her own) happens; (b)explains why we sometimes end up w/ political conflict over what's known -- not because we don't "comprehend" what is known but b/c something is interferig with that rational capacity of recognition; and (c) gives us a program for trying to figure out, through the use of scientific methods, what sorts of influences cause disruption of this rational recognition faculty and what procedures help to protect that faculty from such conditions.

June 8, 2013 | Registered CommenterDan Kahan


Yes, it's a slightly tricky concept.

Perhaps it would be most useful to quote what I said about it before:

Trust in authority in science is based on the knowledge that the reasons for belief are always there and available if we ask, and our confidence that they *have* been sought and checked many times in the past. With a textbook with theorems and proofs, that you know has been used for many years, you might take the theorems on trust without working through the proof. But you're not doing so because "it's a textbook", you're not doing so because it's written by an eminent authority, who you trust, you're doing it because you can see directly that the proof is *there*, and that in the classroom lecturers and students *will* have gone through the proof.

You check to see if the assertions are based on science rather than authority by looking at the reasons given. If you see pages of equations and data analyses and chunks of code, you might well say "My word! That looks impressively scientific!" and skip over it without reading it. Because you know what sort of methods they used to get their result, and you know that with it out there in the open like that, eventually someone with the knowledge to do so is going to get curious and check it, so flaws will have been picked up. Has there been a debate? That's an even better sign. Look for the comments of people who have done such checking. Again, you don't necessarily have to follow the details to get some idea of whether it exists, and whether it found anything.

It's still not entirely reliable - nothing like as good as checking for yourself - but seeing directly that the arguments are open and have been checked by others is a start, and should instil some justified confidence. Your confidence is in direct proportion to the extent you think the working can be and has been checked.

If, on the other hand, you see a reference to "standard results", or assumptions, or vague handwaving, "scientists say...", or references to "expert judgement", or reliance on "peer-review" where the reviews are not shown so you can see what, if anything, they checked, or if you see that questions are not allowed or that people are complaining they can't find the data, you'll know that the author hasn't used the scientific method. They're saying in effect: "take my word for it", which violates nullius in verba. Such arguments are not science.

The nullius in verba principle is not saying you must check every step, it's saying you must accept no 'in verba' arguments, which are essentially uncheckable. Or as Feynman said: "Science is the belief in the ignorance of 'experts'."

As I said previously, it's a paradox - it's trustworthy only because it is not trusted. Scientists can trust authorities precisely because they know that scientists don't. Other people can do the same.

Remember, you're trying to check if the result was generated using the methods of science, the primary method of which is to put the argument and evidence out there for challenge, and then be open to those challenges and questions and other points of view. You don't have to be that much of a scientist to see if they've done that.

June 8, 2013 | Unregistered CommenterNiV


But ordinary people don't do need to look at pages of equations to do the tricky thing you describe -- at least not for antibiotics, microwave ovens, safety of flying from Hartford to Las Vegas. Indeed, if they are curious enough/strange enough to care, they can even form a reasonable judgment that science knows something (provisionally, as always; science never knows anything that it isn't prepared to revise) about the Higgs Boson or about quantum entanglement -- all w/o looking at the workproduct of anyone involved.

When they figure out that those things are known to science, they follow a strategy of rational observation . It just isn't one that involves checking the work of the scientists involved.

Like you say, "tricky"; I say "indeed, astonishing-- fascinating -- awe insppiring!"

And like you say, not foolproof.

Actually, I won't be surprising you when I declare that none of the reliable modes of figuring out what's true as an empirical matter is foolproof. Our senses aren't! But neither are the most exquisite devices we've created for making measurements when we deliberately set out to make observations from which we can draw inferences about matters that remain a matter of empirical uncertainty.

So we start with all of that. All of that is realistic -- psychologically, sociologically. Then we try to figure out why we sometimes end up in such persistent, wasteful, demeaning & dangerous controversies about what is known by the means that science uses.

Nullius in Verba means something-- something very important. It is claim about what way of knowing alone should be credited: the one that depends on human observation & reason.

This is a radical & astonishing claim for those who know their history. Indeed, it is one that will still get your head cut off if you are brave enough & committed enough to the majesty of human reason to say it in many parts of the world today.

But "NiV," NiV, is not a description of how ordinary people (or even scientists) know what is known to science. They come to know those things b/c they know how, rationally, to identify who is in possession of knowledge that nullius in verba says alone is to be trusted.

It follows -- and this the point of this post; and the point of the very first one that agitated you in a way that has been so persistently stimulating & challenging & enlightening too for me & I'm confident others -- that the source of the wasteful, dangerous, demeaning, deflating controversies over what's known to science that sometimes divide the culturally diverse citizens of the Liberal Republic of Science is something (something unusual; this happens rarely) that is causing the disablement of the rational faculties that citizens use to figure out who has the sort of knowledge that rests not on the word of any authority but on the insights yielded by observation and reason. If we want to fix that, we need to identify the disabling agents and destroy them.

But in that project, just saying "nullius in verba!" won't help us. As beautiful, meaningful, true as it is.

June 8, 2013 | Unregistered Commenterdmk38

"But ordinary people don't do need to look at pages of equations to do the tricky thing you describe -- at least not for antibiotics, microwave ovens, safety of flying from Hartford to Las Vegas."

Indeed. They can be even more directly scientific than usual, with those. If you take antibiotics and the persistent infection suddenly clears up, or if you go to Las Vegas airport and see all the scheduled planes fly in, you can see it for yourself directly. You can use a microwave oven and see exactly what happens. They're still not taking anyone's word for it.

"When they figure out that those things are known to science, they follow a srategy of rational observation. It just isn't one that involves checking the work of the scientists involved."

Agreed. But we were talking about the extent to which there is such a thing as genuine 'authority' in science.

"They know that b/c they know how, rationally, to identify who is in possession of knowledge that nullius in verba says alone is to be trusted."

That's where I keep getting stuck. You say "they know how, rationally, to identify who is in possession of knowledge" but what I keep saying is that to be nullius in verba the "who" is irrelevant - they have to identify what arguments constitute scientific evidence. They are the ones that have been debated, challenged, and tested. "Who" has nothing to do with it.

That said, it's probably true that a lot of people believe because they've identified people they trust. It's an effective heuristic, and like correlation-implies-causation and confirming-the-consequent, often correct. But it's not scientific. Science is not the only way of knowing things - but it is the most reliable and effective. Argument ad verecundiam and ad populam are commonly used and not entirely without merit, but they're not science.

If science was that easy and natural, we wouldn't have had to invent it.

June 8, 2013 | Unregistered CommenterNiV


In that case I feel like asking you again to describe to me how you came to know so much about antibiotiics, microbes, etc. Again, unless you re-invented all of science on your own, there were many "who's" involved. Many you wisely decided to accept information from b/c you determined (w/o replicating anyone's studies or checking anyone's math!) that the knowledge they were imparting was genuine; and many many mnany times that many who you wisely disgregarded b/c you figured out -- w/o checking their math either -- that they were palming off counterfeit goods

The rational discernment in that makes me say - achtung! If this faculty weren't natural & rational-- we would never have invented science.

June 8, 2013 | Registered CommenterDan Kahan

"Again, unless you re-invented all of science on your own, there were many "who's" involved."

Re-inventing the science is not required. You don't have to figure it out yourself, you have to check whether somebody else's argument makes sense. There are many people involved in generating and refining those arguments, but it is the arguments that are being judged here, not the people making them.

And there is, of course, a certain amount of human trust involved. I assume the lab assistants didn't sneak in during the night to paint gunk onto my petri dishes to maintain the great global 'bacteria' hoax.

But bacteria are a comparatively familiar and everyday phenomenon, and fairly easy to do experiments on. There are many aspects to bacteriology and virology I haven't seen demonstrated, but their existence is fairly easy to show.

A harder example would be Newton's law of gravity. Now again, I know roughly how Newton did it, but I certainly haven't gone out with a telescope every night for twenty years to gather the data myself. But the skies are open to everyone, and a lot of people do go looking at the planets, so I'm pretty sure that if any of them weren't where they were supposed to be, somebody would have noticed and said so. Again, if it was just one person's word, I'd have to be sceptical, but the predictions have been checked multiple times by many people. It's the checks that give us the confidence.

However, there was an incident about 15 years back when I came across a chap on the internet who talked about the speed of gravity. He pointed out that if gravity propagates at the speed of light or less, the forces between the sun and planets would be unbalanced. The Earth would be pulled towards the sun, but the sun would be pulled towards where the Earth was 8 minutes ago, since it takes sunlight that long to reach the Earth. The unbalanced forces would be big enough to pull the Earth out of its orbit within about 100,000 years.

Now on the one hand we have a lone crazy guy on the internet arguing against the accumulated authority of every astronomer since Newton. Nobody agreed with him. He wasn't any kind of acknowledged authority or expert. On any rational basis of "who knows what about what science knows" he loses hands down. I shouldn't have even listened.

But having read the argument, I couldn't see anything wrong with it. The calculation seemed correct. The physics made sense. The logic of the argument seemed impeccable. The Newtonian version of gravity simply could not work with propagation delays, and yet instantaneous action at a distance broke special relativity. Something was wrong, and I didn't know what.

Now I was still fairly confident the guy was wrong, and the astronomers weren't, but I took his argument seriously. At the least, it was pointing out a flaw in my own understanding. At best, it could be something interesting.

So I spent several weeks investigating the question, dismissed a lot of the many attempts to explain where he was wrong, and eventually found a satisfactory answer in a fairly hairy calculation somebody else put up using general relativity, that explained what was going on to my satisfaction. Yes, the guy was wrong, but now I knew why. And I found it tremendously useful, because it gave me a far deeper appreciation of gravity's subtleties, and Newton's theoretical underpinnings, and how forces and conservation laws work in reality. And yes, my previous understanding of how gravity worked had been wrong, and my confidence having read of Newton's arguments misplaced. Gravity cannot work the way Newton said, nor does it propagate faster than light, and it was all very interesting and deep.

This is the sort of thing I'm talking about. Only the arguments matter. The guy had no credibility compared to the astrophysics establishment, but at no point did I even consider accepting any of the hundreds of assertions to the effect that "thousands of astronomers say you're wrong" as a valid argument. In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual. For a period of several weeks, I seriously considered the possibility that a significant part of modern physics was wrong, which is what the scientific method says you must do. And I gained tremendous insight from doing so, that I never would have got if I'd accepted the authority arguments.

And although I'm sure people here will argue I was crazy to do so, I'd argue for anybody else doing the same.

The point I'm trying to make is that it's all about the arguments, not the people. Yes, I'll initially accept some arguments and reject others, often without doing the maths to replicate them, but it's about the arguments and whether they are convincing. I use plausibility heuristics all the time, but they're heuristics that apply to arguments, not people. Yes, it's people that come up with the arguments in the first place, and yes you generally have to rely on a degree of basic honesty that they're reporting evidence reasonably accurately and not trying to deceive you (unless there's a very strong reason to think otherwise), but it's about the arguments they come up with, and whether they have been challenged and debated and tested sufficiently, and whether all the objections have been answered adequately. A lack of personal credibility is reason to check the evidence more carefully, it's not a reason to dismiss it.

I think a model of scientific communication that relies on personal authority and not plausible argument is teaching people ideas and principles that are in opposition to science. But that's just me. Not everyone agrees, it would seem.

June 8, 2013 | Unregistered CommenterNiV


Not "personal authority"; perception & recognition that another has knowledge that counts as such for science.

I think you understand me to be saying "defer to authority" when I say "recognize" b/c I keep stressing that the recognition can't be based on an assessment of the correctness of the information that is being imparted (one accepts w/o checking math, replicating, etc.). Recognition/perception that another *has* knowledge is based on reason but is not based on *that*.

Let's try something concrete. Consider this:

New Cause of Life-Threatening Disease Identified

June 8, 2013 — Danish researchers have just published findings that explain a previously unknown mechanism used by cells to communicate with one another. The research significantly contributes to understanding why some children are born with malformations and why children and adults may develop life-threatening diseases.

Dr. Søren Tvorup Christensen (Department of Biology) and Professor Lars Allan Larsen (Department of Cellular and Molecular Medicine) at the University of Copenhagen, in collaboration with colleagues in Denmark and France, have spearheaded the recent discovery which sheds new light on the causes of a range of debilitating diseases and birth defects.

Antennae-like structures on the surface of cells

Over the years, the research group has been a leader in primary cilium research. Primary cilia are antennae-like structures found on the surface of nearly all cells in the human body. These antennae are designed to receive signals, such as growth factor and hormones, from other cells in the body and then convert these signals to a response within individual cells. Defective formation or function of these antennae can give rise to a range of serious maladies including heart defects, polycystic kidney disease, blindness, cancer, obesity and diabetes. However, there remains a great deal of mystery as to how these antennae capture and convert signals within cells.

"We have identified an entirely new way by which these antennae are able to register signals in their midst, signals that serve to determine how cells divide and move amongst one another. This also serves to explain how a stem cell can develop into heart muscle," explains Søren Tvorup Christensen.

"What we have found is that the antennae don't just capture signals via receptors out in the antennae, but they are also able to transport specific types of receptors down to the base of the antennae -- where they are then activated and might possibly interact with a host of other signalling systems. The receptors include the so-called Transforming Growth Factor beta (TGFβ) receptors which have previously been associated with birth defects and cancer. Therefore, the base of the antennae can serve as a sort of control centre that coordinates the cell's ability to manage fetal development and the maintenance of organ function in adults."

TGFβ signalling and development of the heart

Lars Allan Larsen has numerous years of experience in heart development research. He adds "we know TGFβ signalling is very important during heart development and that a failure in this system can lead to the congenital heart defects that affect roughly 1% of all newborns. Therefore, our discovery is a significant step towards demystifying the causes of congenital heart defects."

The two researchers also point out that defective TGFβ-signalling has been associated with neurodegenerative diseases such as Alzheimers, Parkinsons disease and mental retardation. Subsequently, the research group has begun studies on how these antennae -- the primary cilia -- regulate TGFβ-signalling during, among other processes, the transformation of stem cells into nerve cells.

"It's definitely an area that will be attracting lots of attention in years to come. Globally, there is a great deal of interest in understanding why the antennae are so important for our health," concludes the pair of researchers.

The groundbreaking results have been published in Cell Reports. The research is supported by the Lundbeck Foundation, the Novo Nordisk Foundation and the Danish Heart Association (in Danish), among others.

Not my area of expertise & I take it not yours either, right?

But are you inclined to credit the report? I am. And in a gesture -- an admittedly rudimentary one, b/c the phenomenon, perception/recognition of someone else's having knowledge, is complex & likely isn't something the experience of which is rooted in things that can't be reduced to some set of list of considertions -- I'll try to say why & what that means.

To begin, I think it would be comic for me to say I am inclined to credit this report b/c I am "persuaded by the argument."

What argument? It's not a set of logical propositions. It's a description of a set of observations that have been integrated into a theory that posits various sorts of mechanisms & processes.

It coheres; but there are many times as many accounts that knit together observations into coherent accounts of mechanisms and processes than are actually true! The vast "false" 'class of "cohering" stories of that sort is populated w/ lots of bull shit but also lots of plausible conjectures & near misses too.

What's more, I have made none of the observations -- of cell antennae detecting one or another sort of signal etc -- on which the strength of the "argument" (if you want to call it that) depends. I could be "persuaded by arguments" involving witches & dragons if I took at face value all observations that I have not myself!

In discerning that this is something that it makes sense for me to credit, I'm relying on my own observations but of things that are unrelated to the content of the information being imparted.

Things like the source of the story in a science news service that would go out of business if it made things up or accepted bogus or contrived reports of discoveries.

Like the credentials and status of the researchers, who it is unlikely would occupy the positions that they do or be supported in the research that they perform if they were not doing work that those qualified to recognize valid science respect.

Like the publication of the article in a journal that is -- I happen to know but if I didn't would deduce from the context -- is trusted by people who are qualified to do the relevant form of science & assess it & who in fact serve as reviewers

Like the incentive to refrain from misleading created by the prospect that one will be quickly be shown to be dishonorable or stupid if one somehow manages to "put one over" on the journal & its reviewers or on those who -- like the news service & like me -- fell for a clever fabrication of the signs of validity.

All these things & others of a similar character give me reasonable grounds, as a rational & reflective & critical thinking person, to assign weight. To credit this.

But only provisionally. If someone tomorrow comes along & publishes a result -- one that is attended by like signs that vouch that it can be trusted as conveying knowledge that counts as such from the point of view of science & that is presented as refuting this finding -- I will be prepared to revise my assessment. But in the meantime I will tend to view the report here is true.

Indeed, if it were the sort of thing that might inform an important decision, I would likely do more to satisfy myself that it is worth relying on but more of the same sort of thing I'm describing, which involves not "assessing an argument" but investigating and corroborating signs that these are people whose claim to know something is one I can trust.

Now, to complete this exercise, I would then point to about 100x reports that I could find in 30 seconds on the internet that I would, using my rational recognition faculty, dismiss as absurd. Not b/c the "arguments" -- as logical combinations of claims based on asserted observations -- are "bad arguments." But b/c I can see that the claims are being made by people under conditions that show them to be the sorts it woudl be unwise to accept as bearers of what science knows.

I go on this way through life. So do hundreds and hundreds of millions of others. And we do very well -- filling our minds w/ many thousands of bits of scientifically valid and validated pieces of information that we'd never have access to otherwise, and avoiding mutliples of that of sheer dreck.

This is pretty much how it works.

Except when it doesn't -- in cases where we end up culturally polarized on matters of DRS. B/c I can see what works when things are working is a process not of "seeing for oneself" or "evaluating arguments" etc but rather a process of rationally discerning who knows what about what, then I infer that the problem in all the cases in which the hundreds of millions of us who normally converge on what is known to science are not converging is something that disables are rational recognition faculty.

I say: what is that? What can be done?

If the reason that you have been characterizing my position as one counseling passive assent to "authority" is that it wasn't clear what the alternative was to the common exaggerated absurd conception of what nullius in verba means -- one that is very powerful, so powerful that it makes it nearly impossible (or maybe just impossible) to get genuinely thoughtful, intelligent people to see what the science communication problem actually is--well, then here it is.

By all means, tell me what's missing in the account. It's my best current understanding but of something too complex for me possibly to have grasped all that is important. But for you to help me, you have to see what I'm genuinely saying -- which means, too, exert the will to see; the will to resist making me out as an advocate of something it is easy to dismiss b/c it is shallow & unappealing ("defer to authority-- that's not science!"; "the regime of experts ordering about the passive masses--that's authoritarianism" etc-- oh, please!).

Then, once you've helped me improve the account, or improve it enough so that we can proceed in safety to try to act on it, help me figure out what to do.

Figure out what to do to dispel a serious problem that inhibits the people who are lucky enough to live in a free society from enjoying both the bounty of scientific knowledge that that form of society is the only one truly fitted to securing; and form enjoying that element of freedom which consists in not having to submit to legal obligations founded on the contempt of others for the particular vision of the good that they, using reason, have identified as the one they find most compelling

June 9, 2013 | Unregistered Commenterdmk38

Thanks for your specific example.
I read it very differently than you do, apparently because I am an expert in the field and have decades of experience not only in the subtleties of the experiments themselves but also in how the scientists and news outlets like Science Daily or New Scientist talk about such results.
In general the actual results standup over time, the author's blanket statements about the importance of their discovery fade, and the claims about how these discoveries apply to vast areas of medicine are nonsense from the beginning and are an attempt to get more research funding or, by the news writers, to get more readership.
What these news releases tend to do is take the nugget of new science and coat it with many layers, predictable kinds of adhesions that sell the release, and then print it so that subscribers can see the embellished product.
Scientists like me who are in the field and need to know what is new and what is real, spot the article by the title and then the first few sentences, analyze the article by the reputation of the authors, how the claimed result fits into a lot of other data, and the solidity of the field, ignore all the statements that have little scientific support, and see if there are useful nuggets left. If there are (after serious attempts to discern what is real and can be applied to our own work), we incorporate those nuggets into our own world view and move on. If there are not such nuggets, we also move on but seldom tell others what the misstatements in the article were (there is neither time nor incentive for this telling). We evaluate articles, quietly, as Carl Sagan did for the book "Chariots of the Gods", to which he gave the award of more scientific errors per page than any other book that he had read. Since I am not in the business of publishing my conclusions about an article as Sagan was, I just read it or not and move on.
Top on my list of articles to quit reading, at the moment, are the endless number that say 'genes say this', 'genes don't say this,' or 'genes don't exist in any useful sense,' when the author, a few sentences in, reveals that she or he does not understand the workings of biochemistry or genetic selection much at all and so is making statements that are not true and is trying to use these statements in support of his or her current thesis. Trying to support a hypothesis by using statements that I know to be false drives me away from an article very quickly.
Scientific articles, such as your example, can have much more verifiable content than articles woven out of whole cloth using long philosophical words that are not really defined and that do not connect to well controlled experimental data. Even scientific articles become stronger once the reader strips out the standard added verbiage that is not well substantiated by experimental facts. The standard verbiage is usually in support of the current belief system of a set of readers. This verbiage routinely has fewer supportive facts than you might think. If you go back even a few years before the proof of dark matter, continental drift, flavor shifting in neutrinos, or small control RNAs; you will find definitive statements in the science or popular science literature that such things can't exist. The statements were strongly believed but were wrong. The mechanism that fuels the belief of statements that are wrong is one of the reasons that I am trying to understand the workings of cultural cognition.
I am making a detailed set of annotations of the Christensen press release and paying attention to how I assign meaning to it or extract meaning from it. If others want to know how I do this and what I find, contact me offline. I am trying to delineate my own process of cultural cognition through a field in which I think I know a lot.

June 9, 2013 | Unregistered CommenterEric Fairfield

"I think you understand me to be saying "defer to authority" when I say "recognize" b/c I keep stressing that the recognition can't be based on an assessment of the correctness of the information that is being imparted (one accepts w/o checking math, replicating, etc.). Recognition/perception that another *has* knowledge is based on reason but is not based on *that*."

Not necessarily direct perception of the correctness of the argument, sometimes heuristic perception of the quality and circumstances of the argument.

But I keep interpreting it that way because it's what you seem to keep on saying. You have to do more than just "recognise" it if you're going to allow it to influence your views. But we've made some progress; we just need to keep talking.

"Let's try something concrete. Consider this:"

Good idea.

"Not my area of expertise & I take it not yours either, right?"


"But are you inclined to credit the report? I am."

I'm not, at least, not on the basis of what I can see here. This would get put in my "I don't know" bin.

Not that I specifically think it's wrong, there are no definite signs of that, either. It doesn't conflict with what I already know. There's nothing in it that doesn't make sense.

1) It looks like a brand new report in a technical journal. That means it hasn't had a lot of scrutiny.

2) They give conclusions but don't explain how they know. (Presumably they do in the paper itself.) There's no way to assess how strong their knowledge might be.

3) They're vague about the mechanism by which the signalling proteins are transported. Do they know? Did they just release signals near the top and then happen to find one near the bottom? Or do they know the sort of detail that indicates many alternatives eliminated, and a thoroughly tested model? They don't say, which inclines me towards the hypothesis that they don't know either.

4) There's no discussion of the limits of their discovery. What don't they know? How certain are they? What other possibilities have they thought of that might explain their results? Are the results shown in vivo or only in vitro? For all cells or only one type of cell?

5) There's no discussion of what other researchers think, how it fits into the wider debate about how inter-cell signalling works. Are there any arguments against it? Are there alternative proposals it is competing with? Is it controversial? Is anyone even trying to criticise or disprove it?

6) There's a bunch of motivational stuff in there about birth defects and degenerative diseases that seems only peripherally related - that looks like sales pitch to drum up more funding. All they've got (apparently) is some signalling proteins being transported from one part of a structure to another. It's far too early to be talking about specific diseases, or cures for them. It's like saying we've figured out how a spanner works and therefore we can build a space shuttle and go to the moon! :-)

And if they're putting a positive gloss on that bit, maybe they're doing the same for other bits, too? Hmm?

7) I've heard of several cases in the past few years of researchers faking biochemistry results related to cancer research and other big-money projects in order to get the fame and funding. They got caught out when nobody else could replicate them, but it took many years for it to happen. I'm not saying that applies here, but it might.


So, in summary, I would read this as a probable strong hypothesis about signalling proteins being transported from the top to the base of the primary cilia, which is one jigsaw piece in the giant mega-puzzle of how cells signal to one another. It's a new result, not well-developed, probably not very certain yet. And they're hoping to drum up more funding on the backs of dreams of curing all known diseases, which I think is overselling it a bit. It's maybe a tiny step in that direction.

I could be wrong. I don't know. Not my area. But I wouldn't invest money in it. :-)


"I go on this way through life. So do hundreds and hundreds of millions of others."

I don't doubt that. When I used to argue and debate about evolution, the most common argument I saw for it was that "thousands of biologists say so". It used to annoy me no end. Now that I argue in the climate debate, the most common argument appears to be the same: "thousands of climate scientists say so", "all the peer-reviewed literature says so", "97% of scientists say so". The argument obviously carries great weight with a very large fraction of the population.

But there is another portion of the population for which it carries little or no weight - at least, on things they disagree with. I've met ordinary people who are unimpressed with scientists and their air of certain superiority. They're not convinced by experts. They're cynical about the funding and politics. They prefer engineers who just get on an build stuff, who have to get things to work in the real world. I don't know how many there are - I've seen no surveys.

But - and I offer this as a mere hypothesis - I wonder if you are projecting the way you assess science stories onto everyone else? Does everyone assess science stories the same way?

I will admit, I'm not a typical case and you probably shouldn't read too much into my example. It's a combination of training and personal philosophy.

But whether I'm representative or not, I still think the point about looking at "who says so?" rather than "how do you know?" offends against the principle of nullius in verba, and common or not, practical or not, it's not what science is supposed to be about.

June 9, 2013 | Unregistered CommenterNiV

@NiV & @Eric:

I think you are both being evasive. The illustration is supposed to help promote reflection on a general phenomenon: how ordinary people "think" -- how they can & should -- about decision relevant science.

My hypothesis: they use a wealth of cues that are available to a critically reasoning person to inform her judgment of who knows what's known to science. The recognition does not inovlve comprehending what's known; there can be no checking of the math, no replication. There's not time; not time enough to do it, not time enough to acquire the expertise to do it competently. Yet people do, very reliably (of coruse not perfectly; showing me mistakes would be evasion not engagement of my point), identify immense amounts of decisoin-relevant science and, just as importantly, steer clear of even larger amounts of pablum. To do that, they are forming judgments about *people*-- that this one, on this issue, is in possession of the "real thing," not counterfeit.

What is the alternative hypothesis? I take NiV to be saying that the only "reasoned" way is to perform some sort of verfication that involves kicking the tires -- checking the arguments to make sure they are valid. Don't listent to a "person" -- forget who he or she is, what he or she does, what sort of training he or she received, what others think of his or her opinion, etc. Just hone in on that information: is it sound?

I say that's just not realistic. Do it if you like; you'll end up knowing much less than you could.

And you'll see around you people who are doing instead what I describe-- millions of them; I am not, as you suggested in the last msg, making an argument from numbers; I'm showing you evidence -- & living lives enriched by their having come to know what is known by science by those means.

I do think at this point the burden is on you to explain how you figured out something that you have not investigated & confirmed for yourself, NiV.

I asked you about antibiotics, virusts & bacteria. Your answer was:

Only bacteria. But it's the wrong question - since most people don't know which diseases are bacterial and which viral. What they need to know is that antibiotics only work against some diseases. And that's an easy statement to test without knowing anything about viruses or bacteria.

Really? Pray tell.

June 9, 2013 | Registered CommenterDan Kahan

We have talked about 'who says so.' I have started to be also focused on 'when did someone say so.'
There are a lot of coherent sounding arguments whose coherence disintegrates when you add the time element. For instance, Einstein is invoked in some arguments about dark matter or inflation of the early universe without it being pointed out that Einstein thought that the universe was static and knew nothing of dark matter or inflation so even Einstein's proofs did not consider these concepts. In biology, anachronisms multiply like rabbits and we forget that DNA as the material of heredity was not proved until 1947, the structure of DNA was not known until 1953, and the body's ability to store bacterial viruses in the intestinal walls to protect the body against disease was not known until last week, so arguments imputing DNA and viral insights into the work of Gregor Mendel in the early 1800's can easily be wrong.
How do anachronistic 'insights', from any field, color a person's cultural cognition?
How might a person routinely eliminate from their own cognition beliefs that a proper timeline would show cannot be true? How would you even detect such anachronisms to begin with since they are seldom presented in formal arguments? Maybe legal arguments avoid anachronisms because your opponent will use the later case to invalidate the point that one lawyer is trying to make. I don't know.
Thoughts? Insights?

June 9, 2013 | Unregistered CommenterEric Fairfield

"I think you are both being evasive. The illustration is supposed to help promote reflection on a general phenomenon: how ordinary people "think" -- how they can & should -- about decision relevant science."

Evasion wasn't intended. But to some extent we can only say how we think, and the people we talk to. I agree it's probably not a representative sample of the population.

"The recognition does not inovlve comprehending what's known; there can be no checking of the math, no replication. There's not time; not time enough to do it, not time enough to acquire the expertise to do it competently."

My answer was intended to illustrate how I thought people should do it, and how I would do it. I did not, I hope you note, check any maths or replicate any results. I came to a conclusion based on heuristic arguments built around figuring out whether this was knowledge that can be traced back to science's signature way of knowing. To do that, I asked whether they gave their reasons, whether they listed their uncertainties and limitations, whether they showed how it has been checked, challenged, and debated. They haven't. There's no evidence in the article to say this is good science.

I was trying to illustrate how you could tell that without knowing any biochemistry, or very much science. I was trying to show how the nullius in verba principle can be applied by a non-scientist. If we don't take their word for it, and look at the evidence they present, what have we got? Even if you don't know what it means, you can still see if it's there.

"What is the alternative hypothesis? I take NiV to be saying that the only "reasoned" way is to perform some sort of verfication that involves kicking the tires -- checking the arguments to make sure they are valid. Don't listent to a "person" -- forget who he or she is, what he or she does, what sort of training he or she received, what others think of his or her opinion, etc. Just hone in on that information: is it sound?"

Yes, more or less. Although I wouldn't say it was the only "reasoned" way, I would say it was the only scientific way, in conformity with nullius in verba.

I do at least try to be consistent! :-)

"I say that's just not realistic. Do it if you like; you'll end up knowing much less than you could."

I could 'know' all sorts of stuff if I would only open my mind to 'alternative' ways of knowing! Scientists complain endlessly about all the weird stuff non-scientists believe in. The ought to relax and go with the flow, right? :-)

But I will try to clarify one point - I make a distinction between scientific beliefs and ordinary beliefs. Scientific beliefs are those that have the backing of scientific method. Ordinary beliefs can have any reason whatsoever, or even none at all. As a believer in freedom of belief, I will not say that people must or should believe a certain way, that certain reasons for belief are disallowed. Nor do I say such ordinary beliefs are wrong, or useless, or deluded. What I say is that they're not scientific, and I think it's perfectly possible to have unscientific beliefs about science, and scientific matters.

And there's nothing wrong with that, so long as you don't start looking down on people for having different unscientific beliefs about science, as if they were idiots or loonies for not knowing what everybody ought to know. The evolution debate being a particular case in point, but we get it in climate too.

I believe in all sorts of things, but I don't consider them scientific beliefs, and I try my best to accept that other people believe differently.

"Really? Pray tell."

Easy! It's obvious to anyone who has caught a cold when on a course of antibiotics! :-)

But you already knew that, didn't you? Is it a trick question?

June 9, 2013 | Unregistered CommenterNiV

An insight that I hope helps the discussion.

As I read and annotate the article on cellular antennae, I find that I do not yet know which of the authors' statements are supported by solid data so I am still agnostic about their claims or the embellished claims of the journal. I find that most of what I conclude from the article is actually generated by thousands of interrelated articles that I have read and not really by this one.
The authors' contribution to my knowledge base and whether their findings help me to get a little farther in solving my own scientific jigsaw puzzle, turn out not to depend on whether they are 'right' or not but on the work that I do to figure out whether they are right.
I bring a lot of background to figure out whether they are right. I analyze their results in detail because if they are right, their knowledge is of great value to me and if they are wrong, knowing why they are wrong is also of great value. In this example, dismissing the authors' claims out of hand would incur a large cost on other things that I need to know.
I was surprised how much of my evaluation of this article does not depend on 'what experts think' or on whether I think the authors or the journal are reputable but on the added value to me of the potential knowledge contained in the article. Much of my internal valuing of the article or belief in the conclusions is not directly in the article at all. Some one else may claim that I am agreeing with the experts because they are experts. If I agree with the experts, it is because I have understood the data on my own and not because the experts say so.
In many other articles, I remain agnostic about the conclusions until the conclusions have enough value to me that I have to make an informed decision.
I bring much more cultural cognition to the process of evaluating this article than I realized I did.
Thank you all for helping me to make my process more conscious.
I hope that my notes on my process are helpful to the discussion here. If my notes are strongly off topic, let me know and I will keep future insights to myself.

June 9, 2013 | Unregistered CommenterEric Fairfield

Primary cilia and transport

Now that I have done some research on what the article might mean I find that:
1. The results are useful but not as apparently important as the journal would have you think they are.
2. The writing is sloppy and lazy. Words that are in common usage in English such as 'antenna' have a very different meaning in this sub field of biology. The differences in meaning are not spelled out. So parts of the meaning of the article are hidden in specialist jargon disguised as standard English. Knowing what I do now, I would have rewritten the article to increase its actual communication value.

Any feedback on my notes and cultural cognition would be appreciated.

June 9, 2013 | Unregistered CommenterEric Fairfield


Oh dear...

Easy! It's obvious to anyone who has caught a cold when on a course of antibiotics! :-)

But you already knew that, didn't you? Is it a trick question?

Who is playing tricks here? I think your answer is one; a sort of rhetorical Houdini maneuver to escape not what I mean to be any sort of trick at all but rather a serious challenge. I want to know the answer to the question I actually posed to you.

But help me w/ this first.

I have 2 friends named Alice & Bob.

They came to me once w/a disagreement. Indeed, it involved a set of experiments each had performed. Now that I think of it, I recounted the nature of their experiments and their reporting of the results once before on this site, so I'll go back to that:

1. Alice says she knows antibiotics can treat bacterial infections because she “felt better" after the doctor prescribed them for strep throat. Bob says he knows vitamin C cures a cold because he took some and “felt” better soon thereafter.

2. Alice says that she has “seen with my own eyes” that cigarettes kill people: her great uncle smoked 5 packs a day and died of lung cancer. Bob reports that he has “seen” with his that vaccines cause autism: his niece was diagnosed as autistic after she got inoculated for whooping cough.

3. Alice says that she “personally” has “felt” climate change happening: Sandy destroyed her home. Bob says that he “personally” has “felt” the wrath of God against the people of the US for allowing gay marriage: Sandy destroyed his home. (Cecilia, meanwhile, reports that her house was destroyed by Sandy, too, but she is just not sure whether climate change "caused" her misfortune.)

They each believed the other was wrong in all respects, & wanted me to explain to the other why.

Of course, I told them they both were incorrect. Not because one or the other had reached the wrong conclusion, necessarily. But because neither of them had engaged in valid reasoning.

Bob and Alice are, as it turns out, very well educated. But we know that the US educational system doesn't do as good a job as it should in instilling what we might call "ordinary science intelligence" -- the sort of critical reasoning that Dewey identified as distinctive of science, and as an essential element of education--not because science's way of thinking is a "peculiar" one suited "for highly specialized ends" but rather because it just "is thinking"!

I tried to straighten them out -- to explain why neither of them had grounds -- based on their experiments -- for believing any of the things they were reporting. But because I only study science communication, a matter completely separate from communicating science, I predictably, comically even, failed to make headway w/ either.

I'm curious: What would you have told them, NiV?

But once you do finish w/ that, please do tell me how you figured out that antibiotics effectively treat any sort of illness. How did you do it w/o doing replicating all the science involved in developing antibiotics, but still evaluating only the strength of the information or arguments or evidence furnished to you & w/o considering anything else, including anything having to do with the people who supplied you with all of that?

That actually can't be done in any manner that involves valid inference. Or at least that is one part of my claim.

The other part is that it can be done -- by anyone -- & is indeed done all the time by people like Bob & Alice & in a manner that has nothing to do w/ doing experiments but that is in fact completely consistent with Dewey's understanding of how to think--which is, as he recognized, all that science is, once we abstract away from all the specific information it has yielded & all the methods it has used to discover it.

June 9, 2013 | Registered CommenterDan Kahan

Many of the public have enough basic knowledge to know when arguments by experts seem flimsy.

In the mid 80's I quit the trade I was in due to the amount of traval required that made it unsuitable for marriage and a family so I returned to school to pusue engineering.

One of the views that was current in North American ancient history at the time was "Clovis First" on when the Americas was first colonized. The experts in the field were quite aggressive, and effective, in shouting down anyone who had the audacity to suggest that the Clovis were not first.

"Clovis" is starting to crack as more proof of an earlier migration is becoming accepted. It only took decades, and many retirements, for the paradigm to change.

While very much not an expert in the field, I have read for pleasure many articles and books on ancient history and even an amateur such as me could see several gaping holes in the research. The biggest hole over which I had many an argument with the profs in the Anthropology dept was the fact that the diggs all seemed to be in the wrong area.

Sea level was several 100's of meters lower during the last ice age when the experts were saying the land bridge between Asia and NA was blocked and migration was not possible. Also They were saying no earlier encampments were being found. The place to look for migrations during this period would be where the sea and the land met at the time, not where it met currently. No one was doing underwater research where the encampments could be expected to be found.

With the crack in Clovis First, dredging at locations where estuaries would have been located during the last ice age has found tools and points.

There are many parallels between the climate wars and the Clovis wars. The one item that is different is the amount of money involved over the climate. The general public could care less over the argument of who was first to migrate into NA. The fight over climate has trillions of dollars on the line, and money does make a difference over who wins or loses the argument.

June 9, 2013 | Unregistered CommenterEd Forbes

I think that the stories above could use a little embellishment to meet real life circumstances.

Alice's doctor actually hasn't gotten the lab test back yet confirming the diagnosis of her sore throat. The doctor suspects that it is not really due to strep. But since Alice is a lawyer the doctor decides to play it safe and prescribe antibiotics anyway.

High levels of vitamin C in his bloodstream makes Bob thirsty and so he drinks plenty of fluids.

Alice's great uncle worked with asbestos, but because he smoked his disability claims were denied. The state in which her uncle works has instituted legislation designed by ALEC to reduce such damage claims against employers:

Bob's brother was careful not to expose his infant daughter to the antivaxxers in his extended family like Bob and Bob's sister's offspring, the one with the autistic daughter. But because Bob's brother thought the problem with whooping cough outbreaks was due to anti-vaxxers, he didn't realize he needed a booster for whooping cough himself. He had a minor cough but didn't think that was important. He exposed his baby to whooping cough and the baby died.

Alice lives in a state in which the state government acknowledges climate change but also one that is adroit at acquiring and utilizing Federal disaster relief funds. Her house will be completely rebuilt in its former seaside location. Because Alice realizes that the climate may be warming she is paying extra to have air conditioning installed in her new home. New neighborhood levees and pumps will be built to attempt to protect the area from future storms. To increase the tax base, a new shopping mall will be added with large paved parking lots to draw customers in from surrounding towns.

Bob worked as a carpenter. He always voted for representatives who recognized that the EPA was putting roadblocks in the way of companies trying to provide jobs. He spent years in a formaldehyde ridden FEMA trailer, miles inland from the coastline. Now he has found a new church home in this town. His new church friends there are pitching in and helping him build a new house. Because he does not trust the government at all, this home will be completely off grid. He also plans to grow his own food.

How to think about this?

June 9, 2013 | Unregistered CommenterGaythia Weis


I have revised downward my assessment of the probability that the finding reported in Cell Reports is true based on @Eric's careful reading and report.

Are you inclined to revise yours -- either by adjusting it upward or downward or, if you view Eric's report as corroborating your own impressions, by increasing your level of confidence in your initial read?

June 10, 2013 | Unregistered Commenterdmk38


I agree that members of the public seem to have this ability to sense who knows what they are talking about.

Do you see, then, why in a case like climate change -- where I know you know that the most science literate members of the population are the most polarized on what the facts of the matter are -- there must be something strange (something that isn't the norm) going on that is interfering with their ability to exercise this sense?

June 10, 2013 | Unregistered Commenterdmk38

Nice examples.
I tend to think out climate change or other discussions in terms of such examples. To get to the kinds of results that pollers report, I add up all the individual cases and come up with a percent belief number. I remember, however, that the percent belief is actually an average over many individual cases which can all be different. If reaching my goal depends only on the current average, e.g. there is a vote on this issue tomorrow, then I can ignore the details that led to that average. If reaching my goal depends on moving that average, then I often have to know about the individual cases in order to know how to speak and act to move the average.
Paying attention to these personal details and averaging over them is called agent based economics (or political science) and is just starting to be studied by people like Dan Ariely and in the sense of the underlying neuroscience by Dan Kahneman.
In interacting with people in my town when I am trying to get something changed, zoning regulations are the current need, I have to remember the details, at the level of your stories, of each person. If they have my beliefs on any issues but exercising those beliefs would cause them to lose FEMA assistance, I should not be surprised when they do not exercise those beliefs.
In my town at the moment, people do not exercise beliefs that would save the town in the long term. They do not exercise these beliefs because, in the short term, to exercise the beliefs would cost them a lot of money, time they feel that they do not have, and, probably, their job.

June 10, 2013 | Unregistered CommenterEric Fairfield

Dan: "there must be something strange (something that isn't the norm) going on that is interfering with their ability to exercise this sense?"

This is where we disagree. I think the public's ability "to have this ability to sense who knows what they are talking about" is working very well.

Climategate with the Harry Readme file just put the finishing touches on the credibility of some of the more famous of the climate concerned. Harry's' comments on the integrity of the data was more damming among those of us with engineering background than the horrible email comments.

June 10, 2013 | Unregistered CommenterEd Forbes

Dan, I think that the article you cite on human cilium research and TGFβ-signalling highlights a key problem in science communication. Much of the function of the initial presentation of such communication has to do with beating the researchers publicity drum. As in here where right off the bat we are told that these researchers "have spearheaded the recent discovery which sheds new light on the causes of a range of debilitating diseases and birth defects." This sort of presentation is designed as if territory is being staked out for a patent fight. We were here first!! Our work is central to combating all of the major diseases that we can think of that might possibly be related in any way to our research!

Such self promotion may seem necessary in the battle for research funding. It is hard to say something more to the effect of: this work is a small step among many that shed light on this research area. And much of the work of our laboratory was devoted to several false starts that yielded only negative results with data that is of no further use in moving forward.

But such promotional efforts inhibit communication with others. Scientists with an interest in the field, as was I believe noted someplace above, tend to be able to sort through this hype and focus on the central part of such articles (I haven't read this one) and evaluate the actual research. The body of the research probably does cite key references by others participating in the field and appropriately delineates the actual findings.

But is not surprising that others, especially those with a vested interest in related outcomes, might respond to a shallow review with a "Bah Humbug!"

Overall one is left with a different impression than if one immediately realized that there is a greater context for this work and other researchers working in similar areas:
The dynamic cilium in human diseases 2008
TGFβ signalling in context 2012

Or even that at least one of these particular researchers have been working in this area for some time:
2007 Christensen ST, et al
Overview of structure and function of mammalian cilia.,f1000m,isrctn

June 10, 2013 | Unregistered CommenterGaythia Weis


I had no idea that you were the same "Gaythia" that Bob & Alice know! Small world!

Did you hear what Bob said about about drones and GM foods?!

June 10, 2013 | Registered CommenterDan Kahan

"I'm curious: What would you have told them, NiV?"

That would depend on the situation.

In many social situations, I'd have avoided the argument and waved it off. People can believe whatever they want. Tolerance and getting along is often more important than being right - and can potentially defuse the sort of antagonism that entrenches positions.

If I suspected they understand the principle, but were simply not applying it, or if I think they're interested in an argument about it (some people enjoy arguing! ;-)) then I'd have probably simply pointed out the fallacy without explaining, and seen what they said.

If I thought they were genuinely confused, and wanted to know the answer, I'd have started with asking why they thought that, and tried to tease their reasoning out of them. Then I'd have to figure out where they had gone wrong, and try to come up with some way to help them see it. Abstraction is probably the best way, if they can cope.

If they appeared to be staking out a deeply entrenched position that they felt strongly about, that they had probably argued many times before about, I might take the line of suggesting ways they could make their case more strongly by taking precautions against common objections. And then to ask them, if a doubter suggested it was just one case, how could they strengthen the argument? Motivated scepticism is a valuable scientific resource not to be wasted! So don't try to persuade them, instead just make them better at it.

I don't know. I might just make it up as I went along! :-)

I'm not sure I understand the relevance to the original question?

"But once you do finish w/ that, please do tell me how you figured out that antibiotics effectively treat any sort of illness. How did you do it w/o doing replicating all the science involved in developing antibiotics, but still evaluating only the strength of the information or arguments or evidence furnished to you & w/o considering anything else, including anything having to do with the people who supplied you with all of that?"

I'd do it the same way I've suggested for the examples given above. We start off by asking the scientists "How do we know this is so?" When it comes to drug efficacy, I'd expect them to start talking about double blind trials and statistics and the history of mortality from infections. Then I'd ask if their work was published and scrutinised. They'd presumably talk about the medical standards people auditing their results, although I have heard that some results might not be as open about the data as they should be. Are there explanatory models predicting effects? Yes, the dosage is calculated by body mass, and different antibiotics are used for different diseases. But not entirely, since some bacteria are unpredictably drug resistant, and quite often a doctor does not know what you've got. There's wiggle-room here. Next, are there any contrary arguments? Well, as noted, quite often doctors don't know what you've got, so it's hard to tell if they're working as predicted. Infections can clear up on their own quite unexpectedly, so any individual case might be coincidence. Sometimes it doesn't work, and it isn't obvious why. Do these objections amount to anything? Nothing definite, but without analysing the trials statistics in detail it's impossible to say there's no room for doubt. In short, I'd be surprised if they had missed something, but not astonished. Do I know of any plausible reason for doubting the result, that I would need to dig into the data further to resolve? No. So I'm content to accept it as true with medium-high confidence, subject to the usual caveats about this not being my area of expertise, etc.

What we're trying to do is tell if this has been determined by science's signature way of finding things out. The story of the discovery of antibiotics is widely recounted in schools and in popular culture, and involves petri dishes and experiments and massive mortality reductions, not certification by medical boards and experts. So I'd say, at a casual glance, it is.

I definitely don't believe it because doctors say so. There are quite a lot of things doctors say that I know are wrong.

"That actually can't be done in any manner that involves valid inference. Or at least that is one part of my claim."

Interesting. I'm curious as to why you think that.

It depends what you mean by 'valid inference'. If you mean watertight logic, then as I've said repeatedly, there's not substitute for doing the science other than doing the science. There's no royal road to geometry. In talking about the sort of scientific knowledge accessible to non-scientists (and non-specialists), we have to make some allowances.

But I'm thinking you might mean something else - that you think my approach of asking how people know, checking whether checks have taken place, etc. is somehow less logically valid than believing anything said by a man in a white lab coat. (If you know what I mean. I'm being rhetorical, here.)

Perhaps you mean people are less able to check whether the arguments are of the right form than they are at checking whether their qualifications are in order?

I'm not sure. I think we may have made some progress, but we're probably both getting tired of the argument now. We need to come back to it afresh some other day. It's an interesting question.

"Are you inclined to revise yours -- either by adjusting it upward or downward or, if you view Eric's report as corroborating your own impressions, by increasing your level of confidence in your initial read?"

I would say Eric's thoughts are similar to mine, so I haven't revised them up or down. And Eric only listed a few of the reasons for his view, in the most part relying on unstated background knowledge, so I haven't revised my confidence much, either. I agree they use too much jargon, I agree they're over-selling the 'cure for all diseases' angle. What else did Eric say?

Although I will say that had Eric been more supportive of the paper, and explained why, I might well have revised my opinion upwards. That would be just the sort of scrutiny and debate that I was looking for.

June 10, 2013 | Unregistered CommenterNiV

You implicitly brought up an interesting point. You would only change your view on the paper if I spent hundreds of hours of my time, for free, to convince you and kept changing my method of explaining until you were satisfied.
If you, or anyone else, wants a viewpoint explained in detail to you, you have to make it worth the other person's time to do the explaining. Otherwise they will just interact with someone else.
This 'is it worth the explainer's time' consideration comes up everywhere from climate change to neutrino physics to the workings of politics and law.
Thanks for making your point so clearly.

June 10, 2013 | Unregistered CommenterEric Fairfield

"You would only change your view on the paper if I spent hundreds of hours of my time, for free, to convince you and kept changing my method of explaining until you were satisfied."

Not specifically for me, and I didn't say anything about "for free".

What I mean is that science is founded on sceptical challenge - we accept only those ideas and conclusions that survive it. Scientists publish so that other scientists can verify (or refute) their arguments. We each have our own biases, so we ask people with different biases to see what we cannot. This is how science bypasses the fallibility of individual humans to achieve something better. Thus, scientific confidence arises from the degree to which sceptical scrutiny has been applied. If other scientists have looked at it critically and found nothing wrong, that increases our confidence. If nobody has ever checked it, what makes you think it's right?

If you've sat the exam but it hasn't been marked yet, can you tell everyone you've passed?

So if there was evidence that it had been thoroughly scrutinised and passed, that builds confidence. One of the reasons I was dubious was that it appeared to be a newly published result, and therefore is unlikely to have had much scrutiny yet. But you apply such scrutiny, if you choose to, for your own purposes. I'm not asking or expecting anything of you.

(Bear in mind that in this discussion we're supposed to be taking the part of a non-scientist, non-specialist, unable to validate the maths and calculations for ourselves. Dan wants to know how such a person can come to know what science knows, without the specialist knowledge and effort required to check every detail. He wants to know how the principle of nullius in verba can be practically applied.)

But yes, if nobody checks it, and explains their reasoning in a way I can understand and assess, I'm not going to accept it as having that high degree of confidence we associate with science. Do you think I should?

June 10, 2013 | Unregistered CommenterNiV

There appear to be two difficulties of process in your approach.

1. For the listed article, I can provide you with the information necessary to check the result for yourself. That information will take you, as the everyman non-scientist, at least six years to understand and absorb. How do you make a supportable decision if you do not have the time to come to one?
2. Almost all Nobel prizes are given for some new discovery. At the time the discovery was made, there was lots of existing peer reviewed literature that said that the discovery was not possible. The Prize was awarded for showing that the existing literature was wrong. On a related tack, standard economic theory says that economic bubbles can't happen. Yet, they do. How might you fit into a decision making process the cost to a person if the established conclusion is wrong? How do you strip out the cultural adhesions so that you can see that the existing paradigm is wrong? How might you communicate well to someone who believes the current paradigm if you know that the paradigm is incorrect (see, for example, the 'primitive' potato farmers trying to talk effectively to the 'smart' European agronomists in Diamond's "The World Until Yesterday." The farmers were right. The agronomists weren't.

June 10, 2013 | Unregistered CommenterEric Fairfield

1. For the listed article, I already did check it for myself, and came to the conclusion shown.

As I've said, there's no substitute for doing the science except doing the science. But there are a wide range of heuristics by which we can approximate that reasoning and make an assessment. Do they explain their reasoning? Do they show their data? Has it been checked and rechecked? Is it controversial? Has anyone raised any unresolved problems? Even for a person with virtually no scientific knowledge, they can do a lot better than having to take somebody's word for it as an expert.

Will that catch the most subtle flaws in a bit of science? No, of course not! But then, neither does trusting experts.

2. That's one reason why I put virtually no weight on something being peer-reviewed, and ignore arguments by consensus. I would make no distinction between primitive farmers and smart European agronomists - I would ask both the same question: what's your evidence?

It's not perfect, but it's the best we've got.

(Incidentally, what standard economic theory says bubbles are impossible? That's a new one on me!)

June 10, 2013 | Unregistered CommenterNiV

On economics.
Stated more carefully, economics does not predict bubbles correctly. So, as one economist has said, "Economists have predicted 9 of the last 4 recessions."

June 10, 2013 | Unregistered CommenterEric Fairfield

Goods can come through post parcel or as baggage with passengers. Home from his studies in struggles to find meaning in his radically altered world don't know about equity appointments, hope somebody else can fill in the blanks in that.
What if I say something stupid and she gives me that patented sternly polite “I’m not sure what the heck you’re talking about” The thought of getting that look is the stuff of reporter nightmares. Or anybody’s, because she knows. If she says you have none, forget about it. You don’t.
Fortunately, is as nice as she is fashionable, slim and elegant in a strapless pant suit. We chatted about how to do fall fashion, what has taught her, and whether the questionably tasteful have a shot of reversing that verdict (the answer surprised me).
This is extremely useful, and means if it's free for them, it's free for you. It's amazing what you can fit in - use your home scales to help. To a traditional holiday destination, then package holidays often come up trumps that the majority's ruling then this rule extends to passengers' property. It seems that thousands of years after Confucius admonished his time a busload of tourists comes by, and one time, a guy bought a house I was getting on a plane, and this had some baggage.
The actual life of the individual is what constitutes what could be called his or her own consciousness, create their own values and determine a meaning to their life. Things don't happen to good people"; to the world, metaphorically speaking, in search of coal, and they found it, I saw people struggling to get by, not knowing what was what and what was next.
We live in a world of infinite status updates, constant tweeting, and compulsive .Recounting along his son and his son's sons in a radically altered world. You should spare yourself the embarrassing discovery of their exact value to your listener. That nothing can alter the truth and nothing can take precedence over that act of .Those who know that they cannot win by means of logic, have always account given above is a statement merely of the "prose meaning," and bears If the Waste Land is not a world-weary cry of despair or a sighing after the vanished.
The coercive power of an institution is inversely related to the alternatives that are available to people. A business firm can make whatever rules it pleases, but if there are alternative employers, there is a way of escape for anyone who finds the rules oppressive. The potential coercive power of parents is great because a child has no alternatives for responding, other than by misbehaving, becoming emotionally disturbed, or by running away.
Because of the historical and present-day fear of coercive power, the society and government of the emphasize individual freedom. Culture has moved away from the powerful father image that permeated the old-world order of family, church, and state. The image of Revolution as throwing off the authority of a British king has been reflected in extreme sensitivity to the possible abuse of power to the extent that even legitimate parental authority has been undermined in families.
As a result of this anti-authority ethos, many parents are not aware that freedom is the ability to make choices between alternatives and only has meaning in contrast with the restraint that is necessary so that our freedom does not deprive others of their freedom. If we did not have to take into account the effect of our behaviour on the freedom of other people, we would be free to do as we wish. In fact we cannot avoid facing the effects of our freedom on other people.
The testing I had done so far told me nothing about the inner world. Was it possible that his visual memory and imagination were still intact? I asked him to imagine entering one of our local squares from the north side, to walk through it, in imagination or in memory, and tell me the buildings he might pass as he walked. He listed the buildings on his right side, but none of those on his left. I then asked him to imagine entering the square from the south. Again he mentioned only those buildings that were on the right side, although these were the very buildings he had omitted before. Those he had ‘seen’ internally before were not mentioned now; presumably, they were no longer ‘seen’. It was evident that his difficulties with leftness, his visual field deficits, were as much internal as external, bisecting his visual memory and imagination.
Our materialist science reduces everything to matter. Materialist science in the West says that we are just meat; we’re just our bodies, so when the brain is dead that’s the end of consciousness. There is no life after death. There is no soul. We just rot and are gone. But actually any honest scientist should admit that consciousness is the greatest mystery of science and that we don’t know exactly how it works. The brain’s involved in it in some way, but we’re not sure how. Could be that the brain generates consciousness the way a generator makes electricity. If you hold to that paradigm then of course you can’t believe in life after death. When the generator’s broken consciousness is gone.
Be over-confident: This makes you optimistic and then make high risk decisions. As said, "Doubt everything or believe everything: these are two equally convenient strategies. With either, we dispense with the need to think for ourselves."
He is convinced that intellectuals underestimate the explanatory power of evolutionary theory, which to means is both within biology than many biologists believe and more relevant to problems outside biology than many social scientists and philosophers pretend. In an analogy that runs through the book, likens Darwinism to a "universal acid," an allusion to childhood lore about an acid so corrosive that it eats through everything -- including the jar in which you desperately try to contain it.
The universal acid of natural selection can spread both downward from biology, explaining the origins of the universe and life, and upward from biology, overturning our views of consciousness, cultural change, and the origin of morality. The resulting Science of Everything "eats through every concept, and leaves in its wake a revolutionized world-view." takes this Science of Everything idea very seriously: "The idea of evolution by selection unifies the realm of life, meaning and purpose with the realm of space and time, cause and effect, mechanism and law."
Coercive decision-making: Coercive persuasions are Mind Control tactics which are part of a Brainwashing practice. They are designed to greatly modify a person's self-concept, perception of reality, and interpersonal relations. They influence the victim's Thinking Straight ability. Brainwashing is a very process that consists of two stages:
One is Conditioning and used for controlling the mind of the victim, e.g., inducing manipulative guilt, covert fear, intimidation, mental and moral confusion, eliciting confessions to uncommitted crimes, and propaganda.
The other is Persuasion to cause an inability to think independently, e.g., implanting suggestible impulses into the victim's mind.
Coping with such an inhumane treatment by other people requires first of all that one should never allow the feeling that he/she is a victim but rather a survivor:
Victims ask for pity, Survivors look for challenge.
Victims worry about who is to blame,
Survivors find a way to make a difference.
Victims complain; Survivors take action.

The most effective propaganda and indoctrination system is one where its victims do not think they are being propagandized and indoctrinated. We are all familiar with "mild" persuasive techniques used in commercial advertising campaigns to influence consumers' buying behaviour. "They" tell us we’ll be healthier, happier, and smarter if only we purchase their products.
Many people are unhappy and neurotic today partly because advertising has caused them to have unrealistic expectations of life, themselves, their due to products and services that are constantly pushed on them.
In this type of image time is no longer subordinate to movement: that is, it is not in sequential segments of movement that time is viewed. The image is freed from its placement in a sequence; instants do not need to follow their order as determined in movement, but may all become available to view, that a restrictive picture to which the world must conform is no longer possible. There is no more progression of time such that all past moments are seen to be contributing to the constitution of the present; rather, they begin to assume directions and vitalities of their own, and present and past cease to be a dualism. acknowledges his predecessor: " major theses on time are as follows: the past coexists with the present that it has been; the past is preserved in itself, as past in general (non-chronological); at each moment time splits itself into present and past, present that passes and past which is preserved".
Time becomes available as time-image in which "time became out of joint and reversed its dependent relation to movement; temporality showed itself as it really was for the first time, but in the form of a coexistence of large regions to be explored". In this present is constituted through several parallel narratives of the past, not always congruent with each other.
None of them manages to make any greater claim to reality than the others, all permeate the present and are part of what makes it a complex reality, unbounded by closure (remains, largely, a floating signifier).
Ostensibly infiltration as a quaint metaphor for the human-machine relations the vision depicts: human beings have allowed machinery to run them they are pieces hanging on the periphery mediated experience, as though the time lines of our lives.

October 10, 2013 | Unregistered CommenterJJ

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>