follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus
 

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« CCP's "Evidence-based Science Communication Initiative" (EBSCI) | Main | Distrust of "trust in science" measures--crisis solved? »
Tuesday
Dec022014

On (confused, confusing) "belief-fact" distinction -- a fragment

From revised version of The Measurement Problem: 

 As used in this paper, “believe in” just means to “accept as true.” When I use the phrase to characterize a survey item relating to evolution or global warming, “belief in” conveys that the item certifies a respondent’s simple acceptance of, or assent to, the factual status of that process without assessing his or her comprehension of the evidence for, or mechanisms behind, it. I do not use “belief in” to align myself with those who think they are making an important point when they proclaim that evolution and climate change are not “mere” objects of “belief” but rather “scientifically established facts.” While perhaps a fitting retort to the schoolyard brand of relativism that attempts to evade engaging evidence by characterizing an empirical assertion as “just” the “belief” or “opinion” of its proponent,  the “fact”–“belief” distinction breeds only confusion when introduced into grownup discussion. Science neither dispenses with “belief” nor distinguishes “facts” from the considered beliefs of scientists. Rather, science treats as facts those propositions worthy of being believed on the basis of evidence that meets science’s distinctive criteria of validity. From science’s point of view, moreover, it is well understood that what today is appropriately regarded as a “fact” might not be regarded as such tomorrow: people who use science’s way of knowing continuously revise their current beliefs about how the universe works to reflect the accumulation of new, valid evidence (Popper 1959).


PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (15)

I try to use the word "think" for something I think is true, based on my understanding and may be modified by new information or understanding, and I use the word "believe" for something that I accept as true without inspection. I think the theory of evolution is on the right track, but I don't believe in evolution.

December 2, 2014 | Unregistered CommenterFrankL

While phrased in a conciliatory manner, this is not strictly valid: a scientific fact is one which is corroborated by all observable evidence, and in the case of a directly-testable phenomenon, one which can be replicated under identical circumstances with identical results regardless of the person conducting the test. The article's first comment immediately demonstrates the inadequacy of the article's understanding of scientific fact by claiming not to "believe in" evolution. While the Theory of Evolution is not entirely complete (and cannot be, since every generation is a step towards the next genetic "leap" between species), evolution itself is an observed fact through laboratory trials and is corroborated by all available evidence. The theory merely serves to expound on HOW it happens, not IF. Were the commenter to conduct the same experiments which have definitively resulted in genetic speciation in the laboratory, they would either have to conclude that they "believe in" evolution or they "disbelieve" the evidence of their own senses and that of every other individual who has conducted the identical experiment with identical results. No matter how hard a person disbelieves in a rock or weight or gravity, it will factually injure them when it collides with their head after falling even a short distance.

December 2, 2014 | Unregistered CommenterThetaSigma

@ThetaSigma:

If you sense confusion in the first comment (I think trying to use rigid rules to constrain words used to characterize intentional states of people toward asserted states of affairs reflects unrealistic view of language but FrankL is free to prove me wrong), the solution is not to compound it with more of the same.

You seem to think there are "facts" out there that science "discovers" & that's that.

No.

There is only one thing in science: valid evidence to support infernces about how things we can't see (causation, mental operations inside people, the natural history of humans, subatomic particles, etc.) work.

The "inferences" are the judgments that people logically make, and beliefs they consequently form, on the basis of evidence.

The process of accumulating evidence in support of one or another surmise, moreover, never ends. It can't. There is no empirical (i.e., factual) claim worthy of being accepted as true (i.e., being "believed") except the ones that remain permanently amenable to re-examination in light of additional evidence.

If you think someone *ought* to "believe" something, then just present the evidence that makes that proposition more worthy of being credited--i.e., "believed"--than the next best alternative, at least for the time being.

If that doesn't persuade your interlocutor, don't resort to metaphysics like "belief-fact" distinctions.

Either keep trying or just shrug your shoulders and return to the business of generating evidence & making valid inferences.

December 2, 2014 | Registered CommenterDan Kahan

We scientists already struggle to teach the public to define "Theory" in a way that does not make it sound like "only a theory <shrug>" . "Belief" is already a word strongly loaded with its religious related meaning for many of the very members of the public to whom we are trying to convey the importance of science. Since the word "believe" is not as deeply embedded in science terminology as "Theory", here I think we (communicators of science) ought to be able to be the ones who find alternative language.

I think that to convey the changing nature of science, "believe" is a poor word choice, in that "belief" to many people implies, in a religious sense, something that is adhered to despite threats and challenges and for which their is no such thing as replacement by better evidence.

December 2, 2014 | Unregistered CommenterGaythia Weis

@Gaythia:

There's just no point even getting in discussion w/ members of public about these things. If they say they don't "believe" evolution, they aren't confused about words or not "getting" that science views it as "fact" etc. They are telling you they aren't using science's way of knowing, at least on that issue.

If one *does* use science's way of knowing, one can use the word "belief" w/o any problem. Scientist say they "believe" or "disbelieve" in connection w/ propositions at issue in science *all the time* (check out IPCC, e.g.); just not true to say it is foreign to useage of science.

The word doesn't have religious connotation *unless* it is being used in a context in which it's clear that that's what it refers to.

It's just confusing to say "science isn't about belief" & mistakes confusion over words as source of conflicts much more basic than that.

Don't fight confusion w/ confusion

December 2, 2014 | Registered CommenterDan Kahan

@Dan Kahan - no proof forthcoming. My belief vs. think is just a possibly unsophisticated way that I mentally keep track of what is up in the air and what is not. All science is up in the air. To me, the existence of God is not. I see the word of God in experimental results but everyone's inferences, including my own, on what they mean or imply are up in the air.

When it comes to science communication, I leave that in the hands of the experts. I agree with your analysis in response to Gaythia, its more than just a confusion of words, and if your recommendations are more effective in clearing the air than my favorite "science is not about belief", then good.

@ThetaSigma - "No matter how hard a person disbelieves in a rock or weight or gravity, it will factually injure them when it collides with their head after falling even a short distance."

Right - unless you are dreaming. Note, you cannot prove that you are not. That sounds facile and disingenuous, but I absolutely don't mean it that way. You have what seems to be a near-religious belief in your rock-solid grip on objective reality. I doubt your belief. I think I have a pretty good grip, but I nervously welcome (non-drug induced) experiences that make me wonder. Like wondering whether my and my opponent's political convictions are a reflection of my deep sense of objective reality and their lack thereof, instead of a clue to what flavor nut jobs we both are.

December 5, 2014 | Unregistered CommenterFrankL

@FrankL:

"All science is up in the air."

Absolutely! If one doesn't get that science treats all "facts" & all "knowledge" as provisional, then one really doesn't get what science's way of knowing is.

I agree it is perfectly sensible, too, to say "science is not about belief" when context makes clear, as it does in your comment, that one is making a point about the limits of what science purports to address & the freedom any reasoning person necessarily has to figure out for him- or herself how to make sense of all the rest (and to reject science's way of knowing altogether for that matter, although that certainly seems like a bad idea to me).

December 6, 2014 | Registered CommenterDan Kahan

The potential for confusion is at the crux of the issue. So we need to think about language, and the fact that as a society at large, we don't actually have mutually agreed upon terms within our language, with which to communicate. And sometimes, within the boundaries of specialization, we create our own dialects.

So, for example, recently I've been delving into policy related ecology issues. In this realm, all significant items seem to be described by 3 or 4 letter acronyms. And there actually aren't enough to go around so sometimes one agency's acronym overlaps with that of another. It is in fact, a way of creating and accentuating exclusive tribes with barriers to entrance, even for those of us who are professionals in more or less the same field.

If you are religious, and, in particular active in any one of many Judeo-Christian traditions in this country, "belief" is a loaded term. Depending on the orthodoxy of your religious tradition, you've received considerable education (indoctrination) into the idea that one ought to believe, and that to believe means to accept something on faith, without question, and without real world investigation of its credibility in a scientific sense.

A scientist, in communicating with a public audience, is trying to get them to see this science as built upon evidence collected by rigorous methods of physical investigation, and validated by repetition and further investigation over time. Thus, starting out by saying that the scientist "believes" something to be true, can be misinterpreted in the minds of an audience trained to think of that word as highly linked to matters of faith. If a scientist uses the word "belief", some in the audience may even see the presentation of that set of "beliefs" as a direct challenge to their own set of "beliefs". And thus tribal boundaries may be set up and accentuated.

Similarly, the word "theory" has very different meaning to different audiences. To a scientist, a statement starting out with "according to the Theory of Evolution", uses the word "Theory" to denote a large body of evidence that has stood the test of time and is very well substantiated. To many members of the public, the word "theory" means something that is still highly tentative and as of yet unproven. This difference in what is being said and what is being heard can seriously inhibit the credibility of the presentation.

I don't think that there are any easy solutions here, English is not a highly precise language, even though its large numbers of synonyms do make for great prose and poetry.

The point is, in outreach communicating to others, we are trying to reach people who are not well versed in our particular dialect. Therefore, I think this statement is not valid:

"The word doesn't have religious connotation *unless* it is being used in a context in which it's clear that that's what it refers to."

I think that the point is that we are trying to convey the "science way of knowing" and to demonstrate its applicability in real world situations. I actually think that many people do use the "science way of knowing" on a regular basis. For matters of public policy, I think that we can agree that it is important that our society is able to do this. Achieving this involves not bashing into, and accentuating, tribal barriers.

So on key issues, such as climate change, we are trying to get people who are not scientists to grasp the significance of findings based on the "science way of knowing" and to, in large measure, be able to evaluate, not the details of the science, but which authorities they ought to consider to be credible sources. In that evaluation, I think that (as Dan has described in earlier posts) statements as to how many scientists *believe* in climate change are not particularly useful. Because the correctness of a belief is not a matter that many would see as determined by the numbers of adherents to that belief. It would be better to discuss how much evidence is accumulating in very diverse fields.

If we are the ones trying to communicate, then we are the ones who have to try to figure out what it is that our audience is hearing when we say stuff. In particular, it is unreasonable to assume that the audience is well indoctrinated in the context of the speaker. Unless that audience is part of the same tribe. In which case, we aren't really doing outreach anyway,we're just having a tribal gathering. Without taking active efforts to consider the context in which the audience is receiving our presentation, we are remaining aloof and isolated. And probably ignored. Or rejected.

Further, in In policy debates, and legal matters, science runs into problems with the concepts of "findings of fact" quite frequently, that Dan describes here:

" If one doesn't get that science treats all "facts" & all "knowledge" as provisional, then one really doesn't get what science's way of knowing is."

In a court of law, or judgements somewhat based on concepts set by our legal system, it is very difficult to get past the barrier of "beyond a reasonable doubt".

Religious belief is not provisional. If you are trying to move a jury to rule on the side of the best available science, in my opinion, use of the word "belief" is not going to get you there.

December 6, 2014 | Unregistered CommenterGaythia Weis

Possible investigatory topic for the new Evidence based Science communication group?

Certainly audience response to terms used in a presentation can be a subject of surveys and is at least somewhat accurately quantifiable, depending on the ways that the questions regarding that response are measured.

December 6, 2014 | Unregistered CommenterGaythia Weis

"A scientist, in communicating with a public audience, is trying to get them to see this science as built upon evidence collected by rigorous methods of physical investigation, and validated by repetition and further investigation over time. Thus, starting out by saying that the scientist "believes" something to be true, can be misinterpreted in the minds of an audience trained to think of that word as highly linked to matters of faith."

Interesting argument! The idea of using rigorous methods of physical investigation sounds a lot like the "nullius in verba" principle that Dan has discussed previously. Dan described it as "utterly absurd", since of course not even scientists have the time to check and verify everything they rely on in their work, which is why they do take other people's word for it, and have faith in what they say. The trick, of course, is figuring out who to put your faith in.

"So on key issues, such as climate change, we are trying to get people who are not scientists to grasp the significance of findings based on the "science way of knowing" and to, in large measure, be able to evaluate, not the details of the science, but which authorities they ought to consider to be credible sources"

Quite so. "Science's way of knowing" is diametrically opposed to "authorities they ought to consider to be credible sources". Nullius in verba. Science is the belief in the ignorance of experts. In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual. So how do you teach people about science's way of knowing while maintaining their faith in the authority of "credible sources"? Isn't this a paradox? And what does it do to your authority to teach it?

And don't you have the same problem when you ask how they are to know who the credible sources are? Should they judge for themselves by evaluating the scientific methods, arguments, and evidence presented by the experts, or must they go to an authority on recognising authorities, who will tell them which authorities are genuine? Ad infinitum?

"In that evaluation, I think that (as Dan has described in earlier posts) statements as to how many scientists *believe* in climate change are not particularly useful. Because the correctness of a belief is not a matter that many would see as determined by the numbers of adherents to that belief. It would be better to discuss how much evidence is accumulating in very diverse fields."

Yes, agreed! Let's. :-)

December 6, 2014 | Unregistered CommenterNiV

There is indeed a conflict here between the scientific way of knowing (SWOK), and the non-SWOK acceptance of the conclusions of "credible" sources. In the "hard" sciences, (e.g. physics), the SWOK employs rigorous logic and mathematics is the language of description. As an example, climate science is not strictly a hard science, it's conclusions are not rigorously developed from a set of fundamental axioms. It involves an enormous number of approximations and assumptions. A person interested in climate science either has to become a climate scientist in order to intelligently discuss the conclusions, or has to use the non-SWOK approach of determining the most credible sources, or a combination of both. One approach is to count up the number of publications on the subject which generally support each other, and go with the biggest group. A more sophisticated approach is to ask, for each researcher, "if they had arrived at the opposite conclusion, how would that likely impact their income and/or reputation?" Research on the effects of tobacco use funded by the R.J. Reynolds company are suspicious to say the least. Funding administered by any group of popularly elected officials will put researchers receiving this funding under pressure to arrive at conclusions which perpetuate the popularity of those officials. Intellectual honesty requires that these effects be taken into account. (That said, and not being a climatologist, I think that anthropogenic climate change is real, but I think the extreme positions on both sides are suspicious.)

December 9, 2014 | Unregistered CommenterFrankL

"Research on the effects of tobacco use funded by the R.J. Reynolds company are suspicious to say the least."

And research by campaigners to ban smoking equally so.

There are always costs on both sides of an issue. It's not unusual for a scare to be started over something that is no significant threat, destroying an industry and putting thousands of people out of work for no good reason. If the scare story is wrong, it's a scientist's duty to say so. And the people whose livelihoods are going to be affected are the only people who are likely to fund them to look. Likewise, a campaign group collects large amounts of money in donations and employs many support staff that will all be put at risk if their pet issue turns out to be bogus, so in response to claims that it is, they'll want to fund scientists to prove that it isn't. On both sides, the scientists know that their work is worthless to their employers if it can be shown to be biased, and most scientists have strong principles, and so they will generally try to do the work honestly. But scientists are human, and when they feel strongly that they are in the right, the ends are sometimes felt to justify leaning a bit to make sure the right side wins. It's rarer - but far from unknown - for what started as a well-intentioned tilt to make sure the right answer wasn't unfairly junked by spurious measurement errors, turn into outright fraud when the scientist in question realises they were wrong and have just backed the wrong side - and put their reputation for integrity and their career on the line in the process.

It's a common mistake, as often happens with the tobacco science story, to mistake honest scepticism expressed about a result before there was solid proof for biased denial of an 'obvious' truth that is only available with hindsight. The tobacco health lobby happened to be correct, but when they started their evidence for it wasn't solid. It was the challenge from the tobacco company scientists that forced them to do the work more rigorously, to answer all the uncertainties, so that we could be *justifiably* confident in the result. Right answer plus wrong method equals bad science. And incidentally in the process, the science of statistical epidemiology used today for drug testing was developed massively, since some of the world's best statisticians were (scientifically) arguing for the tobacco company side (e.g. Ronald Fisher).

You can't tell just from looking at motivations since everybody on all sides of contentious issues is motivated. Strong motivations are a reason to check work more thoroughly, but not in themselves reason to suspect the results. Until sufficiently checked, the answer should only be "I don't know" rather than "I think this is probably wrong".

"A person interested in climate science either has to become a climate scientist in order to intelligently discuss the conclusions, or has to use the non-SWOK approach of determining the most credible sources, or a combination of both."

There are lots of methods that are scientific without requiring one to be a highly-trained scientist to apply. You can ask about the methods they use. Do they publish their data and calculations as well as their conclusions? Are they consistent in their claims - or do they 'change with the weather' to fit whatever has just been observed? Are there graphs and equations and hard numbers, or a lot of hand-waving and speculation? Are there error bars and discussion of potential uncertainties, or is only the evidence in favour presented and all mention of the measurement errors omitted? When mistakes are made - as is inevitable - are they corrected promptly? Or do they deny that any mistakes happened, or when forced to concede one, say that it doesn't matter and deny that any any of their other results could have been affected in the same way? Do they make predictions that differ consistently from the alternatives, and do the predicted number of them come true? Or do they make lots of inconsistent and wildly varying predictions and only highlight the ones that came true? Or only make predictions that cannot be confirmed until after they're dead? Are both sides of the argument presented? What are the strongest arguments against? Do they mention the limits of the approximations they use, or the assumptions they make? Do they tell you about the places where it doesn't work, or the odd results that don't fit? And so on.

There are lots of ways you can make an assessment of a scientific argument just from the style of the writing, without having to know the detailed meaning. It's not perfect, and misses a lot of problems too (scientists are skilled at writing stuff that looks rigorous to a non-expert and downplays the gaping holes in the argument), but such basic methods of scepticism and critical thinking will get rid of 90% of bogus science. And none of it relies on looking at the authority or motivations of the person making the arguments.

December 9, 2014 | Unregistered CommenterNiV

@NiV - Well I agree with about everything you say, except the first few sentences. I was not impugning the methods of the researchers supported by Reynolds, since I have no knowledge of that, just that I would be very sceptical of their research based on their funding. I agree that research by campaigners to ban smoking is also suspect.

December 9, 2014 | Unregistered CommenterFrankL

"Sceptical" is fine - that's what you should be. "Suspicious" means something slightly different, or at least, has a different connotation. But I may be reading more into it than you intended.

What I'm thinking of is the 'ad hominem' argument which occurs when the argument/conclusion is judged on the basis of characteristics of the person making the argument, rather than the content of the argument/evidence itself. It's arguably a useful heuristic if you can't evaluate the evidence, but it's still a fallacy, and highly unreliable. I'd be more "suspicious" of an ad hominem argument than I would of 'motivated' funding.

I did hear someone once reject a technical argument on the basis that the presenter's eyebrows were too close together! I'm not 100% certain they were joking.

December 10, 2014 | Unregistered CommenterNiV

@frankL

SWOK-- I like that. To me it mainly means inference from empirucal obsedvation-- as opposed to revealed truth (sans reason) or rationalism (sans obsevstion; just reason)

December 10, 2014 | Registered CommenterDan Kahan

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>