In our studies, we examine how ordinary persons -- that is, non-experts -- form perceptions of risk & related facts. But I get asked all the time whether I think the same dynamics affect how experts form their perceptions. I dunno -- we haven't studied that.
But of course I have conjectures.
BTW, "conjecture" is a great word when used in the manner Popper had in mind: to describe a position for which one doesn't have the sort of direct evidence one would like and could get from a properly designed study, but which one believes in provisionally on the basis of evidence that supports related matters & subject to even better proof of a direct or indirect kind. Of course, every belief should be provisional & subject to more & better proof. But it organizes one's own thoughts & attention to be able to separate the beliefs one feels really do need to be shored up from ones that seem sufficiently grounded that one needn't spend lots of time on them. Also, if people know which of their beliefs to regard as conjectures & habituate themselves to acknowledge the status of them in discussion with others who do the same, then they all can all speak more freely and expensively, in ways that might help them (maybe by creating excitment or motivation) to obtain better evidence, & without worry that they will mislead or confuse one another.
So -- is expert decisionmaking subject to cultural cognition?
Yes. And No.
Yes, because to start, experts use processes akin to cultural cognition to reason about the matters on which they are experts. Those processes reflect sensitivity to cues that individuals use to orient themselves within groups they depend on for access to reliable information; they are built into the capacity to figure out whom to trust about what.
What is different about experts and lay people in this regard -- what makes the former experts -- is only the domain-specificity of the sensibilities that the expert has acquired in his or her area of expertise, which allow the expert to form an even more reliable apprehension of the content of shared knowledge within his or her group of experts.
The basis of this conjecture is an account of how professionalization works -- as a process that endows practitioners with bridges of meaning across which they transmit shared prototypes to one another that help them to recognize what is true, appropriate & so forth. My favorite account of this is Margolis's in Patterns, Thinking, and Cognition. Llewellyn called this kind of professional insight as enjoyed by lawers & judges "situation sense."
Maybe, then, we should think of this a kind of professional cultural cognition. Obviously, when experts use it, they are not likely to make mistakes or to fall into conflict. On the contrary, it is by virtue of being able to use this professional cultural cognition -- professional habits of mind, in Margolis's words --that they are able reliably to converge on expert understanding.
Now a bit of No: Experts when they are making expert judgments in this way are not using cultural cognition of the sort that nonexpert lay people are using in our studies. Cultural cognition in this sense is a recognition capacity -- made up of prototypes and bridges of meaning -- that ordinary people who share a way of life use to access and transmit common knowledge. One of things they use it for is to apprehend the state of expert knowledge in one or another domain; lay people have to use their "cultural situation sense" for that precisely b/c they don't have the experts' professional cultural cognition.
Still, laypersons' cultural situation sense doesn't usually lead to error or conflict either. Ordinary people are experts at figuring out who the experts are and what it is that they know; if ordinary people weren't good at that, they would lead miserable lives, as would the experts.
When lay people do end up in persistent disagreement with experts, though, the reason might well be incommensurabilities in their respective systems of cultural cognition. In that case, the two of them -- experts and lay people -- both lack access to the common bridges of meaning that would allow what experts or professionals see w/ their prototypes to assume a form recognizable in the public's eye as a marker of expert insight. This is another Margolis-based conjecture, one I take from his classic Dealing with Risk: Why the Public and Experts Disagree on Environmental Issues.
Lay people can also fall into conflict as a result of cultural cognition. This happens when the diverse groups that are the sources of cultural cognition assign antagonistic meanings (or prototypes) to matters that admit of expert investigation. When that happens, the sensibilities that ordinarily enable lay people to know whom to trust about what become unreliable; the signals they pick up who the experts are & what they know are being masked and distorted by a sort of interference. This sort of problem is the main thing that I understand our studies of cultural cognition to be about.
More generally, the science of science communication, of which the study of cultural cognition is just one part, refers to the self-conscious development of the specialized habits of mind -- shared prototypes and bridges of meaning-- that will enable expert knowledge of lay-person/expert misunderstandings & public conflicts over expert knowledge. The kind of professional cultural cognition we want here will allow those who acquire it not only to understand why these pathologies occur, but also to identify what steps should be taken to treat them, and better yet prevent them from happening in the first place.
Now some more Yes -- yes scientists do use cultural cognition of the same sort as lay people.
They obviously use it in all the domains in which they aren't experts. What else could they possibly do in those situations? They might not appreciate that they are figuring out what's true by tuning in to the beliefs of those who share their values. Not only is that invisible to most of us but it is especially likely to evade the notice of those who are intimately familiar with the contribution that their distinctive professional habits of mind make to their powers of understanding in their own domain.
We should thus expect experts -- scientists and other professionals -- to be subject to error and conflict in the same way, to the same extent that lay people are when they use cultural cognition to participate in knowledge (including scientific knowledge) about which they are not themselves experts.
The work of Rachlinski, Wistrich & Gutherie, e.g., suggests this: they find that judges show admirable resistance to familiar cognitive errors, but only when they are doing tasks that are akin to judging, which is to say, only when they are using their domain-specific situation sense for what it is meant for.
But Rachlinski, Wistrich & Gutherie also have shown that judges can be expected systematically to err in judging tasks, too, when something in their decisionmaking environment distorts or turns off their professional habits of mind.
So on that basis, I would conjecture that experts -- scientific & professional ones -- will sometimes err, and likely fall into conflict, in making judgments in their own domains when some influence interefers with their professional cultural cognition, & they lapse, no doubt unconsciously, into reliance on their nonexpert cultural cognition.
In that situation, too, we might see experts divided on cultural lines & about matters in their own fields. This is how I would explain work by Slovic & some of his collaborators (discussed, e.g., here) & by Silva & some of hers (e.g., here & here), on the power of differing worldviews and realted values to explain some forms of expert disagreement. But it is notable that they always find that culture explains much less conflict among experts on matters on which they are experts than they & others have found in cases in which there is persistent public disagreement about policy-relevant science.
So these are my conjectures. Am open to others'. And am especially interested in evidence.
Key point is that my post carries on as if who is an "expert" were a perfectly straightforward thing that needs no elaboration. Not only is that no so, Fourcultures points out, but who counts as one will likely depend on criteria that vary across the ways of life associated with cultural theory.
Of course, this is correct. I was using "experts" to refer to specialized community whose members develop and share reliable craft knowledge -- & ignoring or taking for granted a necessary condition of being an expert, viz, that your status as such be recognized by nonexperts. I don't that people with different values will sometimes fight over this -- not just because they see things differently (in a cognitive sense) but also because what sorts of authority merit deference will be bound up with their conscious commitment to the values that characterize their preferred ways of life.
But I do think it is worth noting that even when culturally diverse people don't get fall into disagreement on who the experts are -- that is, even when they accept a common set of criteria of expertise in a particular domain -- they often will still end up divided as a result of cultural cognition over what those experts believe.
I think this is so in many of the conspicuous conflicts over policy-relevant science that we see. In our study of cultural cognition and scientific consensus, e.g., we found that individuals of all cultural outlooks perceive that their group's position on contentious issues like climate change, nuclear power safety, and the impact of permitting citizens to carry concealed handguns was perfectly consistent with scientific consensues. But they disagreed about what scientific consensus was in these areas -- and in fact construed evidence relevant to that in a way that was congenial to their cultural values.
In other words, they were seeing different things even when they agreed what they were looking for. This is a result, I believe, of the sort of pathological conflict in cultural meanings that interferes with the convergence that we usually observe in the systems of certification that diverse cultural groups use to figure out what the "experts" believe.
But I agree that this is all about a particular domain in which members of cultural groups don't feel impelled by their values to assert conflicting claims about who is an expert, or essentially different claims about the nature of authority.
The question then is: how large is that domain relative to the one that Fourcultures is envisioning? Maybe we disagree about that?