How common is it to notice & worry about the influence of cultural cognition on what one knows? If one is worried, what should one do?
Just yesterday, I successfully stopped myself from telling a person that their expressed belief has not a shred of evidence to support it (just in case, it wasn't a religious belief, that was something that could be demonstrated scientifically, but hasn't been). I stopped myself (pat on the head goes here) because, for one thing, I knew it would lead nowhere; and for another, I have my share of beliefs with a similar status of not being supported by scientific evidence (but not disproved by it either).
Just like anyone else beyond the age of five or ten, I have a worldview, my own particular blend of education, research, life experiences, internalized beliefs, etc. And by now, this worldview isn't easy to shake, let alone change. It doesn't mean that I disregard new scientific evidence, but it does mean that whenever I hear of new findings that seem to be in explicit contradiction with my worldview, I make a point of finding the source and reading it in some detail (going to a university library if need be). In 99 cases out 100 (at least), it turns out that I don't have to change my worldview after all: sometimes the apparent contradiction results from BBC-style popularization with a healthy doze of exaggeration or downright mistakes on a slow news day, sometimes the original research arrives at some almost statistically insignificant result based on far too small a sample, prettified it to make it publishable, or something else, or both.
But the dangerous thing is, if a reported finding does agree with my worldview, I usually don't go to such lengths to check the original source and the quality of research (with few exceptions). There is, of course, a certain degree of confirmation bias at work here, but my time on this earth is limited and I cannot spend it all in checking and re-checking what is already part of my worldview. What I do try to avoid in such cases is the very tempting assumption that now, finally, this particular belief is a knowledge based on scientific evidence (unless I really checked it at least with the same rigor as described above). I am afraid I am not always successful in this... are you?
Here are my questions (feel free to add & answer others):
1. What fraction of people are likely to be this self-reflective about how they know what they know?
2. Would things be better if in fact it were more common for people to reflect on the relationship between who they are & what know, on how this might lead them to error, and on how it might create conflict between people of different outlooks? If so, how might such reflection be promoted (say, through education, or forms of civic engagement)?
3. Okay: what is the answer to the question that Levin is posing (I understand her to be asking not merely whether others who use her strategy think they are successful with it but also whether that strategy is likely to be effective in general & whether there are others that might work better)? What should a person who knows about this do to adjust to the likely tendency to engage in biased search (& assimilation) consistent w/ worldview.
Another graphical model of the occasion for the Levin anxiety.