I've been invited by the University of Minnesota political science department to make a presentation on the "political psychology of misinformation." Am mulling over what to say (have till 2:00 pm tomorrow, so no rush) & was thinking something along the lines of
- misinformation isn't really much of a problem unless antagonistic cultural meanings have become attached to an empirical claim about some fact that admits of scientific investigation;
- when such meanings have taken root, accurate information won't by itself do much good; and
- therefore the kind of misinformation to worry about is public advocacy that needlessly ties policy-relevant factual issues to antagonistic cultural meanings.
Climate change is the obvious example of 3: hierarchical-individualist activists warn that concerns over it are a smoke screen to conceal a plot to overthrow capitalism, while egalitarian-communitarian ones profer climate change as evidence of the destructiveness of capitalist greed that necessitates severe restrictions on technology & markets. The positions are reciprocal -- by supplying vivid examples of exactly the the mindset the other fears, each one actually advances the other's cause at the same time that it advances its own.
But nanotechnology risk concern furnishes an even nicer example, I think. It is, of course, sensible to investigate whether nanotechnology is hazardous, but at this point at least there's no meaningful scientific evidence that it is. Yet that hasn't stopped some advocacy groups from noisly clanging the alarm bells. Indeed, one sponsored a contest for the "best nano-free zone" symbol, with the winner to emblazoned on t-shirts, bumper stickers, etc. The contest drew some 482 entrants.
Eighty Percent of the public hasn't even heard of nanotechnology yet. This is a great way to make sure that their first exposure connects nanotchnolgoy up with politicized issues like climate change and nuclear power. This strategy for creating cultural polarization, CCP found in an experimental study, has an excellent chance to succeed. Good to think ahead, too, since eventually climate change, like nuclear, might lose its power to divide -- and then who would need the "public interest" groups dedicated to protecting us from trying to the prospect that our cultural enemies will erect their worldview into a political orthodoxy?!
This might not be "misinformation" in the sense that the symposium sponsors have in mind -- but it is the sort of behavior that makes the public receptive to misinformation and impervious to sound science. It is a toxin, really, in the communication environment that democracies depend on for reliable transmission of scientific knowledge to their citizens.