follow CCP

Recent blog entries
popular papers

Science Curiosity and Political Information Processing

What Is the "Science of Science Communication"?

Climate-Science Communication and the Measurement Problem

Ideology, Motivated Cognition, and Cognitive Reflection: An Experimental Study

'Ideology' or 'Situation Sense'? An Experimental Investigation of Motivated Reasoning and Professional Judgment

A Risky Science Communication Environment for Vaccines

Motivated Numeracy and Enlightened Self-Government

Making Climate Science Communication Evidence-based—All the Way Down 

Neutral Principles, Motivated Cognition, and Some Problems for Constitutional Law 

Cultural Cognition of Scientific Consensus

The Tragedy of the Risk-Perception Commons: Science Literacy and Climate Change

"They Saw a Protest": Cognitive Illiberalism and the Speech-Conduct Distinction 

Geoengineering and the Science Communication Environment: a Cross-Cultural Experiment

Fixing the Communications Failure

Why We Are Poles Apart on Climate Change

The Cognitively Illiberal State 

Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study

Cultural Cognition of the Risks and Benefits of Nanotechnology

Whose Eyes Are You Going to Believe? An Empirical Examination of Scott v. Harris

Cultural Cognition and Public Policy

Culture, Cognition, and Consent: Who Perceives What, and Why, in "Acquaintance Rape" Cases

Culture and Identity-Protective Cognition: Explaining the White Male Effect

Fear of Democracy: A Cultural Evaluation of Sunstein on Risk

Cultural Cognition as a Conception of the Cultural Theory of Risk

« Weekend update: Q & A at Nature | Main | "They saw an election"-- my 2 cents on election result »

Is cultural cognition an instance of "bounded rationality"? A ten-yr debate

This is basically what I remember saying last week at William & Mary in a workshop co-sponsored by the Law School & Political Science Dep't a couple weeks ago. Slides here.

1. An old but continuing debate.  The paper you read for this workshop—Motivated Numeracy and Enlightened Self Government, Behavioural Policy (in press)—originates in a debate that started 10 yrs ago.

A group of us (me, Paul Slovic, Donald Braman, and John Gastil) had written a critique of Cass Sunstein’s then-latest book Laws of Fear.  In that book, Sunstein had attributed all manner of public conflict over risk to the public’s overreliance on “System 1” heuristic reasoning. The remedy, in Sunstein’s view, was to shift as much risk-regulatory power as possible to politically insulated expert agencies, whose members could be expected to use conscious, effortful “System 2” information processing.

Our response—Fear of Democracy: A Cultural Evaluation of Sunstein on Risk, Harvard L. Rev., 119: 1071-1109—criticized Sunstein for ignoring cultural cognition, which of course attributes a large class of such conflicts to the impact that cultural allegiances play in shaping diverse individuals’ risk perceptions.

The costs of ignoring cultural cognition, we argued, were two-fold. 

Descriptively, without some mechanism that accounts for individual differences in information processing, Sunstein could not explain why so many risk controversies (from climate change to gun control to nuclear power to the HPV vaccine) involve conflicts not between the public and experts but between different segments of the public.

Prescriptively, the cost of ignoring cultural cognition undermined Sunstein’s central recommendation to hand over all risk-regulated decisionmaking to independent expert risk regulators. That recommendation presupposed that all disagreements between the public and experts originated in the public’s bounded rationality, a defect that it was reasonable to assume could not be remedied by any feasible intervention and that generated factual errors unentitled to normative respect in lawmaking.

Cultural cognition, we argued, showed that public risk perceptions on many issues were rooted in diverse citizens’ values.  It wasn’t obvious that expert decisionmaking was “better” than public decisionmaking on risks originating in publicly contested worldviews. Nor was it obvious that conflicts originating in conflicting worldviews could not be resolved by democratic decisionmaking procedures aimed at helping culturally diverse citizens to arrive at shared perceptions of the best available evidence on the dangers that society faces.

In his (very gracious, very intelligent) reply, Cass asserted that cultural cognition could simply be assimilated to his account of the reasoning deficits that distort public decisionmaking: “I argue,” he wrote “that insofar as it produces factual judgments, ‘cultural cognition’ is largely a result of bounded rationality, not an alternative to it.”  “[W]hile it is undemocratic for officials to neglect people’s values, it is hardly undemocratic for them to ignore people’s errors of fact” (Sunstein 2006)

This position—that cultural cognition and affiliated forms of motivated reasoning are rooted in “bounded rationality"—is now the orthodox view in decision science (e.g., Lodge & Taber 2013). 

But we weren’t sure it was right.  As plausible as the claim seemed to be, it hadn’t been empirically tested.  So we set out to determine, empirically, whether the forms of information processing that are characteristic of cultural cognition really are properly attributed to overreliance on heuristic reasoning.

2.  A ten-year research program. The answer we arrived at over a course of a decade of research was that cultural cognition is not appropriately attributed to overreliance on the form of heuristic information processing associated with “System 1” reasoning.  On the contrary, the individuals in whom cultural cognition exerts the strongest effects were those most disposed to use conscious, effortful, “System 2” reasoning.

Click me! I will wither & die w/o attentionThis conclusion was supported by two testing strategies.

The first was the use of observational or survey methods. In these studies we simply correlated various measures of System 1/System 2 reasoning dispositions with public perceptions of risk and related facts. 

If public conflict over risk is a consequence of “bounded rationality,” then one should expect the individuals who evince the strongest disposition to use System 2 reasoning will form risk perceptions more consistent with expert ones than will individuals who evidence the strongest disposition to use System 1 forms of information processing.

In addition, one would expect polarization over contested risk to abate as individuals’ proficiency in System 2 reasoning dispositions increase: those individuals can be expected to “go with the evidence” and refrain from “going with their gut,” which is filled with heuristic-reasoning crap like “what do other people like me think?”

But in fact, those predictions are not borne out by the evidence.

In multiple studies, we found that the individuals who scored highest on one or another measure of the disposition to use conscious, effortful “System 2” information processing were in fact the most polarized on contentious risk issues, including the reality of climate change, the hazards of fracking, the danger of allowing citizens to carry concealed handguns etc. (Kahan, Peters et al. 2012; Kahan 2015; Kahan & Corbin 2016).

Inconsistent with the “bounded rationality” conception, this consistent finding is more consistent with the “cultural cognition thesis,” which posits that individuals can be expected to form identity-protective beliefs and to use all of the cognitive resources at their disposal to do so.

But to nail this inference down, we also conducted a series of experiments, the second type of testing strategy by which we probed Sunstein’s and others’ “bounded rationality” conception of cultural cognition and cognate forms of motivated reasoning.

Click *me*-- I have magic powers that are unlocked by clickingThese experiments consistently showed that individuals highest in the critical reasoning dispositions associated with System 2 information processing were using their cognitive proficiencies to ferret out evidence consistent with their cultural or ideological predispositions and to rationalize the peremptory dismissal of evidence inconsistent with the same (e.g., Kahan 2013).

Motivated Numeracy and Enlightened Self-government (Kahan, Peters et al. in press) reports the results of one of those studies.

3.  So what’s the upshot?  The original debate—over whether cultural cognition is a consequence of overreliance on System 1 heuristic processing—has been resolved, in my opinion.  Insofar as the individuals who demonstrate the greatest disposition to use System 2 reasoning are also the ones who most strongly evince cultural cognition, we can confident that it is not a “cognitive bias.”

But is it a socially desirable form of information processing on socially contested risks?

That’s a different question, one my own answer to which has been very much reshaped by the course of the “Ten Year Debate.”

It is in fact perfectly rational at the individual level to engage information societal risks in an identity-protective rather than a truth-convergent manner.  What an individual personally believes about climate change, e.g., won’t affect the risk she or anyone she cares about faces; whether as consumer, voter, public discussant, etc. her personal behavior will be too inconsequential to matter. 

But given what positions on climate change and other societal risk issues have come to signify about who she is and whose side she is on in a perpetual struggle for status among competing cultural groups, a person who forms a position out of line with her cultural peers risks estrangement from the people on whom she depends on for emotional and material support.

One doesn’t have to be a science whiz to get this.  But if one is endowed with the capacity to make sense of evidence in the manner that is associated with System 2 information processing, it is predictable that she will use those cognitive resources to achieve the everyday personal advantages associated with the congruence between her beliefs and those of her cultural peers.

Of course, if everyone does this all at once, we are indeed screwed.  In that situation, diverse citizens and their democratically accountable representatives won’t converge, or converge nearly as quickly as they should, on the best evidence on the risks they genuinely face. 

But sadly, this fact won’t change the psychic incentives that individuals have to use the forms of reasoning that most reliably connect their beliefs to the positions that signify membership in and loyalty to the identity-defining groups of which she is a member.

This is the tragedy of the science communications commons.

We should do something to dispel this condition.  But what?

That’s a hard question.  But it’s one for which an answer won't be forthcoming if we rely on accounts of public risk perceptions that attempt to assimilate cultural cognition into the “public uses system 1, experts system 2” framework.

I suspect Cass Sunstein by this point would largely agree with everything I’m saying. 

Or at least I hope he does, for the project to overcome “the tragedy of the science communications commons” is one that demands the fierce attention of the very best scholars of public risk perception and science communication.


Kahan. DM and Corbin, JC (2016) A Note on the Perverse Effects of Actively Open-minded Thinking on Climate Change Polarization. Research & Politics, 10.1177/2053168016676705.

Kahan, D.M. Climate-Science Communication and the Measurement Problem. Advances in Political Psychology 36, 1-43 (2015).

Kahan, D.M. Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making 8, 407-424 (2013).

Kahan, D.M., Peters, E., Dawson, E. & Slovic, P. Motivated Numeracy and Enlightened Self Government. Behavioural Policy (in press).

Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Clim. Change 2, 732-735 (2012).

Kahan, D.M., Slovic, P., Braman, D. & Gastil, J. Fear of Democracy: A Cultural Evaluation of Sunstein on Risk. Harvard Law Review 119, 1071-1109 (2006).

Lodge, M. & Taber, C.S. The rationalizing voter (Cambridge University Press, Cambridge ; New York, 2013).

Sunstein, C.R. Laws of fear : beyond the precautionary principle (Cambridge University Press, Cambridge, UK ; New York, 2005).

Sunstein, C.R. Misfearing: A reply. Harvard Law Review 119, 1110-1125 (2006).

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (12)

Respectfully, there are two potential flaws in the reasoning you offer here:

"It is in fact perfectly rational at the individual level to engage information societal risks in an identity-protective rather than a truth-convergent manner. What an individual personally believes about climate change, e.g., won’t affect the risk she or anyone she cares about faces; whether as consumer, voter, public discussant, etc. her personal behavior will be too inconsequential to matter."

In many cases, a single person's views, judgments, and ergo their behaviors regarding a risk does have both direct and indirect impacts on others. A parent who chooses not to vaccinate her child, in order to maintain culturally shared views that prefer natural vs. human-made medicines (which does not show up when interrogated by CC measures) or culturally shared views that government ought not impose such choices on individuals (which does, as one of the defining elements of Individualists), puts not only her child and her family but community members at risk. An individualist motorcycle rider who refuses to obey helmet laws, as a way of asserting cultural identity, costs us all money when he wraps his unprotected brain around a telephone pole but survives and needs lifelong care. That costs us all resources that would do much more to improve health and safety if used in other ways.There are many other examples of how individual behavior, motivated by the rational desire to maintain cultural identity, puts others at risk directly or indirectly impinges on the health and safety of others.
More profoundly, however, if a whole group of people act in similar ways for similar cultural identity reasons, in the name of those group values they frequently try to assert their view of the risk on society as a whole. This sometimes raises the actual risk for everyone. Denial of climate change by a large group of people, for cultural identity reasons, impedes progress on and leaves us more exposed to the dangers of climate change. Group opposition to nuclear power for reasons tied to cultural identity (among ECs more than HIs) impedes consideration of a valuable source of zero emission energy (both greenhouse gasses and particulates). There are many more examples here too.
Broadly, the whole idea of cultural identity is to maintain solidarity with and loyalty to a group, because together we have more power than we have alone, so group cohesion and acceptance helps keep us safe. The very act of an individual rationally pursuing cultural identity is therefore only the act of one individual participating as a eusocial animal, in the name of the betterment of the group as a way of enhancing his own safety and wellbeing. By definition, then, what one does in the name of the group magnifies up to the behavior of the group, and that certainly does have risk implications for people beyond each individual.

December 2, 2016 | Unregistered CommenterDavid Ropeik

While I very much admire the work you have done on this, you seem to persistently ignore a recurrent criticism of it. People like myself have major scepticism about climate change despite being surrounded by a social group who are good old-fashioned lefty believers!

I believe that I am using all my cognitive skills to search for truth, not to search for cultural approval, indeed I only need to open my mouth and I find myself in a sea of disapproving looks and spoken or unspoken demands to shut up.

I am sure that open-minded thoughtful people can come to either a pro or anti climate change position through an earnest search for truth, but I simply do not believe that their sole motivation is for cultural approval, when all the evidence around me is that I get the opposite of this.

In the same way, Dan, is your entire aim in investigating cultural cognition to obtain the approbation of your fellow psychologists? I would hope that there was more to it than that; and you should allow some of the people you study to have similar, complex motivations.

December 2, 2016 | Unregistered CommenterMichael Lowe

Michael -

Not to speak for Dan, but...

==> People like myself have major scepticism about climate change despite being surrounded by a social group who are good old-fashioned lefty believers! ==>

How representative are you? Would a skeptical person generalize from unrepresentative sampling, or at least sampling that hasn't been tested for how representative it is?

December 2, 2016 | Unregistered CommenterJoshua

Do you mean that the form of motivated reasoning Lodge & Taber present in their book "The Rationalizing Voter" is rooted in bounded rationality? Could you elaborate, please?

December 3, 2016 | Unregistered CommenterMika Sandström

@Mika--Taber & Lodge definitely see motivated reasoning as a consequence of defects in rationality. They attribute politically motivated reasoning to system 1's dominance over system 2. Their "book [is] about rationalizing, rather than rational, citizens" (p. 1). "The central component of our dual-process model and force driving the rationalization of political beliefs and judgments is hot cognition ...." (p. 74). If this were correct, then one would expect individuals who are more disposed to use system 1 or "hot cognition" to be more vulnerable to motivated reasoning that individuals who are disposed to use system 2. But in fact this is not so.

December 3, 2016 | Registered CommenterDan Kahan

@Fearless Dave--

I agree w/ much of what you say: the individual habits of mind that rationally advance individual's interest in protecting their status wi/ his or her cultural affinity groups often cause collective welfare to suffer.

This mismatch is the "tragedy of the science communications commons."

December 3, 2016 | Registered CommenterDan Kahan


I would say what @Joshua attributes to me. I've spelled this out a bit more in a previous post.

Cultural cognition is not a form of psychoanalysis; it describes broader patterns of reasonging but can't reliably be use to "diagnose" any particular individual's beliefs.

If I were aiming for approbation by psychologists, I'd certainly be doing a lot better if I were insisting that only climate skeptics or conservatives generally were succumbing to non-truth-convergent forms of reasoning

December 3, 2016 | Registered CommenterDan Kahan

Dan, I think you and I are in basic agreement that there is no meaningful separation between "System1" and "System2" reasoning - instead, there's just reasoning, used however people feel best suits them - so I never understood why you were so keen to relate your findings to system 1/system 2 reasoning until you made this post.

Your train of argument, as I understand it, is as follows. Using various proxies for and measures of conscious, system-2 reasoning, you intended to argue against Sunstein with findings that conscious reasoning magnifies instead of suppresses polarization. Do I have that right?

But you say that Sunstein's core claim was that the general public's heuristic reasoning, as opposed to -expert- reasoning, not opposed to either system 1 or system 2 reasoning, was responsible for the general public's controversy and polarization. In that sense, I think your refutation of Sunstein is missing the mark. I never read Sunstein and don't know if he claimed that expert reasoning was better -because- experts used system-2 reasoning. If he did, I disagree with him on that point. But I think Sunstein was correct in blaming "heuristic reasoning" as the culprit.

I actually think your results validate Sunstein's attribution of cultural cognition to overreliance on heuristic reasoning. Indeed, you've shown me quite convincingly that peoples' heuristics are calibrated against their own identity-image, and your research has powerfully argued for the inherency of identity to knowledge, perspective and expression - i.e., that we know what we do and see what we see and say what we say because of who we are, which identity-mask we are wearing at that moment.

We are ever tempted to find ways to show that the identities we have already acquired are sufficient for the next intellectual task. The people who know the most also have the most deeply developed heuristics. Peoples' tendency to misapply their good heuristics in contexts where they don't apply gets them into intellectual trouble and creates public controversy.

It's really unusual to answer a refutation with "I see validation," so I'm really interested in your comment.

December 3, 2016 | Unregistered Commenterdypoon

@Dan, thank you! Your answer gave me a lot to think about.

December 3, 2016 | Unregistered CommenterMika Sandström

"In many cases, a single person's views, judgments, and ergo their behaviors regarding a risk does have both direct and indirect impacts on others. [...] An individualist motorcycle rider who refuses to obey helmet laws, as a way of asserting cultural identity, costs us all money when he wraps his unprotected brain around a telephone pole but survives and needs lifelong care. That costs us all resources that would do much more to improve health and safety if used in other ways."

There's a problem with causal attribution with that example.

Consider the following thought experiment: I as a politician pledge to burn $1000 of taxpayer's money for every week you don't attend services at the Church of Scientology, or whatever. Now I propose a new law to make you go to church, and justify it on the basis of the cost to society - you're costing everyone else $1000 a week, that we could spend on better causes, because of your "individualist" choice not to go to church.

Is that a valid justification? Who is really responsible for incurring the $1000 cost to society? Me or you?

In some cases, like vaccination, the cost is imposed on society irrespective of anyone else's decisions. In others, the choices are made in large part by others. Suppose a voter votes against the politicians with the high-spending policies on treating motorcyclists who don't wear helmets. Suppose they're willing to opt out of public provision (and having to pay for it) and pay their own medical insurance, knowing the risks and paying the costs. Do you still get to tell them what risks they're allowed to take with their own life?

"More profoundly, however, if a whole group of people act in similar ways for similar cultural identity reasons, in the name of those group values they frequently try to assert their view of the risk on society as a whole. This sometimes raises the actual risk for everyone."

Sometimes. It depends on which group is right about the true risks. The rules are designed so that society collectively gets to decide what risks to take and who to believe, because all groups are biased and fallible.

There are risks from climate change, and risks from the actions taken to address climate change. (Likewise for nuclear power.) There is disagreement over which risks are greater. Nobody knows for certain which group is closer to the truth. Do we let the believers raise the risks of economic pain and waste, in the face of higher priorities like alleviating poverty, because of their cultural belief in dangerous climate change? Do we let sceptics do the opposite? That's what we have democratic debate for - to decide such matters in a culturally neutral way, with no culture able to shut out the views of any other, and to ensure that if it turns out we did make a mistake we can at least own it. If a minority overrule the majority, because they think they know better, and turn out to be wrong (as anyone can be), the majority may be legitimately annoyed about it.

The problem is that each side is equally convinced that they're right and the other side is wrong, and that this democratic debate is dangerous, risking making the wrong decision just because the other side have these weird cultural beliefs. Things would be so much better if the elite people who knew what they were doing (i.e. experts who agree with them) were in charge unconditionally, and could override the 'ignorant' majority when they were mistaken. Each side are utterly convinced that their own beliefs are objective, and uncontaminated by the cultural effects that delude their cultural opponents. It sounds sensible and convincing. Historically, the consequences have often been horrific.

Unfortunately, it's a lesson we keep having to relearn with every new generation.

December 4, 2016 | Unregistered CommenterNiV

@MIka-- welcome!

December 5, 2016 | Registered CommenterDan Kahan

In my opinion, the paragraph below is an excellent summation. I found at the end of Dan's review of the Sunstein book above:

"The challenge that risk regulation poses to democracy is more profound than it appears not only upon first inspection but upon second inspection as well. The material well-being of a democratic society depends on its ability to rationally manage a nearly limitless variety of often competing risks. The integrity of such a society’s commitment to self-governance depends on its ability to fashion procedures that are genuinely deliberative, open, and democratic. And its obligation to reconcile popular rule with respect for individual dignity and freedom requires it to find a mode of regulation and a strategy of regulatory discourse that deflect the ambitions of competing cultural groups to claim the law as theirs and theirs alone. "

It has generally been true, in the march forward of “Scientific Progress”, that adept and cunning individuals and sometimes established corporations, have been able to outpace the ability of society at large to mobilize to assimilate utilize and effectively regulate the new science driven technologies.

IMHO, such is the case today with the science of science communication. I like the two papers that are the subject of Dan Kahan's most recent post, but I also firmly believe that it is well past high time that attention be focused on how powerful forces who already understand how individual identities can be focused and narrowed and exploited to further their own personal gain. For example as demonstrated in the battle over gun rights as described here: How is it that we can now use the science of science communication to construct a broader narrative, one that is inclusive of the aspirations of all Americans?

In the discussion above, I believe that the flaws in this statement:

""It is in fact perfectly rational at the individual level to engage information societal risks in an identity-protective rather than a truth-convergent manner. What an individual personally believes about climate change, e.g., won’t affect the risk she or anyone she cares about faces; whether as consumer, voter, public discussant, etc. her personal behavior will be too inconsequential to matter.""

include, as David Ropeik noted, the two ideas that

In many cases, a single person's views, judgments, and ergo their behaviors regarding a risk does have both direct and indirect impacts on others.


if a whole group of people act in similar ways for similar cultural identity reasons, in the name of those group values they frequently try to assert their view of the risk on society as a whole.

But also that the direct benefit of certain positions (at least if effectively expressed through representative democracy) do work towards individuals own economic benefit.

This has to do with risk, and how we can collectively institute controls that further the common social good, at the expense of the gains certain individuals may gain by acting alone. It is generally true, for example, that the Ogallala aquifer is being drained. But for a farmer in Eastern Colorado or Kansas, that is not a significant economic reason to individually switch back to dryland wheat or cattle ranching ranching when actions of others will still cause one's wells to run dry. Better to grow as much (more profitable) corn as possible, pumping water as much as needed to do so, and take the money and run. Western civilization has a long history of resource depletion and migration. Our current problem is that we are now running out of available planet.

Depending on where one is located and one's potential resources for moving elsewhere, even with a global phenomena like anthropogenic climate change, it may be possible that the most rational individual decision really is business as usual. Many areas in the West are now oil and gas boomtowns. Look at a satellite view of the well pads surrounding Rifle Colorado for example. Or look at the coal strip mining that supports Gillette, Wyoming. Getting people to sacrifice their homes and livelihoods for an abstract risk, which might not affect directly them in their lifetime is a difficult thing to do. On the other hand, some individuals may get filthy rich and end up with large estates in the hills of Greenland. This evaluation process involves something way more than decision making based on an "identity".

I haven't seen the Lodge and Tabor book cited by commenter Mika Sandstrom above, but I did get to this article, which seems to be based on an earlier election cycle:
I think it will be interesting to see what Lodge and Tabor have to say about the election we just experienced.

In my opinion, Dan's counter argument may not apply. The data he cites is based on that for the entire election. As an election progresses, the common assumption, the key at least to the Democratic Party's “ground game” is that efforts to get out the vote should be targeted largely at the reluctant voters, those that vote only infrequently, and in registering new voters such as young people and minorities, that may not have participated before. The ability of inspiring these people to vote can swing elections can be both significant and surprising, given that in the US we have such a large reservoir of eligible people who do not participate. On the other hand, I think that there are two big problems with this. First, Hillary Clinton in particular don't come up with some overarching inspirational and aspirational reason to vote for her. I think that there was something to be said for creating a movement, what way back used to be called “the bandwagon effect”, that would draw reluctant voters in to the general excitement. Secondly, this strategy has now been repeatedly employed. Huge efforts are directed at selected swing precincts within selected swing states. Other areas are neglected. This leads to a lot of intensity directed at people who didn't want to feel intense about politics in the first place. Many of these people seem to me to be sick of being pounded on. They feel resentful to be the targets of so much attention at election time but ignored otherwise. In my personal opinion, in the last couple of weeks before the election, this "ground game" may have done more harm than good. IMHO, the Republican establishment, because they have a large base of home owning, stay in place voters, directs much of their efforts at voter suppression among those that might otherwise vote against them. The Tea Party, however, brings another group of people, that up to this point has been underrepresented in voter turnout. So in both parties, the primaries got turnout from people that the party establishments did not expect to see.

Part of our problem is due to the fact that we have the electoral college system. A big part of the reason that less populous states sought some additional leverage had to do with slavery. In my opinion, we seem to be destined to fight battles with the same ideological origins over and over again. I believe that Harry Caudill's 1963 book, Night Comes to the Cuumberlands, still offers a good explanation as to how the elite power structure in the South used a divide and conquer strategy that pitted poor whites against minorities. I think we also can look even further back at the election of Andrew Jackson. Blaming immigrants for blue collar job loss that is in reality due more to automation, is, I believe a modern extension of this same strategy.

Republicans tend to be more authoritarian, and vote to the bottom of the ticket. Democrats frequently seem to think that there is something more intelligent about an approach in which they don't vote for people that they are unfamiliar with. This, along with heavy applications of campaign financing, has allowed Republicans to dominate at the state and sometimes local levels. The Republicans have used this domination to be clever at gerrymandering districts. I live in Longmont, Colorado. My portion of Boulder County has been attached to the 4th US Congressional District, represented by far right and anthropogenic climate denier Ken Buck. and If I lived a couple of miles to the west, my US House Representative would be Jared Polis. Jared is an Internet entrepreneur. He and his husband have 2 children. The 2nd Congressional District now includes both of the large public universities, CU and CSU, thus concentrating the state of Colroado's available Democrats. There is a definite ideological difference between Boulder and the Wyoming, Nebraska, Kansas, Oklahoma borders. But is more of a gradation than a Grand Canyon. The makeup of both districts supports candidates at the ideological extremes, to the expense of anyone from either party that might be more moderate.

In other words, I think it is not the people who are polarized as much as it is the process that is polarizing.

And I believe that people are not so much irrational as they are doing their best to navigate in what frequently seems to them to be irrational and difficult to predict situations. Such as, for the crucial swing and/or reluctant voters, what the outcome might be of voting for one or the other of two deeply disliked Presidential candidates. Or not voting at all.

December 11, 2016 | Unregistered CommenterGaythia Weis

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>