DS Apathy
Waiting

A Problem in Mind

Dissonance What connects resistance to new scientific theories, religious persecution, racial prejudice, and political partisanship? The answer is a psychological phenomenon known as cognitive dissonance. Although this topic often comes up in the context of religion, few people seem to appreciate that cognitive dissonance is not a rare event experienced only by those with strong religious beliefs, but rather a known flaw in the operating system of the human mind. We are all subject to its effects, to varying degrees, and a widespread comprehension of this idea has the potential to be the greatest advance for human culture in living memory. 

Let us begin by looking at recent research into the behaviour of people with strong political affiliations, as recently reported in the Washington post:

Partisans who watch presidential debates invariably think their guy won. When talking heads provide opinions after the debate, partisans regularly feel the people with whom they agree are making careful, reasoned arguments, whereas the people they disagree with sound like they have cloth for brains.

Unvaryingly, partisans also believe that partisans on the other side are far more ideologically extreme than they actually are, said Stanford University psychologist Mark Lepper, who has studied how people watch presidential debates.

The article continues:

The result reflects a larger phenomenon in which people routinely discount information that threatens their preexisting beliefs, said Emory University
psychologist Drew Westen, who has conducted brain-scan experiments that show partisans swiftly spot hypocrisy and inconsistencies – but only in the opposing candidate.

When presented with evidence showing the flaws of their candidate, the same brain regions that Kaplan studied lighted up – only this time partisans were unconsciously turning down feelings of aversion and unpleasantness.

Although the brain scans of political partisans are a recent addition to the body of published research, the underlying issue has been an established part of psychology for half a century. 

In 1949, Jerome Bruner and Leo Postman demonstrated that despite the aphorism ‘seeing is believing’, the reverse is often the case – our beliefs can dictate what we perceive. The two psychologists ran a unique experiment for which they created a deck of normal playing cards with one subtle difference: some of the cards had suit symbols that were colour reversed, that is, some of the hearts were printed black, some of the spades were printed red and so forth. These altered cards were shuffled into a normal deck and were then displayed one at a time to the test subjects, who were asked to identify them as fast as possible. Initially, the cards were shown for such a short time interval that accurate identification was essentially impossible, then the display time was gradually lengthened until all the cards were identified. Although all of the subjects were eventually able to identify all of the cards no-one noticed that there was anything unusual about the deck.

When facing a black four of hearts, people would see it either as a four of spades or as a perfectly normal red four of hearts – their expectations about what a four of hearts should look like dictated what they actually saw. As the display times lengthened, people did eventually begin to notice that something was amiss, but they could not determine what was wrong. 

Quotes from the transcripts are particularly revealing. One person, gazing at a red six of spades, responded: “That’s the six of spades, but there’s something wrong with it – the black spade has a red border.” Lengthening the display time increased the confusion and hesitation experienced. One exasperated participant reported: “I can’t make the suit out, whatever it is. It didn’t even look like a card that time. I don’t know what colour it is now or whether it’s a spade or a heart. I’m not even sure what a spade looks like. My God!”

Festinger In the 1950s, studies of this kind led Leon Festinger and his colleagues at Stanford University to develop the theory of cognitive dissonance. This holds that when a person is facing contradictory cognitions there is a driving force that compels their mind to acquire or invent new beliefs, or to modify existing beliefs, in order to reduce the conflict (or dissonance) between these thoughts. In essence, cognitive dissonance is the uncomfortable feeling that people experience when confronted by things that ‘should not be, but are’. 

When the force of the dissonance is sufficiently strong, it leads to intense emotional responses such as anger, fear or hostility. Extreme responses may occur in pathological cases of unresolved cognitive dissonance, such as incidents of people blowing up abortion clinics in the name of Jesus. The drive to avoid cognitive dissonance can be so strong that people sometimes react to disconfirming evidence by strengthening their original beliefs and creating rationalisations to dismiss the disconfirming evidence. This is especially problematic when people have committed to a belief publicly.

There are three basic strategies a mind will employ to reduce cognitive dissonance: 

  • Adopt what other people believe: this is related to what is commonly called peer pressure, and provides an explanation for the apparently irrational need children sometimes display for some item that their peer group has adopted. Even in younger children, the need to conform to social pressures is a powerful drive.
  • Apply pressure to people who believe differently: this is what we can see underlying the case of Wilhelm Reich’s persecution by the FDA, and in all manner of religious and other persecutions throughout history.
  • Make the person who believes differently significantly different from oneself: this is the psychological origin of the religious label ‘heretic’ and also the scientific notion of a ‘pseudo-scientist’. It is also the origin of such horrors as ethnic cleansing.

Of course, while we might be able to spot these behaviours in other people, we are less likely to detect them in ourselves. In particular, many scientists, secure in their belief in the objectivity of the scientific process, never consider that science itself might be subject to problems originating from this phenomenon despite widespread documentation to the contrary. 

The celebrated philosopher of science Thomas Kuhn, in his seminal work The Structure of Scientific Revolutions (1962), noted how the effects of cognitive dissonance applied to the scientific endeavour:

...novelty emerges only with difficulty, manifested by resistance, against a background provided by expectation. Initially, only the anticipated and usual are experienced even under circumstances where an anomaly is later to be observed. 

Kuhn illustrated this state of affairs with the case of William Herschel’s discovery of the planet Uranus. It was observed seventeen times by different astronomers from 1600 through to 1781, and none of these observations made any sense if the object being observed was a star (the prevailing assumption about most lights in the sky at the time). Herschel suggested that the ‘star’ might have a planetary orbit, and suddenly it all made sense. After this shift in perception, which was caused by a change in the way astronomers thought about old observations, suddenly everyone was seeing planets!

180pxalfred_wegener_die_entstehung_der_k An equally famous example is the case of Alfred Wegener, who in 1915 published a shocking new theory that the Earth’s continents had once been contiguous. He claimed that over millions of years this continent split into separate segments which drifted apart into their current arrangement. This theory, dubbed ‘continental drift’ was supported by extensive geological evidence. Still, British and American geologists laughed and called the idea impossible, and Wegener died in 1930 as an intellectual pariah. Today, Wegener’s theory is taught to every schoolchild, and when we look at a map of the world we consider this once impossible theory to be self-evident. 

Knowledge of cognitive dissonance is relatively widespread among modern intellectuals, yet the benefit of this knowledge has been severely limited by partisan effects. People are quite capable of spotting cognitive dissonance in people with opposing beliefs, but seem utterly unable to recognize it within themselves.

But make no mistake: whoever you are, whatever your belief system, you have been affected by your own cognitive dissonance in the past, and you will be affected by your own cognitive dissonance in the future. It occurs with any and all belief systems, whether religious, scientific or otherwise, and no choice of belief system allows you to escape it since all mental states are founded upon beliefs. Even a diehard agnostic still has beliefs locked up in their idiolect, and in their conceptions of self and society. 

The situation is not hopeless, however, as it is possible to control and minimise the effects with experience and practice, or by patching our belief systems with appropriate philosophies. To begin with, however, you will need to observe or recall an instance of you yourself being affected by your own cognitive dissonance.

Watch for situations that cause you to react in an extreme fashion, or that trigger an unexpected fit of rage, or examine cases where you have taken a mental step to make a group of people significantly different from yourself (perhaps an opposing political party, or contrary religious stance, or even people who like a particular sport or game you hate). Until you perceive an incidence of cognitive dissonance in your own life, you may struggle to believe that it affects you, but make no mistake – you are human, and this phenomenon occurs as a consequence of having a human mind. No-one escapes it. 

The rise in the diversity of races, cultures, religions and media in our pluralistic societies have arguably caused a corresponding increase in the incidence of cognitive dissonance, expressed as intolerances of all kinds. Fractious attitudes in religious matters and political partisanship only intensify the problem. If we could truly get to grips with the issue of cognitive dissonance in our modern world, we would gain the potential to solve a great many of our global problems. Until we learn to effectively communicate with each other despite the massive variations in our respective belief systems there is little hope of serious social progress. And this communication will doubtless flounder unless we manage to keep our own cognitive dissonance on a tight leash.

Become someone exceptional – tackle this problem within yourself. Debug the operating system of your own mind by working on your responses when encountering dissonant beliefs, and try to avoid mental models that create hostile ‘us’ and ‘them’ divisions. Once we have all won our own private battles, then we can take the fight to a wider stage, and perhaps make a better world.

The opening image is Dissonance by Sungsook Setton, which I found here. As ever, no copyright infringement is intended and I will take the image down if asked.

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Thanks for interesting article :) For me it nicely bridged On Intelligence and the book I'm currently reading--Influence.

I really enjoyed Hawkins' book (On Intelligence) as it managed to somehow describe a lot of things in one easily digesteable package.

For example according to Hawkins' theory the deck of cards trick would be explained by information flow coming from the upper layers of neocortex--where information is stored in more higher level--when the subject is observing the cards.

What your article really put me thinking is how the Cialdini's 'click'n'whirl' relates to Hawkins' neocortex theory. Maybe the relation is that the automatic reactions are build when same small bit of information is always realted to the same higher level information.

Like when you are walking on street you can spot your friend very far away even from the slightest bit of information. Some times this fails too, you though you saw someone you knew even if it was not.

Also my personal experience tells me that I tend to take these shortcuts more when I'm more emotional. Like sad or worried for example.

Do you think Popper, Kuhn and Feyerabend should be required reading (studying) for people who want to learn to be scientists/philosophers/theologists? I know Kuhn feels that this cognitive dissonance serves scientists well since most of them end up just doing the dirt work of filling out the details of some theory; I don't agree with Kuhn.

Intriguingly, physicists have somehow managed to hold on to two conflicting theories for almost a century now: Theory of Relativity and Quantum Mechanics. Can this phenomenom by adequately explained by Kuhn's theory of scientific revolutions?


Mikko: I find myself with little additional to offer, as I haven't read either of the books you mention. :) Thanks for sharing your viewpoint, though! I'll check out those two books - I'd like to stay abreast of current ideas in this field.

Suyi: I do believe philosophy of science should be an essential part of a scientist's education, yes. I'd like to see philosophy added to the curiculum in general: should we not teach future generations how to think clearly?

I don't think the presence of incompatible theories c.f. quantum mechanics and general relativity is inconsistent with Kuhn's model. One should not necessarily think of science as consisting of a single paradigm, but rather collections of paradigms. The paradigm of the "new synthesis" in evolutionary biology does not relate to either of the physics theories mentioned, for instance; it could be replaced with a different paradigm with no significant effect on theories of physics whatsoever.

This was Feyerabend's view too, that while we sometimes mistakenly think of science as a single body of knowledge, it is in fact a collection of disparate and often utterly unconnected models.

I tend to agree with you that Kuhn may have been a touch naive on this point, but if there was not something to act as a brake on the rate of change of ideas it might be difficult for science to make any useful progress at all! :)

Best wishes!

"Until we learn to effectively communicate with each other despite the massive variations in our respective belief systems there is little hope of serious social progress. And this communication will doubtless flounder unless we manage to keep our own cognitive dissonance on a tight leash."

some language analysis:
"doubtless" is a word hard to accept by any sceptic, even those inclined to accept your proposition.

"to keep on a tight leash" seems like a recommendation that may cause a lot of resistance in and by itself. libraries are shock full similar forms of an "ultimate appeal" - most (if not all) people will accept this recommendation only if they've already accepted the original premise i.e. they "believe in" the premise... People are suspicious of any such appeal that does not offers immediate personal benefits - a suspicion not entirely misguided...

So how do you put forward an ethical recommendation while avoiding this type of circularity?

The answer may be: you can't - at least not by using the way of teaching common to the western tradition of philosophy.

A philosopher that has spent most of his life smashing his head against this paradox that lies at the foundation of the ethical dilemma you hint at is Jürgen Habermas from Germany. He gives recommendations similar to yours but he avoids most of the neuro-psycho metaphors ("cognitive dissonance" is probably missing from his vocabulary, but i'm not sure 'bout that...).

Comparing your rationale with Habermas'may be quite useful to move forward by finding additional angles on this cirtical issue.

One more question: Do you discover cognitive dissonance just by yourself, in isolation, or du you need a partner with whom you communicate... as in the test you cite: how would any participant find out if there were no scientist to tell him?

And there must be mention of one of the most famous appealing examples of cognitive dissonance in literature: Lizzy Bennett once again.

"few people seem to appreciate that cognitive dissonance is not a rare event experienced only by those with strong religious beliefs, but rather a known flaw in the operating system of the human mind"

I don't think it should be called a flaw. As, Daniel Schacter says in The Seven Sins of Memory*, the 'seven sins' are the flip sides of the features that allow memory to work well as it does. Similarly, assuming that the human mind strives to be content, cognitive dissonance is a necessary byproduct of that operation i.e. until there's a preponderance of evidence against, the path of least resistance is adopted and contradictory evidence is ignored or rationalised as neatly as possible.

*http://www.amazon.com/gp/product/0618219196

Chris,

your recommendation on self-observation echoes an earlier proposal that did have some success:

"Dewey felt that Alexander taught him how to stop and think before acting. He said that his study of the Alexander Technique enabled him to hold a philosophical position calmly once he had taken it or to change it if new evidence appeared."

http://en.wikipedia.org/wiki/F._Matthias_Alexander

... and the various /meditational) practices aiming at "mindfulness" seem to be related too ...

There is also a long tradition which tries to alleviate the destructive effects of "cognitive dissonance" rather than trying to "avoid" or "control" it. It's called humour.

So I would propose the following recommendation as a starting point:

to observe your own cognitive dissonance carefully - with the help of friendly (but not necessarily totally likeminded!) people - and then try to laugh about yourself and your shortcomings,

try to laugh away and thus overcome the "intense emotional responses such as anger, fear or hostility"
...it works a lot better than many contemporary rationalists may think ;-)

Many thanks for the feedback!

Gyan: a salient point, to be sure. Still, I feel that my philosophical agenda is not hurt by characterising it as a 'flaw' for the time being. Whilst I agree that such problems can be seen to result from beneficial functional properties, much of the human behaviour resulting strikes me as undesirable. This is why I choose to characterise it as a flaw, I suppose, and doing so (helpfully?) shifts the focus onto a common cause and away from assigning blame.

translucy: another explosion of ideas! :) Your criticisms strike me as valid, but as you observe yourself, perhaps we must indulge in such tricks if we are to assert a helpful opinion. It sounds like I should check out Jürgen Habermas - can you recommend a book? I'm considering 'The Future of Human Nature', but welcome your input!

(As you may have noticed, I'm moving towards ethics as my next philosophical port of call - I just had to put the metaphysics into perspective first, and there's a few issues still to resolve on this front.)

We keep coming back to Pride & Prejudice, don't we? :) Austen was a keen observer of human nature, which is perhaps why her novels have aged so well.

And I am in hearty agreement of the value of humour in alleviating cognitive dissonance! :) (And for that matter the value of meditation/mindfulness training for the same). They have certainly worked wonders in my own life.

Take care!

Chris,

"such tricks" are valid tools if one realizes their "performative", "aesthetic" or simply "game-like" dimension, but that goes far beyond what western tradition has been dwelling on for so long.

Habermas in my view is best understood as someone who tries to find an answer to the "german catastrophy" between 1933 - 45, including the shocking inability or even unwillingness "to realize what is going on around you".

So my starting point would be his discussions of some of the thinkers he tried to answer to, from Heidegger to Arendt in
"Philosophical-Political Profiles" - basically a blog in the form of a book :)

His observations on the practical conditions under which "such tricks" may or may not work on a global scale are widely known under the headline "theory of communicative action" - but his style is famously cryptic, so most of the introductions (e.g. on the web) may be good enough, just try out 2 or 3 and look for the inconsistencies ;)

In order to grasp the ethical dimensions of his attempt to found some form of practical or even pragmatist's rational, deliberative practice of discourse and "ethics after Auschwitz" (which is distinct from over-simplifying utilitarian or "scientific" pragmatics) it is important to understand how he tries to find extensions to Arendt's observations on totalitarianism.

So an introduction (e.g. On Violence) to Arendt's thoughts on power, violence and the communicative dimension of action may help you a lot in accessing Habermas' ethical recommendations. A lot of people in Germany really discuss Heidegger-Arendt-Habermas as a package, with Habermas and his disciples (like A. Honneth) being those who finally brought the analytical-linguistic tradition into the picture.

Habermas' thoughts on bioethics are of course also concerned with the possibility of bio-totalitarianism (which is feared deeply in Germany), and a suspicion towards the current bio-tech-optimism, led by industry rather than democratic institutions ... i can't really tell if that's a good starting point or not.

It's funny that you should mention Arendt; her book 'The Human Condition' is currently at the top of my reading list (it was recommended to me by Anne Galloway); I expect to be ordering it later this week. Should I consider getting 'On Violence' as well, do you think?

Chris,

my reading of "on violence" is as a more politically focused treatment of the chapter on "action" from "human condition" and some thoughts from "Origins of totalitarianism", therefore "human condition" should be fine if you want to dive right into the whole "Arendt cosmos" rather than focus on practical/political aspects of ethics and moral.

Article is very nice. Please contact me by mail if some more information in future

K.Ramachandran MSc.,

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)