What connects resistance to new scientific theories,
religious persecution, racial prejudice, and political partisanship? The answer
is a psychological phenomenon known as cognitive dissonance. Although this
topic often comes up in the context of religion, few people seem to appreciate
that cognitive dissonance is not a rare event experienced only by those with
strong religious beliefs, but rather a known flaw in the operating system of
the human mind. We are all subject to its effects, to varying degrees, and a
widespread comprehension of this idea has the potential to be the greatest
advance for human culture in living memory.
Let us begin by looking at recent research
into the behaviour of people with strong political affiliations, as recently
reported in the
Partisans who watch presidential debates invariably think their guy won. When talking heads provide opinions after the debate, partisans regularly feel the people with whom they agree are making careful, reasoned arguments, whereas the people they disagree with sound like they have cloth for brains.
Unvaryingly, partisans also believe that partisans on the other side are far more ideologically extreme than they actually are, said
Stanford University psychologist Mark Lepper, who has studied how people watch presidential debates.
The article continues:
The result reflects a larger phenomenon in which people routinely discount information that threatens their preexisting beliefs, said Emory
psychologist Drew Westen, who has conducted brain-scan experiments that show partisans swiftly spot hypocrisy and inconsistencies – but only in the opposing candidate.
When presented with evidence showing the flaws of their candidate, the same brain regions that Kaplan studied lighted up – only this time partisans were unconsciously turning down feelings of aversion and unpleasantness.
Although the brain
scans of political partisans are a recent addition to the body of published
research, the underlying issue has been an established part of psychology for
half a century.
In 1949, Jerome Bruner and Leo Postman demonstrated that despite the aphorism ‘seeing is believing’, the reverse is often the case – our beliefs can dictate what we perceive. The two psychologists ran a unique experiment for which they created a deck of normal playing cards with one subtle difference: some of the cards had suit symbols that were colour reversed, that is, some of the hearts were printed black, some of the spades were printed red and so forth. These altered cards were shuffled into a normal deck and were then displayed one at a time to the test subjects, who were asked to identify them as fast as possible. Initially, the cards were shown for such a short time interval that accurate identification was essentially impossible, then the display time was gradually lengthened until all the cards were identified. Although all of the subjects were eventually able to identify all of the cards no-one noticed that there was anything unusual about the deck.
When facing a black four of hearts, people
would see it either as a four of spades or as a perfectly normal red four of
hearts – their expectations about what a four of hearts should look like
dictated what they actually saw. As the display times lengthened, people
did eventually begin to notice that something was amiss, but they could not
determine what was wrong.
Quotes from the transcripts are particularly revealing. One person, gazing at a red six of spades, responded: “That’s the six of spades, but there’s something wrong with it – the black spade has a red border.” Lengthening the display time increased the confusion and hesitation experienced. One exasperated participant reported: “I can’t make the suit out, whatever it is. It didn’t even look like a card that time. I don’t know what colour it is now or whether it’s a spade or a heart. I’m not even sure what a spade looks like. My God!”
In the 1950s, studies of this kind led Leon
Festinger and his colleagues at
When the force of the dissonance is sufficiently strong, it leads to intense emotional responses such as anger, fear or hostility. Extreme responses may occur in pathological cases of unresolved cognitive dissonance, such as incidents of people blowing up abortion clinics in the name of Jesus. The drive to avoid cognitive dissonance can be so strong that people sometimes react to disconfirming evidence by strengthening their original beliefs and creating rationalisations to dismiss the disconfirming evidence. This is especially problematic when people have committed to a belief publicly.
There are three basic strategies a mind will employ to reduce cognitive
- Adopt what other people believe: this is related to what is commonly called peer pressure, and provides an explanation for the apparently irrational need children sometimes display for some item that their peer group has adopted. Even in younger children, the need to conform to social pressures is a powerful drive.
- Apply pressure to people who believe differently: this is what we can see underlying the case of Wilhelm Reich’s persecution by the FDA, and in all manner of religious and other persecutions throughout history.
- Make the person who believes differently significantly different from oneself: this is the psychological origin of the religious label ‘heretic’ and also the scientific notion of a ‘pseudo-scientist’. It is also the origin of such horrors as ethnic cleansing.
Of course, while we might be able to spot
these behaviours in other people, we are less likely to detect them in
ourselves. In particular, many scientists, secure in their belief in the
objectivity of the scientific process, never consider that science itself might
be subject to problems originating from this phenomenon despite widespread
documentation to the contrary.
The celebrated philosopher of science Thomas Kuhn, in his seminal work The Structure of Scientific Revolutions (1962), noted how the effects of cognitive dissonance applied to the scientific endeavour:
...novelty emerges only with difficulty, manifested by resistance, against a background provided by expectation. Initially, only the anticipated and usual are experienced even under circumstances where an anomaly is later to be observed.
Kuhn illustrated this state of affairs with the case of William Herschel’s discovery of the planet Uranus. It was observed seventeen times by different astronomers from 1600 through to 1781, and none of these observations made any sense if the object being observed was a star (the prevailing assumption about most lights in the sky at the time). Herschel suggested that the ‘star’ might have a planetary orbit, and suddenly it all made sense. After this shift in perception, which was caused by a change in the way astronomers thought about old observations, suddenly everyone was seeing planets!
An equally famous example is the case of
Alfred Wegener, who in 1915 published a shocking new theory that the Earth’s
continents had once been contiguous. He claimed that over millions of years
this continent split into separate segments which drifted apart into their
current arrangement. This theory, dubbed ‘continental drift’ was supported by
extensive geological evidence. Still, British and American geologists laughed
and called the idea impossible, and Wegener died in 1930 as an intellectual pariah.
Today, Wegener’s theory is taught to every schoolchild, and when we look at a
map of the world we consider this once impossible theory to be self-evident.
Knowledge of cognitive dissonance is relatively widespread among modern intellectuals, yet the benefit of this knowledge has been severely limited by partisan effects. People are quite capable of spotting cognitive dissonance in people with opposing beliefs, but seem utterly unable to recognize it within themselves.
But make no mistake: whoever you
are, whatever your belief system, you have been affected by your own cognitive
dissonance in the past, and you will be affected by your own cognitive dissonance
in the future. It occurs with any and all belief systems, whether
religious, scientific or otherwise, and no choice of belief system allows you
to escape it since all mental states are founded upon beliefs. Even a diehard
agnostic still has beliefs locked up in their idiolect, and in their conceptions
of self and society.
The situation is not hopeless, however, as it is possible to control and minimise the effects with experience and practice, or by patching our belief systems with appropriate philosophies. To begin with, however, you will need to observe or recall an instance of you yourself being affected by your own cognitive dissonance.
Watch for situations that cause you to
react in an extreme fashion, or that trigger an unexpected fit of rage, or examine
cases where you have taken a mental step to make a group of people
significantly different from yourself (perhaps an opposing political party, or
contrary religious stance, or even people who like a particular sport or game
you hate). Until you perceive an incidence of cognitive dissonance in your own
life, you may struggle to believe that it affects you, but make no mistake –
you are human, and this phenomenon occurs as a consequence of having a human
mind. No-one escapes it.
The rise in the diversity of races, cultures, religions and media in our pluralistic societies have arguably caused a corresponding increase in the incidence of cognitive dissonance, expressed as intolerances of all kinds. Fractious attitudes in religious matters and political partisanship only intensify the problem. If we could truly get to grips with the issue of cognitive dissonance in our modern world, we would gain the potential to solve a great many of our global problems. Until we learn to effectively communicate with each other despite the massive variations in our respective belief systems there is little hope of serious social progress. And this communication will doubtless flounder unless we manage to keep our own cognitive dissonance on a tight leash.
Become someone exceptional – tackle this problem within yourself. Debug the operating system of your own mind by working on your responses when encountering dissonant beliefs, and try to avoid mental models that create hostile ‘us’ and ‘them’ divisions. Once we have all won our own private battles, then we can take the fight to a wider stage, and perhaps make a better world.
The opening image is Dissonance by Sungsook Setton, which I found here. As ever, no copyright infringement is intended and I will take the image down if asked.