Previous month:
September 2008
Next month:
November 2008

Moses' Bill of Rights

What would the Ten Commandments say if it were worded as a Bill of Rights and not a list of prohibitions?

One of the most radical changes in the way Western philosophy has dealt with ethics came with Immanuel Kant, who asserted it was possible to derive natural rights from reason alone. This opened the door to a re-imagining of teleological ethics into deontological ethics, which in turn was a major force in giving us the modern notion of moral rights. The rights-focussed approach to ethics is one of three main schools of thought on ethics (the other two being what I have termed outcome-focussed and agent-focussed). 

My general thesis on ethics is that one form of ethical perspective is in principle transformable into any of the others, and any that are not have a weaker claim to moral force, since they fail to be universal. With this in mind, I present here my translation of the Ten Commandments, originally composed by Moses and/or God (according to your preferred beliefs), translated into a rights-based language game

  1. The top diety or absolute transcendent force has the exclusive right to be known as God.
  2. You have the right to avoid the worship of idols.
  3. You have the right to trust sacred oaths.
  4. You have the right to at least one day of rest (and/or worship) out of seven.
  5. You have the right to be respected by your children.
  6. You have the right to be alive, and thus may not be killed intentionally.
  7. You have the right to trust that your spouse will not engage in sexual intercourse with others without your express permission.
  8. You have the right to own possessions.
  9. You have the right to factual testimony in legal proceedings.
  10. You have the right to trust that no-one will scheme to take away your spouse or possessions.

Some commentary may help explain my choices:

  1. I like this formulation as I believe it captures what the first commandment intends, but it's a formulation which has little or no effect on non-believers. "Top diety" may seems a strange choice, but the original wording here is clearly dictating henotheism - devotion to one diety, while recognising others. (If you don't think this makes sense, explain why the stories told in Exodus say the Egyptians are able to transform sticks into snakes as well as Moses and Aaron). The first commandment does not say "there are no other gods", it says "have no other Gods before me" (and this is expressly addressed at the Israelites). Maintaining the henotheistic angle makes this formulation compatible with many more varied faiths, which I see as desirable.
  2. The reason why "graven images" are being prohibited is because these are a form of metaphysical con that was common at the time. In fact, the Jewish Bible contains a wonderful story called "Bel and the Dragon" which is the first detective story - it exposes a particular idol as a fake by an early application of what we now call the scientific method. (It annoys me that this was cut from the Christian version). So here I have worded this as a protection against being forced to worship idols, and an idol in this case would include any substitute for divinity such as when certain Christians take the wording of the Bible as more important than following Jesus teachings. Under this formulation, they are free to choose to do this - but not to force this onto others under any circumstances.
  3. Not "taking the Lord's name in vain" appears to have been a prohibition about invoking God in an oath and then breaking that oath. Here I have used "sacred oath" as this covers a wider range of related issues.
  4. This one is so often lost in translation - the Sabbath wasn't just being promoted as a requirement for worshipping God (for the Israelites), it was being given as a day of rest. In our modern society, where we work people (especially the poorer people) to the bone, this idea of a right to a day of rest has become obscured.
  5. This is a straight inversion of "Honour your father and mother" into rights language.
  6. This is always interesting. While pro-life individuals may well interpret this as precluding abortion, it's not clear at what point this right will apply (from conception or from birth) so the argument for this remains contestable. What is clearer is that it precludes capital punishment, and intentional attempts to kill in war. I have great respect for soldiers, but I have never understood how any Christian could join a military force that is intent upon murder, even on the battlefield. This isn't the only way war could be conducted - it's about time we started to explore new ethics of warfare.
  7. Now this may seem like I'm allowing for an overly permissive interpretation of "adultery" here, but the point is to capture what is meant by this word. Remember that Sarai (Sarah) encouraged Abram (Abraham) to have sex with Hagar when Sarai thought she was barren, so multiple consensual sexual relations are recorded in the Bible. Plus, wording this way allows for polyandrogynous relationships, which some religious and non-religious people practice.
  8. A prohibition on theft is a right to own possessions.
  9. This one is so often misinterpreted, even by Kant. "False witness" is a legal claim - this commandment doesn't say "don't lie", it says "don't perjure". That's a very different assertion!
  10. Finally, not "coveting your neighbours wife" et al is concerned with not making plans to take what is promised to or belongs to someone else. This is a subtle extension of (7) and (8) that seems to be intended to exclude conspiracy and conflict arising from jealousy, although the original intent is perhaps rather to encourage people to be content with what they have. The rewording seems to suggest that when a Government exercises "eminent domain" it violates the tenth commandment, which I find particularly interesting since the US Government exercises this power all too often.

What do you think about this transposition of the Ten Commandments? Whatever your beliefs, I'd be interested in your perspective, so let me know your thoughts in the comments!


Fishing for Troughton

Patrick_troughton For some months now, I have been working my way through the old Doctor Who reconstructions. As of today I only have three Patrick Troughton serials to download before I have reached the end of the black and white era of this, the longest running science fiction television show in the world.

I've been fishing for seeds and peers to download these from for almost a year now, and I'm enormously grateful to the geeks of the internet for sharing these shows over point-to-point clients such as μtorrent (that's "mutorrent", not "utorrent", a waggish distinction apparently chosen to guarantee future arguments). The reconstructions are not copyrighted material, being largely fan-made ensembles of surviving footage and still photographs wed to scratchy soundtrack recordings, and thus are entirely legal to download in this manner. It takes a certain dedication to the show to enjoy watching them, but I have found it a delight to experience what the William Hartnell and Patrick Troughton eras of the show were truly like, and each has a charm all of its own.

The reason for the reconstructions in the first place was that the BBC, between 1964 and 1974, systematically destroyed or wiped its archive material - including 108 of the first 253 episodes of Doctor Who. So when the odd serial in the sparsely populated Troughtan era serials contains a whole intact episode, and I get to see Frazer Hines' character Jamie running from a ridiculous man-in-a-suit yeti monster, or enjoy the magnificently expressive face of Troughton's Doctor, it gives me an odd frisson - a thrill born from the scarcity of the surviving video material. When I was able to watch the manifest absurdity of the original tin-foil suited cybermen in The Tenth Planet serial, at the end of William Hartnell's run on the show, I couldn't help but relish their absurdity and enjoy the innocent imagination of 1960s science fiction.

Thank you Doctor Who fans for taking the time to lovingly piece together these reconstructions, and thank you internet geeks for sharing them. I am in debt to you all.


A Secular Age (7): The Immanent Frame

Richard_Lewontin By this term, the immanent frame, Taylor designates the perspective on the universe that has emerged as a consequence of disenchantment, the buffered identity and other changes in our social and cosmic imaginaries. This frame “constitutes a ‘natural’ order, to be contrasted to a ‘supernatural’ one, an ‘immanent’ world over against a possible ‘transcendent’ one. It is a perspective that, broadly speaking, we all share – although our interpretations of it may differ.

Taylor states in this regard:

And so we come to understand our lives as taking place within a self-sufficient immanent order; or better, a constellation of orders, cosmic, social and moral… these orders are understood as impersonal. This understanding of our predicament has as background a sense of our history: we have advanced to this grasp of our predicament through earlier more primitive stages of society and self-understanding. In this process, we have come of age… The immanent order can thus slough off the transcendent. But it doesn’t necessarily do so. What I have been describing as the immanent frame is common to all of us in the modern West, or at least that is what I am trying to portray. Some of us want to live it as open to something beyond; some live it as closed. It is something which permits closure, without demanding it.

This outlook effectively voids all mystery by splitting nature from supernature. Taylor notes that this provides the modern concept of the “miracle” as “a kind of punctual hole blown in the regular order of things from outside, that is, from the transcendent.” He notes that this is a view “shared between materialists and Christian Fundamentalists. Only for these, it provides proof of ‘miracles’, because certain things are unexplained by the normal course of natural causation. For the materialist, it is a proof that anything transcendent is excluded by ‘science’.” The materialist position is thus that the immanent frame is closed; there is nothing beyond it, while other belief systems allow for transcendence.

There is a certain draw towards treating the immanent frame as closed among certain people (and in particular, the scientific establishment) – Taylor talks of a “sense of being menaced by fanaticism” as being “one great source of the closure of immanence.” As in nineteenth-century France, an anti-clerical movement turns into rejection of Christianity, or later into atheism. By adopting a closed interpretation of the immanent frame it seems as if society can shrug off the chief source of fanaticism – but this impression rests on the fallacious idea that fanaticism emerges solely from religious beliefs, but of course, it is perfectly possible to be a scientific fanatic, a Marxist fanatic, or some other fanatical position which rests wholly upon the immanent frame. Fanaticism is a facet of humanity, not of religion.

The perspective that Taylor builds, therefore, separates the immanent frame itself, which is not a serious subject of dispute, from the two equally possible “spins” – open versus closed:

Some people will undoubtedly feel that the immanent frame calls out for one reading. True, we can adopt the other view by dint of a determined (and not quite intellectually honest) “spinning”, but one reading is the obvious, the “natural” one. In the nature of things, that claim is made today most often by protagonists of the “closed” reading, those who see immanence as admitting of no beyond. This is an effect of the hegemony of this reading, especially in intellectual and academic milieux… By contrast, my understanding of the immanent frame is that, properly understood, it allows of both readings, without compelling us to either. If you grasp our predicament without ideological distortion, and without blinders, then you see that going one way or another requires what is often called a “leap of faith”.

Mindful of the reluctance of people who have a closed reading of the immanent frame to recognize any aspect of their belief system as requiring faith, let alone a “leap of faith” (because of the religious tenor this term has acquired), Taylor expresses this idea in more neutral language by saying “both open and closed stances involve a step beyond available reasons into the realm of anticipatory confidence.” He states that no-one who can see solely the closed or the open reading as valid is fully lucid of our actual situation – one must recognize that “one’s confidence is at least partly anticipatory”. But it is far easier to fall for one kind of “spin” or the other, to avoid seeing the neutral space between them, because this “spin” is “a way of avoiding entering [neutral] space, a way of convincing oneself that one’s reading is obvious, compelling, allowing of no cavil or demurral.”

Much of his discussion of the immanent frame focuses upon the closed reading, but Taylor notes: 

Of course, there are those who think that the open reading is obvious and inescapable, because, for instance, the existence of God can be “proven”. But such people are perhaps less numerous today than their secularist opposite numbers, and certainly cannot approach the intellectual hegemony their opponents enjoy, and so my arguments here will mainly address the latter.

In footnotes, Taylor explores some examples of the “spin” of the closed stance. He talks about Professor Dawkins’ “reasons for believing that science can sideline religion” and notes that these rest on an “oversimple distinction between ‘faith’ and ‘science’. He quotes Dawkins as saying: “Faith, being belief that isn’t based on evidence, is the principal vice of any religion,” whereas science “is free of the main vice of religion, which is faith”. Taylor counters this naïve view by observing that “to hold that there are no assumptions in a scientist’s work which aren’t already based on evidence is surely a reflection of blind faith, one that can’t even feel the occasional tremor of doubt. Few religious believers are this untroubled.”

Against Dawkins’ fanaticism, Taylor offers evolutionary biologist Richard Lewontin’s perspective as an example of a proponent of the materialist viewpoint who is “quite lucid about their prior ontological commitments”. Lewontin (who is pictured above) is quoted as follows:

Our willingness to accept scientific claims that are against common sense is the key to understanding the real struggle between science and the supernatural. We take the side of science in spite of the patent absurdity of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories, because we have a prior commitment, a commitment to materialism.

It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world but, on the contrary, that we are forced by our a priori allegiance to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counterintuitive, no matter how mystifying to the uninitiated. Moreover that materialism is absolute, for we cannot allow a divine foot in the door.

Exploring the force of the closed interpretation, Taylor introduces the idea of a closed world structure (CWS), which is to say “ways of restricting our grasp of things which are not recognized as such.” He stresses that while such closed world structures may be unfounded (they exclude possibilities for which there is no basis for exclusion), this is not the same as saying that the beliefs that people in the grip of closed world structures hold are necessarily wrong: “all CWSs may be illegitimate, and yet there may be nothing beyond the immanent frame. I will not be arguing either for or against an open or closed reading, just trying to dissipate the false aura of the obvious that surrounds one of these.” Of course, there are closed world structures to accompany both the open and the closed “spin” – but these days, most people have little difficulty dismissing the former, while the latter can still seem quite compelling.

We can be held within certain world structures (which are aspects of “the way experience and thought are shaped and cohere”) without awareness of alternatives. A “picture” can “hold as captive”, as Wittgenstein said. Much of the force of the CWS comes from how epistemology is conducted within it – a chain of inferences is constructed that begins with knowledge of the self, before passing onto external reality and other people. This perspective places the transcendent at the end of a chain of inferences, and makes it appear weak because of it.

The philosopher Heidegger presents a rival epistemic position, however, one which upturns the chain of inferences by denying that the most reasonable initial step is to ascertain with confidence our knowledge of the self. Taylor observes in this regard:

The “scandal of philosophy” is not the inability to attain to certainty of the external world, but rather that this should be considered a problem, says Heidegger in Sein und Zeit. We only have knowledge as agents coping with a world, which it makes no sense to doubt, since we are dealing with it. There is no priority of the neutral grasp of things over their value. There is no priority of the individual’s sense of self over the society; our most primordial identity is as a new player being inducted into an old game… The whole sense that [transcendence] comes as a remote and most fragile inference or addition in a long chain is totally undercut by this overturning of epistemology.

Taylor examines other approaches to the closed “spin” on immanence, but the demands of brevity make it difficult to encapsulate these here. However, some of these relate to the position discussed three weeks ago – the epistemic position that sees scientific materialism and related belief systems as a stance of maturity:

This means that this ideal of the courageous acknowledger of unpalatable truths, ready to eschew all easy comfort and consolation, and who by the same token becomes capable of grasping and controlling the world, sits well with us, draws us, that we feel tempted to make it our own. And/or it means that the counter-ideals of belief, devotion, piety, can all-too-easily seem actuated by a still immature desire for consolation, meaning, extra-human sustenance.

This CWS might be even more influential than the chain of inferences discussed above. A related belief is the idea that we have become our own “legislators of meaning”, a view that can exhilarate us or terrify us, depending on how we feel facing an interpretation of the universe as meaningless. But this position too, Taylor notes, is problematic since our ethical concepts are not quite as malleable as they first seem: “…it is clear that, although there are important choices to be made… nevertheless much of what we accept as normative is deeply anchored in our past and identity.” The cultural and historical backdrop of ethics cannot be wholly eliminated.

Something fundamental can seem to be missing when one adopts the closed spin on immanence and discards the open interpretation in its entirety. We are beset by what Taylor terms “malaises of immanence” – a dissatisfaction born of narrowing our world system to exclude any notions of transcendence. There are three such malaises in particular that Taylor identifies: firstly, the sense of the fragility of meaning, and its accompanying search for an over-arching significance; secondly, the flatness felt in the absence of a way to solemnize the crucial moments of passage in our lives; finally, the utter flatness, the emptiness of the ordinary. These are characteristic elements of the modern era, and they each result from believing that the immanent frame is “all that there is”, that there are no sources of transcendence. The yearning for “something more” creates a cultural cross pressure, something we shall explore shortly.

First, however, we must complete our examination of the circumstances surrounding the immanent frame by exploring the narratives that support the closed “spin” on immanence, the subtraction stories that help to make the resulting closed world system seem undeniable, and which Taylor’s account in A Secular Age is intended throughout to challenge.

Next week: Subtraction Stories


Auctions and the Fear of Failure

Last week, the BBC reported on some research about behaviour in auctions that caught my attention. The researchers in question concluded that it was predominantly the fear of losing that drove people to overbid in auctions, and not the joy of winning as had previously been assumed. Here's an extract:

Brain scans of people taking part in an auction showed those "overbidding" had a greater response to losing than to winning, the Science journal reported... A type of scan called functional magnetic resonance imaging (fMRI) showed that in the auction game there was an exaggerated response to loss in the striatum - part of the brain associated with reward - but hardly any response to winning. The greater the tendency to overbid, the stronger the response to loss suggesting that the prospect of losing the competition caused participants to bid too high, the researchers said.

Now this interests me as a game designer and researcher because the obvious interpretation of the auction game is that the participant wants the emotional reward of fiero (triumph over adversity) from being victorious - something ebay have capitalised upon in their "shop victoriously" campaign. But instead, what we see here is that the desire to avoid being beaten by the other players (the other bidders in auction terminology) is showing up as a prevailing force.

I don't want to jump to conclusions here, because if there's one thing my company's research has shown it's that fiero-seeking isn't a majority pursuit (my best estimate at the moment is about one fifth of players have this as their primary drive, but this figure is vague at best) and this is something the researchers won't have considered since the predominant paradigm for this kind of scientific research presumes all participants can be treated as instances of the same archetype (i.e. that all humans are essentially the same) which holds far better for some traits (such as enjoying food) than for others (such as enjoying pain).

But it does beg the question: do some or all challenge-oriented players strive to beat the games they are playing because they refuse to admit defeat? Are they seeking fiero, or just striving not to lose? Are they driven by a subconscious fear of failure rather than a desire for the emotional reward of winning? That's not to say that winning isn't fun... just that it might not be the anticipation of winning that drives certain videogame players to push for the win.

The situation could be far more complicated than we previously assumed: we've talked a lot about the carrot (fiero) but not really considered the stick (fear of failure). Alas, finding a way to investigate this further is going to be especially challenging. If anyone has a spare fMRI scanner they want to loan me I'd appreciate it!


GDC Pre-Mortem: This is Your Brain On Games

I have some good news and some bad news. Every year, I submit multiple sessions to GDC, ranging from lectures, to round tables to panel discussions. Every year, the lectures are rejected, although usually late enough in the process to get the polite "you almost made it" letter. The bad news is that my lecture for GDC 2009, "This is Your Brain On Games", which I rate as the best submission I've made thus far, was rejected in the initial round of winnowing this year. The good news is that I no longer have to wait until after March to publish this material to my blog - so my players here should enjoy some pretty stellar posts on neurobiology of play in the weeks to come.

I expect when I post the most tabloid-style post on this subject later this year, it will pull in between 24,000 and 30,000 visitors (based on previous traffic watermarks), so I shouldn't really care about not getting to lecture about it to an audience of a hundred at GDC. But still, I have to wonder: if the material I'm producing is of great interest to the videogames community at large, why isn't it of interest to the GDC organisers? What really confuses me about this is how it was possible to reject this proposal so rapidly. I'm considered to be one of the more interesting lecturers in the videogames space, and universities and conventions love to invite me to talk. GDC, however, has never accepted one of my lectures (although I've been on the GDC faculty many times thanks to my friends in the IGDA and industry inviting me to host or participate in other kinds of session).

So I'm forced to wonder: was this proposal rejected so rapidly because of a flaw with the proposed lecture (perhaps it seemed too complex, or too academic) or because the GDC organisers don't recognise me as a valuable member of the games community? I accept that my research is not as thorough as Nick Yee's, my player models are not as accessible or popular as Nicole Lazzaro's, and my videogames are not as beloved as Shigeru Miyamoto's, but even so, I'm a long way from being a complete unknown. As is so often the case, it's the uncertainty that troubles me: the early rejection form-letter email gives no clue as to why a proposal has been thrown out - so I'm turning to you, my players, to give me some feedback and speculate as to why this might not have seemed as solid a lecture as it appeared to me.


This is Your Brain On Games

Are games addictive? We all know that there is a sense in which videogames are fiendishly addictive, but this does not mean the same as ‘addiction’ in a medical context. This session looks at the biology of play in order to explore the question “are games addictive?” while simultaneously showing the effects videogames have on people’s brains, and how we can leverage these mechanisms to make better games.

A fairer way to describe the addictive properties of games is perhaps ‘habit forming’ – the reason you keep coming back to a game you enjoy is because playing that game forms a habit for you. Things that form habits in this way needn’t be viewed negatively – in fact, good mental health depends upon forming good habits concerning eating and sleeping (for instance). Furthermore, our lives are full of things that are ‘habit forming’ in this weak sense – television shows of all kinds hope to form habits so that they will build an audience, but there is little outcry about this kind of ‘addiction’.

There is a chemical in the brain known as dopamine which lies behind all habit forming behaviours, and examining how this neurotransmitter functions provides a helpful way of understanding what makes games addictive. Other neurotransmitters have other effects on the brain, and understanding how these work (and how they work in concert with dopamine) provides a valuable picture of the biology of play – something that game developers can use to make better games for their audience.

This session provides an easy-to-follow introduction to the biology of play, featuring examples drawn from real gameplay experiences and showing how they relate to the underlying mechanisms in the human brain. Although some technical terms (like dopamine) are used, the language of the presentations has been simplified slightly to make the central ideas accessible to anyone. Packed full of interesting perspectives on play, the talk draws upon the latest neurobiological research to explain why games are addictive.

For everyone interested in how the brain reacts to videogames, this is the presentation for you – this is your brain on games!

Attendee Takeaway

Attendees to this presentation will take away a whole new perspective on how and why people play videogames, how games can be put together to leverage the various mechanisms in the human brain, as well as a crash course in the neurobiology of play. They will learn about how dopamine reinforces behaviours (and causes both habits and addiction), how the fight-or-flight mechanism lies behind the enjoyment of many (but not all) videogames, why curiosity can be as rewarding as victory, and why successful games leverage not just novelty, but also familiarity.

Please let me know why you think this session was rejected by GDC in the comments. Thanks in advance for your input!


Paper Never Crashes

We are surrounded by electronic solutions to problems, but no matter what technology we develop I doubt we'll ever become a truly paperless society. Frankly, I don't think we would want to be. There is a mystique to the printed word that could be surpassed in convenience by electronic ink, perhaps, but the charm of books remains undiminished more than two thousand years after their invention.

For taking notes, I have rejected electronic devices. No interface is as easy to use as a pen on paper, no file management system is as intuitive as a notebook, and no digital bells and whistles can possibly compensate me for the single most useful aspect of paper: it never crashes. I can carry a sheet of paper in my back pocket through wind, rain and snow, and although it may wear, tear or stain, I can pretty much always read what I have written, and I never have to fear that a fatal software problem will rob me of my efforts, as has distressingly happened all too often with word processors in my life.

We are often so willing to believe in electronic tools as miraculous labour saving advances in technology that perhaps we sometimes forget that there are timeless solutions to everyday problems that surpass their digital equivalents in both their economy and their utility.


A Secular Age (6): Religion Today

Epstein How do people respond to religion in the modern world? To explore this issue, Taylor draws upon a wealth of studies of religious attitudes in Europe, the United States and former Soviet nations such as Ukraine to construct an impression of the state of religion today. The background to this exploration is that which we saw last week – the nova effect, which has created a near-infinite number of spiritual and religious positions, and the ethic of authenticity, which allows each to establish their own path, with the sole provision that it rings true to the person concerned.

Taylor summarises the general thrust of the moral zeitgeist as follows:

…it is clear that the ideals of fairness, of the mutual respect of each other’s freedom, are as strong among young people today as they ever were. Indeed, precisely the soft relativism that seems to accompany the ethic of authenticity: let each person do their own thing, and we shouldn’t criticise each other’s “values”; this is predicated on a firm ethical base, indeed, demanded by it. One shouldn’t criticise the others’ values, because they have a right to live their own life as you do. The sin which is not tolerated is intolerance. This injunction emerges clearly from the ethic of freedom and mutual benefit, although one might easily cavil at this application of it.

So the background to morality in the modern West has evolved from the process that gave us exclusive humanism (via the modern moral order of mutual benefit), which Taylor also sees as the seed of the nova effect which has multiplied the possible positions to a point beyond measure. He is also noting here that the common appreciation of this falls rather short of the high ideals it has emerged from: there is potential for criticism.

One significant change is that not only has the attitude towards this kind of “soft relativism” shifted dramatically, it is now standing on its own whereas previously it was part of a wider system. Locke, for instance, felt the Law of Nature had to be inculcated in the populace by strong discipline – the goal was individual freedom, but the method was one people today would be unlikely to accept. It was two hundred years before a less rigid formulation was developed: John Stuart Mill’s “harm principle” (‘no-one has a right to interfere with me for my own good, only to prevent harm to others’). This is widely endorsed today – but in Mill’s day it was quite a shocking suggestion, seeming to be “the path to libertinism”.

Today, the harm principle is so prevalent that it serves as a justification to deny the validity of traditional religious authority, and embrace a kind of unlimited pluralism:

For many people today, to set aside their own path in order to conform to some external authority just doesn’t seem comprehensible as a form of spiritual life. The injunction is, in the words of a speaker at a New Age festival: “Only accept what rings true to your own inner Self.” Of course, this understanding of the place and nature of spirituality has pluralism built into it, not just pluralism within a certain doctrinal framework, but unlimited. Or rather, the limits are of another order, they are in a sense political, and flow from the moral order of freedom and mutual benefit. My spiritual path has to respect those of others; it must abide by the harm principle.

Drawing upon sources such as Wade Clark Roof, Paul Heelas, and Linda Woodhead, Taylor paints a picture of today’s spiritual seekers as trying to find “something more”:

…they are seeking a kind of unity and wholeness of the self, a reclaiming of the place of feeling, against the one-sided pre-eminence of reason, and a reclaiming of the body and its pleasures from the inferior and often guilt-ridden place it has been allowed in the disciplined, instrumental identity. The stress is on unity, integrity, holism, individuality; their language often invokes “harmony, balance, flow, integrations, being at one, centred”.

The modern spiritual quest is often contrasted directly with religion (which is generally used as if to mean solely orthodox religion):

This kind of search is often called by its practitioners “spirituality”, and is opposed to “religion”. This contrast reflects the rejection of “institutional religion”, that is, the authority claims made by churches which see it as their mandate to pre-empt the search, or to maintain it within certain definite limits, and above all to dictate a certain code of behaviour.

These features of “spirituality”, its subjectivism, its focus on the self and its wholeness, its emphasis on feeling, has led many to see the new forms of spiritual quest which arise in our society as intrinsically trivial or privatised. I believe that this is part and parcel of [a] common error… the widespread propensity to identify the main phenomena of the Age of Authenticity with their most simple and flattened forms.

Spirituality and religion are thus set up as polar opposites, yet Taylor notes that despite this prior assumption, it is perfectly possible for the spiritual quest to bring someone into a more conventionally religious position:

Again, “finding out about oneself, expressing oneself, discovering one’s own way of becoming all that one can… be” is opposed to “denying or sacrificing oneself for the sake of a super-self order of things, or even… living by reference to such an order.” But this contrast can’t be considered exhaustive. The first term could be seen as a definition of the contemporary ethic of authenticity; the second invokes one view of what is supremely important in life. The question set in the first can initiate a quest, and this can end in the second as an answer. Nothing guarantees this, but nothing ensures its opposite either.

While Taylor is keen to observe that the spiritual quest may end in religion, he is equally keen to stress that pre-empting the spiritual quest (by, for instance, insisting that orthodox religion is the only valid response) is tremendously short sighted:

Some people want, of course, to declare a fundamental opposition between this search for integrity and the transcendent: [Heelas and Woodhead] quote a minister who told his congregation that “wholeness” should matter to them less than “holiness”, but that is what one might expect from a hostile observer for whom religious authority renders this kind of quest useless and dangerous. There is no reason to buy into this kind of myopia.

Another important aspect of the state of religion in the modern West is that there an increasingly varied set of ways that one can relate to traditional religion. In the context of Christianity (which has dominated the history of the West), Grace Davie speaks of “believing without belonging” – that is, Christians without a church, and those who have faith in God, and even identify with a particular denomination, yet never attend services. Danièle Hervieu-Léger identifies another pattern in Scandinavia, in which people identify with the national church but attend only for the rites of passage (birth, death, marriage) while expressing considerable skepticism concerning that church’s theology. 

Mikhaïl Epstein (pictured above) finds even further diversity of Christian beliefs in post-Soviet Russia:

…Epstein introduces the concept of “minimal religion”. He also speaks of an overlapping category, the people who declare themselves “just Christians” in surveys of religious allegiance, as against those who adhere to one or other Christian confession, like Orthodox, or Catholic. This kind of religious position Epstein sees as “post-atheist”; and this in two senses. The people concerned were brought up under a militantly atheist régime, which denied and repressed all religious forms, so that they are equidistant from, and equally ignorant of, all the confessional options. But the position is also post-atheist in the stronger sense that those concerned have reacted against their training: they have acquired in some fashion a sense of God, which however ill-defined places them outside the space of their upbringing.

The situation in the United States in this regard is very different indeed, and Taylor explores the differences from a number of different perspectives. He points to an aspect of this difference from polling data: people in the US tend to exaggerate their religious involvement (they claim to go to church more often than they do) while people in Europe tend to understate it. There seems to be a sense that people have an impression of what is “normal” in their culture, and thus people skew their responses towards their expectations. He wonders if the belief in mainstream secularization theory acts in part as a “self-fulfilling prophecy” in Europe: because the beliefs of intellectual élites can define the “religious imaginary” for the populace at large, the attitude of the academic world in Europe (which is quite hostile towards religion) seems to have created a cultural atmosphere where people are almost embarrassed to admit their connections with religion.

Yet on the other side of the Atlantic, this has not happened. The academic world in the US is as “deeply invested in unbelief as its European counterpart” but here it “seems without effect on large segments of the greater society”. Part of this may lie in the role of religion in the formation of the national identity in the United States:

…the continuing importance of religious identity in national integration keeps a majority of Americans happy in “one nation under God”, even while they are disputing bitterly with others about the supposed entailments of this, in areas like abortion or gay marriage. Lots of voters in “blue” states, who abominate the zealots of the Religious Right, are in their majority members of mainline churches, who will still happily sign on to the hallowed formulae of harmoniously co-existing denominations.

An additional facet Taylor identifies is the tendency for Europeans to feel that churches and religion imply authority and “conformity to society-wide standards, not to speak of hostile divisions between people, and even violence.” This “baggage of submission and conformity” has largely been lost in the United States (despite popular European misconceptions to the contrary), whereas in Europe the echoes of an embarrassingly fractious religious history encourages people to “seek extra-religious forms of meaning.”

A common theme throughout these explorations is the way in which “it is a pluralist world, in which many forms of belief and unbelief jostle, and hence fragilize each other”. Belief is no longer an obvious and unchallengeable position, although there are cases where it may be a “default” solution – but there are also milieux (“including important parts of the academy”) where unbelief is the default solution – and between these conflicting poles, only the most narrow-minded of belief systems (whether founded on belief or unbelief) can resist the increasing currents of fragilization. 

As we saw previously, it is between these polar opposites that the impossibly diverse spiritual landscape of the nova effect lies, but in this space traditional religion suffers from specific disadvantages:

Whatever the level of religious belief and practice, on an uneven but many-sloped playing field, the debate between different forms of belief and unbelief goes on. In this debate, modes of belief are disadvantaged by the memory of their previously dominant forms, which in many ways run athwart the ethos of the times, and which many people are still reacting against. They are even more severely disadvantaged by an unintended by-product of the climate of fragmented search: the fact that the falling off of practice has meant that rising generations have often lost touch with traditional religious languages.

Thus religion today is a complex many-faceted affair, belaboured by the weight of its historical excesses and failures (particularly in Europe), but in the constant process of re-inventing and exploring itself anew. Most believers today are as far from orthodox religious practice as they are from unbelief, and all but the most bellicose bigots accept this vast range of beliefs as a legitimate expression of “the spiritual quest”.

What of unbelief today? Behind all the many different positions in the spectrum of modern unbelief lies a particular idea, a way of looking at our world which we all share (excluding a handful of belief systems near the orthodox religious pole) – although believers and unbelievers interpret this particular concept very differently. To fully understand modern unbelief it is necessary to examine the foundation upon which it is almost unilaterally founded. 

Next week: The Imminent Frame


On Fanboys

What makes a fanboy tick? I believe fanboy behaviour is driven by cognitive dissonance in a manner strikingly similar to political partisanship, sporting rivalry and the religious cold war between militant atheists and their theistic counterparts.

I’ve written about cognitive dissonance before, and in greater length than I will attempt here, but the potted version for anyone new to the idea is as follows: we all adopt certain beliefs about ourselves and the world around us, and once these beliefs are adopted they dictate how we interpret all our experiences to a fair degree. When we come across situations that radically contradict our beliefs, we are filled with an uncomfortable feeling: to lessen this unpleasant experience (which is termed cognitive dissonance) we modify our beliefs in a way that will lessen the cognitive dissonance.

When cognitive dissonance occurs between groups of people with different beliefs (which is perhaps the most common manner in which this behaviour manifests) there are a handful of common responses: either we distance ourselves from people who hold other beliefs by marking them out as different (using pejorative terms such as fanatic, heretic, pseudo-scientist), we apply social pressure to try to make them conform to our beliefs, or if we are subject to the social pressure ourselves we may conform to the alternative view (i.e. give in to peer pressure). 

Recently, scientists reported brain scans of political partisans which revealed the parts of the brain activated during partisan-response (for instance, the hostility felt towards an opposing candidate) were the same regions involved in assessing risk and reward in the context of prior experience. According to psychologist Jonas Kaplan of the University of California in Los Angeles, in the political process “people come to decisions early on and then spend the rest of the time making themselves feel good about their decision.” 

This connects with the first mechanism of ameliorating cognitive dissonance: demonising people with opposing beliefs. Partisans not only interpret the speech and actions of their candidates more positively, they turn up the negative feelings engendered by the opposing candidates – ensuring a strong antagonism. This is why politics and religion are such explosive topics: having committed to one metaphysical and ethical position (liberal versus conservative, atheist versus theist etc.) partisans are no longer able to see either side or the argument without massive distortion. Independents and agnostics – people with no prior commitment, or a lesser degree of commitment – can generally see flaws and benefits on both sides of the divide.

Now it may seem that committing your loyalty to Sony, Microsoft or Nintendo is a world apart from committing to a political or religious stance – after all, the stakes of politics are the leadership and government of society and the world, and the stakes of metaphysical belief can seem even more serious to both atheists and theists. Why should videogame fanboys be so invested in their loyalty to one platform over another? 

Remember that the parts of the brain activated in partisan response are those involved in assessing risk and reward, and cognitive dissonance is involved in protecting one’s prior decisions against disconfirming evidence. The reward in the context of videogame players is the enjoyment they will earn from playing the games on the various console systems, often in the form of fiero (triumph over adversity) – that hot and addictive emotional reward from overcoming immense challenge – but this is far from the only form of reward to be found in play. The decision each fanboy has made at some point in the past is which console will give them the greatest emotional reward from play – and for loyalists who stick with one console manufacturer from generation to generation, this decision was made a long time ago.

Thus the fanboy experiences cognitive dissonance in the wake of disconfirming evidence that they made the right choice (i.e. that they choose the console that would give them the most reward). This most commonly manifests in a partisan conflict between opposing camps – at the moment this is most commonly Microsoft and Sony as these are the companies fighting hardest over the loyalty of the “hardcore” gamers, but Nintendo fanboys are subject to the same psychological forces and conflicts (often lessoned these days, as such people often have to own another console as well since Nintendo cannot produce their highest quality games with sufficient regularity to keep their fans occupied). 

Last week, when I wrote about the battle between Sony and Microsoft for hardcore gamer loyalty, it triggered an avalanche of knee jerk reactions from Sony fanboys because I assessed that Microsoft has the edge in this struggle at the moment. This was judged as disconfirming evidence that these people had made the right choice – thus triggering cognitive dissonance, and producing all sorts of wild accusations against me, including that I must be a Microsoft fanboy (I guess not a very good one, though, since I don’t actually own an Xbox 360). This in turn triggered the same kind of response from the Microsoft fanboys as they embattled with their “enemies” in an effort to justify their prior commitments. 

But aside from the fun that can come from friendly rivalry (something we see more commonly between sports fans than between political or religious enemies), this kind of defence of one’s decision to purchase one console or another is rather ridiculous, especially right now when the development climate favours multi-platform releases. Whichever power console you chose to purchase, it is highly likely that it will fulfil your play needs as adequately as its rival. Or course, it doesn’t feel this way to the fanboy, just as the political and religious adversaries cannot accept that there could be merit to aspects of the opposing position. Cognitive dissonance prevents this realisation.

So the fanboys defend their beloved consoles fiercely, because all their prior experience tells them they made the right choice (they have indeed enjoyed their console immensely) and anything that suggests otherwise triggers cognitive dissonance, and thus a need to lesson this uncomfortable feeling – usually by demonising the opposing camp, or indeed anyone not coming from their position. Accepting the idea that people who made the “opposite” decision (who bought a PS3 instead of a 360 or vice versa) also made a good decision is not an allowable response as it seems to invalidate the choice the fanboy made – we can’t both be right, is the assumption made, but this postulate is quite in error. 

We will always have fanboys, and they will always fight vigorously to defend their choice of loyalty. By studying their behaviour we learn something about ourselves, about that unseen aspect of cognitive dissonance that we are all subject to but can rarely catch a glimpse of without exceptional circumstances. Don’t judge the fanboys too harshly, as you and I are all subject to similar forces, in politics, sports, religion, science and other domains. I would much rather see such zeal expressed in the pointless debate over which was the “correct” console to purchase than attempting to disrupt our freedom of belief or the chance of political action on vital issues – if we can’t eliminate cognitive dissonance, perhaps we can at least contain it to the realm of the trivial.