Beyond Space

An open letter to Charles Cameron responding to his blog-letter No Man’s Sky at Zenpundit as part of the Republic of Bloggers. Further replies welcome!

Full Moon Above PinesDear Charles,

The second of my five religions, Zen Buddhism, came about entirely as a consequence of a famous tale you allude to in your wonderful letter. The library at the University of Manchester, where I studied until gaining my Masters degree, was a wonderful resource for me during my time as a student. Although I do not remember the details, I read something about the Last Patriarch’s teachings there, and it went something like this:

The nun Wu Jincang once asked Huineng to explain passages of the Nirvana Sutra to her. Huineng couldn't read, and he asked her to read the passages aloud. Astonished that the revered Zen master could neither read nor write, Jincang wondered how Huineng could understand the teachings. Huineng replied: “Words are not truth. Truth is like the moon, and words are like my finger. I can point to the moon with my finger, but my finger is not the moon. Do you need my finger to see the moon?”

I spent a great deal of time that night meditating upon the gloriously full moon, a little about my finger, and a great deal about the space in between. Space. The space between. The space beyond. When I could be any or all of these, I went to bed. I thought to myself: How arbitrary it is that we should see ourselves as the finger, and as not-the-moon, when we might just as well consider ourselves the spaces in between – since without that, we could never be not-anything!

This lunar encounter served me well until about five years later I hit a terrifying crisis of identity when I lost faith in any ability to use words to communicate at all. I began to fray at the edges… If everyone’s words were their own symbols, how could we ever manage to communicate? Did we? Or were we just braying at each other at random, each one watching a different play on the stage we had been thrown together upon? I was a practicing Discordian at the time, getting my religious community fix from a cabal of strange and wonderful folks who had come upon the journey into chaos with me. We were all wrapped up in our own strange adventures. That was always the risk of leaving the clearly marked paths behind… of being set adrift, becoming a nomad. And we are all becoming nomads these days.

With a flair for the Biblical inherited from the time when Christianity was my only religion, I spent forty days and forty nights hitchhiking across the country, staying with friends. Upon my return, I left Manchester and moved to London, where I began working as a professional videogame designer. I had the honour of working with Sir Terry Pratchett – although not a knight of the realm in those days! – and indeed spent a launch party sat next to him and his agent, Colin Smythe, having a marvellous chat about writing and publishing. Alas, I was young and cocky, too arrogant to truly appreciate how much that night was to come to mean to me. My first book came into print soon afterwards.

Years later, Wittgenstein helped me make sense of my problem with words. He was long gone, of course, but he left his words behind, which meant I could listen to him even if he could not hear me. He made clear how words could be understood as belonging to the many different games of language: the meaning of the word was its use within the game it was deployed within. (“I don’t buy that” means one thing in a courtroom; quite another in a shop.) That meant if you wanted to be sure you were using the words correctly, you had to know which game you were playing. That’s precisely the problem with what you call the God NoGod argument: two very different games are being played that just happen to have identical rules. But once you realise that, once you take that idea aboard, you risk being set adrift from living life in one particular way – you risk becoming a nomad.

Peter Lamarque, perhaps Britain’s greatest living aesthetician, awoke in me a whole new way of riding Wittgenstein’s thought when he expressed how beneath it all was the concept of a practice, of lived practices. At which point, Isabelle Stenger’s idea of an ecology of practices, as a manifold of games, or (as I put it in Chaos Ethics) a multiverse, was the only way to understand our mutual predicament. This multiverse, or pluriverse as William James also puts it, is an idea I develop from Michael Moorcock, who – rather amazingly – is also origin of the use of ‘multiverse’ to mark the physicist’s imagined plurality of universes, a quaintly nontheological reverie if ever there was one. Yet at least one of my Discordian friends speaks of having personally experienced this physical multiverse… Should I treat him differently from those who speak of God, or the Goddess, or even of the Universe? What does a nomad do confronted with any singular way of being? What kind of reply is: there are other ways

Thank you for the letter, and your continued friendship, albeit of the nomadic, disembodied kind where we have never met in the flesh. I place more stock on flesh these days, but then, I also have a great deal of faith in words.

With unlimited love,

Chris.

Any and all replies are welcomed, whether in the comments, or via a blog.


Disbelieving Just One More God

Contains ideas some atheists may find offensive.

No Gods PatchA well-known joke states that since every religion involves disbelieving the gods of other religions, atheism merely involves disbelieving just one more god. The profound truth upon which this joke relies is that atheism is necessarily theology, and as such, does not involve an escape from religious practices, but merely their transformation into yet another form.

This is one of the great oddities of the European diaspora: the presumption that the atheist is the person who has successfully freed themselves of religion. For many, this is essentially their definition of the term, which is thus the source of the valorisation making atheism an appealing identity to adopt. Those who count themselves as an atheist (of the Christian kind, at least) tend to underestimate the extent to which their thought depends upon the thought practices of the very religion they wish most fervently to distance themselves from. This does not make atheism a religion as such; it utterly fails to sustain a community of care, for a start. Rather, it draws a historical connection between individual atheists and the religions they are rejecting – the most common form of atheism today being a rejection of all things Christian, with all other religions taken as mere variations on Christianity.

This odd qualification – the idea of a Christian-flavoured atheist – reads strangely for precisely the reason that the joke works: the concept of religion it relies upon is dependent upon the form of theology that emerges out of European Christianity, and thus from the philosophy of Plato that influenced it. Not coincidentally, this is also the theology that gave rise to the contemporary sciences (and also their valorisation as the nebulous omnicapable ungod Science). It is the equation of religion with belief, understood as the acceptance of propositions without evidence, and this is a very particular and peculiar kind of theology.

We will find no such propositional theology among Buddhist atheists, such as the Dalai Lama, nor among those of the Hindu traditions, whether or not their path involves bakhti, or devotional worship. And it is not entirely clear what a Shinto practitioner would make of any of this. Similarly, if we look at the relationship between what we now term religion around the Mediterranean prior to Christianity, we find that the different gods were not competing propositions, but merely a pool of different names for the same entities, and this never quite managed to generate a contradiction until emperors made themselves gods-on-earth and spoiled the game for everyone.

It is the peculiar legacy of the core traditions of the Abrahamic faiths (and I exclude here traditions like Sufi Islam, which straddles between the God of Abraham and the Dharmic faiths) to risk founding theological thought upon the assumption that our god is the only real god. The story of Bel and the Dragon in the Judaic sacred texts (apocryphal to Christians) is precisely a forensic investigation at to why Bel (or Baal) is not a god. This scripture is the first detective story, the Sherlock Holmes mythos millennia before its time. Yet to equate this kind of exclusionary theology with all religions is terribly misleading. What’s more, the success of Christianity – or better yet what Kierkegaard called Christendomis precisely an artefact of the sheer success of this theology, and a reminder of precisely why our sciences were able to grow out of it.

The question of what kind of atheist someone might be when they claim this identity is thus far more complex than it originally appears, in part because of the sheer historical influence of theology in European culture. For many atheists, the rationality of their atheological position depends upon whether god (and even more so God) is a proposition (equivalently: a hypothesis) and therefore whether that proposition is true, meaning, existing in reality (itself a perverse understanding of theology). In such cases there can be little doubt that the people in question are still practicing a variation on the kind of deistic theology well-known to the men and women of the Enlightenment and the centuries thereafter.

The kind of critique I am advancing here entails an uncovering of the practices of thought entailed in personal identities that thrive on distancing from religion – and this is almost completely obscured by the idea that you must be either a theist or an atheist or agnostic. It is in no way natural from, say, Hindu theology to understand matters this way (nor is it in any way accurate to consider those particular traditions to be polytheistic, i.e. as comprising of a set of gods instead of one God). The three way split is only the false choice between the theist presupposing a certain theology and thus requiring theodicy (i.e. the problem of how God allows evil); the atheist presupposing the failure of theodicy and thus requiring atheology; or playing this game without conclusion for the agnostic. The entire framework here is Christian theology.

Thus anyone for whom the joke about disbelieving just one more god adequately works, not just as a joke but as a mission statement, is necessarily engaged in theological practices that are resolutely and inescapably Christian in their origin and nature. Christian-atheist would be a misleading term, but perhaps achristian atheist is not far from the mark. To reject theology entirely requires a very different capacity, and is what I suspect motivates so many deep thinkers today to focus instead upon ontology, which is effectively non-theology. There is no complete rejection to be found here, only various kinds of righteousness to be generated by different kinds of allegiance or conversion, and various forms of non-participation, whether secular or otherwise.

Over a century ago, Nietzsche remarked that “the complete and definitive victory of atheism might free mankind of this whole feeling of guilty indebtedness toward its origin” – and I suppose he was right, but not in the way that he intended. For what has emerged instead, which Nietzsche would have reviled, is a kind of widespread willed ignorance concerning how most atheological thought comes to reach any kind of conclusion about god-concepts. Disbelieving ‘just one more god’ is not rejecting theology: it is just another version of Christendom’s insistence upon a single mandatory theology. Both the religious and the non-religious can do better.


The Scientific Age?

Density of StatesDo we live in a ‘Scientific Age’? What would that phrase mean, and how could we judge – scientifically – if it were true?

I recently read a piece in The Atlantic on free will that disappointed me. I’d already been checking up on the state of the art for this topic (see Is Free Will Too Cheap?), which has become particularly interesting in recent years. But nothing of that could be found in the piece in The Atlantic, which felt suspiciously more like a poor excuse for a Sam Harris interview. The article closed by tacitly declaring that we live in “the scientific age” – and that rather amused and annoyed me. Because if that were a fair characterisation of our time, would that not be blaming scientists for our rather dreadful global predicament? My sense of this, as a scientist by training, is that there is no empirical basis for such an attribution, and that rhetoric (rather than evidence) is what motivates such an assertion.

To properly explore this, we must first ask: what does it mean to characterise an Age? For the most part, the practice of defining Ages has entailed a historical or mythological assessment. Hence, for instance, the attribution of a Golden Age in ancient Greece, which was a mythic time before humanity messed everything up. The Age of Sail and the Age of Steam were likewise retroactive attributions, albeit in these cases based on historical rather than mythological considerations. It actually makes more sense to make these kinds of assessment after the fact, since only then can the relative competing factors be weighed carefully against each other – although even then, the choice to assign an ‘Age’ shows a bias in focus at the very least.

It is only with the twentieth century that we see attempts to characterise history in the present tense – and even these seem relatively dubious upon later reflection. The Atomic Age built upon fantasies about the future born of the New York World’s Fair of 1939, but as it happened nuclear energy did not characterise much of an Age, since it gave way within decades to the Space Age, which was equally short-lived. The frequent use of ‘space-age’ as a marketing adjective links both these science fiction tales to the flourishing capitalist production line – and indeed to their rhetorical deployment against the Soviet production line that was almost indistinguishable apart from its overriding mythology. Industry, either side of the Iron Curtain, was much more important than science, which was (and is) industry’s bitch.

This analysis is not scientific, of course, but rather historical and political. Our second line of enquiry must then ask: what do we mean by ‘scientific’? The usual invocation here is ‘the scientific method’, the cycle of observation, hypothesis, prediction, testing, and eventual theory. However, empirical observation of scientists at work has not validated this as a general method applied by researchers, and it appears to operate more as a catechism than as a practical methodology. (You would not get very far using solely this method as-stated for a research project!) A key problem is that observations are themselves theory-laden, and as the historian Thomas Kuhn observed it is never the case that observations alone determine how one theory replaces another. The wider philosophical questions here are not vital to the current discussion, however, only that ‘the scientific method’ is not a means of distinguishing what is scientific from its alternatives, regardless of its uses as an educational dogma.

Central to what is deemed ‘scientific’ is evidential reasoning, the process of taking evidence (observations, measurements) and then drawing conclusions from it. Theories form an indispensable element of such reasoning: the theoretical apparatus provided by the periodic table guides evidential reasoning in chemistry, for instance. But by itself, evidential reasoning can only exclude things that are clearly not scientific (such as divine revelation, or faith in free markets), it cannot positively identify a science. It’s notable, for instance, that evidential reasoning is core to the skills of historians, who are not often called scientists, and every branch of the humanities uses evidential reasoning in some role.

What distinguishes most things that are called ‘scientific’ from other disciplines that deploy evidential reasoning is the possibility of verifying judgements, a point discussed at length by Karl Popper. Evidential reasoning in the humanities invites a relationship between propositions and conclusions, yet the propositions themselves entail an element of judgement but not of measurement. Conversely, ethology (study of animal behaviour) entails judgements that are open to verification by further observation. This field, which does not resemble the archetypal ‘scientific method’ at all, nonetheless entails a substantial element of verifiable judgement.

Yet a grey area occurs. Some physicists insist, for instance, upon a quantum multiverse – the existence of which is essentially impossible. (Indeed, the word ‘existence’ has a questionable meaning in these kinds of context). We then might be tempted to extend ‘scientific’ to mean ‘asserted by scientists’, at which point the phrase will cease to distinguish anything useful. Many scientists will assert that George W. Bush was a fool, but that should not be mistaken for a scientific claim: that would require some means of verifying the judgment that was not merely anecdotal. We ought to be careful about this distinction if we value the work of scientists, since the credibility of the term ‘scientific’ is all too easily strained when we start deploying ‘Science says...’ as a form of prophetic persuasion.

Suppose we accept my provisional criteria for determining something as ‘scientific’. We can then ask: what would be required to scientifically judge our time as a ‘Scientific Age’? Immediately it should be clear that it will necessarily fail to qualify for this accolade; firstly because ‘scientific’ is not a criteria that could be applied on a scale beyond specific observations, methods, or practices, and secondly because the characterisation of an ‘Age’ is necessarily a historical judgement, and not one open to verification in the required sense. Of course, this doesn’t rule out the historical judgement in question – but it cannot be a scientific claim in any conventional sense of the term.

So what about the historical judgement? Here, we still have to meet the requirements of evidential reasoning and the evidence is not very convincing. We would presumably expect to see evidence of widespread evidential reasoning in culture at large – something that would be very difficult to produce. Where we do find it – in law, for instance – the trend goes back to before the aforementioned Age of Steam, indeed before the Age of Enlightenment, so using this to characterise our time seems to be extremely misleading.

Not a scientific judgement, not a historical judgement, what is the basis of claiming we live in “the scientific age”? Like the Atomic and Space Age, this appears to be a purely rhetorical move, presumably one intended to contrast our time with an ‘Age of Faith’. But characterising even the Middle Ages as an ‘Age of Faith’ would be a struggle for any honest historian, and until the late nineteenth century the development of the sciences was a quintessentially Christian endeavour (although it was also underwritten by earlier Islamic scholarship, which in turn carried on the work of the ancient Greeks).

The point of claiming that we live in a ‘Scientific Age’ appears to be to continue asserting the alleged war between ‘Science’ and ‘Religion’, and to further imply that ‘Science has won’. But this is simply bad evidential reasoning. As I explore in The Mythology of Evolution, the cultural conflicts that are being spun within this rhetoric occur both within the sciences (e.g. over different evolutionary theories) and between religion and non-religion (e.g. over the theological and atheological implications of said theories). Frankly, it is a hopeless task to treat the terms ‘Science’ or ‘Religion’ as unifying in anything beyond the sketchiest of senses, and even if these generalisations are accepted we ought to take note of Stephen Jay Gould’s objection that there cannot be a conflict between two almost entirely disjunct concepts.

I can find no evidence that positivists, those whose non-religious faith is invested in the sciences, are better or worse people than religious folks. But I can provide evidence that they are alike in many ways, including the example that I have discussed here. Rhetorical tactics such as asserting that we live in “the scientific age” are essentially self-betraying; they do not uphold the evidential values that positivists justifiably venerate. We can gainfully compare this to the reprehensible tendency of some Christians to endorse torture and war against Muslims, thus betraying the moral values of Jesus’ teachings, which they are supposed to venerate, or for a small minority of Muslims to betray Mohammad’s teachings by murdering innocents. If the latter cases are notably more extreme, it’s worth remembering that some positivists have also supported this kind of horrific brutality, it’s just that they are not being overtly hypocritical in doing so, ‘just’ morally repugnant. Every tradition, alas, has its darker side.

What positivists, Christians, and Muslims all have in common is that they are all human. As Charles Taylor argues in his epic tome A Secular Age, one of the most unique characteristics of our time is the sheer range of beliefs and practices on offer, having fractured and diversified in the wake of what he calls ‘the Nova effect’, forming an (all-too-real) phenomenal multiverse. However, as the examples I have given above demonstrate, we could rhetorically dub our time an ‘Age of Confusion’, an era when faithful adherence to the values of any tradition has become increasingly hard to find, while our critical faculties are frequently numbed by the easy appeal of emotive rhetoric – especially when we get to valorise ourselves while denigrating others. If, like me, you think the practices of the sciences deserve our respect, you owe it to yourself to uphold their core values concerning evidential reasoning and not slip into the cognitive biases that flourish as much today as in any other era of human history.

The opening image is Density of States by Dr Regina Valluzzi AKA ‘the Nerdly Painter’, which I found here on her Wordpress site, Nerdly Painter (used here with permission).


Ontology as Non-Theology

Chidgey.Music of the SpheresTo speak of ontology is to speak of being, to say what exists, or how it exists, or how the things that exist are related, while to speak of gods or God is what is called theology. Every theology is necessarily a form of ontology – it takes a specific position on what exists – but not every ontology is a form of theology. However, every ontology is and must be, at the very least, a non-theology. Which is to say, you can’t talk about being or existence without at some point crossing into religious territory, however tangentially.

Let’s get some helpful maps for entering this rather contentious territory.

Either you have an image of God, or of gods, in your mind, and it matters to you, or you have an image of the impossibility of deities and that matters to you, or you do not find images of divine entities are of any particular relevance for how you think about existence. These three positions are those of the religious and non-religious positions of theism (e.g. Christianity, Islam), atheism (e.g. secular humanism, Marxism), and non-theism (e.g. Theraveda Buddhism, Jainism). That makes it sound as if you cannot get by without positioning your view of existence (your ontology) in respect to images of God or gods, even though you quite obviously can – it is just that when you are confronted about theology, whatever stand you take must be positioned somewhere within the space of this particular game.

But it is not just theology that everyone is forced into a position upon, ontology is equally inescapable: everybody who speaks has an ontology – even if it just all the names of all the things that they know about. Your ontology is the set of things you can say exist, and this process is well established in us long before language gives us names for them. Some philosophers develop quite intricate systems for describing how things are, or for setting the limits of what can be known about how things are, in what could be called technical ontologies. However, despite the care with which it is practiced, ontology is not a subject prone to widespread agreement: the number and kinds of ontology are limitless, and all of the more sophisticated ontologies come with a recognition of the limitations of this kind of thinking.

Kant’s Critique of Pure Reason (1781) sets the pattern here. While ontology in one form or another goes back at least three millennia to the Sanskrit scriptures known as the Vedas and to the ancient Greek philosophers, the way we think about technical ontologies remains coloured by the work of Kant during the Enlightenment. As Theodor Adorno has commented, Kant recognised that an ontology “exists only for others” and thus has no meaning outside of lived experience, while he simultaneously tried to conduct a “salvaging of ontology” as something beyond experience. The tension between these two elements of Kant’s ontological work has never gone away.

Today, philosophers can be broadly divided into two camps. Firstly, there are those who have continued to pursue Kant’s project of ontological rescue who are engaged in trying to construct ontologies that can be claimed to go beyond experience. My personal favourite of these is Alain Badiou, who identifies ontology with mathematics (set theory in particular), and then reasons about ontology by using maths as his foundation. On the other hand there are those who are engaged in Kant’s project of ontological critique, who are primarily engaged in situating ontologies (including theologies) as elements of a plurality. Here I have a fondness for Paul Feyerabend, who found technical ontology less than useful, and was dismissive of what Terrance Blake calls “the detour through ontology”.

What ontology and theology have in common, what binds them together as conceptual sisters, is that both are about how we imagine existence. If we did not live in a world so heavily conditioned by theistic traditions, it might not even be necessary to distinguish between these two practices – but our intellectual inheritance is inescapably coloured by the Judaic concept of history, the Islamic reinterpretation of Greek philosophy and mathematics, and the Christian faith in truth, which descends from the earlier monotheistic practices and has given rise to the tradition of scientific investigation. The core danger of trying to paint our time as one where religion and science ‘fight’ is that the key battlegrounds are within the sciences and within religious (and non-religious) thought, as I drew out of the discussion within The Mythology of Evolution. Despite the ‘official story’, the majority of those who believe that the sciences uncover the truth about the world are Christian, and the most vociferously asserted theologies in the public sphere are atheologies that insist upon crossing out all gods. 

Theology, including atheology, always possesses a moral element (or an aesthetic element – the distinction is not always important). Take any of the atheologies being deployed today and you will always find behind it a (moral) commitment to truth. Since gods clearly do not exist (the logic goes), we must commit ourselves to an atheology where gods are not an acceptable part of our thought. This position is undergirded by a prior commitment to the importance of truth. It is because gods are not true that we must reject them. The theological positions are generally more varied, and include those that are a direct inversion of the standard atheology (starting once more with the commitment to truth), as well as others in which God serves as a distant moral guarantor (which was broadly Kant’s position), or in which a moral order is otherwise given a divine foundation.

Now in the case of contemporary technical ontologies, the moral element may appear to be absent, and this could be taken as a justification for not linking these systems of thought with theology altogether. However, this is not as straightforward as it might appear. Many non-theological ontologies begin with the same (moral) commitment to truth as other theologies/atheologies, even if that prior moral claim is sometimes obscured by a claim to nihilism, usually developed with (or perhaps against) Nietzsche. But nihilism is essentially a self-negating position for philosophers: if it were plausible to void all truth and meaning, there would be no viable topics for any philosophy to address. Only the rather limited claim that ‘the universe in itself is devoid of value or meaning’ is available, and this is a terribly uninteresting observation until it enters theology, where it becomes a rather straightforward atheological claim.

Even those technical ontologies that do not begin with the moral commitment to truth cannot avoid entering into moral territory later. Once you make a claim for how existence is organised or can be understood it is hard to avoid this becoming a demand to understand in this way (or something like it) on pain of error. If the ontologist did not have this kind of commitment to truth before theorising, and they manage to avoid acquiring it afterwards, then what motive would they have for sharing their ontology? There is always a moral value here, even if it is concealed behind epistemic justifications. We should expect this: no-one is going to pursue ontology or theology without a motive, and that motive will always contain a moral (or aesthetic) element.

Tolstoy claimed that it was impossible for a person to have no religion, in the sense that this word means a relationship to the universe. This statement no longer seems as self-evident as it did a century and a half ago because the meaning of ‘religion’ has become mangled through its deployment as a caricatured ‘enemy’ to be fought… those whose self-image is founded upon ‘fighting religion’ are effectively barred from considering how this practice might also seem like a religion when viewed from the outside. It was for this reason that I began to talk of non-religions, and for equivalent but opposite reasons that others talk of ‘worldviews’. Technical ontologies scrupulously avoid overt religious elements, but they cannot entirely avoid operating as non-religions, because you simply cannot talk about existence without taking some kind of moral (or aesthetic) stand upon it.

Thus ontology can be understood as non-theology, as a means of conducting the same kind of how-and-why-things-are-this-way discussions that occur within theology – the ‘Queen of the sciences’ as it was once known – without having to take any particular positive or negative view on the existence or otherwise of divine forces. Except, of course, they always do. How can they not! You can’t have a system for summing up existence and yet never be required to take a theological stand when the vast majority of the planet constrains their ontological concerns to those of theology. These two practices are twinned; they are distinct, but they can never be separated while theology is still being practiced. Accepting this proposition doesn’t mean that everyone has to be a theologian – but it does mean that you can’t practice ontology without at least brushing up against theology. And good fences, as they say, make for good neighbours.

The opening image is David Chidgey’s Music of the Spheres, which I found here on his website Art Glass Mosaics. As ever, no copyright infringement is implied and I will take the image down if asked.


Think for Yourself?

They LiveAn extremely common demand made by non-religious folks is that you ought to ‘think for yourself’. On the surface, this seems like a reasonable request – certainly, the people who make this claim believe it is morally exemplary to do so! But what does it mean to ‘think for yourself’ and what moral weight can this directive bear?

It is worth observing that the demand to ‘think for yourself’ is often made against the background assumption that if you are part of a religious tradition you do not think for yourself. This appears to be based on two separate but related assumptions: that religious folks do not think for themselves because a centralised autocratic institution dictates norms of behaviour, and that ‘thinking for yourself’ is necessarily a mode of freedom. The latter claim is largely the converse of first, based on the logical connection that says ‘either you think for yourself, or an institution thinks for you’, and the problem with this is that it is solely by drawing against traditions and institutions that any kind of thinking or language is possible. The trouble with the former assumption is that it doesn’t describe contemporary religious institutions outside of purely fictional narratives very accurately.

Of all the world religions, only Catholicism has a central bureaucracy and single leader, and yet Catholics (if you actually talk to a reasonable number of such people) are generally far more independently minded about their theology and religious practice than protestant Christians who – despite their branch of that religious tradition having expressly broken away for the purpose of ‘thinking for themselves’ – all too often align around congealed interpretations of their scriptures. (I personally find it fascinating that an education system that purportedly discourages independent thought creates so many independent thinkers, and suspect this says more about schools than religions). If we move away from Christianity, the situation is even less one of outsourced thinking: the norm for global religion is distributed religious practice, with no centralised elements whatsoever.

There is, however, at least one way that religious folks can be said to not ‘think for themselves’, which is that when they face moral crises they will turn to their co-religionaries or (local) community leaders for advice. This is, however, what is required in this situation, since the best philosophical and scientific evidence suggests that you can only operate in a moral context when you are embedded in a common moral community and can engage with others in what in Chaos Ethics I have termed moral representation. A person who solely ‘thinks for themselves’ i.e. who never checks their reasoning, ethics, or assumptions against another person cannot be relied upon to think reasonably, or to reason morally, because humans naturally skew their reasoning towards their own benefit.

This question of ‘thinking for yourself’ also takes a strange turn when we bring in psychologists and psychiatrists who provide life advice and moral representation to their patients. This relationship is parallel to that between a religious community leader and their congregation; only the framework of reason and morality is distinct, as it is when we move between religious traditions. Is a person who is extolled to ‘think for themselves’ prohibited from seeking psychological assistance? I doubt this is the intent of the phrase. Indeed, I suspect that this kind of secular (and allegedly scientifically grounded) advice-seeking is something that would normally be encouraged. Similarly, the advice to ‘think for yourself’ is turned on its head in the context of scientific consensus, which advocates of this phrase typically align with while opposing those (such as people who dispute the reports of climate scientists) who ‘think for themselves’ on empirical issues, and thus where independent thinking is both mocked and disdained. In this regard, ‘think and verify’ might be a better phrase to bandy about.

So the demand to ‘think for yourself’ transpires not to be advice to think independently – which anyway, would be both impossible (since our linguistic concepts are maintained collectively) and undesirable (since thought without cross-checks is self-serving and apt to mislead). Rather, it risks becoming a demand that you think within the same framework of reason and ethics that the person making the assertion holds. Which at this point means that it has become simply a non-religious version of the very complaint being levelled against the religious alternatives i.e. it is an insistence of adopting specific norms of reason and ethics, namely the secular descendant of the Enlightenment tradition of reason and ethics. This is a great tradition – one that both religious and non-religious people are participating in – but it cannot be elevated to the sole source of norms without transgressing its own values of freedom and autonomy.

The one good thing I can say about a demand to ‘think for yourself’ is that at least it is a positive claim. I would rather hear this than incoherent fantasies like ‘the world would be a better place without religion’, which is only a secular version of an all-too-familiar religious bigotry that insists everyone who isn’t like me is necessarily inferior. There are authentic moral values being espoused in the claim to ‘think for yourself’ – it is just that in its most basic form, ‘thinking for yourself’ is also likely to lead to terrible vices. In this, as in all ethical affairs, we need to be part of a community if we hope to live up to our chosen moral standards. Besides, we now have plenty of people who ‘think for themselves’ and it isn’t helping any more: what we need is not a greater supply of autonomous thinkers, but better forms of collective reasoning. And this requires co-operation between everyone, whether they ‘think for themselves’ or not.


Galileo the Hero and Other Mythos Histories

Galileo Suggest Jesus was just another human and you horrify orthodox Christians – suggest Galileo wasn’t heroic, and you horrify orthodox Positivists. How do disputes over historical facts possess this power to induce horror?

The inability to bear contradictory conceptions is called cognitive dissonance by psychologists. Recently, we have made watching other people endure dissonance into entertainment – amateur ‘singers’ who become enraged when their lack of talent becomes exposed, or low-income lovers reacting violently when a lying partner is revealed. The experience is disorientating, and can invoke rage in certain cases, yet we all experience minor dissonance on a daily basis in pursuit of a consistent sense of self: the story we tell about ourselves has to be maintained against the ambiguities of life.  We expect to encounter a single consistent story about the world – history – and when this is threatened by rival accounts, dissonance occurs.

In Chaos Ethics, I use the term moral horror to describe cognitive dissonance in the context of ethics – the unsettling or fury-inducing response to incompatible ethical conceptions. Moral horror can be seen in the context of abortion, gay marriage, and many more cases of contemporary political disagreement. My additional claim in this piece is that because we possess moral values concerning truth, clashes over historical questions also evoke moral horror, and this is the reason that contrary historical claims can bring about dissonance. When positivists express outrage at the idea of creationism, for instance, it is because this suggestion transgresses their deeply held moral values concerning truth (see The Mythology of Evolution for this discussion). What on the face of it seems to be a factual dispute becomes a moral conflict: ‘you should not believe this (because it is not true)’.

We need moral horror – it is not something we should wish to eliminate. It is one of the few things that will motivate us to take action against that which we judge as morally wrong. But there is also severe danger any time cognitive dissonance is involved, because we are at the greatest risk of acting unreasonably whenever it affects us (just recall the poor victims of those ‘shocking’ day time talk shows). In the grip of moral horror, we are certain we are right, and cannot – quite literally – imagine how the other view of the world that horrifies us could be in any way reasonable. At most, we can tolerate the other perspective, which is a polite way of saying that we look down on these foolish others and patiently endure their being so obviously wrong. A key part of my purpose in exploring moral horror in Chaos Ethics is precisely to move past this intolerant tolerance, and to achieve this requires a deeper understanding of the role of imagination in morality – and history.

To unravel the moral horror of clashing histories we need to appreciate that our access to the world is mediated by certain imaginative patterns. Joseph Campbell referred to the mythic systems that are tied to lived practices as ‘living mythologies’ and it is the nature of such things that they are indeed lived. Often, this entails a relationship between the practitioners’ ethics and the stories of their mythos (i.e. a specific cultural vantage point, see the chapter on ethics in Imaginary Games for more on this). We cannot, as Jean-François Lyotard and others have suggested, break out of seeing the world through these ‘grand narratives’ – judging them as if we could get completely outside is simply, as Charles Taylor observed, yet another mythic point of view (what might be called the postmodern or relativist mythology). No, I’m afraid we all must imagine in specific ways if we are to imagine anything at all (whether fact or fiction), but as both Campbell and Raimon Pannikar drew attention to, we all have great difficulty in understanding our own mythologies as anything other than truth – and this is the root of the problem to be explored here, because it is this that sets up inevitable cognitive dissonance.

For the purpose of explaining the phenomena under consideration, let us treat any mythos as comprised of two elements – mythos stories that are recognised as stories by those who share them, and mythos histories that are taken as factual. Mythos stories have as their focus their moral content – Jesus’ parables are a great example, or Homer’s Odyssey as a guide to how a Greek warrior must be tempered before he can become a good husband. Conversely, mythos histories are read as informing chronology rather than morality, a key archetype being Jewish genealogies in the Torah (“Abraham begat Isaac” and so forth) that organize the passage of time. Indeed, the Abrahamic traditions are sometimes taken as having ‘invented history’ in the way it is often understood – perceiving time as both passing and consecutive, and also as heading somewhere  (see, for instance, Jacob Neusner’s The Christian and Judaic Invention of History). Homer and (later) Herodotus developed a concept of recording the past narratively, but it is only after Christianity brought Jewish practices to Rome that the mythic dimensions of histories became fully-fledged.

Now the problematic part of mythos histories is that the transition from ‘story’ to ‘history’ implies a move from an infinite space of possibility to a finite space of definite facts. There can be (it is assumed) only one history, or rather there can be only one true history. In those traditions partly descended from Plato’s Greek philosophy (especially Christianity and its offshoot atheisms) this is an especially likely habit, but via the sciences (which grow out of Christianity and Islam, and hence Platonic thought) the trend is now everywhere. What is more, wherever the prevailing assumption is to demand a single true history, there is a temptation for people to treat mythos stories as mythos histories. For example, orthodox Christian sects may recognize parables as ‘just stories’ but the Garden of Eden will be taken as history. This is by no means a given, of course – the majority of Christian groups draw their lines here very differently – but the point remains that the presumed line between fact and fiction becomes blurred within an individual’s mythos.

This phenomena is not constrained to religious traditions, as is usually assumed, but happens just as readily within non-religious contexts. For example, for many positivists Galileo is presented as having defended a suppressed yet true description of the arrangement of the planets (heliocentrism) against the erroneous dogma of the church. However, the records of the same event offer multiple alternative accounts - including that the clergy at the time were the sober scientists in this affair and that Galileo's techniques were not sufficient to prove what he had nonetheless correctly intuited. Similarly, the usual positivistic mythos history requires Galileo as a valiant hero maintaining the truth against the errors of the church – but this account is somewhat undermined by the cynic’s observation that Galileo’s offending manuscript brought trouble for him primarily by portraying his then-ally, Pope Urban VIII, as a simpleton. Woe betide anyone who suggests to an orthodox positivist that Galileo’s downfall was his own arrogance!

We can see in this example why a mythos history is more than just a neutral chronicle of events, and why it is sometimes difficult to separate ‘story’ from ‘history’. To hail Galileo as a scientific ‘martyr’ requires a mythos history that presents him as heroically resisting religious oppression, and bringing forth the world-changing power of empirical observation that is the ‘sacred’ value of positivistic non-religion. This particular episode comes across radically differently from the viewpoint of (for instance) the Chinese, who were never so invested in any specific cosmological arrangement, and who readily adopted Copernicus’ heliocentric cosmos when exposed to it by the Jesuits – and without the significant seismic upheavals attributed to Galileo’s ‘heroism’. This is directly contrary to what is claimed by, say, Luciano Floridi, whose mythos history (presented at a TEDx talk in Oxford) essentially requires Galileo, along with Darwin and Freud, to acquire the grand status of epoch-making scientific iconoclasts fighting religion (a mythos Bertrand Russell helped lay the groundwork for). It is not that this role does not match ‘the facts’ in each case, but rather that the account of these individuals purely as revolutionary is radically incomplete – as would be the case for any history presented solely from a single point-of-view. As the philosophers of the twentieth century never tired of emphasizing, all history is mythic history.

Rather than taking this situation to mark the ‘end of history’, I want to offer a slightly different approach. In Chaos Ethics, I draw against William James and (later, and independently) Michael Moorcock’s image of a multiverse, rather than a universe. This is not the multiverse of quantum physics, however (although Moorcock also helped inspire that), but rather the idea that beings and things experience their own separate worlds, and that none of these worlds can claim to be ‘the true world’ when taken alone. Thus while in an (imagined) universe there is only one true version of events, in an (also imagined) multiverse the facts depend upon the world you are in: it is false in most Christian worlds that ‘Jesus was an ordinary human’, but this is true in any positivist worlds. Crucially, no world-independent account is available in a multiverse, even though there is substantial agreement (at least between humans) about all manner of things. All facts always depend upon the world they are perceived from, but these diverse worlds are congruent in the majority of cases for any given species or entity provided the necessary translations can be performed accurately. Where they diverge, however, is precisely at the fault lines between contrary mythos histories – and these thus become a locus for unresolved cognitive dissonance.

This multiversal perspective is not something that can be expected to attain widespread acceptance since it requires a strong imagination to envisage. But it may only take a sufficient volume of intellectuals to adopt it (or something like it) to radically enhance our diplomatic power, and thus our capacity for effective, peaceful action on a whole host of pressing issues. Orthodox theists and positivists are unlikely to be able to talk to each other effectively – but their moderate colleagues could cross this bridge, and securing that dialogue would go a long way towards motivating substantial moral action in the developed world. By substituting a mythos superset for a singular and exclusive mythos history, the possibility of harnessing moral horror as a transformative influence can begin to seriously emerge. This is a powerful option since, as mentioned previously, it is moral horror that helps motivate reform on ethical matters – but only when it is properly aligned. Up until now, the potentialities of the multiverse have been used mostly for ‘spin’ – to obfuscate and deceive by using the gap between events and the mythic histories that record them solely for partisan gain. We can only speculate at what might be achieved if we began to use it instead as a tool for peace.

For more on moral horror, intolerant tolerance, and how to be a traveller in an ethical multiverse, check out my latest book Chaos Ethics.


Haught on Theology

In August 2011, I ran a two-part interview with Catholic theologian John F. Haught. An active voice in attempting to reconcile theology and evolutionary theory, Haught has also worked to reform Christian attitudes towards ecology and the environment.

The two parts are as follows:

  1. Evolution vs. Religion
  2. Science, Values and Ecology

If you enjoyed this interview, please leave a comment. Thank you!


Are Atheists Moral?

The Atheist Monster One of the recurrent themes I encounter while discussing freedom of belief is the distress or anger that many non-religious people feel when Christians (principally in the US) suggest that “atheists can’t be moral”. This is at heart a very strange claim, although the confusion stems in part by what it means to root part of one’s identity in a negative claim, as atheists by definition do.

Any Christian who believes that atheists are inherently immoral is on very unstable ground. If, as Christians believe, humanity was made in God’s image, and God is a moral being (indeed, the supreme moral being on Christian metaphysics) then the moral nature of humanity must be inherent to humanity, not to Christianity. This is not to ignore the implications of the Christian concept of humanity as “fallen”, which concerns our weakness to temptation not our capacity for compassion or morality. Everyone, according to Christian belief, is vulnerable to sin – even Christians! – and everyone is equally capable of moral behaviour. Accepting Jesus, after all, is to be forgiven for sinfulness, not to be rendered immune to it.

It is helpful to recognise that the Christian faith in Christ as a saviour never entailed the belief that only the “saved” could be moral, although it may entail the belief that one can be a better (i.e. more moral) person by following the teachings of Jesus – that is to suggest that Christian ethics are moral, but it does not preclude virtues of other kinds. Expecting only Christians to be moral is the kind of tribal elitism Jesus argued against in the parable of the Good Samaritan, so to dismiss atheists as inherently immoral is profoundly un-Christian. The sort of lazy partisan cheerleading of ‘Christians’ found in parts of the United States and elsewhere is what Kierkegaard so gainfully critiqued more than a century ago – what he disparagingly called ‘Christendom’. Christians have to try much harder if they want to represent the best their religion has to offer.

The belief that morality can only be secured in God (as a supreme moral being) is, oddly, a view shared between certain Christians and certain Nihilists – since the latter perspective, following the kind of reasoning Jean-Paul Sartre explored, claims that if there is no supreme moral being there is no secure morality. These are both Law ethics positions. We don’t usually think of Nihilism as espousing a moral law, but logically this is what follows from this position – it just happens to be an empty moral law. The fact that some who identify as atheists espouse a form of nihilism only furthers the confusion about this issue, though, since the majority of non-believers are not nihilists and indeed there seem to be more Humanists than Nihilists among the atheists of the world.

I do frequently encounter odd claims about the connection between ethics and atheism, though, such as the idea that “being an atheist has made me more moral”. I’m at a loss to understand how not holding particular metaphysical beliefs leads to improved moral beliefs, to be honest, and suspect that claims like these are actually prejudice in disguise. What is perhaps being felt is that belief in God leads to worse moral beliefs, therefore it is more moral to reject God. Although it may be an innocent form of bigotry, this is nonetheless a form of racism – as indeed is Christian condemnation of “godless atheists”. However, we are all prejudiced in one way or another, and these trivial discriminations are perhaps best ignored.

The Canadian philosopher Charles Taylor, a Catholic by faith, noted the following about our “secular age”:

I may find it inconceivable that I would abandon my faith, but there are others, including possibly some very close to me, whose way of living I cannot in all honesty just dismiss as depraved, or blind, or unworthy, who have no faith (at least not in God, or the transcendent). Belief in God is no longer axiomatic. There are alternatives.

Christians today need to take this argument seriously – and also recognise that if atheists were inherently immoral this would be an indictment on God, in whose image (Christians believe) humanity is wrought. Conversely, atheists might want to ponder whether identifying their beliefs by the negative element of not having a workable God concept is productive: you do have faith in something, be it science, humanity or the sense of personal identity inherent in the word “I”. No-one gets through life without faith in something, however slender. Recognising this might be a positive step forward forward for the public face of non-belief.


Jesus as an Agent of Chaos

Jesus Che Following the perspective of Søren Kierkegaard, should we understand Christianity not as a force of law, but as a source of moral teachings that are essentially anarchistic?

Charles Taylor remarked in passing that: “It is not only Machiavelli who has thought that believing Christians make bad citizens.” This remark struck a chord with me, since when one takes solely Jesus’ teachings as a point of reference, the prevailing thrust of his message appears to be anti-doctrinal – particularly his collapsing of all Jewish law into two key tenets, love of God and love of humanity. Although as a Rabbi he was in support of the traditions of Judaism, Jesus was in direct conflict with the understanding of that path as being shackled by a dogmatic legalism, as his subversive overthrowing of the tables of the money changers in the temples of Jerusalem demonstrates. This bold act of protest led directly to his execution.

The Danish philosopher Søren Kierkegaard, writing in the nineteenth century, saw clearly the inimical relationship between the core tenets of Jesus’ ministry and the institutionalised poisoning of that message when it becomes shackled to national politics. Kierkegaard used the term ‘Christendom’ to derisively refer to the social and political entity descended from early Christianity. It was his view that although many European citizens were officially “Christians”, they had absolutely no understanding of their religion, and were essentially lazily following doctrines which were not faithful to Jesus’ message. (This critique could equally be applied to many ‘Christians’ today). Kierkegaard similarly baulked at the presumption that morality had to be understood as universal (an idea that had flourished in part thanks to Kant), and suggested that the individual sometimes faced situations that required them to deny the norms of morality in order to do what only they themselves could determine would be right.

Whereas Nietzsche (writing in the same century) had also savaged the monstrous perversion of Jesus’ teaching into the Church of that era, his goal had been a permanent end to Christianity (something he naively believed he would see in his lifetime). Conversely, Kierkegaard sought to restore Christianity to something closer to its spiritual roots through a process of insightful re-examinations of Biblical writings. His vocal opposition to ‘Christendom’ – a kind of political revisiting of the overturning of the money changers – placed him into conflict with the Danish establishment of his time, and ultimately caused him to become a pariah in his native Copenhagen. Ironically, Kierkegaard’s influence as a philosopher was to be felt more strongly outside of the Christian tradition, in particular because the French writers Jean-Paul Sartre and Albert Camus were significantly influenced by his work.

Kierkegaard’s understanding of Christianity is, I’m claiming, either close to Jesus’ original teachings or at the very least a recapturing of the spirit of the early Christian church – and it is focussed on the role of the individual against the conformity of the masses. This passage typifies Kierkegaard’s attitude in this regard:

There is a view of life which holds that where the crowd is, the truth is also, that it is a need in truth itself, that it must have the crowd on its side. There is another view of life; which holds that wherever the crowd is, there is untruth, so that, for a moment to carry the matter out to its farthest conclusion, even if every individual possessed the truth in private, yet if they came together into a crowd (so that “the crowd” received any decisive, voting, noisy, audible importance), untruth would at once be let in.

We do not usually think of Christianity as anarchistic, but in so much as Jesus’ focus was on harmonious relationships between individuals in a spirit of love, he was offering an approach to community not rooted in legal formality but in the natural chaos of every day life. As such, Jesus could be seen as an agent of chaos – not the negative disorder of carnage and destruction, but the positive discord of unpredictable human relations. If Christians had managed to maintain this theme from Jesus’ ministry instead of corrupting it into ‘Christendom’, the religion would not have suffered the disastrous public relations fiasco that has tarnished its image for so many today.


Grey Wethers

Grey WethersDuring my time on holiday in Devon with my family, my wife and I hiked up into the mists of Dartmoor, our trusty dog beside us and our baby strapped into a harness like Yoda to Luke. Our destination was a pair of prehistoric stone circles high in the moors known as Grey Wethers, and after one failed attempt we did eventually make it there.

I’ve noticed recently that some enthusiastic positivists tend to make grand narratives about such early astronomical constructions – to wax lyrical about the crude understanding of the world that the makers of these circles must have had (often specifically in terms of their “supernatural” beliefs) and to exude a certain smugness about just how much we know about astrophysics and astronomy today. I find this kind of attitude somewhere between repugnant and hilarious.

In the first place, we know very little about the culture of the men and women who built and used Grey Wethers. We can only guess at their beliefs, but we certainly don’t need to patronise their early astronomical skills – they were mapping the heavens using just stone tools, often with considerable accuracy. The sites were almost certainly ceremonial, but why should we look down on calendar festivals? It’s not like we don’t continue to celebrate the seasons… I rather suspect the winter festival that took place at Grey Wethers was something truly memorable, which most of us cannot claim of our last Christmas et al.

There is still something of the condescending attitude that the British and other Empires held towards “primitive” cultures behind the dismissal of prehistoric monuments. I find them more impressive than the glass and steel monstrosities we build today, and have great respect for the people who built them, whatever their beliefs. We are no more separated from these early settlers than we are from remote tribes today, and there are few if any reasons to believe our contemporary cultures are inherently superior to these other forms of human life.