A Magisterium for Science

Pope Peter Rabbit"I believe in free speech," goes an archetypical conversation I sometimes have in the pub with people largely outside of any religious tradition, "but people shouldn't reject vaccination/evolution/science etc." Oh dear, I think to myself... how do I unpick this knot without offending them? Because these apparently innocuous statements run perilously close to saying "democracy is great, but I prefer theocracy". How can this possibly be...?

For more or less anyone reading this, theocracy will seem like the worst possible form of government. Indeed, a great deal of the tacit hostility that some today hold for the Catholic Church lies in the rejection of the idea that anyone should be placed in a position of arbiter of the truth, and thus in hostility towards theocracy, broadly construed. This is sometimes expressed in a conceptual rejection of the Catholic pope, as Trey Parker and Matt Stone did in their hilariously blasphemous South Park episode "Fantastic Easter Special", which ends with a rabbit being appointed pope and the claim that this was what God had always intended. This episode really interested me, as I believe it successfully captures Protestant Christians' prejudice against Catholics, from which the so-called "New Atheist" movement descended (all the prominent New Atheists began by rejecting Protestant Christianity, then expanded their dismissal until it encompassed all world religions). A similar theme also manifests in the Principia Discordia, the sacred book of the Discordian Society, which also aligns with elements of Protestant theology against Catholic theology, while softening the hard edges by cross-breeding (rather productively) with those schools of Zen Buddhism that embrace absurdity as a path to wisdom.

In Catholic parlance, the term 'theocracy' is seldom if ever used (although Vatican City can be understood as a theocratic nation - albeit a rather small one!). The official term for the authority of the Vatican is 'magisterium', meaning 'power of the office of magister', where 'magister' is Latin for 'master'. The magisterium is understood as the capacity of the pope and the bishops to render a judgement on the authentic interpretation of the Word of God, taken both in the sense of establishing an official interpretation to scriptural texts but more importantly in terms of guiding traditional practice, which encompasses an extremely broad range of human activities. Thus, when Pope Francis endorsed the idea in 2016 that Catholic churches could offer the sacraments to divorced Catholics (a controversial suggestion in some quarters!), he was exercising his official role in the magisterium.

The term was applied outside of Catholicism in 1997, when the brilliant evolutionary essayist Stephen Jay Gould proposed that the alleged conflict between science and religion could be resolved by asserting 'non-overlapping magisteria' (NOMA). His proposal was to assign dominion over facts to the sciences, while religions (and non-religions) would have separate dominion over values. Gould's use of the term 'magisterium' was borrowed directly from Catholic parlance; as a palaeontologist coming from a Jewish family in New York, where 60% of the population is Catholic (and only 10% Protestant), he would have been quite familiar with Catholic terminology and thought. This proposal did not go down particularly well - those who would go on to align with the New Atheist movement unilaterally replied that there was no need for such a principle because science could claim authority over everything and there was thus no need to make concessions. In such situations, 'religion' is being rejected not over the idea of asserting a magisterium, but merely because the wrong magisterium is being asserted. 

This is roughly where I see the problem with my friends in the pub claiming to support free speech, but placing its limit on topics that they consider to have been scientifically resolved, and thus beyond dispute. This position implies a magisterium of science, and in the same way that the presence of a magisterium in Catholic tradition implies a theocracy, any time someone unthinkingly singles out a scientific topic for special status (vaccination and evolution are by far the most common), they are also requesting or expecting something similar, if only implicitly. For it is solely in the presence of a magisterium of some kind that there could be authority to adjudicate what is or is not permitted as an absolute matter. The law, after all, is free to change when the people require it; only a magisterium is beyond dispute.

I think back to a very good friend of mine expounding his outrage in respect of a Creationist Museum somewhere in the United States that he had heard about. And I found this odd, because it sounded very amusing to me, and I could not see any legitimate cause for indignation here. "But it's not true," was his retort. But so what? It's not true that there is an inherent goodness to humanity, but I still choose to believe it, and if we are defenders of the so-called free society (and always assuming such a thing still exists) we should be able to accept that at least some of the things others believe are 'not true'. It seemed to me that a Creationist Museum was hardly likely to change anyone's opinion about anything, which means even if we accept that it's 'not true', the expected harms of such an establishment are rather limited.

Besides, who are we to call out other people's nonsense and not our own...? I have yet to meet a human who does not harbour strange non-testable beliefs of some kind, and I am perfectly happy for this situation to persist - because the alternative can only be a theocracy of some kind, or rather, I suppose atheocracy, in that it has become very possible to compose the kind of metaphysically-justified autocracy without any concept of deities. Besides, we make a mistake when we associate religions exclusively with positive theology (a concept of God or gods), since we are singling out just one aspect of the immense diversity of religious experiences and making it central. This is not entirely surprising, however. Christianity and Islam were too successful at making a theology of truth versus falsehood central to our thought via the creation and maintenance of libraries over a span of millennia. We never lost this habit of thought, we just switched from from theology to atheology and from libraries to corporate-managed online repositories while also ceasing to notice the change entirely

The idea that truth is singular and that deviation from it is abhorrent is an artefact of the monotheistic religions that has been inherited by those who place their faith in 'science', as Nietzsche shrewdly pointed out in 1882. This desire for a magisterium for science is far more widespread than we tend to admit. Consider the political questions that have continued to erupt over gender and sex for the entire duration of the feminist movement (which is to say, since at least Mary Wollstencroft in the latter half of the 18th century). Feminists largely managed to avoid calls for anything like a magisterium on gender up until the end of the twentieth century. After this, the number of non-testable beliefs about gender required to meet everyone's emotional needs ballooned beyond any reasonable expectation. Problematically, tolerance between these different metaphysical conceptions has been extremely limited - remind you of anything? I can't be the only person who thinks that offensive labels like 'TERF' or the 'trans cult' are oddly resonant of older insults like 'heretic' and 'infidel' that came from others who were rather too certain about their beliefs...

No-one should be subjected to the arbitrary beliefs of others, and it does not matter to me one jot whether those beliefs are about God, or gender, or flying spaghetti monsters. However, quite unlike the South Park creators, I don't find a rabbit a desirable replacement for a human Catholic pope, especially one with such an uncommon passion for the oft-forgotten humility of the Christian tradition as Pope Francis. It rather seems to me that the problems with the Protestant Christian traditions I was raised in emerge precisely from the consequences of deciding that a rabbit would indeed be better than a human as a pope. As a Discordian, the absurdity amuses me; as a Christian, I am less convinced. The Catholic church may lag behind the western zeitgeist by about a century but it does eventually change its mind, whereas certain Protestant Christians seem to have an almost negligible possibility of changing their mind under any circumstances! Frankly, they are far from alone in this...

Here I should like to note that the Discordians have a different and altogether more hilarious conception of what it is to be a pope. In a move clearly inspired by the Protestant philosophy of the Enlightenment, Discordians claim that every human is a pope, and some Discordians like to give out 'pope cards' to certify people as such. Indeed, I was ordained as a Discordian pope by Robert Anton Wilson in the late 1990s, which sounds impressive but really is not, since there is no Discordian magisterium and if there were, Wilson would have excommunicated himself. The followers of this religion are almost universally anti-magisterium, and I hope that most if not all of my fellow Discordians would have the sense to never be caught arguing for a magisterium of science, although the golden rule of the followers of Eris is that "we Discordians should stick apart". As such, Paul Feyerabend's adage for capturing the realities of scientific practice, "anything goes!", applies far better to Discordians than to scientists, where suggesting that 'anything goes' is widely considered tantamount to blasphemy.

And here lies the awful truth of the idea that we can claim to be a supporter of free speech but place its limits at some scientific point of reference: the emotional framework that makes this possible is directly parallel with that of blasphemy. This word, after all, is only the name we have given cognitive dissonance when it occurs within a religious context. We must already have determined a necessary truth in order to wish to prevent dissemination of alternative views, and this implies that we secretly desire a scientific magisterium, the rejection of which would be tantamount to blaspheming. Yet free speech depends upon an absence of limitations, with the sole exception being the one proposed by Immanuel Kant: that we should only enact limitations upon freedom where they are necessary to protect a like freedom for others. It requires a real commitment to liberty for people to negotiate how to achieve such a balance, and alas for the most part we have decided not to bother.

Besides, why worry about being free to speak when the social media giants of Facebook, Twitter, et al. and the search engine giants of Google, Baidu, et al. have conveniently provided their own magisterium of thought for us? Watch them with amazement as they merrily adjust search results and the relative visibility of what different people are saying, choosing on the one hand what should be read first, and on the other hand pronouncing which blasphemies must never be heard. Habemas papam, Cyberpope Google I...? I shudder to think. It is situations like this that ought to provoke the outrage my friend felt for a mostly harmless 'museum of ignorance'. Yet we apparently accept this gerrymandering of information without concern - some of our neighbours even advocate for this censorship, as the example from the pub at the opening of this piece foreshadowed!

When I first starting thinking about 'science popes' and a magisterium for science - always in opposition of any such concept! - my concern was that there would come some kind of attempt to create a Council of Scientists that could attempt to act as magister and offer declarations of what was or was not scientifically valid and therefore permitted to be enforced. Then came 2020, when the World Health Organisation - to its own great distress! - found itself unwittingly appointed to this role in a new and disturbing medical magisterium that spread into almost every world government and swiftly ran out of anyone's control. The topics upon which adjudication was demanded rapidly and inevitably fell into that state of pseudoscience whereby disagreement was not permitted (and thus the sciences cease to function), and the medical magisterium that we collectively instituted immediately undermined its own credentials in a manner rather parallel to the idea that a rabbit would make a better pope. I will not say that a rabbit would have made a good replacement for the WHO, only that the WHO struggled to fulfil its role scientifically while it was also expected to act as magister.

It is an admirable thing to stand up for public health and say "what can I do to help?" It is far more problematic to stand up and demand that everyone must satisfy your chosen vision of public health. In a democracy, any such claim is valid solely while it is has the support of the people, and if we propose any intervention on scientific grounds (where the people may have to take it on trust that we are not mistaken), those acting must at all times be ready both to absorb any new evidence, and to remain open to even the most difficult debates about what it all means (which is never a scientific judgement). We are no doubt emotionally secure in our support for science, but it remains to be seen if we are intellectually secure in our support for the sciences. The test for this is whether we are indeed open to new evidence... whenever you no longer need to test your own claims because you know in advance that you are correct, it makes little sense to assert that what you are doing is 'scientific': you are just playing at magister.

We have a choice to make, between free scientific enquiry and a magisterium of science. Yet there is no viable magisterium of science that will not swiftly become an abomination, since it is only because scientists are free in their enquiries that they gain their cybernetically-enhanced power to secure whatever limited truths are available via various research methods. Without debate there can be no legitimate science of any kind, and since a magisterium of science necessarily declares an authorised truth to enforce it could never be scientific. Rather, these kinds of atheology (like so many brutal monotheisms before them) rest upon an ideological claim to a complete and final knowledge - a rather ugly conception that might truly deserve the name 'anti-science'. Inevitably, every attempt at a magisterium of science prevents discussion and therefore unleashes the state of pseudoscience where free research is impossible, thus destroying the very conditions for enquiry that make scientific methods effective.

Perhaps, if you have read this far, you have nodded sagely to everything I've said and thought it obvious. Yet the moment you step away from this train of thought, you will encounter the anti-vaxxers or the pro-maskers, the Creationists or the militant atheists, the 'TERFs' or the 'trans cult' or whatever else it might happen to be that throws you into a blind seething rage. And when you do, won't you still experience that powerful emotional upheaval that comes with encountering blasphemy...? Underneath it all, our desire for a truth that we can depend upon is entirely human, and the Catholic church's stumbling towards hopes of a good life via its magisterium are only one expression of our unquenchable desire for secure foundations to truth. The Catholic pope these days always has the moral defence that the magisterium of the Vatican applies solely to those of the Catholic faith. A magisterium for science demands more: it demands obedience from us all, and a silencing of all objections. This is not, and can never be, scientific, no matter what claims it defends, no matter what motives it evokes. This is the ghastly confusion at the heart of the terrible events of 2020. Yet all it would take to bring this mistake to an end is the restoration of free scientific discourse, of listening once more to all we have refused to listen to. If only any of us knew how we could go back to doing that.

The Power of No

The Four Oxford Moral PhilosophersPerhaps more than any other twentieth century philosopher, the late Mary Midgley understood that there were great conceptual misunderstandings emerging out of the deep commitments to the power of scientific thinking that began in the nineteenth century. She remarked that this confusion had permitted bias to be "smuggled in as if it were a technical matter only accessible to experts" - a warning written in 2003 that captures a great deal of what went wrong in 2020.

This extended role for the sciences where they are taken to possess an authority that could in itself never be scientific flows from the mythos of magical science I discussed back in January, and it is an entirely contemporary problem - indeed, I rather suspect it is the essential contemporary problem, of which others such as the environmental crisis (euphemistically concealed by talk of 'climate change') are only shadows and reflections. It is because so many have emotionally invested in the power of science, while so few have intellectually invested in understanding the intricacies of the actual workings of the sciences, that we find ourselves in strange places where nobody can say 'no' to even a ghastly mistake provided it is made in the 'name' of science.

To be responsible in relation to the work of the sciences, then, requires a certain vigilance to ensure that what we are doing when we invoke terms like 'scientific research' or phases like 'the science says' or 'scientists increasingly believe' is accurately reporting the state of research programmes, and not mistaking the findings of scientific research (which are always provisional) with specific doctrines that are neither a requirement for, nor a part of, the work of the sciences. As Midgley warned of such philosophical smuggling, a great deal of so-called 'anti-science' positions are reacting to these imported ideologies rather than the research programmes that are truly the activities central to scientific work. This is not only problematic because of the way it complicates all attempts to share scientific perspectives more widely, but also because these unnoticed doctrines "import irrelevant, inhuman standards into non-scientific aspects of life and lead people to neglect the relevant ones."

No philosopher has had greater influence upon me than Midgley, and this is not only because she was my first (and for a long time, my only) philosophical correspondent. I have always aspired towards - and all too often failed to reach - the clarity of language that Midgley was able to bring to bear on quite complex problems in her always-excellent books. For her, as she said to me in the interview I ran back in 2010, specialist terminology such as can be found in the work of the German philosopher Martin Heidegger "runs counter to my deep identification with everyday speech", and the essence of her work in philosophy was always written in a way that makes her far easier to read than Kant, or Wittgenstein, or any of the other staples of modern philosophy. Since I believe any philosophy excised from a general readership has sealed its own doom, I have always considered Midgley's philosophical methods exemplars for what a philosopher ought to be.

When it comes to my own work in philosophy of science, Midgley is more than just an influence, she is the foundation of my thinking. Again and again, she was able to return to the problems springing from the persistent illusion that a pronouncement made by a scientist carries with it an almost magical power to authorise (or de-authorise) certain actions. Yet this authority is not scientific, but metaphysical (non-testable); it is a faith of a very specific kind, and one that tends towards the same blindness that all human thinking is prone to: of excluding ourselves from consideration when it comes to the errors of thought we can see quite clearly in others. It is the clarity of her understanding of this point, and many others related to it, that makes Midgley the essential guide to the broader philosophical problems of the sciences.

Midgley was part of an exceptional group of four female philosophers who studied at Oxford University during the Second World War - from left to right in the photo above, Philippa Foot, Mary Midgley, Elizabeth Anscombe, and Iris Murdoch. Indeed, in my correspondence with Midgley while she was with us, she still could not resist calling one of her former classmates 'P Foot' in a way that felt like an in-joke from decades past. Recently, attention has finally been paid to these four astonishing women philosophers, a recognition that was long overdue - and there is even a certain gathering momentum to recognise them as collectively representing a movement. Rachael Wiseman tentatively suggests 'uncommon sense realism' or 'depictive metaphysics' as names for this philosophical school... I do not think these names will stick, but they are a solid attempt to capture the commonalities of these four astonishing thinkers.

Midgley associated the perspective shared by these four philosophers with the collective issuing of a resounding "No!" to the ethical currents of the early twentieth century. Writing in 2017, she remarked:

Did that make us four into a Philosophical School?

This is a loose term, but the point is worth discussing. We did not at once become a 4-headed unanimous squad of prophets. We each followed our own diverging paths in various directions. But what, for me, makes the unanimity-story still important is a persisting memory of the four of us sitting in Philippa’s front room and doing our collective best to answer the orthodoxies of the day, which we all saw as disastrous. As with many philosophical schools, the starting-point was a joint 'NO!'. No (that is) at once to divorcing Facts from Values, and – after a bit more preparation – also No to splitting mind off from matter. From this, a lot of metaphysical consequences would follow.

These two elements of the prevailing dogma that Midgley singled out are important because they do indeed frame both the significance of these four philosophers and the disastrous trajectory of the twentieth century, from which we are still reaping ever more grotesque fruit.

The split of mind from matter is taken as rejection of Descartes' philosophy, but as I've discussed many times previously, it is actually a perverse commitment to it. Whereas Descartes sought to demonstrate the necessity of treating mind as distinct from matter, today we are committed to the same framework yet inverted, such that we now agree wholeheartedly with Descartes' cleaving of existence - but only in so much as it allows us to make 'matter' (and not mind) the important side of the equation. The power of No had to be placed against this because in reducing who we are as beings to merely the action of molecules, we simultaneously created an illusion of humanity (indeed, of all animals) as merely mechanistic machines, and fatally undermined the conditions for understanding what a good life might be.

Relatedly, and perhaps more importantly, the attempt to divorce facts from values - a mistake that the Scottish philosopher David Hulme made in 1740 and then unsuccessfully retracted in 1758 - is perhaps the pivotal error of the twentieth century, from which so many other philosophical disasters have flowed. This is a mistake imported from moral philosophy, but the consequences of making it have bled out into everything, and especially into philosophy of science. Indeed, as the opening piece to this current philosophical 'campaign' already observed, the confused idea that the sciences can be 'value-free', from which the wilfully ignorant state of pseudoscience flows, emanates precisely from this horrible misunderstanding of both facts and values, to which the power of No had to stand up in opposition.

It is worth reflecting upon the contributions of Midgley's three friends at Oxford, and how they relate to this resistance against the orthodox philosophy of mid-twentieth century Europe. Elizabeth Anscombe is sadly most famous for being Wittgenstein's student, and although that connection with Wittgenstein is important (as Midgley attests) it is worth noting that the four philosophers rarely saw him, even though his ground-breaking philosophical work was instrumental to developing their ways of thinking about the world. Midgley speaks of how Anscombe handed out loose-leaf bundles of papers containing Wittgenstein's notes (what are now called 'the Blue and Brown Books', and would go on to become Philosophical Investigations and On Certainty)... it is hard not to be slightly envious of this chance to be personally connected to what would prove to be the most influential work of twentieth century philosophy. (Heidegger's Being and Time - the likely contender to this dubious crown - is read solely by philosophers; Wittgenstein, on the other hand, was also taken up by psychologists, a rare honour for any philosopher!)

Anscombe, alas, had the misfortune of being too overtly influenced by her Catholicism, and thus was too easily dismissed by the philosophical establishment. Academic philosophy has tended to treat religious commitments as something that can be overlooked as an indulgence in men (it is routinely overlooked for Wittgenstein, for instance, despite its central place in his work as a philosopher), yet it is almost always perceived as a fatal flaw in women, for whom the path to being taken seriously was (and perhaps still is) to act and think as much like a man as possible. Nonetheless, Anscombe's paper "Modern Moral Philosophy" is perhaps the single most important critique of consequentialism ever written. This term 'consequentialism' marks the belief that what matters beyond everything else in our ethics are outcomes, and this specific name was invented by Anscombe and is still widely used by philosophers today - generally without even a passing reference to her work! She remarks of this narrow focus upon outcomes that this approach...

...leads to its being quite impossible to estimate the badness of an action except in the light of expected consequences. But if so, then you must estimate the badness in the light of the consequences you expect; and so it will follow that you can exculpate yourself from the actual consequences of the most disgraceful actions, as long as you can make out a case for not having foreseen them.

A more apposite summary of the disaster that was 2020 I have not seen.

Philippa Foot is known for being the creator of so-called 'trolley problems', and in an irony that is now so common that it will not raise any eyebrows whatsoever, her purposes in using these thought experiments was diametrically opposed to how they are now used today. As I have written about in Chaos Ethics and elsewhere, trolley problems have come to be deployed as a convenient way of fooling people into accepting consequentialist thinking as necessary by making the truths of mathematics seem to possess moral rather than merely logical truth, and thus a means of luring people into acting and thinking atrociously (as Anscombe clearly warned would happen, and as was already happening in the early years of the twentieth century). Foot could not possibly be blamed for this absurd misuse of her toolbox, and the lack of attention to her work in moral philosophy is unfortunate, since she insightfully engaged with the attempt to divorce facts from values in ways that are still well worth reading. I have been particularly struck by her concept that injustice can be understood as a kind of injury, and therefore we have rational reasons to avoid injustice - another idea that bears gainfully upon many of the grim events of 2020.

Iris Murdoch had the fortune or misfortune (depending upon how you wish to view the matter) of having succeeded in writing excellent novels, and therefore of enjoying critical and indeed commercial success - she even scored a 'Dame' in the Queen's 1987 honours list, something no other philosopher has ever achieved. This has allowed Murdoch's philosophical thinking to be roundly ignored, since novels are deemed too frivolous a form for philosophising; even Jean-Paul Sartre and Albert Camus are only grudgingly accepted as philosophers these days, having chosen novels over treatises. The idea that a contemporary academic philosopher would choose a novel as their preferred medium is one that doesn't even need scoffing at - despite the self-evident fact that writing a novel allows philosophical concepts to propagate far more effectively than writing a series of impenetrable arguments for echo chamber journals.

Since she never developed any explicit moral theory, attempts to summarise her position will always be a simplification - although as the case of Philippa Foot demonstrates, explicit theorising is in fact no defence against this either. However, her collection of essays, The Sovereignty of Good, does provide an outstanding skeleton key for understanding Murdoch's moral philosophy:

The self, the place where we live, is a place of illusion. Goodness is connected with the attempt to see the unself, to see and to respond to the real world in the light of a virtuous consciousness. This is the non-metaphysical meaning of the idea of transcendence to which philosophers have so constantly resorted in their explanations of goodness. 'Good is a transcendent reality' means that virtue is the attempt to pierce the veil of selfish consciousness and join the world as it really is. It is an empirical fact about human nature that this attempt cannot be entirely successful.

Murdoch's intuition that there was an irreducible plurality to the moral 'fields of force' hinged upon the idea that there is a sense of unity haunting our thinking about ethics, and followed Plato in naming this unity 'good'. She too used the power of No that the four philosophers unleashed to tackle the crises they jointly perceived. But she did so through the methods of the arts because she believed, with good reason, that this was an approach with the capacity to wield the greatest influence. I hope and trust that through her novels she did in fact attain precisely what she set out to achieve.

As the twentieth century ended, the power of No has been increasingly taken up as a feminine power. The phrase "No means no" has become a commonplace, the meaning of which is that consent must necessarily be explicit because the risks of implying consent are too terrible to permit. Sadly, this is often meant to apply solely in the context of rape, since in truth the confused idea that we can consent to that which we are vocally objecting to has equal application in politics, whereby consent is all too often assumed to have been implied merely by election, and therefore that democracy is solely about choosing who should wield power and not about perpetually solving the recurring problem of how we should all live together.

But if I leave this discussion of the power of No and the four extraordinary women who choose to wield their philosophy under this unstated banner at this point, it will only further the feminist stereotype that women want to undermine and destroy. And this is a terrible misrepresentation not only of women and feminists, but also of what Midgley, Anscombe, Foot, and Murdoch strove towards. It will foreground their opposition to philosophical orthodoxy and not what they were seeking to defend. For the true power of No is not in denial but in resistance, and all effective resistance is grounded in defence of the good, however that needs to be construed.

The four Oxford moral philosophers represented a defence of the good life as it had been articulated by the ghosts of philosophy past, and as it had yet to be articulated in the spirit of philosophy yet to come. They unleashed the power of No in order to resist attempts to simplify, obfuscate, and distort the nature of human existence and the moral decisions consequent upon it. Of the four, only Midgley took on this vital struggle upon the battleground of philosophy of science, and if I have tended to favour her work it is almost certainly because my own conflicted and tortured relationship with the sciences - which I love and therefore recoil in horror when they are distorted, yet also fear because the vision we have unthinkingly chosen for them flirts so blatantly with catastrophe.

Midgley saw with immense clarity the way that non-scientific dogmas and orthodoxies could corrupt and undermine the work of the sciences, and sited that discussion directly in the alleged conflict between science and religion because she understood, with an insight that far outstrips any of her contemporaries, the immense danger of this artificial split, which mirrors the misguided split between facts and values or between mind and matter. To claim facts and matter 'for science' is to attempt to subjugate values and mind, and therefore to gut democracy, freedom, and indeed truth. It reduces the sciences to dogmatic caricatures of their true beauty and worth, and affords to anyone willing to wield this tainted sceptre of "following the science" an authority they neither truly possess nor justly deserve. Tied up in this mistake is the demonisation of religion as 'anti-science' - rather than, as history shows us, giving birth to every science as we now understand this term. It is perhaps the quintessential mistake made by those who claim to love science, yet who deceive themselves by falling prey of what Murdoch warns we are all deceived by: ourselves.

Against any and all such attempts to flatten and oppress the beauty of human existence we can and must join together in raising up the power of No, again, again, and ever again, unceasingly, if we are to have any hope of defending what is good in this world. When we stop, when we decide that we should let injustice that has been misrepresented as necessity, or bias that has been misrepresented as unchallengeable truth, happen without resistance - whatever good reason we may claim for doing so - we betray the efforts of all those who came before us and cleared the way for us all to try to make a good life together. In this neverending project, these four women will be our invaluable allies, if only we are willing to listen to the immense and incalculable wisdom they have bequeathed to us.

Unattributed Mary Midgley quotes in this piece are from her 2003 book, The Myths we Live By.

A Case Study in Pseudoscience

Contains ideas some people may find distressing.

Microscopic Crystals

The science is clear! Masks save lives/don't work! But which is it, and even more importantly how can we know? To answer this wildly contentious question - one which so many on either side are utterly convinced is entirely settled - we first have to understand why this topic has not yet even been adequately debated, much less resolved beyond dispute. Join me, if you dare, on a disturbing journey through a scientific story from the United Kingdom in 2020, a tale that centres upon the world's second oldest university, Oxford...

First, however, a polite warning. This is a hot button issue, and therefore one with a high risk of triggering cognitive dissonance in those who have committed to a specific side...  But if we care about the sciences, we cannot simply consent to keeping our mouths shut rather than debating the ambiguities of a live research question, regardless of how much of a minefield it becomes. In so much as the truth about this topic is currently known, the only two certain claims I can ascertain are that there is not enough good quality evidence to settle the debate definitively, and there is no longer even anything that might be called a debate, since both sides are now intractably locked into their beliefs. This kind of situation is a paradigm case of what I have called pseudoscience, the collapse of even the possibility of productive scientific work occurring.

Our story begins relatively early last year, as thousands of armchair epidemiologists took to social media to declare what was or wasn’t true on a great many topics that were far more complicated than anyone seemed to realise. A great deal of that complexity comes from the fact explored last week, namely that the sciences are discourses, series of conversations via written texts. This has the unfortunate consequence that the act of interpreting the evidence is seldom as simple (as the armchair epidemiologists apparently believed) as sifting out the ‘good evidence’ and discounting the ‘bad evidence’ - and doubly so since the evidence that is rejected in such a procedure is very frequently cast out as a result of confirmation bias rather than for any sound reason.

Not long after the social media platforms began to descend even further into a verbal war zone, severe disagreements broke out in the United Kingdom between medical researchers and practitioners about a newly proposed medical intervention for COVID-19, namely community masking. It's important to make a distinction here: use of personal protective equipment in hospitals is radically different from asking the population as a whole to deploy face masks; there are disagreements about the former as well as the latter, but since our interest in this case study is not in resolving these disputes but rather in examining them, it will be helpful to recall that the question that was being debated in the UK was not 'are face masks ever effective?' but whether we should require the general population to wear face masks to help stop the spread of the SARS-CoV2 virus. It was over this discussion specifically that medical scientific practice almost entirely collapsed in the UK.

The crisis point can be traced to a pivotal moment in June. Two months earlier, Trisha Greenhalgh of Oxford University and half a dozen other medical professionals had argued in a piece for the British Medical Journal that while “direct, experimental evidence for benefit is not clear cut”, we should follow the precautionary principle and recommend face masks for the public all the same. Intriguingly (and this will be important later), they also made the following remarks:

...trials have shown that people are unlikely to wear them properly or consistently, which is important since prevention depends on people not repeatedly touching their mask, and on all or most people wearing them most of the time.... the trials cited above have also shown that wearing a mask might make people feel safe and hence disregard other important public health advice such as hand washing and social distancing...[these] arguments may have been internally valid in the trials that produced them, but we have no evidence that they are externally valid in the context of covid-19. “The public” here are not volunteers in someone else’s experiment in a flu outbreak—they are people the world over who are trying to stay alive in a deadly pandemic. They may be highly motivated to learn techniques for most effective mask use.

In June, Professor Greenhalgh and her colleagues returned to follow up on their original piece. There had been enormous swathes of comments in the meantime, and heated arguments about the risks that might potentially be involved, not to mention how this proposal could be justified in terms of the precautionary principle, which cautions doctors not to use unproven interventions about which there is a potential risk of harm. Surprisingly, in responding to their critics the authors did not engage with any of the concerns that had been raised. Rather, they declared the myriad objections colleagues had presented as “straw men” (misusing the term, incidentally) and announced that the UK ought to do what they had suggested anyway. A week later, the UK government mandated community masking by law, with escalating fines for non-compliance. This led the Centre for Evidence-based Medicine (like Greenhalgh, also based at Oxford University) to run an unprecedented opinion piece denouncing the decision as politically motivated and scientifically unsound. From that point on, the outbreak of pseudoscience corrupted the discourse and little productive discussion on this topic has yet re-emerged.

An interesting aspect of the CEBM’s rebuttal was that it was entirely couched in terms of how the research had been conducted up until the year before, and the lack of strong supporting evidence - including mentioning the calls that had been made for further research on the efficacy of different kinds of face mask after previous epidemics that had never been followed up. Even if the CEBM's response was marred by the kind of righteous outrage that also corrupted discussion on social media, it is clear that (at the very least) they understood the role of the discourse in validating scientific claims, and saw the risks involved in pretending there was no prior understanding on the topic that might have made certain advocates of community masking more cautious than they were. In the sciences, scepticism can be both a blessing and a curse, but the absence of adequate scepticism - or the refusal to listen to it - almost always heralds mistakes, and sometimes disastrous errors. It is why allowing disagreements is essential to the work of the sciences, and every attempt to prevent such arguments from taking place fosters pseudoscience.

It is worth pausing briefly to point out that when I claim the medical discourse in the UK devolved into pseudoscience over this issue (and a parallel argument can almost certainly be constructed for the US, but I have spent less time examining the discourse there) I am not making any kind of claim about the truth of the competing claims about community masking. From the UK perspective, one side came to the table with a hypothesis that this intervention would be effective at preventing the spread of a respiratory virus, acknowledged the evidence they had at the time was inconclusive, recognised some of the specific risks involved in pursuing this intervention but claimed that - as a precaution - we ought to adopt the community masking anyway.

The positive argument made for the intervention was essentially ‘it might save lives and we might avoid the known harms so we must do it’. Yet as a purely logical matter, this is poor reasoning, and as a medical question the precautionary principle could not plausibly be applied on this basis (as some pointed out at the time, it cautions the exact opposite of what was done). Thus right from the outset, the necessary discussion on the topic was on dangerous ground. But this certainly does not exclude the possible benefits of community masking; rather, what was indicated was an urgent need for trials to establish the balance of benefits to risks. In ignoring the ambiguous state of knowledge regarding the potential harms, the discourse failed and we entered the condition of pseudoscience.

If we had remained in a state of productive scientific discourse, what should have happened next was commissioning studies to gather evidence to resolve the ambiguities. Yet this did not happen, and still has not happened, and it is incorrect, as British evidence-based medicine practitioner Margaret McCartney shrewdly observed, to claim that the evidence could not be gathered because it would be unethical to do so:

Another argument is that large scale trials, say of face mask use in schools, are impossible, because of the belief that every child would need a guardian to consent, making recruitment practically impossible. But this is deeply problematic. This suggests that the government can choose and implement any policy, without requiring any individual consent, as long as it is not called a trial. For as long as this double standard is allowed to persist, giving less powerful results and unnecessary uncertainty, people may come to avoidable harm. Nor does valuable information come only from randomised controlled trials. Complex interventions require multiple disciplines and types of research for assessment. But where are they? [Emphasis added]

Furthermore, it is rather strange that Greenhalgh and her colleagues specifically identified a key risk associated with mask use (touching an infected mask - see the quotation above), but set this aside by claiming that the public would be “highly motivated to learn techniques for most effective mask use.” Yet the British government provided negligible guidance on effective mask use to the public. Considerable expense was put towards promoting the idea on television and other media that the British public should wear masks, but almost none at all on what good mask technique ought to consist of. Notes on the government website, however, did provide numerous important warnings - about not re-wearing used masks, about storing used masks in plastic bags etc. - none of which I have seen practiced by anyone but myself in months and months of government-enforced mask wearing. Nor were any studies conducted to even check the quality of the mask technique that was occurring in the community! Once the law was passed to mandate face masks, even those concerns openly acknowledged by the medical professionals who had called for community masking in the UK were simply ignored.

If you had suggested to me in 2019 that the British government was going to mandate a medical intervention on weak evidence and then commission no studies to verify either the efficacy or the safety of that intervention I would have at the least raised an eyebrow, and at the worst asked what you were smoking. Yet this is precisely what happened. The entire affair has caused me quite considerable distress, not because I know the truth of the matter (community actions are far more complex research subjects than most people seem to realise), but because I would never have believed in 2019 that it would take just eight weeks to disrupt the capacity for the medical networks of the United Kingdom to act as scientists, nor that anyone would propose to use the force of law to compel everyone into a medical intervention the case for which had never even been adequately debated, let alone investigated. It is doubly amazing to me that anyone can use phrases like “following the science” or, worse, “the science is clear!” in a situation where the truth is that the required scientific work has not yet been adequately conducted.

The concern I am raising here is rather independent of what transpires to be the truth about community masking if and when scientific discourse is restored. Even if future evidence did eventually validate the hypothesis, it would not change the fact that the British government acted improperly by enforcing penalties by law for non-compliance with an intervention they apparently had no intention of confirming was effective, nor indeed of ruling out the possible health risks suggested by earlier mask studies - perhaps most significantly that cloth face masks, improperly used, might increase the rate of infection (as the CEBM commentary points out, and as Greenhalgh and colleagues acknowledged was a risk). There was more than enough evidence in April to formulate a hypothesis, but nowhere near enough to settle the issue unequivocally - as indicated by the fact evidence-based medical practitioners in both England and Scotland publicly spoke out against both the lack of good evidence and the abject failure of the British government to commission any new studies to gather it.

I can think of no better name for this depressing collapse of the medical discourse in the UK than pseudoscience. This condition destroys the ability of the sciences to operate by undermining our capacity to disagree, which is fundamental to the pursuit of scientific truth. What's more, once this situation occurs, the problem is no longer constrained to the topic that initiated it, and alas creates ample opportunities for unscrupulous people to manipulate the truth for personal profit while the scientific networks are effectively disabled. Thus in November 2020, the British Medical Journal's Executive Editor Kamran Abbasi issued an unprecedented editorial about the suppression of scientific research in the UK's most respected medical forum declaring:

Science is being suppressed for political and financial gain. Covid-19 has unleashed state corruption on a grand scale, and it is harmful to public health. Politicians and industry are responsible for this opportunistic embezzlement. So too are scientists and health experts. The pandemic has revealed how the medical-political complex can be manipulated in an emergency—a time when it is even more important to safeguard science.

This is not some off-the-cuff remark by an armchair epidemiologist on social media, this is the Executive Editor of a major British journal issuing an editorial for the express purpose of lambasting the British government for "state corruption on a grand scale" and "opportunistic embezzlement", this latter point relating to the news story (reported in October by the BMJ) that the government had handed out contracts without tender for face masks and other protective equipment, some of which was not even fit for purpose. (I note for context that Abassi appears to have remained agnostic about community masking - although not about Facebook censorship over the issue). How curious that this serious breakdown in scientific discourse did not even warrant a mention in any British news source! But then, each of the channels, each of the newspapers had already picked a side on the face mask issue, so they simply ignored and discredited any and all contrary viewpoints... thus the journalists followed the scientists into pseudoscience too, if they did not in fact lead them into it.

Logically, if the US medical community had not descended onto this crooked path immediately beforehand, we would be hard pressed to explain how this could have happened in the UK at all (it is exceptionally unusual to argue to undertake a precautionary measure while admitting the evidence for it is still inconclusive, for obvious reasons). However, since I have not examined these earlier discussions in any great depth, I leave it open whether there might be some other explanation besides the most obvious one, namely that the UK's pseudoscience outbreak was caused by a metaphorical infection of human thought that spread from the other side of the Atlantic where political partisanship had already destroyed any possibility of clear scientific thinking at a time when it was most needed.

Hence the epidemic of armchair epidemiologists who dealt with every contrasting perspective by the expedient means of summarily discounting the views of anyone who disagreed with them. Yet for their chosen position to be in any way credible, these partisans still have to explain why they have needed to discredit so many people who are well-versed in the medical sciences. As this UK case study hopefully makes clear, whichever stance is taken in 'masks save lives/don't work', at least one senior academic at the prestigious Oxford University, plus hundreds more academics at other faculties around the world, will be on the other side. How far are you willing to go in your crusade of denouncements and discreditings just to uphold a specific interpretation of the still-ambiguous evidence as being both clear and irrefutable? Will you say that their political beliefs misled them, while yours miraculously had no effect on your truth-finding powers...?

Accepting this as an outbreak of pseudoscience, on the other hand, provides both an explanation for this otherwise incomprehensible lack of collective discernment, and a potential solution as well: restore debate over the key disagreements, and either conduct the required research or entirely withdraw the legal requirement for community masking in the UK (or wherever you happen to live). Without embracing dissent, there can be no legitimate scientific position on community masking at all, only the counter-productive war of bias-against-bias I have named pseudoscience. The sooner we accept this, the fewer lives we will lose to these two infections - the deadly SARS-CoV2, and the even deadlier outbreak of pseudoscience it has fostered.

As long as we pretend that this issue is resolved beyond further dispute, rather than trapped in a limbo where such resolution is impossible to reach, the more people will die who did not need to. Not because some people wouldn't wear masks, but because we have collectively destroyed the ability of the sciences to do what they do best: to investigate ambiguous situations and explore all the possible explanations for the evidence gathered thus far. The science is clear? No, it almost never is. But our guilt in undermining the work of the sciences is all too clear, and for this I fear we should all feel greatly ashamed.

Comments welcome, but please don't comment angry! If this piece enrages you, please wait a short while before replying.

Every Science is a Discourse

Einstein ReadingWe celebrate Albert Einstein as the greatest scientific genius of the preceding century, yet we tend to focus solely upon his theories in physics when we do so. In the 75 years since his death, we have continuously taken steps to place greater importance upon science and mathematics and to downplay the importance of the humanities. Yet Einstein himself would have cautioned against taking this path. He remarked, in a piece for the New York Times in 1952 (and please forgive his exclusive use of male pronouns, which at the time was entirely usual in English): 

It is not enough to teach a man a specialty. Through it he may become a kind of useful machine but not a harmoniously developed personality. It is essential that the student acquire an understanding of and a lively feeling for values. He must acquire a vivid sense of the beautiful and of the morally good. Otherwise he – with his specialized knowledge – more closely resembles a well-trained dog than a harmoniously developed person. He must learn to understand the motives of human beings, their illusions and their sufferings, in order to acquire a proper relationship to individual fellow men and to the community. These precious things are conveyed to the younger generation through personal contact with those who teach, not – or at least not in the main – through textbooks. It is this that primarily constitutes and preserves culture. This is what I have in mind when I recommend the ‘humanities’ as important, not just dry specialized knowledge in the fields of history and philosophy.

How fascinating that at the time he was writing, the danger Einstein saw was that only history and philosophy would be taught in the humanities! Today, neither subject is a priority at most universities, and the humanities as a whole have been relegated to a lesser status next to so-called STEM (Science Technology Engineering Mathematics) subjects. Einstein, as this quote and others like it attest, was against this elevation of the sciences above the humanities, against the specialisation that has become the hallmark of contemporary higher education... he saw great danger on the path that we were already upon in the 1950s. We did not listen.

Today, even those of us who value both the humanities and the sciences for their unique contributions to human flourishing will tend to treat the former as worthy and the latter as useful. The impression is thus that the humanities are an optional extra, while the sciences are doing the real work in advancing human knowledge. Indeed, it sometimes seems that what distinguishes the humanities from the sciences is that humanities scholars merely ‘talk’ while scientists ‘do’. But this is an illusion brought about by the impoverished state of our philosophy of science. In actuality, every science is also a discourse. Not understanding this subtle point leads to a great many errors.

Giant Shoulders

The story we like to tell about Einstein's scientific work, and the tales we tell of Galileo and Newton as well, have a nasty habit of valourising these theoreticians and natural philosophers as lone heroes fighting for truth against the Church or some other orthodoxy (e.g. the ether, in Einstein’s folk history). Almost always, these tales are mythically exaggerated - even to the extent of falling into magical science, as previously discussed. Regarding Galileo, Paul Feyerabend is not the only historically-inclined philosopher of science to observe that it was the Church at that time who was more “faithful to reason” in the famous dispute. As Charles Taylor puts the matter: “If we look at the period we’re examining, we see that the mantle of sober scientists was often seized by the defenders of orthodoxy.” In each and every case, looking at what scientists came to accept afterwards is an inadequate way of understanding how they reached these new understandings, which always entailed disagreements being worked through by a community.

What I find particularly fascinating about the relationship between the sciences and their discourses is that contemporary scientists - quite unlike Einstein and natural philosophers like Newton - typically do not understand themselves as being in a discourse at all. I would suggest this shortcoming happens precisely because scientists today are trained in blinkered specialist degrees and do not receive a university education in the sense that Galileo or Newton would have understood, and that Einstein championed. For the natural philosophers, to go to university was to be prepared to understand the world as a coherent whole - a universe, hence ‘university’ (both terms coming from the Latin, ‘universus’ - whole, complete). There was no concept of humanities vs sciences for these scholars, and although there was for Einstein, he urged us to pursue both and considered the humanities to be so important that a good education ought to revolve around it.

A university education in the classical sense required you to understand, for instance, that Newton’s laws of motion spring from Newton’s writings, which were part of a mathematical discourse with his predecessors and peers. Not without good reason did Newton famously claim to be “standing on the shoulders of giants.” Conversely, while I was studying physics at the world-class physics department at the Schuster Laboratory in Manchester, every theory was presented to the undergraduates as if it had come from nowhere, just a magical free-standing edifice, a roof without walls to support it. Humanities scholars broadly understand their fields as sustained by their texts, while contemporary science students are taught misleading nonsense like ‘the scientific method’ instead (see the earlier discussion for why this is incoherent), although I note that, to their credit, no professor at University of Manchester ever suggested any such thing to me. Alas, a great many people today seem to foolishly believe that ‘the science’ speaks for itself, yet that it does so through them, as indeed oracles claimed of the gods that spoke through them (another manifestation of magical science, perhaps...?).

Every scientist is part of a discourse - and they ignore this to their (and sometimes our) peril, most especially because training in one field does not automatically give you expertise in all fields. Newton is not the only one who stood on the shoulders of giants, every scientist (every scholar in every discipline, in fact) necessarily does so, and every mythic image that conceals this poses risks to scientific practice. As much as I have dabbled with being a polymath since graduating, I have only ever managed this by committing to learning new discourses and being willing to both listen and talk to practitioners in those other fields - as I had to do in 2011 with aesthetics and 2012 with the evolutionary sciences in order to write about them for my first two philosophy books. To conceive of the sciences as uncovering truth without borrowing those giant shoulders is to deceive yourself. The sciences are community practices, and have always been so.

Einstein's Hope for the Future

We take Einstein as a scientific hero with good cause, but like his natural philosopher predecessors he did not associate knowledge with intense specialisation, but rather with co-operation within and between disciplines. Remember that Einstein performed no experiments to verify his theories (although he designed one experimental instrument, his “little machine”, which does not appear to have worked) - he didn't need to conduct his own practical research; he could count on the physicist community to be curious enough to want to consider all the possibilities with care, because of their shared commitment to determining the truth of each situation.

As much as I admire the sheer elegance of his mathematical derivation of special relativity, which I studied in high school, there is an Einstein quote that for me sums up his genius more than anything else:

Perfection of means and confusion of goals seem – in my opinion – to characterize our age. If we desire sincerely and passionately for the safety, the welfare, and the free development of the talents of all men, we shall not be in want of the means to approach such a state. Even if only a small part of mankind strives for such goals, their superiority will prove itself in the long run.

The message here may not be immediately clear: it is not enough for scientists - nor indeed anyone seeking to serve humanity as a whole - to be siloed away in a specialism ‘perfecting means’. Yet because we have become so good at doing this, because our means (our technology) have become so powerful, we could easily achieve a state of near-universal human flourishing if only that was the goal we should wish to undertake. It was Einstein’s hope that we would. Yet we did not, and still do not, in part because Einstein’s generation of scientists were the last that learned their science as a discourse, and thus did not look down upon the humanities as somehow lesser, requiring the self-deceit that the sciences transcend human discourse to speak directly with the universe - or as Einstein would say, with God. Einstein would not have said that the moral truth was given by God, however, but discovered by us, through pursuing our disagreements in the humanities, which are at least as important as the sciences when properly understood.

The mission statement I take Einstein to be laying out here is not one I associate with spreading high technology indiscriminately around the world, thus bringing the community-rich ‘Third World’ down to the impoverished social state of our so-called ‘First World’, nor with dictating for all what a technological good life should (or worse, must) be. On the contrary, the safety, welfare, and the free development of the talents of all humanity will be quite seriously threatened by our technology if we do not change how we think about it, a topic I have explored in The Virtuous Cyborg. Rather, I take Einstein as participating in a prior discourse (a lowly humanities discourse...), that of the Enlightenment philosophers such as Immanuel Kant, whose major works Einstein had already read at age 16. I take it, therefore, that Einstein was proposing to work towards what Kant suggested was the “merely possible” future state where we can support everyone in pursuing their own chosen ends provided they do not prevent others from pursuing their own ends. Both pseudoscience and magical science disrupt our ability to do this, in part by obscuring the truth that both the humanities and the sciences are vital discourses we cannot afford to disrupt, a fact that has alas become obfuscated by this very division of human thought.

This schism in knowledge - a grenade whose pin was accidentally pulled by Kant in his rethinking of the university system - now threatens everything the Enlightenment strived towards. For me, the best reason to pursue philosophy of science - to take part in the discourse about the discourses of the sciences - is to help fulfil Einstein’s dream of ensuring the safety, the welfare, and the free development of the talents of all humanity, an ideal originally espoused by Kant, Mary Wollstonecraft, and others like them. In so doing, I join Einstein, Wollstonecraft, and Kant’s discourse, without of course ever speaking to them. It is my hope, vain though it might be, that more might still follow us - but I fear this will not happen without a seismic shift in our understanding of the contributions both the humanities and the sciences make towards our collective knowledge, and with it a vast and long overdue improvement to our philosophies of science.

Comments always welcome.

Magical Science

Elementary Dear DataArthur C. Clarke famously suggested that any sufficiently advanced technology would be indistinguishable from magic. This suggests another maxim: any insufficiency developed philosophy of science is incapable of distinguishing between science and magic.

We all have our own philosophy of science, our conceptual framework for understanding scientific topics. In the best case, our personal philosophy of science informs us of the limitations of scientific knowledge, allows us to put research into a wider context, and ensures we remember that the work of the sciences is still at heart an entirely human endeavour. Alas, few of us have such a clear view of the sciences. Far more widespread is a kind of pervasive mythos we might call ‘magical science’, which affords to the image of science unlimited future power, and to scientists an awesome capacity to divine the truth through singular experiments, like a Roman haruspex reading animal entrails to predict the future.

Magical science has the dubious honour of being the only superstition widely encouraged today. We are all too frequently adamant that science has all the answers, science is the royal road to truth, that we can trust in the science... I notice that even the British Prime Minister has taken to invoking magical science in his speeches these days to validate his increasingly dubious actions. At heart, magical science may seem harmless, a mere rose-tinted vision of the work of scientists, one that tries to account for all the successes of our various research networks without any attempt at balance or insight. We typically overlook this kind of naive enthusiasm for scientific achievement on the basis that it's at least ‘supporting the right team’. Yet it becomes increasingly clear that blind support for science can manifest in ugly ways, even in ways that can prevent the sciences from working, plunging research into the debilitating condition of pseudoscience, as previously discussed.

The perceived infallibility of the sciences as truth-seeking procedures clashes worryingly with the necessity of scientists making mistakes, and thus magical science leads to anger at scientists when the actual scientific work is not as wondrous as it is imagined it should be (as with the ugly 2009 L'Aquila trial, where terrible earthquakes in Italy were not successfully predicted and the scientists blamed), or when any scientist speaks out against a claim that has been proclaimed unshakably true by its advocates. It is precisely because magical science is incapable of distinguishing science from magic that it represents a far greater danger to scientific endeavours than other philosophies, perhaps even so-called ‘anti-science’ philosophies. What deceives us here, what elevates scientists to their misguided role as flawless augurs rather than researchers struggling with ambiguous data, are the bad habits we have learned from the manifestations of science in fiction, where magical science is the norm. If we wish to see the work of the sciences with clearer eyes, we may have to start by putting some of the most iconic characters in fiction on philosophical trial.

Sherlock Holmes and the Flawless Investigation

It is sometimes remarked that in creating Sherlock Holmes, Sir Arthur Conan Doyle produced the first hero of ‘the scientific age’. The Victorians were the ones who coined the term ‘scientist’ and it was their obsession with the sciences that set the scene for the unfolding technological transformation of the world over the next century and a half. We tend to treat the character of Holmes as significant mainly for crime fiction, as the archetype from which all whodunits descend - but Holmes, quite unlike a Raymond Chandler or Agatha Christie detective, is always a practitioner of magical science. Partly, this proceeds from the inherent parsimony of storytelling whereby all questions will eventually be answered because everything is there by the author’s design. Partly, however, it proceeds from Holmes’ essential power - which upon closer inspection is not deductive reasoning at all, but rather the infinite convenience possible solely in literature.

Doyle gives Holmes a quite impossible access to every conceivable fact as a starting point, such that a berry stain or the smell of a particular tobacco can certainly be identified, and then (to pile on the absurdity) Holmes by purest chance always encounters a set of circumstances that allow for only one viable interpretation. This particular brand of tobacco, for instance, is sold in exactly one place in London... We thus end up admiring Holmes purportedly scientific form of investigation while what we ought to admire is the way Doyle effortlessly conceals the magical science entailed in this depiction by making it seem as if all of Sherlock’s deductions (and inductions) were strictly logical. Doyle has contrived a set of circumstances that Holmes, with his unlimited catalogue of facts, can be certain to solve. This makes Holmes a disastrous role model for scientists (or indeed, detectives!) since it is only through the meticulous construction of literary contrivance that he possesses any investigative power at all. This becomes clearest when Holmes relies upon facts we know are false - such as the ludicrous snake plot device in The Speckled Band, which entails behaviour implausible to coax out of any reptile. Holmes’ claims to be a man of science are rather fraudulent behind the scenes: he is simply the locus of a mythic depiction of magical science.

Neither is Holmes the only such character. Both Spock and Data in the worlds of Star Trek share this power of magical science - also manifested in these shows by the tricorder, which like Holmes spits out every required fact on demand and without error. Or consider Doctor Who from the third Doctor onwards: anything necessary is certainly known by the Time Lord, except when the story requires a convenient (and often temporary) amnesia for dramatic effect. That both Data and the Doctor had a spin at being Baker Street’s most eligible bachelor is not accidental, nor perhaps is Stephen Moffat’s concurrent time as showrunner for both Doctor Who and Sherlock... Magical science heroes seem to reaffirm our faith in the power of scientific knowledge, while also playfully exposing the quirky personalities of scientists. House, The Big Bang Theory, and much more besides all participate in a literary tradition that stems from the Sherlock Holmes tales, and is now seemingly dominated by his science fiction proteges. 

Yet these are not scientific heroes, but magical science heroes. They have exactly the facts and the circumstances to answer perfectly every time, without ever having to confront the ambiguity, indeterminacy, and incompleteness of an authentic scientific problem. They are to science what Superman is to police officers: naively idealized caricatures. They find the answers solely because they live in stories where uncovering the truth is possible by design. This is a wildly misleading template for scientific truth, and although we know these are ‘just’ stories, we somehow import our wilder beliefs about the sciences into our everyday thinking unless we are extremely careful. If we are to break this spell, we need a philosophy capable of distinguishing science and magic - and for this, we need a clearer understanding of ‘scientific truth’.

Desperately Seeking Truth

Even if we start with the acknowledgement that the sciences are capable of discovering or affirming truth, the question of what might qualify as a ‘scientific truth’ is far trickier than it seems. As the preceding discussion on pseudoscience made clear, we cannot simply append ‘scientific’ to known truths without distorting the essential ambiguities of the research process where we cannot in practice know if the apparent truth of a researched claim will hold in the future. In fact, we have a choice. We could align ‘scientific truth’ with the unshakeable deep truth of reality and thus admit that the claims asserted by scientists cannot be known as truth at all (effectively contracting the domain of scientific truth to concluded research programmes like optics). Or else we can align scientific truth with the body of beliefs held by scientists, with the inevitable consequence that such truths can be later revealed as false - or even abominable. We don’t even have to go back a century to find all manner of racist, sexist nonsense asserted as truth by those who identified as scientists.

Now those who buy into magical science have an easier job here, but only by being wildly dishonest about both truth and scientific methods. According to magical science, scientists uncover truth infallibly so all claims asserted by scientists are scientific truth. Thus if and when the circumstances shift we can ‘debunk’ or ‘discredit’ those responsible and say they were not really scientists at all, or even exclude their claims from consideration in the first place! This is where ‘pseudoscience’ has been used as a label, although as I have argued previously it is not a terribly viable way of using the term. Babette Babich has made even stronger - and oft misunderstood - claims about the way the discrediting associated with the term ‘pseudoscience’ serves as a dogmatic attempt to demarcate legitimate science, while all too frequently preventing any scientific enquiry from even beginning. Thus when this particular word comes out, it narrows scientific knowledge by declaring certain topics forbidden and out of bounds - and woe betide the researcher who goes on to try to report experimental results from such verboten fields...

The highly problematic implication of every attempt to discredit and thus demarcate ‘science’ from ‘pseudoscience’ must be that we cannot know when scientists assert a claim whether it will later need to be ‘debunked’. Thus faith in magical science is inevitably a distortion of the truth - for things we will say are scientific truths on this philosophy may later be ‘discredited’, or even discredited before they are considered at all. The alleged truths of magical science are thus only defended by ignoring the inevitable consequences of the inherent revisionism of scientific practice and pretending that the current consensus among researchers is ‘more true’ than it was yesterday and thus that now (and by implication, only now) we can trust everything scientists say as long as we are standing guard for those pernicious pseudoscientists who ruin it for everyone. To say that this is dangerous nonsense is easy; to replace it with a more sound philosophy of science will be much harder.

There might be a way out of this maze, but it would require us to think differently about the relationship between truth and the sciences. Part of what deceives us here is our desire to understand the truth in terms of a set of valid statements. Since we can point to scientific concepts we abandoned, like phlogiston (which was a hypothetical substance that made combustion possible), we want to assert a gradual improvement in the accuracy or scope of our ‘book of facts’. “We would not be fooled by phlogiston today,” we might think. Yet phlogiston was an important - and arguably entirely scientific - proposal that was merely discarded when our understanding of chemistry shifted such that combustion could be thought of in terms of a chemical reaction with oxygen.

The brutal truth of the ‘book of facts’ is that such a collection of statements today would theoretically contain far more ultimately false claims than it would in the 1770s, simply because the number of scientists and the diversity of research fields has increased dramatically we are now paradoxically more wrong than researchers in the 18th century (in terms of sheer numbers of errors made) - the inescapable consequence of asking both more and more difficult questions. What makes it feel as if we are now more right is knowing that phlogiston was to become replaced by a new understanding of chemical reactions and thus combustion and so forth. But this is largely an illusion caused by examining successful research programmes in hindsight.

Similarly, when I say phlogiston was ‘scientific’, I am projecting with hindsight since the term ‘scientist’ was not coined until 1834... researchers in the 1770s would not have described anything they were doing as ‘scientific’ - it is our desire to paint the sciences as something with a history of more than two centuries that makes us ‘claim’ both phlogiston and oxygen (not to mention Copernicus, Galileo, Newton and so forth) as part of the story of ‘science’, rather than the natural philosophy that those involved would have stated they were pursuing. Thus our ‘book of facts’ not only contains more errors than our predecessors two and a half centuries ago, it is not even entirely honest about its relationship with its own past. Add to this the unavoidable truth that this imagined ‘book of facts’ does not exist (for all that encyclopedias and their successors have wished to fulfil this role) and it begins to feel uncomfortably like we are deceiving ourselves - as if we have all fallen for the seductive confusions of magical science.

Legitimate Practices

We want to defend our intuitive impression of the sciences as truth-seeking, and also (in some nebulous sense) successful at doing so. How do we do it?

One option we can consider is that which I proposed in Wikipedia Knows Nothing: to switch our focus from facts (true statements) to practices (skills and equipment). To know how to use something - a polymerase chain reaction, an interferometer, a fractional distillator - is more a matter of knowing what to do than it is a ‘book of facts’, even though that knowledge also produces facts related to the equipment used (and any theories deployed to give a context to the reading of the instruments). Thus an astronomer armed with geometric theorems can use an interferometer to measure the diameter of stars, while an engineer can use an interferometer and the wave theories of light to measure very small objects precisely. The practices associated with both the equipment (the interferometer) and the theories associated with each specific usage give rise to facts - in this case, distances. The difference lies in what legitimizes the activity in question: on the usual conception of knowledge, if you had the facts you had legitimate knowledge if those facts were true and the reasons for justifying them were correct - which actually provides no means of knowing what is or is not legitimate since our criteria for legitimacy requires an appeal to something beyond the situation (the truth) that we cannot access directly. Conversely, when we view knowledge as a practice, what makes the facts legitimate is that we are using the tools correctly. In this context, we have recourse to everyone with the relevant knowledge of the tools entailed to verify the legitimacy of the practices used and hence the facts reported.

On this understanding of knowledge, unlike an appeal to the truth, we can construct a viable understanding of ‘scientific truth’, since certain equipment, certain theories can be uncontroversially attributed to the sciences, and their correct usage can be judged by anyone else with access to the same knowledge practices. On this path we can therefore distinguish between scientific truth (facts emerging from legitimate research practices) and errors, provided we allow the disagreements to be properly explored in any given research community. However, as Babich warns, this cannot happen if we rush in with a dogmatic cry of ‘pseudoscience’, since every attempt to discredit something a priori entails an outright refusal to think about a given topic at all. Ironically, such attempts to discredit effectively cause an outbreak of the condition of pseudoscience, in my sense (a state of disrupted communication where scientific work can no longer be pursued), since whomsoever speaks this word with the intent to discredit (and thus ignore something) signals the very breakdown of legitimate scientific disagreement required to understand whatever is (not) being discussed.

The deeper problem we encounter when we look more clearly at how scientists discover or verify truths is that the claims that are asserted soon exceed simple assertions of facts. Once they do, it requires another set of knowledge practices to disentangle the relationships between facts and conclusions - and these are not strictly scientific at all, for all that scientists engage (unknowingly) in these kind of interpretative philosophical practices every time they assert anything but the most trivial of claims. Indeed, precisely the crisis of contemporary sciences is that their application is not a scientific practice, but a philosophical one - and Einstein’s generation may have been the last where scientists spanned these disciplines rather than retreating behind specializations that narrow, rather than widen, the scope of our collective understanding.

It is small wonder that we seem to have arrived in a “post-truth” world: the attempt to make the only acceptable truths those that flow from scientific endeavours renders a great many of the truths that matter impossible to adequately discuss, precisely because the important truths (those that pertain to what we ought to do, for instance) could never be scientific and thus cannot be established solely by an appeal to the facts. Yet we keep looking to scientists to give us a certainty that is not in any way available through scientific methods - and as the L'Aquila trial in Italy demonstrated, we will turn upon those who do not live up to our insanely unrealistic expectations and even accuse them of committing crimes when they, inevitably, make mistakes. But it is we that have failed, by falling for such an impoverished understanding of the complexity of scientific research as that of magical science.

Breaking the Spell

The needs of a narrative require magical science for the very same role as arcane magic - as a plot device limited solely by our imagination - and the two are (in more ways than we tend to acknowledge) equivalent, exactly as Clarke foreshadowed. The problem is, the actual work of the sciences, the global cybernetic collaboration of scientists that began under that name in the 1800s and continues today, is magical solely in its lustre and not in its details. Yes, the collective technological achievements facilitated by the work of countless scientists is now already indistinguishable from magic in a great many situations. But the work of scientists is not magic, and is certainly nothing like the magical science of a Sherlock Holmes fable. When we mistake the two, when we treat a human who conducts scientific work as someone wielding all the sorcery of magical science to know, automatically, everything that needs to be known, we are not supporting scientific truth-finding at all, but making it far, far harder, and in the worst cases, rendering it entirely impossible.

I will not say we must stop enjoying the fantasy of magical science in our stories - escapism is mostly harmless, after all, even if it is not entirely blameless - but is it not perhaps about time we stopped pretending that our scientists are superheroes with magical powers to determine truth? Scientific truths are extremely specific, and much narrower than we want them to be - they are at their most precise precisely when their claims are most limited. The heroism of actual researchers is of a patient, humble kind, that requires time and substantial disagreements to bring about. It is neither as spell-binding as Holmes’ contrived deductions, nor as charmingly detached from human fallibility as Data or Spock’s inhuman resourcefulness suggest. Neither has any living scientist access to the unquenchable moral certainty of the later incarnations of the iconic Time Lord to guide them either. These role models all imply a role that is impossible to bring to life: we should be careful not to buy too deeply into such implausible exemplars, without dismissing entirely the hopes and ideals that they embody.

Actual scientific practice is amazing, but it is neither miraculous nor supernatural. It is rather mundane in its details, which never entail perfectly prophetic experiments, and always require a great deal more arguing about the possible interpretations of the facts than literature has ever depicted. When we cannot distinguish science from magic, we obscure scientific truth and the immense and heroic efforts required to produce and understand it. We do all our scientists a disservice when we mistake them for sorceresses and wizards, and we entirely dishonour the scientific traditions when we censor or revile researchers for not living up to our hopelessly elevated expectations of their truth-discovering powers.

If we cannot distinguish science from magic, we need to either improve our philosophy of science or else remain silent on scientific topics. As Holmes remarks: the grand gift of silence makes Watson quite invaluable as a companion, for scientists, much like Holmes, often need us to pay close attention to their work and their disagreements, so that together we can eventually reveal true claims about our world. When we work to silence and discredit others we disagree with, rather than remaining silent so we might hear those disagreements we are denying, we have destroyed the very conditions for any kind of legitimate scientific investigation to occur. If we truly wish to be friends of the sciences, perhaps we too ought to know how to hold our tongue and try to listen to the quiet whispers of the truth when the game is afoot.

Comments always welcome, especially the polite ones!

What is Pseudoscience?

PhrenologyWhen we talk about something being ‘pseudoscience’ what we tend to mean is that it’s ‘not true’, and we reach that conclusion because what we mean by pseudoscience is something that is ‘not scientific’, and we associate the sciences with truth. Yet the alternative to truth is not automatically falsehood; there is also ambiguity, indeterminacy, and incompleteness to consider. What’s more, if we call things scientific only if they are true, we are admitting that we don’t actually know what is or isn’t scientific until some future time when the arguments about some given topic are finally resolved. There is a confusion here worth examining closely.

Implausible Methods

Ask someone to explain how the sciences work and chances are they will tell you about the scientific method:

1. Observe a situation
2. Come up with a hypothesis (an untested theory) to explain a phenomenon
3. Devise an experiment to test whether the hypothesis is valid
4. If the experiment is successful, the hypothesis becomes a theory. Congratulations you’ve discovered scientific truth!

This description is so far from adequate that it is a wonder that so many university students are taught it! Quite apart from the way it sets aside the most difficult aspect of scientific practice (the interrelationships of existing knowledge on any subject) it fancifully imagines that scientists determine truth simply by performing just one experiment, as if scientific truth were as simple as revealing a scratch card – three microscopes, we have a winner! Rather than an adequate description of how contemporary scientific processes operate, this is more akin to a catechism recited in order to bolster faith in the ability of the sciences to reveal truth – and as such, it obfuscates the complexity of the relationships between experiments, theories, and truth, and prescribes a method almost certain to lead to error every time.

If a hypothesis and experiments are indeed the necessary elements of a claim that a certain activity is ‘scientific’, then anthropology, economics, almost all of the evolutionary sciences, and a fair amount of biology and medicine are all doomed to be ‘unscientific’. These kinds of accusation are indeed sometimes advanced – a furore occurred in 2010 when the American Anthropological Association decided to removed the word ‘science’ from its mission statement, despite many of its members feeling this was a consequence of a narrow and reductionist description of the sciences. There are also questions here about concluded research programmes: no-one has needed to perform further experiments in optics, for instance... has it ceased to be scientific? Or did it earn its place in scientific heaven by being a good research field while it was still alive...?

Tied up with this confusion is the idea that the sciences are ‘value free’, i.e. that scientific research is inherently unbiased. This is a naive mistake to make, and on two counts. Firstly, as Nietzsche warned back in 1882, we are “still pious” when it comes to scientific truth – all scientific research rests on a core moral value, a commitment to the pursuit of truth. Without this, the endeavours we undertake in the name of science make no sense; ‘valueless science’ is entirely implausible. Secondly, and even more importantly, scientists are still human, and as such they have their own values. The attempt to purge the sciences of values is nonsensical and indeed impossible! No matter how much you try to present scientific research as a purely rational, emotionless, valueless activity, scientists will continue to pursue research motivated by their own moral values (to save lives or to save the planet, to advance knowledge or technology, to win fame or wealth etc.). The idea that having these values is somehow unscientific is to doom all the sciences to oblivion! The values and the facts are intimately related or, as Hilary Putnam described it, entangled. The idea of a science without values is pure nonsense.

At this point, you have a choice in how you respond to this critique of ‘scientific method’, and this in itself may be illuminating. On the one hand (and especially if you’ve spent any time at all thinking about philosophy of science), you can happily cast off this quite ridiculous dogma and still maintain a viable understanding of the sciences without it. That’s the easy way... but it still has some hard consequences. Or alternatively you can dig in your heels and try to cast out the demons of those that don’t follow ‘the method’, attempting to purify research of pseudoscience, meaning in this case ‘not following the scientific method’, but usually playing out by simply deriding counter-claims against whatever dogmatic position has been adopted on any given point. That path is so misguided it’s a wonder that plenty of otherwise intelligent people seem to fall for it.

As it happens, the sciences themselves show us why this purported ‘scientific method’ is unworkable. Psychology – which has been staunchly dedicated to ‘the method’ yet still gets cast out as ‘soft science’ – has provided a lot of neat titles for the various kinds of human bias. Defenders of ‘the method’ like to evoke hindsight bias to defend the need for hypotheses – “if you don’t make a hypothesis, you’ll just end up seeming to expect the result you get!” But these cognitive biases cut both ways: if you do make a hypothesis, you are now prone to confirmation bias – cherry picking your data and references to support the position you have chosen. This is why medical sciences insist on good quality evidence from randomized trials where even the experimenters don’t know what’s going on until all the data is in. We know from bitter experience that when you set out to prove some specific claim, you are more likely to find (and report) the evidence that supports what you have chosen. In other words, not having a hypothesis condemns you to bias, and having a hypothesis condemns you to bias! What makes something legitimately scientific cannot be the elimination of bias, or else nothing could ever be sufficiently purified of values to qualify. There has to be another way of conceptualising the difference between ‘science’ and ‘pseudoscience’ if either is going to have any legitimate meaning.

Ghosts of Science Past

The celebrated historian of science, Thomas Kuhn, lays out the question of pseudoscience at the very outset of his project to understand the nature of scientific change. The problem as he presents it is that if we judge the historical precedents to our scientific practices as pseudoscientific (he talks of them being ‘myths’), then we have to acknowledge that pseudoscience can be pursued and justified by the same methods and reasons we now use to defend science against its alternatives. Yet if we call these artefacts of older research ‘science’, then we have to accept that the sciences were beset by wild beliefs that today we would find unthinkable (even abominable). He argues very persuasively that from a historical perspective we have no choice but to accept that “out-of-date theories are not in principle unscientific because they have been discarded.”

Kuhn’s position is widely accepted today – yet it runs directly contrary to the view of Sir Karl Popper that the boundary of legitimate science is falsification – the ability to have a theory proven false. Amazingly, this viewpoint is also widely accepted today, even though the two approaches are essentially incompatible, and indeed were the basis for an unresolved despite between the two academics. Kuhn saw Popper’s falsification as applying solely to those rare periods of scientific upheaval (paradigm shifts) where one way of thinking replaces another. His view was that ‘normal science’ never dabbles in big theoretical changes at all, but is always about solving problems using the current theoretical apparatus. Again, these two viewpoints are entirely incompatible, yet both are widely supported views on the sciences.

Popper suggested that Kuhn’s approach committed him to saying that astrology is a science because it entails problem solving within its own paradigm. Kuhn denied this, and argued that in the context of astrology “particular failures did not give rise to research puzzles” and thus astrology was never a science. Both men died without resolving their disagreement; I think it clear, however, that both are wrong about astrology. We cannot – as Kuhn himself warns – back-project our current scientific judgements upon prior practices that were claimed as sciences at earlier times without distorting what we are trying to assert. To do so is to deny the very capacity for scientific revolutions that Kuhn’s account provides. The suffix ‘-ology’ by itself is a clue that the practices of astrology had at one point in its history a claim to knowledge, and the question of whether astrology was ever a science in Kuhn’s terms is a historical investigation requiring far more application to the task than either Popper or Kuhn were willing to commit. As such, this question is in fact still very much open to debate! But nobody wants to do so, because everybody with any skin in this game wants to show that astrology isn’t a science and never was – thus again preempting any possible research except that which will prove this one tenuous point.

If Kuhn’s historical theory (albeit not Kuhn himself) is able to defend against Popper’s attack, Popper’s falsification criteria has no equivalent defence against Kuhn’s criticisms. Indeed, Kuhn expressly doubted that falsifying experiences ever really happen. He did not need the psychologist’s label ‘confirmation bias’ to realise that giving up a scientific paradigm is a major conversion for anyone (comparison with religious conversion is quite justified here), made all the less likely by the problem that if every failure of a theory in the face of contradictory evidence were sufficient grounds for rejecting it, all theories ought to be rejected at all times! That’s because the very reason that Kuhn’s ‘normal science’ has problems to solve is precisely that no theory is capable of fitting all the observations it seeks to explain. As the French science studies scholar Bruno Latour puts it, the theories are all under-determined with respect to the evidence – and this conclusion is unavoidable if you spend time examining what scientists actually do rather than merely reciting the catechism.

But this does not mean there is no way of distinguishing science from pseudoscience, even though we have to accept a certain amount of historical contingency after Kuhn (or Foucault – he gets to the same place via a different route). What we might reasonably suggest as a provisional criteria for calling something ‘pseudoscience’ is a combination of Popper and Kuhn’s claims: when even the possibility of falsification is removed, or when the investigative practices cease to produce further enquiries in response to the questions the previous research implies, the claim to be scientific evaporates. As chemist-turned-philosopher Isabelle Stengers attests, successful experiments in the sciences give rise to new research questions. When they do not produce any more, it is because the field has managed a complete description of its subject matter (as with optics). The difference here is that such ‘completed’ fields have produced theories capable of making unfailing predictions. And such cases are vanishingly rare.

The Condition of Pseudoscience

What tied us up in conceptual knots here, and kept Popper and Kuhn from reaching an accord, is that we want to level the accusation ‘pseudoscience’ at fields like astrology or phrenology. But understanding the sciences as an ecology of practices, as Stengers has brilliantly discussed, shows that this is not the only way we might identify a breakdown of Kuhn’s ‘normal science’. We could (indeed must) give up the idea that ‘pseudoscience’ is a way of trashing any theory, research, or evidential claims we don’t agree with. On the contrary, I propose that the clearest way of understanding pseudoscience is as a condition within a scientific discourse that undermines or destroys its power to investigate.

Thus, to continue with phrenology’s original models of mental function after animal experiments began to show that its suggested brain regions did not hold up to scrutiny would have been to enter into a condition of pseudoscience, because its practices could not produce viable new research questions in the light of this new evidence. It would, however, be wildly unfair to it to suggest it was always in this condition: it is from phrenology, after all, that the idea of the brain being the organ of the mind originated, and while most of its specific claims did not pan out, it remains an important part of the backstory of neuroscience. If phrenology had not become spread around as working class ‘popular science’ (thus earning the enmity of Victorian cultural elites), we might well have kept the name ‘phrenology’ (science of the mind) rather than renaming brain research ‘neurobiology’. It’s not at all clear to me that phrenology was ever in the condition of pseudoscience, except perhaps at the very end – although anyone practicing it today would be behaving very oddly indeed.

Pseudoscience is thus akin to an ailment afflicting scientific practices that have become shorn from the logic of legitimacy provided by their current paradigm. The sign that a field has fallen into pseudoscience is not the truth or falsehood of its claims as such. Indeed, these will frequently not be in any way settled, forcing us into highly suspect retrospective accusations, such as that levelled routinely at phrenology. Rather, you can see the condition of pseudoscience occurring whenever scientists give up the values that motivate their enquiry - when they purposefully falsify data, or conceal it ‘to defend the truth’, or give up experiments and data gathering entirely in order to maintain a status quo based upon whatever happens to have been previously claimed. And once we see this, we are forced into the realisation that we are currently in the condition of pseudoscience in several entirely legitimate research fields, and over the last year we have had the audacity to defend the breakdown in the medical discourses that has put us into a state of collective pseudoscience as “following the science”!

The truth is, we cannot ‘follow the science’, it is the science that must follow us. For the values of science are those of discovery and verification, and this only has a purpose in so much as it serves to resolve those questions our other values compel us into exploring. Thus, while medicine commits to ‘first, do no harm’ as a supreme value governing its own practice, that particular principle sets no positive goal at all. The medical practitioners and the cybernetic networks supporting them take on the objectives that we have collectively given to them. If the circumstances that follow from that pursuit make falsification of a medical claim impossible, or provide no means to reliably answer the relevant medical questions, those medical practitioners affected (and anyone trusting their judgements) enter into the condition of pseudoscience, a (temporary) renunciation of the values of scientific practice, capable of precisely the great harm doctors are sworn to avoid. For the collective medical power we exercise cybernetically always causes some degree of harm along with the pursuit of its goals – requiring medical practitioners, on pain of becoming (temporary) pseudodoctors, to commit to studying the impact of any procedure or intervention attempted or else risk violating all the values of contemporary medical science. This is an extreme example, but it is also an extremely important one.

Now whether the values of discovery and verification have always conditioned the work of scientists, and whether they always will isn’t the point, for they are our moral requirements for the sciences now and on this point we quite miraculously do not disagree. In so much as pseudoscience is a phenomenon, it is merely a consequence of recognising that scientists are human, and what makes them seem otherwise is the remarkable power that they bring to bear when cybernetically linked into singular networks, working together – not just by co-operating but just as importantly by disagreeing, refining the research questions by honing the essential ambiguities into points sharp enough to penetrate our ignorance by pursuing further investigations and experiments. Pseudoscience prevents that dialogue from happening, and breaks up the network connections, making research harder or preventing it entirely, setting bias against bias and thus blocking the communication essential to verification, which is necessarily a distributed activity.

When verification stops, pseudoscience has begun... it goes away when we can go back to listening to those objections that our human bias prevented us from hearing. The ugly truth of it all is that fear, anger, and self-righteousness spread pseudoscience all too easily, yet banishing it is as easy – or as impossible – as going back and listening to the objections in order to work out where in the maze of ambiguity, indeterminacy, and incompleteness the truth of each disagreement can be found.

More philosophy of science soon.

How To Be Yourself

Untitied.KwangHoShinPerhaps the first mistake we all make as individuals is to think that we know how to be ourselves. When we object to someone else that "nobody can be me but me" we're being entirely truthful, but we should not deduce from this that being yourself is easy.

The Danish philosopher, Søren Kierkegaard, puts it beautifully:

There is a fear of letting people loose, a fear that the worst will happen once the individual enjoys carrying on like an individual. Moreover, living as the individual is thought to be the easiest thing of all, and it is the ethical that people must be coerced into becoming. I can share neither this fear nor this opinion, and for the same reason. No person who has learned that to exist as the individual is the most terrifying thing of all will be afraid of saying it is the greatest.

The individual person isn't a loner survivalist cut off from society, but one being among others whom they live amidst. When we angrily desire our individuality, what we are hungering for is an escape from the ties that bind us to these other beings that intersect our lives – but this we cannot achieve except through the self-destructive intervention of breaking these ties one-by-one. Every time you resort to this drastic step, you sever yourself from another piece of your individuality, for it is all these random, circumstantial connections to other beings and things, places and people, that are the raw materials from which your life is built. Without it, you are not an individual you are nothing, both because it is these circumstances that brought you to life and kept you alive ever since, and also because who you are flows from where you are coming from.

Now it is difficult for me to speak about this question of becoming yourself, because I do not want it to sound that I am claiming that I know how to be you better than you do. Obviously, I don't even know who you are as I write this! Rather, what I am trying to do is offer a warning that being yourself is much harder than it sounds. It is always a dangerous game, giving advice, and often disastrous when advice is given in anger or haste, and the last thing I would ever want to do is interfere with anyone's exploration of how to be themselves. Besides, as Kierkegaard warns, whenever we try to tell others how to be themselves we "betray ourselves by our instantly acquired proficiency, and fail to grasp the point that if another individual is to walk the same path, they have to be just as much the individual and can, therefore, be in no need of guidance, least of all from anyone anxious to press their services upon others…"

However, I can see little harm in pointing out that whatever being yourself is going to entail, it might help to understand what you are...

What You Are

We tend to assume we know what kind of thing we are – yet there are many different choices for understanding what you are, all of which can work out for certain people and any of which can lead to disaster when undertaken thoughtlessly.

Take the case of disbelieving in the reality of your existence. If you come to think that you don't really exist because you are just an illusion brought about an elaborate hoax of your biology, then there is no possibility of being yourself because there is no you to be. This seems like a terrible start to any process of self discovery! Yet this self-negating way of understanding what you are could also be illuminating, as it is to Buddhists and Hindus whose conception of appearances as essentially illusionary offers a way of discovering yourself through a denial that your thoughts and desires are the most important part of your existence. In this, as in so much in life, the same assumptions can lead to radically different conclusions.

Most likely, you view yourself as a consciousness inhabiting a body, with the latter generating the former via the biology of neuron connections that grants you free will and powers of imagination. In which case, your view is not terribly different from that of people who lived hundreds or even thousands of years ago, apart from the name given to the kind of thing you are. As the British philosopher Mary Midgley made clear:

When the sages of the Enlightenment deposed god and demystified Mother Nature, they did not leave us without an object of reverence. The human soul, renamed as the individual – free, autonomous, and creative – succeeded to that post, and has been confirmed in it with increased confidence ever since. Though it is not now considered immortal, it is still our pearl of great price.

The danger in buying into a purely individual conception of who you are is that it will make your existence appear to be something emanating solely from inside your mind. But that's not the case – who you are and what you are may have its locus of experience inside your mind, but it is constituted and sustained by the network of connections and situations I mentioned above, the raw materials from which you make yourself. We take great risks with our selfhood, therefore, if we think of what we are as something wholly sealed inside our heads.

Inside Out

Whatever way you settle upon for understanding what you are, you then have to negotiate the tension between what is apparently inside (your mind, your memories) and what is apparently outside (your social connections, your lived environment). Psychologists have finally started to come around to the idea that your mind is partly constituted by this exterior environment. Compelling recent concepts like 'enactivism' and 'embodied cognition' explore a path cleared by philosophers, especially the German philosopher Martin Heidegger. Heidegger saw our situation as one of being thrown into a world, the circumstances we are born into being the very condition for discovering what we mean by ourselves.

But how do we distinguish between inside and outside? Many teenagers try to break ties with their family or the traditions of their birth culture as an act of asserting their individuality... but the rejection of these relationships becomes in itself an act of participation, participating in exile, if you will. Active rejection of family or tradition still defines the inner self in these cases precisely by that rejection. Rather than severing that connection, we simply take upon a different form of connection – that of opposition or withdrawal.

To navigate this problem requires that we have access to some concept of what is good or right for us, but this cannot simply be to act on our hunches – that would risk removing ourselves from any viable standards of judgement. Our ability to make accurate judgements depends, after all, upon our tools for thinking (our languages and terminology) which are sustained by communities of practice. It is for this reason than the Canadian philosopher Charles Taylor explored an "ethic of authenticity" that emerged in the last century or so:

To know who I am is a species of knowing where I stand. My identity is defined by the commitments and identifications which provide the frame or horizon within which I can try to determine from case to case what is good, or valuable, or what ought to be done, or what I endorse or oppose. In other words, it is the horizon within which I am capable of taking a stand.

This is part of the reason why encounters with new communities of practice can be so transformative – whether it is a religious tradition from outside of our prior experience, a community of care based around a sexuality or gender identity we had not previously considered as applying to us, a medical diagnosis that connects you to other people with whom you share a commonality of experience, or a political faction that speaks to you from outside of your prior assumptions, the discovery of who you are frequently involves a voyage outside of your mind and into revelatory new connections with others.

Yet each encounter of this kind also risks deceiving us – especially when we have actively broken ties to our previous communities. The discovery of a new network of care that we can see ourselves belonging to is alluring, because as social creatures we crave belonging even though other humans fundamentally annoy us (as the Prussian philosopher Immanuel Kant remarked, we are "sociably unsociable"). But this inherent appeal of belonging to something cannot resolve the question of whether the identity we are trying on is an authentic solution to the problem of ourselves. But by the same token, nobody watching 'from the outside' is going to be able to decisively determine what is and isn't authentic on our behalf. We are all inside and outside the same boats in this regard.

The danger of treating the dizzying array of possible identities presented to us as merely a buffet or a shopping catalogue to chose from is the risk of failing to notice how each encounter with every possibility of understanding ourselves is going to have an effect on who we are becoming. If we think of who we are as just a single identity where we simply have to browse the shelves until we find "the right one", we will end up reducing ourselves to a mere caricature of who we could be if we took the time to discover authentic connections with all the many facets of who we are and might be.

Paradoxically, discovering how to be yourself requires other people, both as examples to understand, and as a sounding board as we work through the challenges of understanding how the different shards of who we are fit together into a coherent whole. Even if you were "born this way", you still needed to learn about 'this way' by seeing these possibilities for existence acted out in others. Identities are sustained by their communities – and counter-intuitively, they are strengthened by the opposition of other communities that deny their legitimacy, for we are never bolder than when we feel threatened.

The problem of being yourself has no quick fix, and certainly cannot be solved by ordering your new self online. It requires you to do the work, thinking and feeling through your existing connections and communities, taking on new potential aspects of yourself with care and not rushing the process of discovery by letting our enthusiasm for the new lure us away from parts of who we are that are far more important than their humdrum familiarity might suggest.

How do you discover how to be yourself? The same way we learn anything: you watch other people become themselves, and then try to make some of what you encounter work for yourself. Sometimes it will. Sometimes it won't. Sometimes it will seem impossible that this could be you, but you may still later come to see how it all fits together. It's a mystery to solve, and only you can solve it – but you will have a much greater chance of success the more you listen to others and recognise that you can only be yourself with others. Alone, you are trapped 'inside' with your fears and your anger – only together can we find ourselves.

Prepare yourself for the adventure of a lifetime.

The opening image is an untitled painting by KwangHo Shin, which I found here. As ever, no copyright infringement is intended and I will take the image down if asked.

Scorsese vs Marvel Studios

Scorsese vs ThorVeteran film director Martin Scorsese could scarcely ask for better publicity for his new film, The Irishman, than picking a fight with the box office powerhouse that is Disney's Marvel Studios. In an interview for Empire magazine, Scorsese was asked about Marvel movies and replied:

I don’t see them. I tried, you know? But that’s not cinema. Honestly, the closest I can think of them, as well made as they are, with actors doing the best they can under the circumstances, is theme parks. It isn’t the cinema of human beings trying to convey emotional, psychological experiences to another human being.

This is a much more interesting statement than it might first appear. Before delving into it, however, it is worth acknowledging that Scorsese would never have had anywhere near as much coverage for his new feature if he had not decided to position himself against one of Disney's two big-ticket purchases both of which were acquired to fill a gap in the media corporation's portfolio, which was always lacking in action franchises. I don't think it greatly matters if this is a planned PR manoeuvre from the 76-year-old director, or a lucky striking of gold by one of Empire's writers, either way it's a win for both parties since the battle line it draws guarantees more attention for both of them, and mobilises the legions of Marvel fans for free publicity, since negative reactions online – especially those guaranteed to travel far – have nearly the same effect as ploughing millions of dollars into marketing.

But I do not mean to suggest that Scorsese is disingenuous in his remarks – indeed, as critic Jed Pressgrove remarked to me on Twitter, there really is nothing enormously surprising about these comments in terms of the discourse surrounding films. That's because it has long been a tenet of what might be called 'serious' cinema that there are two competing forces in the movie theatres. This 2016 blog post by filmmaker Rob Hardy poses this divide in terms of 'films' (Scorcese's 'cinema') and 'movies' (Scorcese's 'Not cinema'), and there are hundreds of similar claims spanning decades. What is at heart here are implicit aesthetic values and the practices that those aesthetic values belong to. Representatives of cinema or film are claiming the artistic high ground – often falling just short of outright saying "we are art, you are not", but always implying it – and contrasting their craft against 'movies', which are not actively represented by anyone in this argument but merely the mass market shadow of the practice that Hardy calls 'filmmaking'.

When film critic Roger Ebert declared that videogames could not be art, or when disgruntled gamers declared Dear Esther was 'not a game', these claims were undergirded by specific aesthetic values and, along with this, participation in the practices that sustain and embed those values. Dear Esther was 'not a game' to anyone for whom 'games' were either the aesthetic pursuit of victory or of problem-solving, an aesthetic camp explored beautifully by game scholar Jesper Juul in his book The Art of Failure: An Essay on the Pain of Playing Video Games. Coming at the matter from this territory in the aesthetic landscape all but requires the erection of a barrier: the Chinese Rooms ingenious usurpation of the components of first person shooters for something radically novel had to be 'kept out' of games because of a felt need to valorise different aesthetic values, those associating games with challenge where something like Shadow of the Colossus might be pointed to as an exemplar. This is the videogame mirror of Scorcese's 'not cinema', which is also Hardy's 'film versus movies'.

Writing centuries before either films or videogames, the Enlightenment philosopher Immanuel Kant made a crucial point about our aesthetic values: that when we assert them, it is because we expect our judgements to have universal assent, or rather we behave as if they should be capable of garnering such agreement. As a result, when something transgresses our aesthetic values – when a Marvel movie is claimed to be cinema (for Scorsese) or Dear Esther is claimed to be a game (for certain gamers), there is an aesthetic transgression, and just as we would baulk at a moral transgression, there is potential for outcry, opposition, and argument. The disagreement, however, is usually hollow since two positions divided by distinct values never connect in any meaningful way. As Kant observed: it is a 'commonplace' that everyone has their own taste, and also that 'there is no disputing of taste'.

Thus there is no need or purpose to Marvel Studios' myriad fans stepping up to the plate to try to defend the Marvel Cinematic Universe by pointing to examples of movies in that corporate megatext that meet Scorsese's apparent definition of cinema in terms of conveying psychological experiences... as Hardy puts it, the question goes to intention not outcome, and I would further suggest that what lies at root here is participation in a particular tradition, a distinct practice of making and engaging with films that is not rooted in entertainment, for all that it is frequently marketed successfully as that. Besides all this, Scorsese is surely correct to compare everything that comes out of Disney's corporate process to theme parks, since this is the practice that the House of Mouse pioneered and is still engaged in: an applied psychology of commercial entertainment rooted in meticulous brand management. In this regard, Scorsese's point is nearly impossible to rebuff and comes down to a claim about the limits of authorial intent: whatever filmmakers might achieve in a Marvel Studios movie cannot change the fact that what has been made is the result of a tightly-managed corporate process of engineering both brand and entertainment value on an industrial scale. Our only choice is whether this matters for our enjoyment of what results – and this depends upon which practices we ourselves are engaging in when we go to the cinema.

Silk is About... The Designer's Notes

Silk Notes

Silk is About... was a Designer's Notes serial in five parts that ran here at Only a Game from August 27th to September 24th 2019. It examined the thematic influences behind the game Silk, and pondered the game from a historical, personal, and political perspective. Each of the parts ends with a link to the next one, so to read the entire serial, simply click on the first link below, and then follow the “next” links to read on.

Here are the five parts:

  1. Silk is about... 200AD
  2. Silk is about...1984
  3. Silk is about... Glorantha
  4. Silk is about... Religion
  5. Silk is about... Brexit

Silk is out on Switch, Windows, Mac, and Linux in October 2019.

Silk is About... Brexit

BrexitSilk is my Brexit game. There, I said it.

Silk is about Brexit because Silk is about how people live together and, perhaps even more so, how they fail to live together. I see in 200AD an allegory of 2000AD, lessons we can learn and did not learn, and are still not learning.

I am not committed to either side of the Brexit ‘debate’ (‘battle’ is perhaps more accurate, since a debate assumes a conversation entirely absent in this matter). I understand the argument that sees in leaving the European Union an opportunity for national self-determination, even if I myself could not vote for leaving because of my suspicion – now amply proven correct – that voting to leave would not spark the essential political dialogue required for the United Kingdom to acquire a viable, shared national identity. Instead, it deepened a previously ignored divide. Knee-jerk racism lines up on one side alongside those who had more honourable reasons for desiring a departure from the EU, while political one-upmanship and the certainty that everyone has it wrong except those who agree with you overwhelm all sides and leave us no closer to having a sense of what our country could or should be.

In Silk, the desire for self-determination is echoed in the imperial battles the game makes central to the Warlord and the Rebel. Settlements defend themselves in Silk when they feel threatened... today, nations do the same. The potential for military power to be abused was always present, and has little to do with the reasons people desire to defend themselves from threats from the outside. Then as now, what starts as defence ends as empire-building. Many Brits still feel like they are part of the British Empire even though in truth we are only offered the choice of being a neighbour to the European Empire or a vassal of the US Empire. But that desire to make your own nation everything it can be is not as morally wrong as liberal opponents to national pride make out. As Mary Midgley observed, we are entitled to put our own interests first; every species does this, and doing so need not – and indeed usually does not – devolve into utter selfishness, even if that is an ever-present risk.

What risks getting lost in this perspective, however, is that co-operation is almost always in our best interest. In Silk, this is represented by the Caravan itself, where a hugely diverse range of cultures and ethnicities come together to try and succeed in the challenge of surviving in the wilderness in the Traveller, or striving to profit from trade in the Noble. The game intentionally has a little casual racism in some of the Advisor’s responses to the world they are travelling... the unfamiliar culture will always provoke a suspicious reaction, after all. I learned so much about the complexities of racism reading Michael Moorcock’s astonishing Between the Wars quartet, and Isabelle Stenger’s “The Curse of Tolerance” deepened my understanding of this even further. Racism and opposition to racism both block co-operation in their own ways, but the lesson of the Caravan in Silk is that we gain more from co-operating than from going it alone. That’s not an argument for staying in the EU as such: it’s an argument for not letting a fight about whether we should endorse one ideology or another tear us apart as a nation. And that’s just as true in the United States as is it is in the United Kingdom.

So when I say that Silk is my Brexit game, I’m not saying that Silk is offering an answer to the problems of Brexit, but rather that in this game I am reflecting on the cultural problems – in the UK and elsewhere – that led us to Brexit, and that are not solved by leaving Europe, nor by remaining. We have lost our sense of the benefits of co-operating, either because we demonise those from other cultures we see as ‘different’ (especially Muslims), or because we have lost respect for our fellow citizens and are no longer willing to let them participate in democracy because we are so convinced that they are ‘wrong’. I see disaster on both paths. Silk is, in a way that is woven into the tapestry of every game of it that anyone plays, an opportunity to reflect upon our interdependence with those around us, and to consider different paths.

We can be more than divided nations squabbling against each other, if that’s what we wish. The question, as Silk asks every player to decide at every juncture, is always: what will we choose...?

Silk is out on Switch, Windows, Mac, and Linux in October 2019.