Magical Science

Elementary Dear DataArthur C. Clarke famously suggested that any sufficiently advanced technology would be indistinguishable from magic. This suggests another maxim: any insufficiency developed philosophy of science is incapable of distinguishing between science and magic.

We all have our own philosophy of science, our conceptual framework for understanding scientific topics. In the best case, our personal philosophy of science informs us of the limitations of scientific knowledge, allows us to put research into a wider context, and ensures we remember that the work of the sciences is still at heart an entirely human endeavour. Alas, few of us have such a clear view of the sciences. Far more widespread is a kind of pervasive mythos we might call ‘magical science’, which affords to the image of science unlimited future power, and to scientists an awesome capacity to divine the truth through singular experiments, like a Roman haruspex reading animal entrails to predict the future.

Magical science has the dubious honour of being the only superstition widely encouraged today. We are all too frequently adamant that science has all the answers, science is the royal road to truth, that we can trust in the science... I notice that even the British Prime Minister has taken to invoking magical science in his speeches these days to validate his increasingly dubious actions. At heart, magical science may seem harmless, a mere rose-tinted vision of the work of scientists, one that tries to account for all the successes of our various research networks without any attempt at balance or insight. We typically overlook this kind of naive enthusiasm for scientific achievement on the basis that it's at least ‘supporting the right team’. Yet it becomes increasingly clear that blind support for science can manifest in ugly ways, even in ways that can prevent the sciences from working, plunging research into the debilitating condition of pseudoscience, as previously discussed.

The perceived infallibility of the sciences as truth-seeking procedures clashes worryingly with the necessity of scientists making mistakes, and thus magical science leads to anger at scientists when the actual scientific work is not as wondrous as it is imagined it should be (as with the ugly 2009 L'Aquila trial, where terrible earthquakes in Italy were not successfully predicted and the scientists blamed), or when any scientist speaks out against a claim that has been proclaimed unshakably true by its advocates. It is precisely because magical science is incapable of distinguishing science from magic that it represents a far greater danger to scientific endeavours than other philosophies, perhaps even so-called ‘anti-science’ philosophies. What deceives us here, what elevates scientists to their misguided role as flawless augurs rather than researchers struggling with ambiguous data, are the bad habits we have learned from the manifestations of science in fiction, where magical science is the norm. If we wish to see the work of the sciences with clearer eyes, we may have to start by putting some of the most iconic characters in fiction on philosophical trial.

Sherlock Holmes and the Flawless Investigation

It is sometimes remarked that in creating Sherlock Holmes, Sir Arthur Conan Doyle produced the first hero of ‘the scientific age’. The Victorians were the ones who coined the term ‘scientist’ and it was their obsession with the sciences that set the scene for the unfolding technological transformation of the world over the next century and a half. We tend to treat the character of Holmes as significant mainly for crime fiction, as the archetype from which all whodunits descend - but Holmes, quite unlike a Raymond Chandler or Agatha Christie detective, is always a practitioner of magical science. Partly, this proceeds from the inherent parsimony of storytelling whereby all questions will eventually be answered because everything is there by the author’s design. Partly, however, it proceeds from Holmes’ essential power - which upon closer inspection is not deductive reasoning at all, but rather the infinite convenience possible solely in literature.

Doyle gives Holmes a quite impossible access to every conceivable fact as a starting point, such that a berry stain or the smell of a particular tobacco can certainly be identified, and then (to pile on the absurdity) Holmes by purest chance always encounters a set of circumstances that allow for only one viable interpretation. This particular brand of tobacco, for instance, is sold in exactly one place in London... We thus end up admiring Holmes purportedly scientific form of investigation while what we ought to admire is the way Doyle effortlessly conceals the magical science entailed in this depiction by making it seem as if all of Sherlock’s deductions (and inductions) were strictly logical. Doyle has contrived a set of circumstances that Holmes, with his unlimited catalogue of facts, can be certain to solve. This makes Holmes a disastrous role model for scientists (or indeed, detectives!) since it is only through the meticulous construction of literary contrivance that he possesses any investigative power at all. This becomes clearest when Holmes relies upon facts we know are false - such as the ludicrous snake plot device in The Speckled Band, which entails behaviour implausible to coax out of any reptile. Holmes’ claims to be a man of science are rather fraudulent behind the scenes: he is simply the locus of a mythic depiction of magical science.

Neither is Holmes the only such character. Both Spock and Data in the worlds of Star Trek share this power of magical science - also manifested in these shows by the tricorder, which like Holmes spits out every required fact on demand and without error. Or consider Doctor Who from the third Doctor onwards: anything necessary is certainly known by the Time Lord, except when the story requires a convenient (and often temporary) amnesia for dramatic effect. That both Data and the Doctor had a spin at being Baker Street’s most eligible bachelor is not accidental, nor perhaps is Stephen Moffat’s concurrent time as showrunner for both Doctor Who and Sherlock... Magical science heroes seem to reaffirm our faith in the power of scientific knowledge, while also playfully exposing the quirky personalities of scientists. House, The Big Bang Theory, and much more besides all participate in a literary tradition that stems from the Sherlock Holmes tales, and is now seemingly dominated by his science fiction proteges. 

Yet these are not scientific heroes, but magical science heroes. They have exactly the facts and the circumstances to answer perfectly every time, without ever having to confront the ambiguity, indeterminacy, and incompleteness of an authentic scientific problem. They are to science what Superman is to police officers: naively idealized caricatures. They find the answers solely because they live in stories where uncovering the truth is possible by design. This is a wildly misleading template for scientific truth, and although we know these are ‘just’ stories, we somehow import our wilder beliefs about the sciences into our everyday thinking unless we are extremely careful. If we are to break this spell, we need a philosophy capable of distinguishing science and magic - and for this, we need a clearer understanding of ‘scientific truth’.

Desperately Seeking Truth

Even if we start with the acknowledgement that the sciences are capable of discovering or affirming truth, the question of what might qualify as a ‘scientific truth’ is far trickier than it seems. As the preceding discussion on pseudoscience made clear, we cannot simply append ‘scientific’ to known truths without distorting the essential ambiguities of the research process where we cannot in practice know if the apparent truth of a researched claim will hold in the future. In fact, we have a choice. We could align ‘scientific truth’ with the unshakeable deep truth of reality and thus admit that the claims asserted by scientists cannot be known as truth at all (effectively contracting the domain of scientific truth to concluded research programmes like optics). Or else we can align scientific truth with the body of beliefs held by scientists, with the inevitable consequence that such truths can be later revealed as false - or even abominable. We don’t even have to go back a century to find all manner of racist, sexist nonsense asserted as truth by those who identified as scientists.

Now those who buy into magical science have an easier job here, but only by being wildly dishonest about both truth and scientific methods. According to magical science, scientists uncover truth infallibly so all claims asserted by scientists are scientific truth. Thus if and when the circumstances shift we can ‘debunk’ or ‘discredit’ those responsible and say they were not really scientists at all, or even exclude their claims from consideration in the first place! This is where ‘pseudoscience’ has been used as a label, although as I have argued previously it is not a terribly viable way of using the term. Babette Babich has made even stronger - and oft misunderstood - claims about the way the discrediting associated with the term ‘pseudoscience’ serves as a dogmatic attempt to demarcate legitimate science, while all too frequently preventing any scientific enquiry from even beginning. Thus when this particular word comes out, it narrows scientific knowledge by declaring certain topics forbidden and out of bounds - and woe betide the researcher who goes on to try to report experimental results from such verboten fields...

The highly problematic implication of every attempt to discredit and thus demarcate ‘science’ from ‘pseudoscience’ must be that we cannot know when scientists assert a claim whether it will later need to be ‘debunked’. Thus faith in magical science is inevitably a distortion of the truth - for things we will say are scientific truths on this philosophy may later be ‘discredited’, or even discredited before they are considered at all. The alleged truths of magical science are thus only defended by ignoring the inevitable consequences of the inherent revisionism of scientific practice and pretending that the current consensus among researchers is ‘more true’ than it was yesterday and thus that now (and by implication, only now) we can trust everything scientists say as long as we are standing guard for those pernicious pseudoscientists who ruin it for everyone. To say that this is dangerous nonsense is easy; to replace it with a more sound philosophy of science will be much harder.

There might be a way out of this maze, but it would require us to think differently about the relationship between truth and the sciences. Part of what deceives us here is our desire to understand the truth in terms of a set of valid statements. Since we can point to scientific concepts we abandoned, like phlogiston (which was a hypothetical substance that made combustion possible), we want to assert a gradual improvement in the accuracy or scope of our ‘book of facts’. “We would not be fooled by phlogiston today,” we might think. Yet phlogiston was an important - and arguably entirely scientific - proposal that was merely discarded when our understanding of chemistry shifted such that combustion could be thought of in terms of a chemical reaction with oxygen.

The brutal truth of the ‘book of facts’ is that such a collection of statements today would theoretically contain far more ultimately false claims than it would in the 1770s, simply because the number of scientists and the diversity of research fields has increased dramatically we are now paradoxically more wrong than researchers in the 18th century (in terms of sheer numbers of errors made) - the inescapable consequence of asking both more and more difficult questions. What makes it feel as if we are now more right is knowing that phlogiston was to become replaced by a new understanding of chemical reactions and thus combustion and so forth. But this is largely an illusion caused by examining successful research programmes in hindsight.

Similarly, when I say phlogiston was ‘scientific’, I am projecting with hindsight since the term ‘scientist’ was not coined until 1834... researchers in the 1770s would not have described anything they were doing as ‘scientific’ - it is our desire to paint the sciences as something with a history of more than two centuries that makes us ‘claim’ both phlogiston and oxygen (not to mention Copernicus, Galileo, Newton and so forth) as part of the story of ‘science’, rather than the natural philosophy that those involved would have stated they were pursuing. Thus our ‘book of facts’ not only contains more errors than our predecessors two and a half centuries ago, it is not even entirely honest about its relationship with its own past. Add to this the unavoidable truth that this imagined ‘book of facts’ does not exist (for all that encyclopedias and their successors have wished to fulfil this role) and it begins to feel uncomfortably like we are deceiving ourselves - as if we have all fallen for the seductive confusions of magical science.

Legitimate Practices

We want to defend our intuitive impression of the sciences as truth-seeking, and also (in some nebulous sense) successful at doing so. How do we do it?

One option we can consider is that which I proposed in Wikipedia Knows Nothing: to switch our focus from facts (true statements) to practices (skills and equipment). To know how to use something - a polymerase chain reaction, an interferometer, a fractional distillator - is more a matter of knowing what to do than it is a ‘book of facts’, even though that knowledge also produces facts related to the equipment used (and any theories deployed to give a context to the reading of the instruments). Thus an astronomer armed with geometric theorems can use an interferometer to measure the diameter of stars, while an engineer can use an interferometer and the wave theories of light to measure very small objects precisely. The practices associated with both the equipment (the interferometer) and the theories associated with each specific usage give rise to facts - in this case, distances. The difference lies in what legitimizes the activity in question: on the usual conception of knowledge, if you had the facts you had legitimate knowledge if those facts were true and the reasons for justifying them were correct - which actually provides no means of knowing what is or is not legitimate since our criteria for legitimacy requires an appeal to something beyond the situation (the truth) that we cannot access directly. Conversely, when we view knowledge as a practice, what makes the facts legitimate is that we are using the tools correctly. In this context, we have recourse to everyone with the relevant knowledge of the tools entailed to verify the legitimacy of the practices used and hence the facts reported.

On this understanding of knowledge, unlike an appeal to the truth, we can construct a viable understanding of ‘scientific truth’, since certain equipment, certain theories can be uncontroversially attributed to the sciences, and their correct usage can be judged by anyone else with access to the same knowledge practices. On this path we can therefore distinguish between scientific truth (facts emerging from legitimate research practices) and errors, provided we allow the disagreements to be properly explored in any given research community. However, as Babich warns, this cannot happen if we rush in with a dogmatic cry of ‘pseudoscience’, since every attempt to discredit something a priori entails an outright refusal to think about a given topic at all. Ironically, such attempts to discredit effectively cause an outbreak of the condition of pseudoscience, in my sense (a state of disrupted communication where scientific work can no longer be pursued), since whomsoever speaks this word with the intent to discredit (and thus ignore something) signals the very breakdown of legitimate scientific disagreement required to understand whatever is (not) being discussed.

The deeper problem we encounter when we look more clearly at how scientists discover or verify truths is that the claims that are asserted soon exceed simple assertions of facts. Once they do, it requires another set of knowledge practices to disentangle the relationships between facts and conclusions - and these are not strictly scientific at all, for all that scientists engage (unknowingly) in these kind of interpretative philosophical practices every time they assert anything but the most trivial of claims. Indeed, precisely the crisis of contemporary sciences is that their application is not a scientific practice, but a philosophical one - and Einstein’s generation may have been the last where scientists spanned these disciplines rather than retreating behind specializations that narrow, rather than widen, the scope of our collective understanding.

It is small wonder that we seem to have arrived in a “post-truth” world: the attempt to make the only acceptable truths those that flow from scientific endeavours renders a great many of the truths that matter impossible to adequately discuss, precisely because the important truths (those that pertain to what we ought to do, for instance) could never be scientific and thus cannot be established solely by an appeal to the facts. Yet we keep looking to scientists to give us a certainty that is not in any way available through scientific methods - and as the L'Aquila trial in Italy demonstrated, we will turn upon those who do not live up to our insanely unrealistic expectations and even accuse them of committing crimes when they, inevitably, make mistakes. But it is we that have failed, by falling for such an impoverished understanding of the complexity of scientific research as that of magical science.

Breaking the Spell

The needs of a narrative require magical science for the very same role as arcane magic - as a plot device limited solely by our imagination - and the two are (in more ways than we tend to acknowledge) equivalent, exactly as Clarke foreshadowed. The problem is, the actual work of the sciences, the global cybernetic collaboration of scientists that began under that name in the 1800s and continues today, is magical solely in its lustre and not in its details. Yes, the collective technological achievements facilitated by the work of countless scientists is now already indistinguishable from magic in a great many situations. But the work of scientists is not magic, and is certainly nothing like the magical science of a Sherlock Holmes fable. When we mistake the two, when we treat a human who conducts scientific work as someone wielding all the sorcery of magical science to know, automatically, everything that needs to be known, we are not supporting scientific truth-finding at all, but making it far, far harder, and in the worst cases, rendering it entirely impossible.

I will not say we must stop enjoying the fantasy of magical science in our stories - escapism is mostly harmless, after all, even if it is not entirely blameless - but is it not perhaps about time we stopped pretending that our scientists are superheroes with magical powers to determine truth? Scientific truths are extremely specific, and much narrower than we want them to be - they are at their most precise precisely when their claims are most limited. The heroism of actual researchers is of a patient, humble kind, that requires time and substantial disagreements to bring about. It is neither as spell-binding as Holmes’ contrived deductions, nor as charmingly detached from human fallibility as Data or Spock’s inhuman resourcefulness suggest. Neither has any living scientist access to the unquenchable moral certainty of the later incarnations of the iconic Time Lord to guide them either. These role models all imply a role that is impossible to bring to life: we should be careful not to buy too deeply into such implausible exemplars, without dismissing entirely the hopes and ideals that they embody.

Actual scientific practice is amazing, but it is neither miraculous nor supernatural. It is rather mundane in its details, which never entail perfectly prophetic experiments, and always require a great deal more arguing about the possible interpretations of the facts than literature has ever depicted. When we cannot distinguish science from magic, we obscure scientific truth and the immense and heroic efforts required to produce and understand it. We do all our scientists a disservice when we mistake them for sorceresses and wizards, and we entirely dishonour the scientific traditions when we censor or revile researchers for not living up to our hopelessly elevated expectations of their truth-discovering powers.

If we cannot distinguish science from magic, we need to either improve our philosophy of science or else remain silent on scientific topics. As Holmes remarks: the grand gift of silence makes Watson quite invaluable as a companion, for scientists, much like Holmes, often need us to pay close attention to their work and their disagreements, so that together we can eventually reveal true claims about our world. When we work to silence and discredit others we disagree with, rather than remaining silent so we might hear those disagreements we are denying, we have destroyed the very conditions for any kind of legitimate scientific investigation to occur. If we truly wish to be friends of the sciences, perhaps we too ought to know how to hold our tongue and try to listen to the quiet whispers of the truth when the game is afoot.

Comments always welcome, especially the polite ones!


What is Pseudoscience?

PhrenologyWhen we talk about something being ‘pseudoscience’ what we tend to mean is that it’s ‘not true’, and we reach that conclusion because what we mean by pseudoscience is something that is ‘not scientific’, and we associate the sciences with truth. Yet the alternative to truth is not automatically falsehood; there is also ambiguity, indeterminacy, and incompleteness to consider. What’s more, if we call things scientific only if they are true, we are admitting that we don’t actually know what is or isn’t scientific until some future time when the arguments about some given topic are finally resolved. There is a confusion here worth examining closely.

Implausible Methods

Ask someone to explain how the sciences work and chances are they will tell you about the scientific method:

1. Observe a situation
2. Come up with a hypothesis (an untested theory) to explain a phenomenon
3. Devise an experiment to test whether the hypothesis is valid
4. If the experiment is successful, the hypothesis becomes a theory. Congratulations you’ve discovered scientific truth!

This description is so far from adequate that it is a wonder that so many university students are taught it! Quite apart from the way it sets aside the most difficult aspect of scientific practice (the interrelationships of existing knowledge on any subject) it fancifully imagines that scientists determine truth simply by performing just one experiment, as if scientific truth were as simple as revealing a scratch card – three microscopes, we have a winner! Rather than an adequate description of how contemporary scientific processes operate, this is more akin to a catechism recited in order to bolster faith in the ability of the sciences to reveal truth – and as such, it obfuscates the complexity of the relationships between experiments, theories, and truth, and prescribes a method almost certain to lead to error every time.

If a hypothesis and experiments are indeed the necessary elements of a claim that a certain activity is ‘scientific’, then anthropology, economics, almost all of the evolutionary sciences, and a fair amount of biology and medicine are all doomed to be ‘unscientific’. These kinds of accusation are indeed sometimes advanced – a furore occurred in 2010 when the American Anthropological Association decided to removed the word ‘science’ from its mission statement, despite many of its members feeling this was a consequence of a narrow and reductionist description of the sciences. There are also questions here about concluded research programmes: no-one has needed to perform further experiments in optics, for instance... has it ceased to be scientific? Or did it earn its place in scientific heaven by being a good research field while it was still alive...?

Tied up with this confusion is the idea that the sciences are ‘value free’, i.e. that scientific research is inherently unbiased. This is a naive mistake to make, and on two counts. Firstly, as Nietzsche warned back in 1882, we are “still pious” when it comes to scientific truth – all scientific research rests on a core moral value, a commitment to the pursuit of truth. Without this, the endeavours we undertake in the name of science make no sense; ‘valueless science’ is entirely implausible. Secondly, and even more importantly, scientists are still human, and as such they have their own values. The attempt to purge the sciences of values is nonsensical and indeed impossible! No matter how much you try to present scientific research as a purely rational, emotionless, valueless activity, scientists will continue to pursue research motivated by their own moral values (to save lives or to save the planet, to advance knowledge or technology, to win fame or wealth etc.). The idea that having these values is somehow unscientific is to doom all the sciences to oblivion! The values and the facts are intimately related or, as Hilary Putnam described it, entangled. The idea of a science without values is pure nonsense.

At this point, you have a choice in how you respond to this critique of ‘scientific method’, and this in itself may be illuminating. On the one hand (and especially if you’ve spent any time at all thinking about philosophy of science), you can happily cast off this quite ridiculous dogma and still maintain a viable understanding of the sciences without it. That’s the easy way... but it still has some hard consequences. Or alternatively you can dig in your heels and try to cast out the demons of those that don’t follow ‘the method’, attempting to purify research of pseudoscience, meaning in this case ‘not following the scientific method’, but usually playing out by simply deriding counter-claims against whatever dogmatic position has been adopted on any given point. That path is so misguided it’s a wonder that plenty of otherwise intelligent people seem to fall for it.

As it happens, the sciences themselves show us why this purported ‘scientific method’ is unworkable. Psychology – which has been staunchly dedicated to ‘the method’ yet still gets cast out as ‘soft science’ – has provided a lot of neat titles for the various kinds of human bias. Defenders of ‘the method’ like to evoke hindsight bias to defend the need for hypotheses – “if you don’t make a hypothesis, you’ll just end up seeming to expect the result you get!” But these cognitive biases cut both ways: if you do make a hypothesis, you are now prone to confirmation bias – cherry picking your data and references to support the position you have chosen. This is why medical sciences insist on good quality evidence from randomized trials where even the experimenters don’t know what’s going on until all the data is in. We know from bitter experience that when you set out to prove some specific claim, you are more likely to find (and report) the evidence that supports what you have chosen. In other words, not having a hypothesis condemns you to bias, and having a hypothesis condemns you to bias! What makes something legitimately scientific cannot be the elimination of bias, or else nothing could ever be sufficiently purified of values to qualify. There has to be another way of conceptualising the difference between ‘science’ and ‘pseudoscience’ if either is going to have any legitimate meaning.

Ghosts of Science Past

The celebrated historian of science, Thomas Kuhn, lays out the question of pseudoscience at the very outset of his project to understand the nature of scientific change. The problem as he presents it is that if we judge the historical precedents to our scientific practices as pseudoscientific (he talks of them being ‘myths’), then we have to acknowledge that pseudoscience can be pursued and justified by the same methods and reasons we now use to defend science against its alternatives. Yet if we call these artefacts of older research ‘science’, then we have to accept that the sciences were beset by wild beliefs that today we would find unthinkable (even abominable). He argues very persuasively that from a historical perspective we have no choice but to accept that “out-of-date theories are not in principle unscientific because they have been discarded.”

Kuhn’s position is widely accepted today – yet it runs directly contrary to the view of Sir Karl Popper that the boundary of legitimate science is falsification – the ability to have a theory proven false. Amazingly, this viewpoint is also widely accepted today, even though the two approaches are essentially incompatible, and indeed were the basis for an unresolved despite between the two academics. Kuhn saw Popper’s falsification as applying solely to those rare periods of scientific upheaval (paradigm shifts) where one way of thinking replaces another. His view was that ‘normal science’ never dabbles in big theoretical changes at all, but is always about solving problems using the current theoretical apparatus. Again, these two viewpoints are entirely incompatible, yet both are widely supported views on the sciences.

Popper suggested that Kuhn’s approach committed him to saying that astrology is a science because it entails problem solving within its own paradigm. Kuhn denied this, and argued that in the context of astrology “particular failures did not give rise to research puzzles” and thus astrology was never a science. Both men died without resolving their disagreement; I think it clear, however, that both are wrong about astrology. We cannot – as Kuhn himself warns – back-project our current scientific judgements upon prior practices that were claimed as sciences at earlier times without distorting what we are trying to assert. To do so is to deny the very capacity for scientific revolutions that Kuhn’s account provides. The suffix ‘-ology’ by itself is a clue that the practices of astrology had at one point in its history a claim to knowledge, and the question of whether astrology was ever a science in Kuhn’s terms is a historical investigation requiring far more application to the task than either Popper or Kuhn were willing to commit. As such, this question is in fact still very much open to debate! But nobody wants to do so, because everybody with any skin in this game wants to show that astrology isn’t a science and never was – thus again preempting any possible research except that which will prove this one tenuous point.

If Kuhn’s historical theory (albeit not Kuhn himself) is able to defend against Popper’s attack, Popper’s falsification criteria has no equivalent defence against Kuhn’s criticisms. Indeed, Kuhn expressly doubted that falsifying experiences ever really happen. He did not need the psychologist’s label ‘confirmation bias’ to realise that giving up a scientific paradigm is a major conversion for anyone (comparison with religious conversion is quite justified here), made all the less likely by the problem that if every failure of a theory in the face of contradictory evidence were sufficient grounds for rejecting it, all theories ought to be rejected at all times! That’s because the very reason that Kuhn’s ‘normal science’ has problems to solve is precisely that no theory is capable of fitting all the observations it seeks to explain. As the French science studies scholar Bruno Latour puts it, the theories are all under-determined with respect to the evidence – and this conclusion is unavoidable if you spend time examining what scientists actually do rather than merely reciting the catechism.

But this does not mean there is no way of distinguishing science from pseudoscience, even though we have to accept a certain amount of historical contingency after Kuhn (or Foucault – he gets to the same place via a different route). What we might reasonably suggest as a provisional criteria for calling something ‘pseudoscience’ is a combination of Popper and Kuhn’s claims: when even the possibility of falsification is removed, or when the investigative practices cease to produce further enquiries in response to the questions the previous research implies, the claim to be scientific evaporates. As chemist-turned-philosopher Isabelle Stengers attests, successful experiments in the sciences give rise to new research questions. When they do not produce any more, it is because the field has managed a complete description of its subject matter (as with optics). The difference here is that such ‘completed’ fields have produced theories capable of making unfailing predictions. And such cases are vanishingly rare.

The Condition of Pseudoscience

What tied us up in conceptual knots here, and kept Popper and Kuhn from reaching an accord, is that we want to level the accusation ‘pseudoscience’ at fields like astrology or phrenology. But understanding the sciences as an ecology of practices, as Stengers has brilliantly discussed, shows that this is not the only way we might identify a breakdown of Kuhn’s ‘normal science’. We could (indeed must) give up the idea that ‘pseudoscience’ is a way of trashing any theory, research, or evidential claims we don’t agree with. On the contrary, I propose that the clearest way of understanding pseudoscience is as a condition within a scientific discourse that undermines or destroys its power to investigate.

Thus, to continue with phrenology’s original models of mental function after animal experiments began to show that its suggested brain regions did not hold up to scrutiny would have been to enter into a condition of pseudoscience, because its practices could not produce viable new research questions in the light of this new evidence. It would, however, be wildly unfair to it to suggest it was always in this condition: it is from phrenology, after all, that the idea of the brain being the organ of the mind originated, and while most of its specific claims did not pan out, it remains an important part of the backstory of neuroscience. If phrenology had not become spread around as working class ‘popular science’ (thus earning the enmity of Victorian cultural elites), we might well have kept the name ‘phrenology’ (science of the mind) rather than renaming brain research ‘neurobiology’. It’s not at all clear to me that phrenology was ever in the condition of pseudoscience, except perhaps at the very end – although anyone practicing it today would be behaving very oddly indeed.

Pseudoscience is thus akin to an ailment afflicting scientific practices that have become shorn from the logic of legitimacy provided by their current paradigm. The sign that a field has fallen into pseudoscience is not the truth or falsehood of its claims as such. Indeed, these will frequently not be in any way settled, forcing us into highly suspect retrospective accusations, such as that levelled routinely at phrenology. Rather, you can see the condition of pseudoscience occurring whenever scientists give up the values that motivate their enquiry - when they purposefully falsify data, or conceal it ‘to defend the truth’, or give up experiments and data gathering entirely in order to maintain a status quo based upon whatever happens to have been previously claimed. And once we see this, we are forced into the realisation that we are currently in the condition of pseudoscience in several entirely legitimate research fields, and over the last year we have had the audacity to defend the breakdown in the medical discourses that has put us into a state of collective pseudoscience as “following the science”!

The truth is, we cannot ‘follow the science’, it is the science that must follow us. For the values of science are those of discovery and verification, and this only has a purpose in so much as it serves to resolve those questions our other values compel us into exploring. Thus, while medicine commits to ‘first, do no harm’ as a supreme value governing its own practice, that particular principle sets no positive goal at all. The medical practitioners and the cybernetic networks supporting them take on the objectives that we have collectively given to them. If the circumstances that follow from that pursuit make falsification of a medical claim impossible, or provide no means to reliably answer the relevant medical questions, those medical practitioners affected (and anyone trusting their judgements) enter into the condition of pseudoscience, a (temporary) renunciation of the values of scientific practice, capable of precisely the great harm doctors are sworn to avoid. For the collective medical power we exercise cybernetically always causes some degree of harm along with the pursuit of its goals – requiring medical practitioners, on pain of becoming (temporary) pseudodoctors, to commit to studying the impact of any procedure or intervention attempted or else risk violating all the values of contemporary medical science. This is an extreme example, but it is also an extremely important one.

Now whether the values of discovery and verification have always conditioned the work of scientists, and whether they always will isn’t the point, for they are our moral requirements for the sciences now and on this point we quite miraculously do not disagree. In so much as pseudoscience is a phenomenon, it is merely a consequence of recognising that scientists are human, and what makes them seem otherwise is the remarkable power that they bring to bear when cybernetically linked into singular networks, working together – not just by co-operating but just as importantly by disagreeing, refining the research questions by honing the essential ambiguities into points sharp enough to penetrate our ignorance by pursuing further investigations and experiments. Pseudoscience prevents that dialogue from happening, and breaks up the network connections, making research harder or preventing it entirely, setting bias against bias and thus blocking the communication essential to verification, which is necessarily a distributed activity.

When verification stops, pseudoscience has begun... it goes away when we can go back to listening to those objections that our human bias prevented us from hearing. The ugly truth of it all is that fear, anger, and self-righteousness spread pseudoscience all too easily, yet banishing it is as easy – or as impossible – as going back and listening to the objections in order to work out where in the maze of ambiguity, indeterminacy, and incompleteness the truth of each disagreement can be found.

More philosophy of science soon.


How To Be Yourself

Untitied.KwangHoShinPerhaps the first mistake we all make as individuals is to think that we know how to be ourselves. When we object to someone else that "nobody can be me but me" we're being entirely truthful, but we should not deduce from this that being yourself is easy.

The Danish philosopher, Søren Kierkegaard, puts it beautifully:

There is a fear of letting people loose, a fear that the worst will happen once the individual enjoys carrying on like an individual. Moreover, living as the individual is thought to be the easiest thing of all, and it is the ethical that people must be coerced into becoming. I can share neither this fear nor this opinion, and for the same reason. No person who has learned that to exist as the individual is the most terrifying thing of all will be afraid of saying it is the greatest.

The individual person isn't a loner survivalist cut off from society, but one being among others whom they live amidst. When we angrily desire our individuality, what we are hungering for is an escape from the ties that bind us to these other beings that intersect our lives – but this we cannot achieve except through the self-destructive intervention of breaking these ties one-by-one. Every time you resort to this drastic step, you sever yourself from another piece of your individuality, for it is all these random, circumstantial connections to other beings and things, places and people, that are the raw materials from which your life is built. Without it, you are not an individual you are nothing, both because it is these circumstances that brought you to life and kept you alive ever since, and also because who you are flows from where you are coming from.

Now it is difficult for me to speak about this question of becoming yourself, because I do not want it to sound that I am claiming that I know how to be you better than you do. Obviously, I don't even know who you are as I write this! Rather, what I am trying to do is offer a warning that being yourself is much harder than it sounds. It is always a dangerous game, giving advice, and often disastrous when advice is given in anger or haste, and the last thing I would ever want to do is interfere with anyone's exploration of how to be themselves. Besides, as Kierkegaard warns, whenever we try to tell others how to be themselves we "betray ourselves by our instantly acquired proficiency, and fail to grasp the point that if another individual is to walk the same path, they have to be just as much the individual and can, therefore, be in no need of guidance, least of all from anyone anxious to press their services upon others…"

However, I can see little harm in pointing out that whatever being yourself is going to entail, it might help to understand what you are...

What You Are

We tend to assume we know what kind of thing we are – yet there are many different choices for understanding what you are, all of which can work out for certain people and any of which can lead to disaster when undertaken thoughtlessly.

Take the case of disbelieving in the reality of your existence. If you come to think that you don't really exist because you are just an illusion brought about an elaborate hoax of your biology, then there is no possibility of being yourself because there is no you to be. This seems like a terrible start to any process of self discovery! Yet this self-negating way of understanding what you are could also be illuminating, as it is to Buddhists and Hindus whose conception of appearances as essentially illusionary offers a way of discovering yourself through a denial that your thoughts and desires are the most important part of your existence. In this, as in so much in life, the same assumptions can lead to radically different conclusions.

Most likely, you view yourself as a consciousness inhabiting a body, with the latter generating the former via the biology of neuron connections that grants you free will and powers of imagination. In which case, your view is not terribly different from that of people who lived hundreds or even thousands of years ago, apart from the name given to the kind of thing you are. As the British philosopher Mary Midgley made clear:

When the sages of the Enlightenment deposed god and demystified Mother Nature, they did not leave us without an object of reverence. The human soul, renamed as the individual – free, autonomous, and creative – succeeded to that post, and has been confirmed in it with increased confidence ever since. Though it is not now considered immortal, it is still our pearl of great price.

The danger in buying into a purely individual conception of who you are is that it will make your existence appear to be something emanating solely from inside your mind. But that's not the case – who you are and what you are may have its locus of experience inside your mind, but it is constituted and sustained by the network of connections and situations I mentioned above, the raw materials from which you make yourself. We take great risks with our selfhood, therefore, if we think of what we are as something wholly sealed inside our heads.

Inside Out

Whatever way you settle upon for understanding what you are, you then have to negotiate the tension between what is apparently inside (your mind, your memories) and what is apparently outside (your social connections, your lived environment). Psychologists have finally started to come around to the idea that your mind is partly constituted by this exterior environment. Compelling recent concepts like 'enactivism' and 'embodied cognition' explore a path cleared by philosophers, especially the German philosopher Martin Heidegger. Heidegger saw our situation as one of being thrown into a world, the circumstances we are born into being the very condition for discovering what we mean by ourselves.

But how do we distinguish between inside and outside? Many teenagers try to break ties with their family or the traditions of their birth culture as an act of asserting their individuality... but the rejection of these relationships becomes in itself an act of participation, participating in exile, if you will. Active rejection of family or tradition still defines the inner self in these cases precisely by that rejection. Rather than severing that connection, we simply take upon a different form of connection – that of opposition or withdrawal.

To navigate this problem requires that we have access to some concept of what is good or right for us, but this cannot simply be to act on our hunches – that would risk removing ourselves from any viable standards of judgement. Our ability to make accurate judgements depends, after all, upon our tools for thinking (our languages and terminology) which are sustained by communities of practice. It is for this reason than the Canadian philosopher Charles Taylor explored an "ethic of authenticity" that emerged in the last century or so:

To know who I am is a species of knowing where I stand. My identity is defined by the commitments and identifications which provide the frame or horizon within which I can try to determine from case to case what is good, or valuable, or what ought to be done, or what I endorse or oppose. In other words, it is the horizon within which I am capable of taking a stand.

This is part of the reason why encounters with new communities of practice can be so transformative – whether it is a religious tradition from outside of our prior experience, a community of care based around a sexuality or gender identity we had not previously considered as applying to us, a medical diagnosis that connects you to other people with whom you share a commonality of experience, or a political faction that speaks to you from outside of your prior assumptions, the discovery of who you are frequently involves a voyage outside of your mind and into revelatory new connections with others.

Yet each encounter of this kind also risks deceiving us – especially when we have actively broken ties to our previous communities. The discovery of a new network of care that we can see ourselves belonging to is alluring, because as social creatures we crave belonging even though other humans fundamentally annoy us (as the Prussian philosopher Immanuel Kant remarked, we are "sociably unsociable"). But this inherent appeal of belonging to something cannot resolve the question of whether the identity we are trying on is an authentic solution to the problem of ourselves. But by the same token, nobody watching 'from the outside' is going to be able to decisively determine what is and isn't authentic on our behalf. We are all inside and outside the same boats in this regard.

The danger of treating the dizzying array of possible identities presented to us as merely a buffet or a shopping catalogue to chose from is the risk of failing to notice how each encounter with every possibility of understanding ourselves is going to have an effect on who we are becoming. If we think of who we are as just a single identity where we simply have to browse the shelves until we find "the right one", we will end up reducing ourselves to a mere caricature of who we could be if we took the time to discover authentic connections with all the many facets of who we are and might be.

Paradoxically, discovering how to be yourself requires other people, both as examples to understand, and as a sounding board as we work through the challenges of understanding how the different shards of who we are fit together into a coherent whole. Even if you were "born this way", you still needed to learn about 'this way' by seeing these possibilities for existence acted out in others. Identities are sustained by their communities – and counter-intuitively, they are strengthened by the opposition of other communities that deny their legitimacy, for we are never bolder than when we feel threatened.

The problem of being yourself has no quick fix, and certainly cannot be solved by ordering your new self online. It requires you to do the work, thinking and feeling through your existing connections and communities, taking on new potential aspects of yourself with care and not rushing the process of discovery by letting our enthusiasm for the new lure us away from parts of who we are that are far more important than their humdrum familiarity might suggest.

How do you discover how to be yourself? The same way we learn anything: you watch other people become themselves, and then try to make some of what you encounter work for yourself. Sometimes it will. Sometimes it won't. Sometimes it will seem impossible that this could be you, but you may still later come to see how it all fits together. It's a mystery to solve, and only you can solve it – but you will have a much greater chance of success the more you listen to others and recognise that you can only be yourself with others. Alone, you are trapped 'inside' with your fears and your anger – only together can we find ourselves.

Prepare yourself for the adventure of a lifetime.

The opening image is an untitled painting by KwangHo Shin, which I found here. As ever, no copyright infringement is intended and I will take the image down if asked.


Scorsese vs Marvel Studios

Scorsese vs ThorVeteran film director Martin Scorsese could scarcely ask for better publicity for his new film, The Irishman, than picking a fight with the box office powerhouse that is Disney's Marvel Studios. In an interview for Empire magazine, Scorsese was asked about Marvel movies and replied:

I don’t see them. I tried, you know? But that’s not cinema. Honestly, the closest I can think of them, as well made as they are, with actors doing the best they can under the circumstances, is theme parks. It isn’t the cinema of human beings trying to convey emotional, psychological experiences to another human being.

This is a much more interesting statement than it might first appear. Before delving into it, however, it is worth acknowledging that Scorsese would never have had anywhere near as much coverage for his new feature if he had not decided to position himself against one of Disney's two big-ticket purchases both of which were acquired to fill a gap in the media corporation's portfolio, which was always lacking in action franchises. I don't think it greatly matters if this is a planned PR manoeuvre from the 76-year-old director, or a lucky striking of gold by one of Empire's writers, either way it's a win for both parties since the battle line it draws guarantees more attention for both of them, and mobilises the legions of Marvel fans for free publicity, since negative reactions online – especially those guaranteed to travel far – have nearly the same effect as ploughing millions of dollars into marketing.

But I do not mean to suggest that Scorsese is disingenuous in his remarks – indeed, as critic Jed Pressgrove remarked to me on Twitter, there really is nothing enormously surprising about these comments in terms of the discourse surrounding films. That's because it has long been a tenet of what might be called 'serious' cinema that there are two competing forces in the movie theatres. This 2016 blog post by filmmaker Rob Hardy poses this divide in terms of 'films' (Scorcese's 'cinema') and 'movies' (Scorcese's 'Not cinema'), and there are hundreds of similar claims spanning decades. What is at heart here are implicit aesthetic values and the practices that those aesthetic values belong to. Representatives of cinema or film are claiming the artistic high ground – often falling just short of outright saying "we are art, you are not", but always implying it – and contrasting their craft against 'movies', which are not actively represented by anyone in this argument but merely the mass market shadow of the practice that Hardy calls 'filmmaking'.

When film critic Roger Ebert declared that videogames could not be art, or when disgruntled gamers declared Dear Esther was 'not a game', these claims were undergirded by specific aesthetic values and, along with this, participation in the practices that sustain and embed those values. Dear Esther was 'not a game' to anyone for whom 'games' were either the aesthetic pursuit of victory or of problem-solving, an aesthetic camp explored beautifully by game scholar Jesper Juul in his book The Art of Failure: An Essay on the Pain of Playing Video Games. Coming at the matter from this territory in the aesthetic landscape all but requires the erection of a barrier: the Chinese Rooms ingenious usurpation of the components of first person shooters for something radically novel had to be 'kept out' of games because of a felt need to valorise different aesthetic values, those associating games with challenge where something like Shadow of the Colossus might be pointed to as an exemplar. This is the videogame mirror of Scorcese's 'not cinema', which is also Hardy's 'film versus movies'.

Writing centuries before either films or videogames, the Enlightenment philosopher Immanuel Kant made a crucial point about our aesthetic values: that when we assert them, it is because we expect our judgements to have universal assent, or rather we behave as if they should be capable of garnering such agreement. As a result, when something transgresses our aesthetic values – when a Marvel movie is claimed to be cinema (for Scorsese) or Dear Esther is claimed to be a game (for certain gamers), there is an aesthetic transgression, and just as we would baulk at a moral transgression, there is potential for outcry, opposition, and argument. The disagreement, however, is usually hollow since two positions divided by distinct values never connect in any meaningful way. As Kant observed: it is a 'commonplace' that everyone has their own taste, and also that 'there is no disputing of taste'.

Thus there is no need or purpose to Marvel Studios' myriad fans stepping up to the plate to try to defend the Marvel Cinematic Universe by pointing to examples of movies in that corporate megatext that meet Scorsese's apparent definition of cinema in terms of conveying psychological experiences... as Hardy puts it, the question goes to intention not outcome, and I would further suggest that what lies at root here is participation in a particular tradition, a distinct practice of making and engaging with films that is not rooted in entertainment, for all that it is frequently marketed successfully as that. Besides all this, Scorsese is surely correct to compare everything that comes out of Disney's corporate process to theme parks, since this is the practice that the House of Mouse pioneered and is still engaged in: an applied psychology of commercial entertainment rooted in meticulous brand management. In this regard, Scorsese's point is nearly impossible to rebuff and comes down to a claim about the limits of authorial intent: whatever filmmakers might achieve in a Marvel Studios movie cannot change the fact that what has been made is the result of a tightly-managed corporate process of engineering both brand and entertainment value on an industrial scale. Our only choice is whether this matters for our enjoyment of what results – and this depends upon which practices we ourselves are engaging in when we go to the cinema.


Silk is About... The Designer's Notes

Silk Notes

Silk is About... was a Designer's Notes serial in five parts that ran here at Only a Game from August 27th to September 24th 2019. It examined the thematic influences behind the game Silk, and pondered the game from a historical, personal, and political perspective. Each of the parts ends with a link to the next one, so to read the entire serial, simply click on the first link below, and then follow the “next” links to read on.

Here are the five parts:

  1. Silk is about... 200AD
  2. Silk is about...1984
  3. Silk is about... Glorantha
  4. Silk is about... Religion
  5. Silk is about... Brexit

Silk is out on Switch, Windows, Mac, and Linux in October 2019.


Silk is About... Brexit

BrexitSilk is my Brexit game. There, I said it.

Silk is about Brexit because Silk is about how people live together and, perhaps even more so, how they fail to live together. I see in 200AD an allegory of 2000AD, lessons we can learn and did not learn, and are still not learning.

I am not committed to either side of the Brexit ‘debate’ (‘battle’ is perhaps more accurate, since a debate assumes a conversation entirely absent in this matter). I understand the argument that sees in leaving the European Union an opportunity for national self-determination, even if I myself could not vote for leaving because of my suspicion – now amply proven correct – that voting to leave would not spark the essential political dialogue required for the United Kingdom to acquire a viable, shared national identity. Instead, it deepened a previously ignored divide. Knee-jerk racism lines up on one side alongside those who had more honourable reasons for desiring a departure from the EU, while political one-upmanship and the certainty that everyone has it wrong except those who agree with you overwhelm all sides and leave us no closer to having a sense of what our country could or should be.

In Silk, the desire for self-determination is echoed in the imperial battles the game makes central to the Warlord and the Rebel. Settlements defend themselves in Silk when they feel threatened... today, nations do the same. The potential for military power to be abused was always present, and has little to do with the reasons people desire to defend themselves from threats from the outside. Then as now, what starts as defence ends as empire-building. Many Brits still feel like they are part of the British Empire even though in truth we are only offered the choice of being a neighbour to the European Empire or a vassal of the US Empire. But that desire to make your own nation everything it can be is not as morally wrong as liberal opponents to national pride make out. As Mary Midgley observed, we are entitled to put our own interests first; every species does this, and doing so need not – and indeed usually does not – devolve into utter selfishness, even if that is an ever-present risk.

What risks getting lost in this perspective, however, is that co-operation is almost always in our best interest. In Silk, this is represented by the Caravan itself, where a hugely diverse range of cultures and ethnicities come together to try and succeed in the challenge of surviving in the wilderness in the Traveller, or striving to profit from trade in the Noble. The game intentionally has a little casual racism in some of the Advisor’s responses to the world they are travelling... the unfamiliar culture will always provoke a suspicious reaction, after all. I learned so much about the complexities of racism reading Michael Moorcock’s astonishing Between the Wars quartet, and Isabelle Stenger’s “The Curse of Tolerance” deepened my understanding of this even further. Racism and opposition to racism both block co-operation in their own ways, but the lesson of the Caravan in Silk is that we gain more from co-operating than from going it alone. That’s not an argument for staying in the EU as such: it’s an argument for not letting a fight about whether we should endorse one ideology or another tear us apart as a nation. And that’s just as true in the United States as is it is in the United Kingdom.

So when I say that Silk is my Brexit game, I’m not saying that Silk is offering an answer to the problems of Brexit, but rather that in this game I am reflecting on the cultural problems – in the UK and elsewhere – that led us to Brexit, and that are not solved by leaving Europe, nor by remaining. We have lost our sense of the benefits of co-operating, either because we demonise those from other cultures we see as ‘different’ (especially Muslims), or because we have lost respect for our fellow citizens and are no longer willing to let them participate in democracy because we are so convinced that they are ‘wrong’. I see disaster on both paths. Silk is, in a way that is woven into the tapestry of every game of it that anyone plays, an opportunity to reflect upon our interdependence with those around us, and to consider different paths.

We can be more than divided nations squabbling against each other, if that’s what we wish. The question, as Silk asks every player to decide at every juncture, is always: what will we choose...?

Silk is out on Switch, Windows, Mac, and Linux in October 2019.


Silk is About... Religion

Sapadbizes CoinWe don’t talk about religion, right? That’s what ‘secular’ has come to mean... we don’t talk about religion. Unless of course you want to make criticisms against religion, which are still fair game – indeed, are all but encouraged among the intellectually respectable. Liberals are only credible if they are willing to speak out against Christian nonsense, while conservatives positively thrive upon their distrust of Muslims... So we end up in this strange situation where ‘not talking about religion’ becomes a blanket cover for racism because religion and non-religion are intimate elements of culture, and so if all you’re permitted to do is to speak ill of religion, you have created an environment where racism not only festers, it achieves a kind of illusion of intellectual honesty that, in my lifetime at least, distrust of skin colour has always been mercifully denied.

Because it’s set in 200AD, Silk can be about religion without dealing with the immense baggage of contemporary religions. Islam has yet to be founded, while Christianity and Judaism are a very small part of the world of 200AD, which is dominated by what we tend to unjustly collect under term ‘Pagan religion’ or, perhaps even more misleadingly, ‘polytheism’. The civic religion of Rome and ancient Greece spreads throughout more than half of the Ancient Silk Road, and collides in the Kushan Empire with eastern Buddhism, which is still a very young religion at this time. It’s also worth noting that the very term ‘religion’ has no real analogue at this time: our capacity to talk about cultural mythos as a package deal emerged via the Enlightenment... the Romans had no equivalent term at all. ‘Religio’, the root of the word, carries the meaning of a sense of duty or responsibility in 200AD, and mostly in the sense of social obligations.

Religions that are huge today are minorities in 200AD. What we call today the Hindu traditions are not entirely absent from the game, but what we usually associate with these spiritual paths are definitely on the fringes – you can sacrifice to Shiva in the Kushan Empire, for instance, but most temples there are dedicated to the Lion Goddess Nana, whom nobody remembers today. In the Parthian Empire, Zoroastrian fire temples are the core of civic religion, and although Islam is still several centuries away you can feel the connectivity between the Parthian Empire and Islamic culture in many ways... like everything else in life, religions have a history, they are not as isolated and static as we tend to imagine, and in 200AD this is far more evident than it is today.

Not that long ago, I was interviewed about the portrayal of religion in videogames by a PhD student, in part because my game Kult: Heretic Kingdoms had on the surface a vehemently anti-religious stance. (The actual story in that regard is much more nuanced, but this isn’t the place to explore it...) One of the things I took from that discussion was the manner in which a huge aspect of the portrayal of religion in videogames is the priest or priestess as the healer – a debt from Dungeons & Dragons that seems to have been tangentially influenced by Hammer House of Horror movies of all things! I became interested in finding another approach to this issue. I didn’t set out to make a game about religion, but once I knew I was making a game about 200AD I knew that it was inevitably going to be about religion in addition to whatever else it was about.

As I came to develop the class of Ritegiver in Silk, I began to see them as an opportunity for a different way of approaching religion in games. The Ritegiver is, in effect, the diplomat: by being able to perform rites at different shrines and temples, the Ritegiver allows the player to make friends with people in new areas, to stave off rebellion by performing sacrifices that help bind them to their captured citadels through civic religion, or simply to ask for aid from strangers. I leave it to the player how they interpret this – cynically, as social manipulation, or idealistically, as a marker for what religions do best when they do not lose their way: binding people together into communities of care. Both ways of understanding religion have some truth to them, and always have.

Silk isn’t a game about religion as most people understand the term. That’s because it’s about the religions of 200AD. I happen to believe that this could tell us more about religion today than it might first appear.

Next, the final part: Brexit


Silk is About... Glorantha

RuneQuest TableauKnowing I wanted to make a game in tribute to The Lords of Midnight, the question was: how? Because making a direct spiritual successor to it was clearly not going to work – Legions of Ashworld had already tried, and it had struggled because it was solely fans of the original who could possibly appreciate it. No, if I was going to create a game that spoke to why The Lords of Midnight was important to me, I was really going to be making a game about a square-based map. It was mapping, and using other people’s maps, that made those early game experiences for me, and this was especially so for The Bard’s Tale, which I painstakingly mapped by hand with graph paper, and then took great pleasure in my friends using my maps to complete the game after me.

So I knew I wanted to make this tribute game about exploration, but I also didn’t want to pass up the opportunity to experiment with radical unexplored possibilities in narrative design, and for this I had another influence: King of Dragon Pass. I had always regretted ‘missing out’ on RuneQuest, possibly the only classic 1980s RPG that I never got to play. King of Dragon Pass let me participate in Gregg Stafford’s extraordinary game by having been set in the world of Glorantha and being, in a very tangible sense about Glorantha. To play King of Dragon Pass is to enter into a fantasy world that’s not like any others out there... it’s more Bronze Age than Medieval, it’s a world where gods and spirits are tangible and pressing in on mortal life. David Dunham’s game is an incredible achievement, one that came to my attention because my colleague at International Hobo in the 2000s, Ernest Adams, waxed lyrical about its achievements in narrative design.

But what I really fell in love with in King of Dragon Pass was the Clan Ring, the set of people who advise you as to what decisions you could be taking as the game progresses. I became obsessed with how this worked, and dug into its designed systems and internal language (OSL), becoming ever more convinced that what was ‘just’ another clever extra feature in that particular game could become the central element of a narrative design that was based upon an entirely different kind of play. Perhaps, the kind of play that would see the player striking out across three million square miles of wilderness....

The Clan Ring in King of Dragon Pass became the Advisors in Silk. They’re your party, you hire them to your Caravan, and once you hire them they’re with you until the end of the game. That wasn’t how the design began – for a while, the paper design allowed the Advisors to die if they failed a skill check spectacularly. But as time went on, I came to realise that what I was doing with Silk in terms of letting the player explore the cultures of 200AD (just as King of Dragon Pass lets you explore the culture of Glorantha), was stronger in some ineffable sense if your Caravan was more than just a set of interchangeable pawns. The Caravan is your family in the game... and by necessity, it’s going to be a family of misfits, just like every party of adventurers in RuneQuest. That’s something that speaks to me as a player of games, and a lover of the strange. It’s why even though Silk is set in 200AD, it’s also in a strange but understandable way, about Glorantha.

Next: Religion


Silk is About... 1984

Lords of MidnightIn 1984 and 1985, amazing things were happening in the British videogames industry. The following year, Japan would overshadow this with titles like Metroid and The Legend of Zelda that transformed videogames forever by having the ability to preserve player progression (the genesis of save games), but for these two years nobody anywhere in the world can match the inventiveness of British bedroom coders.

One of these stories is well known... David Braben and Ian Bell made Elite, which with its vast feeling of player freedom would go on to directly influence Grand Theft Auto, and thus give birth to the open world genre as we now know it. But even that’s not the whole story, because Elite is a descendent of tabletop role-playing games, specifically Traveller and Space Opera, and it was the infinite agency of the tabletop RPG that inspired Elite’s radical approach to digital agency. It’s always a mistake to think videogames sprung into life from nowhere... they flowed down the river of artworks like everything else.

Two other great precursors to the open world game that came out of these two years are both from 1985: Andrew Braybrook’s Paradroid – which I still suspect was an influence upon Grand Theft Auto’s car stealing (although I have not yet proved it), and Paul Woakes and Bruce Jordan’s Mercenary, that took Elite’s wireframe world and made a fantastic story out of it (Surely the faction system in the original GTA was inspired by this game...?). Paradroid is actually my favourite game of the last century, but I don’t feel quite the sense of debt towards it as I do to another 1984 classic, perhaps because I got to work with Andrew Braybrook and Steve Turner in the waning years of Graftgold, and so our stories already intersected in some way.

The last of the four harbingers of the open world is Mike Singleton’s The Lords of Midnight, the best adaptation of The Lord of the Rings to never have had the license. Singleton was not influenced by tabletop RPGs as far as I can tell, but was just interested in how to take the two threads of Tolkien’s epic – the adventure story and the epic war – and represent them in the 48K of the ZX Spectrum, Europe’s most iconic home computer. I was spellbound by The Lords of Midnight, even though it was actually terribly difficult to play, and even more difficult to play well. My appreciation of what it achieved grew when I started giving talks about the history of games, and peaked when I finally sacked Ushgarak (let’s not call it the Dark Tower of Mordor...) in Chris Wild’s outstanding port of the game.

Singleton did not rest on his laurels. The open-world-before-open-worlds concept was revisited in a sequel, Doomdark’s Revenge (which also has a fantastic port by Chris Wild) and later in Midwinter and its sequel, games that moved into polygonal 3D and were equally astounding, perhaps even more so, since they attempted the immersive presence we now expect from first person games before the hardware was in any way up to the task of rendering them. But there was just something about that square-based world in The Lords of Midnight that maintained its magic. It’s a mystical wonder that can also be found in Eye of the Beholder and The Bard’s Tale, which also built their world on squares, although both had so much more computational resources available that they cannot possibly count as the technical achievement that Mike Singleton’s classic was.

I felt a debt of honour to him. I don’t really know why, but I always have. In the 1990s, when I was working on the Discworld games, I tried to make a game in that style, but it was impossible to make the argument for it then. It’s not that much easier now, to be honest! But at least now we have a thriving indie community who sometimes welcome the strange and wonderful into their hearts. So I made Silk, to pay off that debt to Mike Singleton. It’s why even though the game is set in 200AD, it’s also inextricably about 1984.

Next: Glorantha


Silk is About... 200AD

Silk NotesSilk is about 200AD.

Silk is about 1984.

Silk is about Glorantha.

Silk is about religion.

Silk is about Brexit.

Five seemingly contradictory statements, all absolutely true. The fact that all these claims are true doesn’t spring from any conceptual gymnastics, it flows naturally from the way I came to design and ultimately implement Silk, with the incredible help of Nathan (the programmer) and Jamie (the artist), and many others (like Becky, the portrait artist; Chris, the composer; and Patrick and Sean, the producers).

That games are about things doesn’t sound controversial, but in an odd way it is. That’s because the entertainment value of a game (or a film, for that matter) is the moral value we elevate above all others for them – provided a game entertains, all other priorities are rescinded. That’s why games are a multi-billion dollar industry today: not because they are a vibrant, extravagant, hugely inventive artform (although they are), but because they entertain. And who doesn’t like being entertained? By definition, it’s something we all want.

But it’s not enough of a reason to make a game like Silk, because the people who might be entertained by a game like this are not the same people who are going to be entertained by, say, Grand Theft Auto, even though the GTA franchise and Silk have their roots in exactly the same places: the British games of 1984 and 1985 that invented the open world before anyone had thought of calling it that. No, Silk is a niche game... it’s a game for players who are looking to be more than just entertained, who are willing to be challenged to take upon a new way of thinking, one quite different from those that most games present us today.

We should start by acknowledging that this is a game about 200AD. This is a time period I’ve always been enraptured by... the Roman republic has mutated into the Roman Empire, bringing the seeds of its eventual downfall. Thousands of miles east, the Han Empire are about to lose control of China as it slips into the vicious civil war known as the Three Kingdoms. And in between these two ends of the Ancient Silk Road are two other empires that people just don’t talk that much about – the Parthians, who are Rome’s bitter enemy (and whom Rome never convincingly defeated), and the Kushan Empire who rule what we now call India with a cosmopolitanism that is quite astonishing for a time two millennia before our own. To play Silk is to visit 200AD. That’s the player experience we’re offering, over and above any other themes I might have weaved into its narrative design.

I’ve been writing Designer’s Notes since 1993 for every game that I can definitively call ‘mine’ (without denying my immense dependence upon those who work alongside me). I was inspired to do so by Sandy Peterson, the designer of Call of Cthulhu (and level designer on Quake), who first made it clear to me that pretending you’re not influenced by other people’s designs is pure arrogance and folly. In this five part series of Designer’s Notes, I want to look at five things Silk is about. The first, as I’ve just discussed, is 200AD. I’m not going to say too much about that because if you want to know about 200AD you should play Silk – short of a time machine, there’s no other way of experiencing it! But the other four thematic influences upon Silk – 1984, Glorantha, religion, and Brexit – those are things you probably aren’t going to get out of just playing Silk. They require me to tell something of the story behind the game, and that’s what Designer’s Notes are ultimately about. These are the notes I want to make about the most personal game I’ve ever made.

I hope you’ll join me for this journey.

Next: 1984