Lessons from the MUD
Everybody's Got It Wrong Except You

Technological Cowardice

Grim Reaper DroneWhat do internet trolls and drone assassinations have in common? An absence of courage brought about by creating the most grotesque kinds of cyborg.

In the heroic age of ancient Greece, the Norse Vikings, Celtic warriors, courage was the central virtue, around which society revolved. This was not just in battle, but everywhere in life: to lack the courage to do what was required of you was to bring shame upon yourself and your kin. Fidelity was an important part of this, and Alasdair MacIntyre suggests this was the primary virtue expected of women during this time, but that is not to say it only affected them; indeed, in feudal China, fidelity was more central to a virtuous man than courage. To be alive in the heroic age was to be bound to blood kin that you were expected to defend in both body and honour – and in so doing, sooner or later you would meet your death. To die was everyone’s fate, and this awareness – which we have lost sight of today – provided the backdrop against which courage gained its meaning.

Today, we are inclined to view such situations negatively, emphasising not the culture of valour that mattered to the people of that time, but the ways these stifling strictures of honour suppressed individual liberty and freedom. Yet there is a danger here, one entangled with the moral disaster of individualism and brought into focus by the problems with fidelity. For without a common bond against which the exercise of courage acquires its meaning, we either lose sight of it completely or mistakenly identify our outrage with valour. The ease with which our digital public spaces permit us to scratch this itch only deepens the crisis. How do we even know if we are brave when all measure of courage has been lost to us?

A robot cannot show cyber-courage in any personal manner for it fears nothing and is thus incapable of valour as we understand it. This very absence of both fearfulness and courageousness is precisely why robots are such appealing ‘soldiers’ when war is conditioned solely by the moral disaster of consequentialism. But before we consider the abject failure of the battlefield we ought to consider whether cyber-courage is even a plausible concept – for the evidence of the effects of technology upon this virtue is primarily of the exact opposite.

For Alasdair MacIntyre, courage was not only the primary virtue of the heroic age, but a central virtue in any situation. Virtues are qualities that acquire their meaning from the practices that people pursue together, since only in a shared context do qualitative judgements possess a common ground. Macintyre suggests three virtues are indispensable to any tradition, since without them even maintaining a practice becomes implausible. Truthfulness and a sense of justice are the two virtues required to maintain a viable community; courage is required to do the right thing even when it is difficult – indeed, the most basic understanding of courageousness is as the capacity to act when others would not, and this is vanishingly far from the mere willingness to display outrage, which need not be virtuous.

For a cyborg to display cyber-courage, a robot would need to be capable of encouraging its human to assert themselves virtuously: but how would it know? Precisely the failure of Artificial Intelligence has been the discovery, slow to be accepted, that computational intelligence is divorced from the practices of beings. All animals understand their situation through being able to coordinate their memories and habits within their own imagination, which ‘fills in the blanks’ of every circumstance through means so familiar to us that we take it for granted. Yet no robot can do this. The computational efficiency of silicon chips creates an impression of greater mental power because complex calculations are hard for us yet easy for robots. But calculation is a very small aspect of our cognitive capabilities – and for computers, it is all they have. To exist as a being is to live within a world, and this capacity is something none of robots possesses, nor is it likely that they will on the current design principles for software.

Rather than cyber-courage, what we have seen in the growing presence of computers in all aspects of human life is an erosion of courage as robots become the point of confrontation, and humans are able to distance themselves from their actions. The internet troll – the 21st century’s resident bully – is emboldened to make verbal attacks on strangers precisely because it is only a computer that is in personal contact with their victim. Bullying had long been associated with cowardice, it’s psychological appeal resting on the illusion of power created by picking on those who are powerless to stop you. In the playground or workplace, the bully chose to target only those who could be successfully intimidated. The cyber-cowardice engendered by our digital public spaces so successfully isolates trolls from their actions, the risk of reprisal falls to almost nothing. The virtual mask stokes the confidence of trolls, but courage is more than blind assertiveness, and there is nothing courageous about skulking in the shadows and preying upon others who have no capacity for reprisal or restitution.

In the heroic age, the fundamental display of courage was upon the battlefield. There, warriors braved death to defend their brothers in arms, and their families and clans for whom defeat could mean slavery, rape, or death. There is still courage to be found among today’s soldiers, but it is threatened by the cyber-cowardice that offers the capacity to kill without any risk of injury in return. Armed drones, a grotesque modification of equipment originally intended merely for surveillance, allow missile strikes on distant lands without any risk of personal harm to the operator. Here is the ultimate example of cyber-cowardice, a technology than extinguishes the flame of valour that burns in all those who serve in armed forces and dishonours entire nations such as the United Kingdom and the United States who have turned to these robotic weapons as a means of assassination.

Bradley Strawser is the ethicist who had made the strongest case for the moral permissibility of drones. He points to the psychological stress upon drone pilots, and the terrible post-traumatic stress caused by watching people die in a screen. He suggests it takes “intellectual bravery and perhaps some moral courage” to fly drones... but is this not the cyber-cowardice of the internet troll elevated to its most extreme degree? Laurie Calhoun draws exactly the opposite conclusion from the psychological impact of being a killer drone pilot: it demonstrates that they do feel remorse for taking the lives of their victims. Perhaps the most that can be said in defence of the armed drone pilot is that unlike the troll, they suffer for what they do.

I have respect for Strawser, who has engaged with the moral problems of armed drones in a way that is honourable for all that I radically disagree with his conclusions. He has suggested that the perceived problems with armed drones springs from the intuitive asymmetry of the battlefield where one side can kill without risk. His claim is that this imbalance was already present when jet fighters faced off against guerrillas armed with shoulder-mounted missiles, who could not be deemed remotely equivalent in power. Yet the fighter pilot still put themselves at risk in this scenario: there is not just a difference of degree involved in the use of armed drones, the ratio of risk between combatants has become infinite – courage cannot survive this asymptotic chasm, and the psychological cost of being part of an armed drone cyborg is evidence of the depravity of this technology, not of any form of courage.

What makes the armed drone seem acceptable is the moral disaster of consequentialism, which sees morality as reducible to calculation. Thus Strawser’s view is that the capacity to complete a mission without risking a soldier is morally obligatory – provided, he repeatedly stresses, that the cause is just. But good ends cannot justify despicable means, and the battlefield emptied of valour ceases to be a site of anything honourable. Indeed, it is no longer a battlefield, but merely the place where extermination takes place in pursuit of a victory that gets further from reach when such robotic weapons are deployed. Every civilian killed or injured in a drone strike sees nothing but the horror of death brought about by a cyborg enemy too cowardly even to show its face.

More cybervirtues next week.


Feed You can follow this conversation by subscribing to the comment feed for this post.

I'm afraid you are conflating courage and "sportsmanship" with morality. War is not a game. The morality of firing a rocket into a compound that acts as the command and control of a terrorist operation and is also the home of the wives and children of said terrorists does not change for better or worse when the pilot is removed from the vehicle of war.

Hi Lewis,
It is not because armed drones are 'unsporting' than I oppose their use, but because they remove the conditions that make 'just war' - the only recognised moral justification for pursuing war at this time - impossible. Jus in bello, the proportionate use of means, is essential to the claim of 'just war'. When this is absent, something has gone horribly wrong.

I have two basic objections to your stated counter-argument. Firstly, the way that armed drones have been deployed does not match your hypothetical situation, and is far more problematic than you appear to assume. Secondly, it is an impoverished view of morality that equates equivalent outcomes with equivalent moral assessment. Even contemporary consequentialist ethicists have accepted that weighing of outcomes, without any inclusion of circumstances, is an impoverished understanding of morality.

Regarding this first point, I encourage you to examine this report from 2012 based on US drone activities in Pakistan. It is a far cry from your hypothetical strike on a 'terrorist command and control centre'. I am open to the possibility that drone strikes are used sometimes as airstrikes, in situations that military ethicists would consider acceptable. Nonetheless, this is an inadequate description of the deployment of armed drones at this time, which have been used to pursue assassinations of convenience that would not have been actioned in a situation that required the deployment of an aircraft with a pilot. See the report at the link below for detailed discussions:

Regarding the second point, you are welcome to adopt a strict outcome-focussed ethical system as your basis for morality, but this is only one of three basic approaches to morality, and has no essential priority over its alternatives. In both a duty ethics (deontological) and a virtue ethics perspective, the removal of the pilot has significant moral impact that you have opted to ignore. To be honest, even from a purely outcome-focussed perspective, the very convenience of the armed drone as a weapon of assassination changes the decision process that leads to its deployment. This moral effect applies even if the other perspectives on morality were discounted for some reason.

I do, however, agree with you that war is not a game. In a game, you are free to pursue any strategy within the rules that will lead to your desired outcome, with no consequences once the game is concluded. In war, nations are bound to adhere to the rules of engagement that they have vouched to honour in international treaties. When they do not do so, they open themselves up to moral criticism, and potential future consequences that are anything but a game.

Thanks for sharing your objection to this piece,


Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)