Human vulnerability is our strength

by Kris Dobie | Published on 27 May 2019 for The Ethics Institute monthly newsletter

Recently, at The Ethics Institute’s 9th Annual Conference, Prof Christof Heyns gave an insightful presentation on ‘Programming ethical conduct in robots’. More specifically (and provocatively), ‘killer’ robots. Prof Heyns, in his role as Special Rapporteur for the United Nations, has done a great deal of work exploring the ethical dimensions of such technology, and it made for a fascinating and surprising subject. Most surprising was my realisation that our vulnerability is perhaps our best hope.

 Storm trooper picking poppies

"The golden rule of ethics is to treat others as you want to be treated. We understand that we are

vulnerable to suffering and that we should not inflict that on others." 

 

‘Killer robots’ refers to fully autonomous weapons; essentially, weaponised drones that can identify and kill people without human intervention. In other words, technology that uses a pre-programmed algorithm to decide who lives and dies.

The cases for and against such technology are intriguing. One reason for, is that robots could be more accurate and less emotional when making such decisions, which might lead to fewer deaths of innocent civilians. Also, human drone operators would be saved the post-traumatic stress of being directly involved in the killing of others. (Consider the emotional dissonance experienced by a remote-drone operator who lives somewhere in suburban America, who greets her children in the morning, goes to work, kills some people remotely, heads back home for supper and asks about the kids’ school day…)

Many of the factors against fully autonomous weaponry are rooted in the fact that it involves giving away a significant, ‘high stakes’, human decision to something that isn’t human. This isn’t merely an academic question: artificial intelligence is becoming more and more capable, and is something to which we will be able to transfer more and more of our messy human judgement in future.

A more immediate concern is what happens to us as a society when we don’t have any ‘skin in the game’ in war? Prof Heyns pointed out that the United Nations was born from war-weary nations, coming together after the Second World War to ensure that such human devastation should not happen again. This response was likely not purely calculative in terms of human lives lost, but based on the emotional and experienced suffering of their people. Interestingly, the UN initiative was led by the Allies, who had ‘won’ the war. Such an outcome would have been unlikely if they were fighting using fully autonomous weapons.

The subtext was clear – our human vulnerability is what led us to search for lasting peace. And artificial intelligence is unable to experience human vulnerability. While it can be infinitely better at computing, this innate vulnerability that flows from our biological make-up cannot be computed.

Vulnerability is an inherent part of the human condition and a core dimension of our moral make-up. The golden rule of ethics is to treat others as you want to be treated. We understand that we are vulnerable to suffering and that we should not inflict that on others. We recognise our own humanity in others. 

While our vulnerability makes us moral creatures, it also leads to much of the pain and division in the world. Wars are fought over resources, or because one group is subjugating another. We loath ‘others’ and the impact they have on us, or fear them for the impact they could have. Vulnerability is at the heart of this hatred, this fear.

Take our unequal South Africa, for example, where we have many ‘others’. Black and white, rich and poor, radical and moderate. Pick your binary. It is interesting, and disturbing, to observe how our politics is shaping in terms of vulnerability. There are those who want more for vulnerable marginalised parts of society, and they want to obtain it preferably by taking from those who have more. And those who have more see themselves as vulnerable to such taking. These are real forces at play and, while we can disagree with the methods used, it is easy to understand the vulnerability experienced by both sides. It is also easy to see that polarisation of these ‘others’ will lead to a messy outcome. To retain some stability, it is critical that the radical expression of these elements is not the dominant force in our society. And yet our recent election outcomes show that the discontent is growing on both extremes.

To move forward we need to draw on the positive side of our human vulnerability and recognise the dire need to fix our unequal society. While the urgency for the discussion comes from radical elements, the solutions will probably have to come from more moderate sources. All of us in the ‘middle’ have to work harder to look beyond our own vulnerability and see that of others. As a start we need to see the ‘others’ as part of ourselves.

But let us not fool ourselves that finding solutions will be easy. We can imagine that a dispassionate artificial intelligence that is programmed for pursuing a fair and prosperous society might come up with some smart solutions that might actually work. It might also be useful to think of what some of those solutions would be. 

At the same time, I would prefer that us humans come up with these solutions ourselves. In solving problems of our common vulnerability, by using our vulnerability as a ‘resource’, we develop our humanity. We can of course transfer these decisions to artificial intelligence, but that means we don’t get a chance to become better people.


Kris Circle

 

Kris Dobie is Senior Manager: Organisational Ethics Development at The Ethics Institute. He holds a Master of Workplace Ethics from the University of Pretoria.