PAX – a co-founder of the Stop Killer Robots Campaign, is working towards a ban on autonomous weapons without meaningful control. Last week we were in New York at the United Nations, were this issue was discussed at the First Committee. At a well attended side event at the UN, I had the privilege to give a speech about this issue, together with Jody Williams from the Nobel Women’s Initiative and Ryan Gariepy from the Canadian company Clearpath, the first robotic company pledging never to be involved in the development of these weapons. I emphasized that while technology rushes forward, we need to take a time out, we need to take a pause.
By Miriam Struyk
A time out, to debate the trend towards the automation of warfare to ensure that not only lives, but also a concept of the value of human life or dignity are preserved in the long term. Here’s a summary of what I said.
Luckily there are continuing discussions on autonomous weapons systems both with an International Humanitarian Law as with a Human Rights orientation. For PAX the issue of killer robots is first and foremost – and perhaps even more so than with other disarmament issues – an ethical one. We believe humanity should not surrender meaningful human control over decisions of life and death to machines. Humanity cannot in good conscience outsource moral responsibility, for us this is an issue of human dignity.
Depth and breadth
At the beginning of this year PAX issued a report called ‘Deadly decisions, 8 objections to killer robots’. Fortunately many of these objections were mentioned by states and other actors during this First Committee and at other meetings. It is remarkable to see how many states over the last two years devoted time and thinking into the issue. The four day meeting at the Convention on Certain Conventional Weapons (CCW) in May, excellently chaired by French ambassador Simon Michel, was enriched by the depth and breadth of states’ interventions during that meeting. Although it did feel that sometimes there was just a little too much emphasis on the legal framework. And by doing that we might run the risk of neglecting the larger and far more important realm of moral and ethical issues.
Legal is not always moral
Michael Ignatieff once said that there are habits of the mind that encourage the view, that if you have legal coverage you have moral coverage, but what is legal is of course not necessarily moral. And legality represents also only a minimum standard for the behavior of soldiers in the chaos of combat. It is about doing what is required, but also about doing less than is permitted (to restrain oneself). And of course human beings are frail, flawed and, indeed, can be “inhumane”; but they also have the potential to rise above the minimum legal standards for killing. Soldiers have ability to refuse orders. We should not reduce complex issues of morality into purely technical issues of legality.
How much time do we have?
The issue of killer robots is an urgent one. The thing is, we don’t know how much time we have before – like with most disarmament work – we are too late, the genie is out of the bottle and facts on the ground are created that will be too hard to reverse?
What we know from previous successful disarmament treaties, is that it starts with the acknowledgement of humanitarian harm or foreseeable harm, a sense of urgency, solid research, an effective civil society coalition and the work of the ICRC and other multilateral actors. And most crucial of course it needs willingness of states. Successful treaties had states that were able and willing to devote time and capital in working towards a multilateral solution and some states that were willing to be champions on an issue. And of course states were crucial outside of international forums, states developing policies, positions or even moratoria were boosting international processes.
Work to be done
To have a ban on killer robots there is work to be done:
- For civil society: to analyse and write policy papers, to organise public debates, to do work in capitals and raise awareness among the general public.
- For CCW states: to adopt a formal mandate in November to continue discussions next year. We believe that the 4 day meeting in May this year clearly showed there is a will, an appetite for continuing this work in 2015 and as many states expressed in May there is urgency to have more substantive work in 2015 and we hope this urgency will be reflected in the new mandate.
- For national states: there are many steps states can take: give statements on the need for meaningful human control, develop political or policy positions etc. A comprehensive policy might seem too difficult at this stage, but this should not hamper states from taking other steps. Building acceptance and outlining parameters in a position or policy paper will set the debate on concrete terms that can provide a basis for work in the future. Incremental steps are necessary for national debates and global discussions.
- For companies: to start debates within and between them and develop Memoranda of Understandings or even have public policies like the Canadian company Clearpath, which I think is great.
Because the very heart of the issue of killer robots is human dignity. Autonomously killing machines go against the principles of human dignity. Not only the dignity and right to life of those who will be directly affected, but also the dignity of soldiers and civilians in whose name killer robots will be deployed.
For more information on what was happened at the UN First Committee on Killer Robots, see:
- What United Nations has to say about “Killer Robots”-Clearpath