Drones are already doing a lot of killing on behalf of certain governments, but a human being still has to make a conscious decision somewhere and press a button. What if killing machines were programmed to take such decisions all by themselves?
“Killer robots” may sound like fodder for dystopian fiction, but they are exactly what weapons technology experts, human rights groups and United Nations’ member states are meeting in Geneva this week to discuss.
The five-day gathering, taking place within the context of the UN’s Convention on Conventional Weapons, is looking at the legal and moral implications of Lethal Autonomous Weapons Systems, LAWS for short, and assessing how advanced their development already is.
IRIN examines the key issues.
What are LAWS?
Answering that question is one of the things this meeting aims to do. LAWS are still only in the development stage, they have never been used on the battlefield. Many countries are working towards more autonomous, even fully autonomous, weapons systems, but exactly what they will be capable of remains unclear, and no legal or commercial definitions for LAWS have yet been determined. LAWS should not be confused with drones, which despite flying without pilots, remain controlled by humans on the ground. A truly autonomous weapon is a machine that is programmed well in advance, to seek out certain people or objects, and destroy them. The final decision to attack will be taken by the machine itself.
Who has LAWS and who wants them?
Again, it is hard to answer that question, since countries are traditionally fairly quiet about their development of advanced weapons technology. But the United States, Russia, Britain, and Israel are all thought to be developing autonomous weapons systems. The United States has been working on the “Crusher,” an unmanned ground combat vehicle, and Britain has tested an unmanned fighter plane called “Taranis.” Meanwhile, South Korea has deployed autonomous “sentries” in the demilitarised zone. These are equipped with machine guns that have the capability to lock on to human targets and shoot them, but have never been used to do so.
Are there advantages to LAWS?
The idea of wars fought by machines, clinically destroying targets, with no human error, can seem attractive. Some military leaders believe autonomous weapons could significantly reduce the number of human soldiers required on the battlefield. Others suggest their use would reduce civilian suffering during conflict.
And what about disadvantages?
Human rights groups say autonomous weapons raise serious moral and ethical questions. Ahead of this week’s meeting in Geneva, Human Rights Watch (HRW) published a report arguing that autonomous weapons should be banned before they are even fully developed. The report, called "The Accountability Gap", raises concerns about liability for war crimes in conflicts in which autonomous weapons are used.
“A fully autonomous weapon could commit acts that would rise to the level of war crimes if a person carried them out, but victims would see no one punished for these crimes,” said Bonnie Docherty, HRW’s senior Arms Division researcher.
The United Nations Human Rights Council has also raised questions about autonomous weapons. A report published in May 2013 by Christof Heyns, the UN special rapporteur on arbitrary execution, argued that “machines lack morality and mortality, and as a result should not have life and death powers over humans.” Heyns’ report called for a moratorium on the development of autonomous weapons until legal and moral concerns are addressed.
What about the Geneva Conventions?
The International Committee of the Red Cross (ICRC), the guardian of the Geneva Conventions, has also raised concerns about lethal autonomous weapons. The ICRC fears their use would undermine the key principles of distinction, proportion and precaution aimed at protecting civilians during warfare.
Highlighting the complex nature of modern warfare, the ICRC’s report (LINK) on autonomous weapons says: “it is not clear how such weapons could discriminate between a civilian and a combatant, as required by the rule of distinction.
“Indeed, such a weapon might also have to distinguish between active combatants and those hors de (out of) combat or surrendering, and between civilians taking a direct part in hostilities and armed civilians.”
“An autonomous weapon system will also have to operate in compliance with the rule of proportionality, which requires that the incidental civilian casualties expected from an attack on a military target not be excessive when weighed against the anticipated concrete and direct military advantage,” the report adds.
However, the ICRC has not joined calls for a pre-emptive ban. Instead, it is contributing to this week’s debate and urging UN member states to answer to the legal and ethical implications of LAWS before they are further developed or deployed.
How likely is a pre-emptive ban?
There is a precedent for banning a weapon before it is fully developed and used: in 1995 the UN banned blinding lasers.
But other attempts at agreeing UN conventions to ban weapons have been less successful.
After years of campaigning against landmines, supporters of a ban finally abandoned the UN disarmament structure and set up the Ottawa process, which led to the 1998 Convention on the Prohibition of Anti-Personnel Mines.
The same thing with cluster munitions: stalemate within the UN led Norway to invite other interested parties to join the Oslo process, and in 2010 the Convention on Cluster Munitions came into force.
It is more than likely that the debate on LAWS will be long, and progress towards restrictions or a ban are expected to be slow.
NGOs would prefer the debate to stay inside the United Nations because it would ensure the participation of the world’s key military powers. Don’t forget, neither the United States, Russia nor China have adopted the conventions against landmines and cluster munitions.
So what can we expect from this meeting?
Lots of discussion, but no major decisions. The diplomat leading the talks, German ambassador Michael Biontino, has sent member states a background “food for thought” document.
Biontino asks them to reflect on a number of issues: the possible consequences of LAWS on regional stability; the ethics of leaving a decision over life and death to a machine; and under what scenarios autonomous weapons would be likely to be deployed.
By the end of the meeting, he hopes there will be a much clearer picture of the legal and moral questions surrounding LAWS and an accurate assesment of their current state of technological development.
From there, the debate is expected to become more formal, ahead of the Convention on Conventional Weapons review conference in 2016, at which concrete proposals for restrictions or a complete ban can be expected.
The whole process sounds slow, laborious, and bureaucratic, but in fact the discussions in Geneva are groundbreaking. Countries, weapons developers, and humanitarians are meeting together to identify the rights and wrongs of a new weapon - a weapon with the potential to fundamentally change the nature of war - before that weapon even exists.
Help make quality journalism about crises possible
The New Humanitarian is an independent, non-profit newsroom founded in 1995. We deliver quality, reliable journalism about crises and big issues impacting the world today. Our reporting on humanitarian aid has uncovered sex scandals, scams, data breaches, corruption, and much more.
Our readers trust us to hold power in the multi-billion-dollar aid sector accountable and to amplify the voices of those impacted by crises. We’re on the ground, reporting from the front lines, to bring you the inside story.
We keep our journalism free – no paywalls – thanks to the support of donors and readers like you who believe we need more independent journalism in the world. Your contribution means we can continue delivering award-winning journalism about crises.