Explainer: The automation of war

Talk (5)

Angela Long

Angela Long

"Hey Jack, a nice juicy addition about..."
Jack Barton

Jack Barton

"Hi Angela, thanks very much for your ..."
Angela Long

Angela Long

"That's grand. Peter asked me to give ..."
Angela Long

Angela Long

"Me again... "A new Group of Govern..."

More than 100 high-profile business leaders have signed a letter urging the United Nations to prevent escalation of threats purportedly posed by autonomous weapons. The letter warns that such weapons development could lead to warfare at a greater-than-ever scale and at a speed “faster than humans can comprehend.”

Autonomous weapons systems, also known as “killer robots,” are designed to identify and engage enemy targets without human involvement. They are intended to be more “efficient” than conventional warfare, which requires humans to identify targets and assess risks before pulling a trigger.


The letter, signed by 116 leaders in artificial intelligence technology, was addressed to the UN’s Conference of the Convention on Certain Conventional Weapons. With signatures including Tesla’s Elon Musk and Google Deepmind’s Mustafa Suleyman, the letter does not explicitly call for the UN to ban these weapons.

However the group said that they felt “especially responsible in raising this alarm” as the technology from their companies is likely to contribute to the development of autonomous weapons.

At a conference on the Convention on Certain Conventional Weapons last year, experts agreed to establish a new Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS). The group was due to meet on 21 August to begin discussing potential international regulation of autonomous weapons, but the meeting was delayed as some member states had not paid their UN contributions.

Toby Walsh, professor of artificial intelligence at the University of New South Wales in Sydney, coordinated the letter and published it at the International Conference on Artificial Intelligence, which took place in Melbourne in August.

The letter encourages the GGE to “prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.”

The business leaders warned that autonomous weapons could lead to armed conflict fought “at a greater scale than ever and at timescales faster than humans can comprehend.”

The letter further warned that such weapons could be misused by despots and terrorists and hacked by opponents, posing great risk to civilians.

“We do not have long to act,” they wrote. “Once this Pandora’s box is opened, it will be hard to close.”

The tech

The development of artificial intelligence (AI) in weaponry builds on the use of unmanned aircraft (drones), which have the advantage of keeping the targeting combatants far from danger.

In practical terms, a drone, without any human intervention, could identify a target and make a calculation on the risk of collateral damage before launching a strike. Such technology could advance to become the overarching means of controlling weapons in war.

According to Human Rights Watch, South Korea has deployed automated gun towers in the demilitarized zone (DMZ) with North Korea. This system uses a laser rangefinder and infrared technology to seek targets in the DMZ, and can be deployed manually or autonomously.

The risk

Campaigners against the development of autonomous weapons argue that it is unlikely that artificial intelligence could replace human judgment and reliably assess risks, potentially causing a breach of humanitarian law.

Humanitarian law (the international standards that were created to govern combat) rests on the “cardinal principles” of proportionality and distinction.

These principles, relying on an assumption that any war can only be legally justified as defensive, require that any combative action must be proportionate to the defensive aim of the war, and any attack must distinguish between civilians and combatants.

An additional addendum to the canon of humanitarian law, known as the Martens Clause, established in 1899, requires governments to be mindful of the “public conscience.”

In a 2016 report, Human Rights Watch said, “Although progress is likely in the development of sensory and processing capabilities, distinguishing an active combatant from a civilian or an injured or surrendering soldier requires more than such capabilities.”

In a 2013 report for the UN’s Human Rights Council, Christof Heyns, the special rapporteur on extrajudicial, summary or arbitrary executions, wrote that “Proportionality is widely understood to involve distinctively human judgement.”

“The prevailing legal interpretations of the rule explicitly rely on notions such as ‘common sense,’ ‘good faith’ and the ‘reasonable military commander standard,’” Heyns wrote. “It remains to be seen to what extent these concepts can be translated into computer programmes, now or in the future.”

The Stop Killer Robots campaign group warned earlier this year that “low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control.”

According to the campaign group, 19 countries have endorsed calls for a preemptive ban on fully autonomous weapons systems.

In 2015, Walsh, the AI professor, organized another letter signed by over 1,000 tech experts and scientists, which also warned against starting a military arms race powered by artificial intelligence.

Read more

How will humanity go extinct?

United Nations Institute for Disarmament Research (UNIDIR) “Framing discussions on the weaponization of increasingly autonomous technology”

Jack Barton is a WikiTribune journalist. He has an LLM in Human Rights and a background reporting on law and international development. Follow @jackbarton91

Started by

United Kingdom
Jack Barton is a staff journalist at WikiTribune where he writes about international law, human rights and finance, whilst covering daily news. He was previously a senior reporter at Law Business Research and has experience covering law and international development, with credits in the Sunday Times, the New Indian Express, and New Statesman online among others. He has an LLM in Human Rights and worked on a UN-funded research project, looking at peace processes.

History for stories "Explainer: The automation of war"

Select two items to compare revisions

12 January 2018

03 January 2018

• (view) . . Comment: Feedback on everything please!‎; 16:49:17, 03 Jan 2018 . . Dimitrios Theodorou (talk | contribs)‎‎ ( Comment -> In its current version, the article should not have been accepted. The author should state from the very beginning that the Alternet is his project. No hard feelings Miguel (especially since I like your ideas), but this has to be rewritten as a personal account. )

13 November 2017

02 November 2017

17:30:36, 02 Nov 2017 . .‎ Peter Bale (Updated → Accepting revisions)

26 October 2017

06:21:29, 26 Oct 2017 . .‎ Jodie DeJonge (Updated → Tightening)

Talk for Story "Explainer: The automation of war"

Talk about this Story

  1. Hey Jack, a nice juicy addition about the towers at the DMZ. I don’t think you need the clause of supposition about why SK needs them. We all know they hate each other. Sentence works better (less cognitive load) without.

    I shall keep reading your work with interest.


  2. That’s grand. Peter asked me to give some feedback.
    Apart from the suggestions above, the only thing is that you could make it a little more dramatic, without going all Daily Mail. ‘The tech leaders do not want to be responsible for rogue AI weapons which could kill indiscriminately … One scenario is that an AI weapon could decide to fire or bomb ….’ (etc …as you know the subject well having consumed all those UN and campaign sources.)


    1. Hi Angela, thanks very much for your thoughts.
      I’ll reply to all here for ease. The letter cites “third revolution” and I feel like I’ve read somewhere that nuclear weapons are considered the second revolution, but I can’t find where I saw that – or any other references to the two previous revolutions of warfare. I will keep looking.
      The GGE is new – there had been previous meetings of experts, the last of which decided to establish a GGE. I’ll try to establish what the difference is.
      I’ll chuck something in about the South Korean gun towers – as you suggest, will add some colour.
      Thanks for your help.

  3. Me again…

    “A new Group of Governmental Experts (GGE)” ….

    The GGE already exists (I think) so this is a newly-constituted version of it, is it? A fine difference but there is a distinction.

    This goes with another AI or war story, right … as it’s an explainer.


  4. Hi Jack, I’m having a look at the piece on autonomous weapons. Really good topic. Did I miss it, or what are the first and second revolutions of warfare? if we mention the third, tell the worriers what the first two were. It can be inserted down the story.
    And would it be possible to have one more sentence about the automated gun towers in South Korea? To give the reader a mental picture of what they actually do (e.g., do they fire a volley of shots if a sensor detects human-sized movement?)


Subscribe to our newsletter to receive news, alerts and updates

Support Us

Why this is important and why you should care about facts, journalism and democracy

WikiTribune Open menu Close Search Like Previous page Next page Back Next Open menu Close menu Play video RSS Feed Share on Facebook Follow us on Twitter Follow us on Instagram Follow us on Youtube Connect with us on Linkedin Email us Message us on Facebook Messenger Save for Later