• Revision ID 2949 REVISION
  • 2017-08-22 09:41:31
  • by Jack Barton (talk | contribs)
  • Note: Removed under construction
  • Revision ID 2996 REVISION
  • 2017-08-22 16:21:36
  • by Angela Long (talk | contribs)
  • Note: Trimmed 'more broadly' from end of sentence - was a bit of a hanging chad.
 
   
Title Title
Explainer: The automation of war Explainer: The automation of war
Summary Summary
Inside the intensifying debate over the ethics of fully automated weapons Inside the intensifying debate over the ethics of fully automated weapons
Highlights Highlights
Content Content
<strong>More than 100 high-profile business leaders have signed a letter urging the United Nations to prevent escalation of the supposed threats posed by autonomous weapons. The letter warns that such weapon development could lead to warfare at a scale greater than ever and speed "faster than humans can comprehend".</strong> <strong>More than 100 high-profile business leaders have signed a letter urging the United Nations to prevent escalation of the supposed threats posed by autonomous weapons. The letter warns that such weapon development could lead to warfare at a scale greater than ever and speed "faster than humans can comprehend".</strong>
Autonomous weapons, or “killer robots”, are designed to identify enemy targets and engage them <a href="https://www.law.upenn.edu/institutes/cerl/conferences/ethicsofweapons/">without human involvement</a>. They are intended to be more efficient than conventional warfare, which requires  humans to identify targets and assess risks before pulling a trigger. Autonomous weapons, or “killer robots”, are designed to identify enemy targets and engage them <a href="https://www.law.upenn.edu/institutes/cerl/conferences/ethicsofweapons/">without human involvement</a>. They are intended to be more efficient than conventional warfare, which requires  humans to identify targets and assess risks before pulling a trigger.
<b>Breakdown</b> <b>Breakdown</b>
<span style="font-weight: 400;">The </span><a href="https://futureoflife.org/autonomous-weapons-open-letter-2017"><span style="font-weight: 400;">letter</span></a><span style="font-weight: 400;">, signed by 116 leaders in artificial intelligence technology, was addressed to the UN’s Conference of the Convention on Certain Conventional Weapons.</span>  <span style="font-weight: 400">The </span><a href="https://futureoflife.org/autonomous-weapons-open-letter-2017"><span style="font-weight: 400">letter</span></a><span style="font-weight: 400">, signed by 116 leaders in artificial intelligence technology, was addressed to the UN’s Conference of the Convention on Certain Conventional Weapons.</span>
<span style="font-weight: 400;">With signatures including Tesla’s Elon Musk and Google Deepmind’s Mustafa Suleyman, the letter does not explicitly call for the UN to ban these weapons.</span>  <span style="font-weight: 400">With signatures including Tesla’s Elon Musk and Google Deepmind’s Mustafa Suleyman, the letter does not explicitly call for the UN to ban these weapons.</span>
<span style="font-weight: 400;">However the business leaders said that they felt “especially responsible in raising this alarm” as the technology from their companies is likely to contribute to the development of autonomous weapons.</span>  <span style="font-weight: 400">However the business leaders said that they felt “especially responsible in raising this alarm” as the technology from their companies is likely to contribute to the development of autonomous weapons.</span>
<span style="font-weight: 400;">At a conference on the Convention on Certain Conventional Weapons last year, experts <a href="https://www.unog.ch/80256EE600585943/(httpPages)/F027DAA4966EB9C7C12580CD0039D7B5?OpenDocument">agreed </a>to establish a new Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS).</span>  <span style="font-weight: 400">At a conference on the Convention on Certain Conventional Weapons last year, experts <a href="https://www.unog.ch/80256EE600585943/(httpPages)/F027DAA4966EB9C7C12580CD0039D7B5?OpenDocument">agreed </a>to establish a new Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS).</span>
<span style="font-weight: 400;">The GGE was </span><a href="https://www.stopkillerrobots.org/2017/05/diplomatsfalter/"><span style="font-weight: 400;">due to meet on 21 August</span></a><span style="font-weight: 400;"> to begin discussing potential international regulation of autonomous weapons, but the meeting was delayed because some member states had not paid UN contributions.</span>  <span style="font-weight: 400">The GGE was </span><a href="https://www.stopkillerrobots.org/2017/05/diplomatsfalter/"><span style="font-weight: 400">due to meet on 21 August</span></a><span style="font-weight: 400"> to begin discussing potential international regulation of autonomous weapons, but the meeting was delayed because some member states had not paid UN contributions.</span>
<span style="font-weight: 400;">Toby Walsh, professor of artificial intelligence at the University of New South Wales in Sydney, coordinated the letter and published it at the International Conference on Artificial Intelligence, currently taking place in Melbourne [19-25 August]. </span>  <span style="font-weight: 400">Toby Walsh, professor of artificial intelligence at the University of New South Wales in Sydney, coordinated the letter and published it at the International Conference on Artificial Intelligence, currently taking place in Melbourne [19-25 August]. </span>
<span style="font-weight: 400;">The letter encourages the GGE to “prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.”</span>  <span style="font-weight: 400">The letter encourages the GGE to “prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.”</span>
<span style="font-weight: 400;">The business leaders warned that autonomous weapons could lead to armed conflict fought “at a greater scale than ever and at timescales faster than humans can comprehend.”</span>  <span style="font-weight: 400">The business leaders warned that autonomous weapons could lead to armed conflict fought “at a greater scale than ever and at timescales faster than humans can comprehend.”</span>
<span style="font-weight: 400;">The letter further warned that such weapons could be misused by despots and terrorists and hacked by opponents, at great risk to civilians.</span>  <span style="font-weight: 400">The letter further warned that such weapons could be misused by despots and terrorists and hacked by opponents, at great risk to civilians.</span>
<span style="font-weight: 400;">“We do not have long to act,” they wrote, “Once this Pandora’s box is opened, it will be hard to close.”</span>  <span style="font-weight: 400">“We do not have long to act,” they wrote, “Once this Pandora’s box is opened, it will be hard to close.”</span>
<b>The tech</b> <b>The tech</b>
<span style="font-weight: 400;">The development of artificial intelligence (AI) in arms builds on the use of unmanned aircraft (drones), which have the advantage of keeping the targeting combatants far from danger.</span>  <span style="font-weight: 400">The development of artificial intelligence (AI) in arms builds on the use of unmanned aircraft (drones), which have the advantage of keeping the targeting combatants far from danger.</span>
<span style="font-weight: 400;">In practical terms, the initial difference would be that the drone identifies the target itself, and makes a calculation on the risk of collateral damage before deciding whether to launch a strike, without any human intervention. Such technology </span><a href="https://www.law.upenn.edu/live/files/4003-20141120---wagner-markus-dehumanizationpdf"><span style="font-weight: 400;">could be advanced to become the overarching means of controlling weapons in war more broadly</span></a><span style="font-weight: 400;">.</span>  <span style="font-weight: 400">In practical terms, the initial difference would be that the drone identifies the target itself, and makes a calculation on the risk of collateral damage before deciding whether to launch a strike, without any human intervention. Such technology </span><a href="https://www.law.upenn.edu/live/files/4003-20141120---wagner-markus-dehumanizationpdf"><span style="font-weight: 400">could be advanced to become the overarching means of controlling weapons in war</span></a><span style="font-weight: 400">.</span>
<a href="https://www.hrw.org/news/2017/01/05/growing-international-movement-against-killer-robots"><span style="font-weight: 400;">According to</span></a><span style="font-weight: 400;"> Human Rights Watch, South Korea has deployed automated gun towers in the demilitarised zone (DMZ) with North Korea.  This system, possibly deployed due to the fact that North Korea's military is <a href="https://www.defense.gov/Portals/1/Documents/pubs/Military_and_Security_Developments_Involving_the_Democratic_Peoples_Republic_of_Korea_2015.PDF">larger in number </a>than that of the South's, <a href="https://en.wikipedia.org/wiki/Samsung_SGR-A1">uses a laser rangefinder and infrared technology</a> to seek targets in the DMZ, and can be delpoyed manually or autonomously.</span>  <a href="https://www.hrw.org/news/2017/01/05/growing-international-movement-against-killer-robots"><span style="font-weight: 400">According to</span></a><span style="font-weight: 400"> Human Rights Watch, South Korea has deployed automated gun towers in the demilitarised zone (DMZ) with North Korea.  This system, possibly deployed due to the fact that North Korea's military is <a href="https://www.defense.gov/Portals/1/Documents/pubs/Military_and_Security_Developments_Involving_the_Democratic_Peoples_Republic_of_Korea_2015.PDF">larger in number </a>than that of the South's, <a href="https://en.wikipedia.org/wiki/Samsung_SGR-A1">uses a laser rangefinder and infrared technology</a> to seek targets in the DMZ, and can be delpoyed manually or autonomously.</span>
<b>The risk</b> <b>The risk</b>
<span style="font-weight: 400;">Campaigners against the development of autonomous weapons </span><a href="https://www.hrw.org/topic/arms/killer-robots"><span style="font-weight: 400;">argue</span></a><span style="font-weight: 400;"> that it is unlikely that AI could replace human judgment and reliably assess the risk of causing a breach of humanitarian law.</span>  <span style="font-weight: 400">Campaigners against the development of autonomous weapons </span><a href="https://www.hrw.org/topic/arms/killer-robots"><span style="font-weight: 400">argue</span></a><span style="font-weight: 400"> that it is unlikely that AI could replace human judgment and reliably assess the risk of causing a breach of humanitarian law.</span>
<span style="font-weight: 400;">Humanitarian law (the international standards that are supposed to govern combat) rests on the “</span><a href="https://www.loc.gov/rr/frd/Military_Law/pdf/Cust-Intl-Hum-Law_Vol-I.pdf"><span style="font-weight: 400;">cardinal principles”</span></a><span style="font-weight: 400;"> of proportionality and distinction.</span>  <span style="font-weight: 400">Humanitarian law (the international standards that are supposed to govern combat) rests on the “</span><a href="https://www.loc.gov/rr/frd/Military_Law/pdf/Cust-Intl-Hum-Law_Vol-I.pdf"><span style="font-weight: 400">cardinal principles”</span></a><span style="font-weight: 400"> of proportionality and distinction.</span>
<span style="font-weight: 400;">These principles, relying on an assumption that any war can only be legally justified as defensive, mean that any combative action must be proportionate to the defensive aim of the war, and any attack must distinguish between civilians and combatants.</span>  <span style="font-weight: 400">These principles, relying on an assumption that any war can only be legally justified as defensive, mean that any combative action must be proportionate to the defensive aim of the war, and any attack must distinguish between civilians and combatants.</span>
<span style="font-weight: 400;">An additional addendum to the canon of humanitarian law, known as the <em><a href="https://www.icrc.org/eng/resources/documents/article/other/57jnhy.htm">Martens Clause</a></em>, established in 1899, requires actors to be mindful of the "public conscience".</span>  <span style="font-weight: 400">An additional addendum to the canon of humanitarian law, known as the <em><a href="https://www.icrc.org/eng/resources/documents/article/other/57jnhy.htm">Martens Clause</a></em>, established in 1899, requires actors to be mindful of the "public conscience".</span>
In a 2016 report, Human Rights Watch said that "Although progress is likely in the development of sensory and processing capabilities, distinguishing an active combatant from a civilian or an injured or surrendering soldier requires more than such capabilities." In a 2016 report, Human Rights Watch said that "Although progress is likely in the development of sensory and processing capabilities, distinguishing an active combatant from a civilian or an injured or surrendering soldier requires more than such capabilities."
In a 2013 <a href="http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf">report </a>for the UN's Human Rights Council, Christof Heyns, the special rapporteur on extrajudicial, summary or arbitrary executions wrote that "Proportionality is widely understood to involve distinctively human judgement." In a 2013 <a href="http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf">report </a>for the UN's Human Rights Council, Christof Heyns, the special rapporteur on extrajudicial, summary or arbitrary executions wrote that "Proportionality is widely understood to involve distinctively human judgement."
"The prevailing legal interpretations of the rule explicitly rely on notions such as “common sense”, “good faith” and the “reasonable military commander standard,” Heyns went on, "It remains to be seen to what extent these concepts can be translated into computer programmes, now or in the future." "The prevailing legal interpretations of the rule explicitly rely on notions such as “common sense”, “good faith” and the “reasonable military commander standard,” Heyns went on, "It remains to be seen to what extent these concepts can be translated into computer programmes, now or in the future."
<span style="font-weight: 400;">The Stop Killer Robots campaign group </span><a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_PR_CCW_30May2017fnl.pdf"><span style="font-weight: 400;">warned</span></a><span style="font-weight: 400;"> earlier this year that “low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control.”</span>  <span style="font-weight: 400">The Stop Killer Robots campaign group </span><a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_PR_CCW_30May2017fnl.pdf"><span style="font-weight: 400">warned</span></a><span style="font-weight: 400"> earlier this year that “low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control.”</span>
<a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_CountryViews_May2017.pdf"><span style="font-weight: 400;">According to the campaign group</span></a><span style="font-weight: 400;">, 19 countries have endorsed calls for a preemptive ban on fully autonomous weapons systems.</span>  <a href="http://www.stopkillerrobots.org/wp-content/uploads/2013/03/KRC_CountryViews_May2017.pdf"><span style="font-weight: 400">According to the campaign group</span></a><span style="font-weight: 400">, 19 countries have endorsed calls for a preemptive ban on fully autonomous weapons systems.</span>
<span style="font-weight: 400;">In 2015, Walsh organised </span><a href="https://futureoflife.org/open-letter-autonomous-weapons"><span style="font-weight: 400;">another letter</span></a><span style="font-weight: 400;"> signed by over 1000 tech experts and scientists, which also warned against starting an AI military arms race.</span>  <span style="font-weight: 400">In 2015, Walsh organised </span><a href="https://futureoflife.org/open-letter-autonomous-weapons"><span style="font-weight: 400">another letter</span></a><span style="font-weight: 400"> signed by over 1000 tech experts and scientists, which also warned against starting an AI military arms race.</span>
<span style="font-weight: 400;"><strong>Read more</strong></span>  <span style="font-weight: 400"><strong>Read more</strong></span>
<span style="font-weight: 400;">“</span><a href="https://www.wikitribune.com/stories/how-will-humanity-go-extinct/"><span style="font-weight: 400;">How will humanity go extinct?</span></a><span style="font-weight: 400;">”</span>  <span style="font-weight: 400">“</span><a href="https://www.wikitribune.com/stories/how-will-humanity-go-extinct/"><span style="font-weight: 400">How will humanity go extinct?</span></a><span style="font-weight: 400">”</span>
<span style="font-weight: 400;">United Nations Institute for Disarmament Research (UNIDIR) “</span><a href="http://www.unidir.org/files/publications/pdfs/framing-discussions-on-the-weaponization-of-increasingly-autonomous-technologies-en-606.pdf"><span style="font-weight: 400;">Framing discussions on the weaponization of increasingly autonomous technology”</span></a>  <span style="font-weight: 400">United Nations Institute for Disarmament Research (UNIDIR) “</span><a href="http://www.unidir.org/files/publications/pdfs/framing-discussions-on-the-weaponization-of-increasingly-autonomous-technologies-en-606.pdf"><span style="font-weight: 400">Framing discussions on the weaponization of increasingly autonomous technology”</span></a>
<em>Jack Barton is a WikiTribune journalist. He has an LLM in Human Rights and a background reporting on law and international development. Follow @jackbarton91</em> <em>Jack Barton is a WikiTribune journalist. He has an LLM in Human Rights and a background reporting on law and international development. Follow @jackbarton91</em>
Categories Categories
Article type Article type
Tags Tags
Author byline Author byline
No No
Has hero Has hero
No No
Hero Alignment Hero Alignment
Hero Image URL Hero Image URL
None None
Featured Image URL Featured Image URL
Sources Sources

Subscribe to our newsletter

Be the first to collaborate on our developing articles

WikiTribune Open menu Close Search Like Back Next Open menu Close menu Play video RSS Feed Share on Facebook Share on Twitter Share on Reddit Follow us on Instagram Follow us on Youtube Connect with us on Linkedin Connect with us on Discord Email us