Robots formerly belonged to the realm of fiction, but are now becoming a practical issue for the disarmament community. While some believe that military robots could act more ethically than human soldiers on the battlefield, others have countered that such a scenario is highly unlikely, and that the technology in question should be banned. Autonomous weapon systems will be unable to discriminate between soldiers and civilians, and their use will lower the threshold to resort to the use of force, they argue. In this article, I take a bird’s-eye look at the international humanitarian law (ihl) pertaining to autonomous weapon systems. My argument is twofold: First, I argue that it is indeed difficult to imagine how ihl could be implemented by algorithm. The rules of distinction, proportionality, and precautions all call for what are arguably unquantifiable decisions. Second, I argue that existing humanitarian law in many ways presupposes responsible human agency.
Purchase
Buy instant access (PDF download and unlimited online access):
Institutional Login
Log in with Open Athens, Shibboleth, or your institutional credentials
Personal login
Log in with your brill.com account
M. Ekelhof and M. Struyk, ‘Deadly Decisions: 8 Objections to Killer Robots’, Pax (2014), http://www.paxvoorvrede.nl/media/files/deadlydecisionsweb.pdf, visited on 1 February 2016, p. 4.
Doswald-Beck, supra note 24.
D. Greenfield, ‘The Case against Drone Strikes against People Who Only “Act” Like Terrorists’, The Atlantic, 19 July 2013.
Zenko, supra note 41, p. 6.
D. Kilcullen and A.M. Exum, ‘Death From Above, Outrage Down Below’, The New York Times, 16 May 2009, <http://www.nytimes.com/2009/05/17/opinion/17exum.html?pagewanted=all&_r=0>, visited on 1 February 2016.
Scahill and Greenwald, supra note 39, visited on 26 July 2014.
See M. Brooks, ‘Meet the Man Who Wants Total Unemployment for All Human Beings in the World’, The New Statesman, 19 September 2013, <http://www.newstatesman.com/economics/2013/09/meet-man-who-wants-total-unemployment allhuman-beings-world>, visited on 25 July 2015.
M.N. Schmitt, ‘Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics’, National Security Journal, online feature, 5 February 2013, <http://harvardnsj.org/2013/02/autonomous-weapon-systems-and-international-humanitarian-law-a-reply-to-the-critics/> , visited on 25 July 2014, p. 6.
Asaro, supra note 47, p. 700; see also Morkevicius, supra note 47, 3–19.
Arkin, supra note 46, visited on 1 February 2016, p. 64.
Dennett, supra note 52.
E.g., Human Rights Watch, ‘Losing Humanity’, supra note 18, p. 4.
Arkin, supra note 46, visited on 1 February 2016, p. 7.
Cassese, supra note 76, p. 33.
M.N. Schmidt, ‘Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics’, National Security Journal, Harvard Law School, online (2013), <http://harvardnsj.org/2013/02/autonomous-weapon-systems-and-international-humanitarian-law-areply-to-the-critics/>, visited on 24 July 2013, p. 1.
Schmidt, ibid., p. 33.
Lin et al., supra note 72, 8.
Lin et al., supra note 72, 8.
Schmidt, supra note 85, p. 33.
Cassese, supra note 76, pp. 236–237.
See for example, Martinez, ibid., 644–645; Cassese, supra note 76, pp. 61–73.
Williamson, supra note 97, p. 303.
Arkin, supra note 46.
See e.g., Badar, supra note 102; J. Van der Vyver, ‘The International Criminal Court and the Concept of Mens Rea in International Criminal Law’ , 12 University of Miami International & Comparative Law Review (summer 2004) p. 57; I. Marchuk, The Fundamental Concept of Crime in International Criminal Law (Springer, Berlin, 2014); Martinez, supra note 97.
Sparrow, supra note 84, p. 74. Emphasis added.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 4952 | 809 | 104 |
Full Text Views | 1039 | 82 | 15 |
PDF Views & Downloads | 1355 | 190 | 36 |
Robots formerly belonged to the realm of fiction, but are now becoming a practical issue for the disarmament community. While some believe that military robots could act more ethically than human soldiers on the battlefield, others have countered that such a scenario is highly unlikely, and that the technology in question should be banned. Autonomous weapon systems will be unable to discriminate between soldiers and civilians, and their use will lower the threshold to resort to the use of force, they argue. In this article, I take a bird’s-eye look at the international humanitarian law (ihl) pertaining to autonomous weapon systems. My argument is twofold: First, I argue that it is indeed difficult to imagine how ihl could be implemented by algorithm. The rules of distinction, proportionality, and precautions all call for what are arguably unquantifiable decisions. Second, I argue that existing humanitarian law in many ways presupposes responsible human agency.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 4952 | 809 | 104 |
Full Text Views | 1039 | 82 | 15 |
PDF Views & Downloads | 1355 | 190 | 36 |