Noel Sharkey on Autonomous Weapons Systems (Podcast)

Noel Sharkey is a long-time scientist and researcher in AI who has more recently turned his attention to autonomous weapons systems.  He is sharply critical of them and is a leading member of a campaign based in Britain to ban them from the get-go.  He discusses these issues in a very interesting podcast on “SuchThatCast – the Philosopher’s Podcast.”  Professor Sharkey calls for a ban on autonomous weapons – from a variety of concerns ranging from believing that those developing such systems or wanting to consider deploying them do not understand the limitations of AI, as well as fundamental moral objections.

I don’t share this view, but want to say that Professor Sharkey is one of the most generous scholars and advocates I’ve had correspondence with (we haven’t met in person) over the years.  He has offered Matthew Waxman and me cogent comments and advice on these issues, despite disagreements of position, and we have been very grateful.  Matt and I have a short essay on the right way to deal with regulation of (gradually) automating and perhaps finally autonomous weapons systems, appearing at the end of the year in Policy Review.  There is a working version of it – not the final draft, but one with footnotes – up at SSRN, Law and Ethics for Robot Soldiers.  Our view is that although Professor Sharkey might turn out to be right regarding the possibilities and limits of AI to be able to control weapons systems, it is better to see where the technology goes and regulate what emerges, and regulate it as it emerges – particularly since the kinds of technologies involved will have applications far outside of weapons systems.

More fundamentally, our view is that much of the focus in today’s debates over autonomous weapons systems is on some hypothesized autonomous technology.  Whereas we think the real issues lie in the gradual accumulation of automation of systems – much of the automation a function of the rest of the military machine, such as the automation of flight functions of a drone aircraft, rather than weapons systems directly.  Regulation of these systems needs to take place in the course of development, including the setting of design requirements to include concerns about applicable law and ethics – at the front end and in the course of gradual automation.  Thinking about what a fully autonomous weapons system would have to be in order to satisfy the law and ethics of war is useful for considering basic principles – but the real regulatory work lies in how one addresses the gradual accumulation of automation.

Moreover, we argue that none of this is suitable for the tools to which international lawyers frequently reach – treaties or appeals to customary international law, and even less some kind of grand bargain multilateral treaty.  Far more desirable, not to mention achievable, is for states to adopt a kind of “best practices” approach in which they relax some of the secrecy surrounding weapons development to address standards that evolving systems have to meet – if not as accepted international law as such, then the ethical framework that a state such as the US believes should apply to its own weapons development as well as that of others.

These best practices are evolutionary and not fixed in stone; nor are they international law, whether for the state putting them out there or others – they are instead approaches to an area in which international law does not dictate specific answers but for which cautiously evolving standards make sense.   They or something eventually derived out of them might eventually turn into some form of international law – but possibly not because of the rapid evolution of technology.  The US military already does this kind of weapons review across the board, in both formal international legal terms as well as its own sense of where law and ethics in weapons systems should go, and we think this is the right kind of approach for the US as well as other states in the autonomous weapons area.  (The same basic policy approach applies as well, we think, to other areas of rapid technological development with implications for weapons – cyber, for example, and remotely-piloted drones.)

Powered by WordPress. Designed by Woo Themes