Skip to main content

Article by law professor looks at implications of lethal autonomous military robots


Gary Marchant
June 30, 2011

An article by leading scholars of military technologies in the 21st century that addresses the potential development of robotic weapons capable of using lethal force on their own initiative, and the legal and ethical ramifications of such technologies, has been published in the June 2011 issue of the Columbia Science and Technology Law Review.

Gary Marchant, Lincoln Professor of Emerging Technologies, Law and Ethics, and Executive Director of the Center for Law, Science & Innovation at the Sandra Day O’Connor College of Law at Arizona State University, is the lead author of “International Governance of Autonomous Military Robots.”

“There’s a lot of pressure to push forward with using autonomous systems robots in the military,” Marchant said, “but there are many operational, legal, ethical and policy issues surrounding this next generation of war weapons that need to be considered first. We’re not saying that military robots should be banned; we’re suggesting that there is a range of possible governance mechanisms that should be considered sooner rather than later.”

Co-author Ronald Arkin, Regents’ Professor, Director of the Mobile Robot Laboratory, and Associate Dean for Research & Space Planning, College of Computing at the Georgia Institute of Technology, said it is incumbent upon international society to determine effective means to govern the deployment of autonomous robotic systems, given the rapid pace of development of and heavy investment in them.

“Without critical guidance at this stage in this new revolution in military affairs, we may indeed find ourselves in a situation whereby we rue the haste with which we rushed these systems into the fray,” Arkin said.

The article was produced by the autonomous robotics group of the Consortium on Emerging Technologies, Military Operations, and National Security (CETMONS), a multi-institutional research consortium based at ASU’s Lincoln Center for Applied Ethics. CETMONS studies the national security, ethical, legal and policy implications of emerging military technologies. Other CETMONs experts joining Marchant and Arkin on the article were Braden Allenby, ASU Lincoln Professor of Engineering and Ethics, and the Founding Chair of CETMONS, ASU; Edward T. Barrett, Director of Research, U.S. Naval Academy’s Stockdale Center for Ethical Leadership, and Ethics Professor, Department of Leadership, Ethics, and Law; Jason Borenstein, Director of Graduate Research Ethics Programs and Co-Director, Center for Ethics and Technology, Georgia Institute of Technology; Lyn M. Gaudet, Research Director, Center for Law, Science & Innovation, Sandra Day O’Connor College of Law, ASU; Orde Kittrie, Professor and Director, Washington, D.C., Legal Externship Program, Sandra Day O’Connor College of Law, ASU; Patrick Lin, Assistant Professor, Department of Philosophy, California Polytechnic State University, San Luis Obispo; George R. Lucas, Professor of Ethics & Public Policy, Graduate School of Public Policy, and Naval Postgraduate School & Class of 1984 Distinguished Chair in Ethics, Vice Admiral James B. Stockdale Center for Ethical Leadership, U.S. Naval Academy; Richard O’Meara, retired Brigadier General and former Fellow, Stockdale Center for Ethical Leadership, U.S. Naval Academy, and Adjunct Faculty in Global Affairs, Rutgers University; Jared Silberman, Associate Counsel for Arms Control & International Law, U.S. Navy Office of Strategic Systems Programs, Washington, D.C.

The authors write, “Many technological changes have occurred under the radar, in military labs and private test fields, with the majority of citizens unaware of the leaps and bounds of progress. Robotics is one such modern military technology advancement that has largely escaped public attention to date.”

Built with a combination of the most advanced electronic, computer, surveillance and weapons technologies, these robots have extraordinary capabilities, perhaps, soon, even the ability to decide to kill, according to the article.

“There have recently been growing calls for the potential risks and impacts of Lethal Autonomous Robots (LARs) to be considered and addressed in an anticipatory and preemptive manner,” the authors note, pointing out such requests by a United Nations human-rights investigator and by experts on unmanned military systems, meeting last fall in Berlin.

The article provides background on the development of autonomous military robotics, beginning with a U.S. Navy project on unpiloted aircraft at the end of World War I and including the unmanned (but controlled by humans) aerial vehicles (“drones”) by the U.S. military in Afghanistan and elsewhere.

"We know that, throughout history, technological evolution and military activity has been linked,” said ASU’s Allenby. “The existential challenge to society represented by warfare, combined with the immediate advantage that new technology often delivers, accelerates technology innovation and diffusion.

“As this article makes clear, however, the relationships between the resulting technology systems, and consequent social and ethical issues and changes, are subtle, profound and complex.”

Allenby said CETMONS was established to help better understand and explain these relationships, so that the advantage and security of the military can be enhanced for the long term.

The authors address the ethical, policy and legal issues related to the creation and development of LARs, in order to raise awareness of the issues and encourage relevant stakeholders to consider appropriate legal and regulatory responses.

“We are quite clear on what we are able to do, from the standpoint of feasibility, both now and in the near-term future,” said Lucas of the Naval Postgraduate School. “We are likewise clear that the ‘military manpower resources’ and ‘force-mix’ questions will plague us unless we can deploy an array of unmanned systems with varying degrees of autonomy and, possibly, lethality.

“What remains to be clarified is what we are permitted under law to do, and what we ought, from the standpoint of ethics, to be trying to accomplish with unmanned systems.”

Present technological advances in robotics continue to receive a robust arms control review process before they are permitted for use in the battle space. This process will continue so that the advancements, including advancements dedicated to autonomous functions, continue to receive well developed and comprehensive legal reviews.

The authors delve into existing governance mechanisms for military robots, noting that, currently, no laws or treaties specifically pertain to them; rather, they are covered in a piecemeal (if at all) patchwork of legislation.

“Technology often outruns policy and ethics, and it’s no surprise that we continually run into this gap with military technologies, since they are on the frontlines of research,” said Cal Poly’s Lin. “This law article is one of the first to comprehensively analyze our policy options, which is made possible by the diversity of experts in CETMONS. We expect this uncommon collaboration to generate new practical findings in other technology areas, too, such as military human enhancements and cyber warfare.”

The experts then evaluate the possibility of legally binding international agreements, described as a more formal and traditional approach for oversight, such as some form of binding international arms control agreement. They also look at soft law governance strategies that tend to be more flexible, are capable of being launched relatively quickly, and can be adapted easily to the changing technological, political and security landscapes.

The authors do not prescribe a specific action or form of oversight, but rather seek to identify the range of possible governance mechanisms on which discussions at the national and international levels must advance promptly.

To read the article, click here.

Janie Magruder, Jane.Magruder@asu.edu
(480) 727-9052
Sandra Day O'Connor College of Law