Skip to main content

New consortium explores how war weapons impact society

October 15, 2009

In the year 2009, when astronauts live in a space station, people send messages around the world with the touch of a finger and diseases such as polio have virtually been eradicated from earth, warfare between nations still includes men with rifles shooting at each other.

The scenario of boots on the ground is on the verge of enormous changes, thanks to technology, but innovations that take men off the battlefield have a lot of implications for society that we may not be thinking about when we only consider our “old-fashioned” wars where humans toss grenades.

For, as Peter French says, what impacts the military will eventually impact all of society. New weapons and means of fighting wars could bring profound changes to all aspects of society – and introduce ethical dilemmas that have not even been thought about yet.

A new organization, the Consortium on Emerging Technologies, Military Operations and National Security, chaired by Brad Allenby and managed by the Lincoln Center for Applied Ethics, will explore the meaning of those new weapons and their ethical ramifications.

Consider the possibility, for example, of robots that are programmed to go into combat and kill humans; telepathic helmets that enable members of military units to “think” to each other; miniature chips that can be implanted in bugs that are then sent out to spy; and non-lethal rays that make the recipient feel as though he or she is on fire.

Though some of this technology is already in use, such as drones that are ordered to attack targets in Iraq from military command posts near Las Vegas, much is still the subject of research, French says.

“We are studying the ethical implications of this technology in military and national security operations and its moving into the commercial world. For example: Is it ethical to use some of this equipment outside of war? Is it ethical to use it in war? Does it change the very nature of war? What are the implications, e.g., with respect to privacy, in civilian implementation?”

Allenby notes, for example, that insect-sized robots could be capable of both surveillance and killing.

“When that technology flows back into civic society, which it will, our privacy may be threatened," he says. "And if there’s a backlash in civic society that stops the military from using it, that could be difficult because it is an important technology for their mission.”

The ASU-directed consortium is leading the way in the study of the future of military technology, society and ethics, Allenby says. “There is no place in government that is focused on these issues. We’re filling a niche. The military tends to focus on the warfighting implications of their technology, not the social and cultural issues that arise when it flows back to civil society. We want to look farther out.”

ASU’s partners in the study are Inamori International Center for Ethics and Excellence at Case Western Reserve University, the Stockdale Center at the U.S. Naval Academy and the Mobile Robot Lab at the Georgia Institute of Technology. Researchers from California Polytechnic University, General Dynamics, the Brookings Institute and the Department of Defense also are on the team.

The research areas include nanotechnology, biotechnology, robotics, applied cognitive science, information and communication technologies, neurotechnology and law.

Researchers are divided into 10 “thrust groups,” which include such topics as “International Law, Ethics, and Governance on Robotics and National Security”; “New Military Technologies: Implications for State/Society Relationships and Questions of Democracy and Political Theory”; “Enhanced Warfighters”; “Emerging Genetic Technologies in the Military and Issues with Human Subjects Research in the Military”; and “Nonlethal Weapons.”

Participating ASU faculty members include Elizabeth Corley, Jason Robert, Gary Marchant, Christopher Buneo, Steve Helms Tillery, Orde Kittrie, Clark Miller, Veronica Santos and Brian Smith.

The thrust groups will develop grant proposals in their various areas and create scenarios and case studies that will serve as the focus of their research. Meanwhile, Allenby; Joel Garreau, Lincoln Professor of Law, Culture and Values; and others will seek to use that research to understand the broader implications of emerging technologies for both military operations and, more generally, national security.

Representatives of the consortium universities met at Case Western Reserve University in October, and will meet next February at SkySong in Scottsdale, Ariz.

Research in some of the areas sounds like purely science fiction. Under the “Enhanced Warfighters” banner, for example, the scholars are studying ways to keep soldiers from aging in order to maintain a “young” military force, and ways to keep them awake and alert for longer periods of time.

But there are ethical implications in these ideas. “If we can build the perfect transhuman war fighter, what happens when the war is over – do we ‘de-enhance’ them?” asks Allenby. Robotic soldiers also bring ethical dilemmas. “Should a robot on the battlefield be allowed to kill a soldier in battle without a human being involved in that decision? Is that legal under the laws that govern warfare?” he asks.

It may seem like there is a lot of time before society has to worry about bugs that spy or perpetually youthful soldiers, but Allenby says it’s closer than we know.

“Society as a whole doesn’t have a clue about what’s going on this research. For example, we may be on the verge of being to keep people alive for 120 years,” he says.

"The research and the implementation of the new technologies in military operations and national security, about which most people seem to be willing to allow a free reign as long as they are persuaded they are being protected from external threats, will give us a fairly good picture of what the global society will look like, indeed what humans will look like. It's the future of the world. We have to decide how we want it to look," French says.