By David McNally, RDECOM Public Affairs
Scientists and engineers from the U.S. Army Research Laboratory gathered Sept. 10, 2014 to discuss ethical robots.
Dr. Ronald C. Arkin, a professor from Georgia Tech, roboticist and author, challenged Army researchers to consider the implications of future autonomous robots.
“The bottom line for my talk here and elsewhere is concern for noncombatant casualties on the battlefield,” Arkin said. “I believe there is a fundamental responsibility as scientists and technologists to consider this problem. I do believe that we can, must and should apply this technology in this particular space.”
Arkin said he believes lethal autonomous weapons systems can lead to the potential saving of noncombatants, if properly developed and deployed.
“I wish these systems never had to be built. I wish we never had wars,” Arkin said. “But clearly there is a need for this technology to protect our national interests. But at what cost?”
Arkin encouraged researchers to reduce collateral damage that can occur by including ethical considerations in the design of robotics systems.
Lethal autonomy is inevitable he said. As examples, he listed cruise missiles, the U.S. Navy Aegis-class cruisers, Patriot missiles, fire-and-forget systems and even land mines by certain definitions.
“Could we create autonomous systems to potentially outperform human warfighters with respect to compliance to international humanitarian law or the Geneva Convention?” Arkin asked. “I’m not saying it’s easy. I’m not saying it’s around the corner. What I am saying is that it should be a topic of research.”
Human failings abound throughout history when it comes to war crimes and atrocities, he said.
“Don’t we have a responsibility as scientists to look for effective ways to reduce man’s inhumanity to many through technology?” Arkin asked. “Research in ethical military robotics could and should be applied toward achieving this end. I believe we can make a difference in this.”
Arkin suggested that the solution would require risk taking on the part of management, dedication and consideration of “very hard problems.”
“How can we make it in the last few seconds when it’s approaching a target and it sees something that doesn’t look right, a school bus, or whatever, where you don’t have time to call home and make that particular decision?” he asked.
Arkin said the events of 9-11 still disturb him.
“There is no reason on earth that we should have ever allowed an aircraft to fly into a building,” he said. “It doesn’t take that much technology if you’re on a collision course and alarms are going off…to be able to make the aircraft gain altitude and usurp authority from the human. Sometimes, machines know better.”
One of the professor’s objectives is to ensure robots possess an ethical code. Arkin said robots should be provided with the right of refusal for unethical orders.
“These systems are embedded with troops as organic assets and will work alongside the warfighter,” he said. “I don’t know how to program a robot to be good, but I do know how to put in constraints into these systems, which will enable it to refuse under certain circumstances.”
Arkin expressed optimism for the future; however, he said, “There is is no way we are going to be able to inculcate these systems with the moral reasoning of human beings any time soon. Not possible.”
Instead, Arkin advocated finding narrow, limited circumstances where roboticists can define the appropriate actions, such as room clearing, counter sniper operations and perimeter protection.
“For a whole bunch of reasons, we should not create robot armies even if it were possible,” Arkin said. “We need humans in the battlespace to understand how horrible war can be.”
Arkin began researching ethical robots in 2006 with a grant from the Department of Defense. He has published multiple papers on the topic as well as authoring a book, “Governing Lethal Behavior in Autonomous Robots.”
“The introduction of new technologies to society often comes with challenging ethical questions,” said Army researcher Christopher Kroninger, who attended the colloquium. “This was a great exploration of a particularly thorny issue. It was certainly a provocative topic. I’d say the primary line of discussions in the conversations I participated in regarded what sorts of behaviors might prompt more ethical decision making on the battlefield and what governs the role of acceptance of intelligent systems in society.”
Roboticist Philip Osteen, an ARL contractor working in the Autonomous Systems Division, also attended the presentation.
“Researchers have a tendency to focus on specific goals to the point of having tunnel-vision, so while it is important to hear talks from researchers working in the same domain,” Osteen said. “It is also essential to hear talks that make us think about the broader implications of our work.”
Osteen said Arkin’s presentation achieved its goal.
“It was thought-provoking and refreshing in its inclusion of results from an array of different research fields,” Osteen said.
ARL offers frequent colloquiums, or academic seminars led by a different lecturer on a different topic at each meeting. The events are streamed live over the Internet to interested researchers across the U.S. Army Research, Development and Engineering Command.
The Army Research Laboratory is part of the U.S. Army Research, Development and Engineering Command, which has the mission to develop technology and engineering solutions for America’s Soldiers.
RDECOM is a major subordinate command of the U.S. Army Materiel Command. AMC is the Army’s premier provider of materiel readiness–technology, acquisition support, materiel development, logistics power projection and sustainment–to the total force, across the spectrum of joint military operations. If a Soldier shoots it, drives it, flies it, wears it, eats it or communicates with it, AMC provides it.