[Infowarrior] - DOD wants to build ethical Terminators?

Richard Forno rforno at infowarrior.org
Mon Dec 1 15:18:23 UTC 2008


Pentagon hires British scientist to help build robot soldiers that  
'won't commit war crimes'
The American military is planning to build robot soldiers that will  
not be able to commit war crimes like their human comrades in arms.

By Tim Shipman in Washington
Last Updated: 7:36AM GMT 01 Dec 2008

http://www.telegraph.co.uk/news/worldnews/northamerica/usa/3536943/Pentagon-hires-British-scientist-to-help-build-robot-soldiers-that-wont-commit-war-crimes.html

American military is planning to build robot soldiers that will not be  
able to commit war crimes

The US Army and Navy have both hired experts in the ethics of building  
machines to prevent the creation of an amoral Terminator-style killing  
machine that murders indiscriminately.

By 2010 the US will have invested $4 billion in a research programme  
into "autonomous systems", the military jargon for robots, on the  
basis that they would not succumb to fear or the desire for vengeance  
that afflicts frontline soldiers.

A British robotics expert has been recruited by the US Navy to advise  
them on building robots that do not violate the Geneva Conventions.

Colin Allen, a scientific philosopher at Indiana University's has just  
published a book summarising his views entitled Moral Machines:  
Teaching Robots Right From Wrong.

He told The Daily Telegraph: "The question they want answered is  
whether we can build automated weapons that would conform to the laws  
of war. Can we use ethical theory to help design these machines?"

Pentagon chiefs are concerned by studies of combat stress in Iraq that  
show high proportions of frontline troops supporting torture and  
retribution against enemy combatants.

Ronald Arkin, a computer scientist at Georgia Tech university, who is  
working on software for the US Army has written a report which  
concludes robots, while not "perfectly ethical in the battlefield" can  
"perform more ethically than human soldiers."

He says that robots "do not need to protect themselves" and "they can  
be designed without emotions that cloud their judgment or result in  
anger and frustration with ongoing battlefield events".

Airborne drones are already used in Iraq and Afghanistan to launch air  
strikes against militant targets and robotic vehicles are used to  
disable roadside bombs and other improvised explosive devices.

Last month the US Army took delivery of a new robot built by an  
American subsidiary of the British defence company QinetiQ, which can  
fire everything from bean bags and pepper spray to high-explosive  
grenades and a 7.62mm machine gun.

But this generation of robots are all remotely operated by humans.  
Researchers are now working on "soldier bots" which would be able to  
identify targets, weapons and distinguish between enemy forces like  
tanks or armed men and soft targets like ambulances or civilians.

Their software would be embedded with rules of engagement conforming  
with the Geneva Conventions to tell the robot when to open fire.

Dr Allen applauded the decision to tackle the ethical dilemmas at an  
early stage. "It's time we started thinking about the issues of how to  
take ethical theory and build it into the software that will ensure  
robots act correctly rather than wait until it's too late," he said.

"We already have computers out there that are making decisions that  
affect people's lives but they do it in an ethically blind way.  
Computers decide on credit card approvals without any human  
involvement and we're seeing it in some situations regarding medical  
care for the elderly," a reference to hospitals in the US that use  
computer programmes to help decide which patients should not be  
resuscitated if they fall unconscious.

Dr Allen said the US military wants fully autonomous robots because  
they currently use highly trained manpower to operate them. "The  
really expensive robots are under the most human control because they  
can't afford to lose them," he said.

"It takes six people to operate a Predator drone round the clock. I  
know the Air Force has developed software, which they claim is to  
train Predator operators. But if the computer can train the human it  
could also ultimately fly the drone itself."

Some are concerned that it will be impossible to devise robots that  
avoid mistakes, conjuring up visions of machines killing  
indiscriminately when they malfunction, like the robot in the film  
Robocop.

Noel Sharkey, a computer scientist at Sheffield University, best known  
for his involvement with the cult television show Robot Wars, is the  
leading critic of the US plans.

He says: "It sends a cold shiver down my spine. I have worked in  
artificial intelligence for decades, and the idea of a robot making  
decisions about human termination is terrifying."


More information about the Infowarrior mailing list