[Infowarrior] - Automated killer robots 'threat to humanity': expert
Richard Forno
rforno at infowarrior.org
Wed Feb 27 13:50:19 UTC 2008
Automated killer robots 'threat to humanity': expert
Feb 27 06:18 AM US/Eastern
http://www.breitbart.com/article.php?id=080227111811.y9syyq8p&show_article=1
Increasingly autonomous, gun-totting robots developed for warfare
could easily fall into the hands of terrorists and may one day unleash a
robot arms race, a top expert on artificial intelligence told AFP.
"They pose a threat to humanity," said University of Sheffield professor
Noel Sharkey ahead of a keynote address Wednesday before Britain's Royal
United Services Institute.
Intelligent machines deployed on battlefields around the world -- from
mobile grenade launchers to rocket-firing drones -- can already identify and
lock onto targets without human help.
There are more than 4,000 US military robots on the ground in Iraq, as well
as unmanned aircraft that have clocked hundreds of thousands of flight
hours.
The first three armed combat robots fitted with large-caliber machine guns
deployed to Iraq last summer, manufactured by US arms maker Foster-Miller,
proved so successful that 80 more are on order, said Sharkey.
But up to now, a human hand has always been required to push the button or
pull the trigger.
It we are not careful, he said, that could change.
Military leaders "are quite clear that they want autonomous robots as soon
as possible, because they are more cost-effective and give a risk-free war,"
he said.
Several countries, led by the United States, have already invested heavily
in robot warriors developed for use on the battlefield.
South Korea and Israel both deploy armed robot border guards, while China,
India, Russia and Britain have all increased the use of military robots.
Washington plans to spend four billion dollars by 2010 on unmanned
technology systems, with total spending expected rise to 24 billion,
according to the Department of Defense's Unmanned Systems Roadmap 2007-2032,
released in December.
James Canton, an expert on technology innovation and CEO of the Institute
for Global Futures, predicts that deployment within a decade of detachments
that will include 150 soldiers and 2,000 robots.
The use of such devices by terrorists should be a serious concern, said
Sharkey.
Captured robots would not be difficult to reverse engineer, and could easily
replace suicide bombers as the weapon-of-choice. "I don't know why that has
not happened already," he said.
But even more worrisome, he continued, is the subtle progression from the
semi-autonomous military robots deployed today to fully independent killing
machines.
"I have worked in artificial intelligence for decades, and the idea of a
robot making decisions about human termination terrifies me," Sharkey said.
Ronald Arkin of Georgia Institute of Technology, who has worked closely with
the US military on robotics, agrees that the shift towards autonomy will be
gradual.
But he is not convinced that robots don't have a place on the front line.
"Robotics systems may have the potential to out-perform humans from a
perspective of the laws of war and the rules of engagement," he told a
conference on technology in warfare at Stanford University last month.
The sensors of intelligent machines, he argued, may ultimately be better
equipped to understand an environment and to process information. "And there
are no emotions that can cloud judgement, such as anger," he added.
Nor is there any inherent right to self-defence.
For now, however, there remain several barriers to the creation and
deployment of Terminator-like killing machines.
Some are technical. Teaching a computer-driven machine -- even an
intelligent one -- how to distinguish between civilians and combatants, or
how to gauge a proportional response as mandated by the Geneva Conventions,
is simply beyond the reach of artificial intelligence today.
But even if technical barriers are overcome, the prospect of armies
increasingly dependent on remotely-controlled or autonomous robots raises a
host of ethical issues that have barely been addressed.
Arkin points out that the US Department of Defense's 230 billion dollar
Future Combat Systems programme -- the largest military contract in US
history -- provides for three classes of aerial and three land-based
robotics systems.
"But nowhere is there any consideration of the ethical implications of the
weaponisation of these systems," he said.
For Sharkey, the best solution may be an outright ban on autonomous weapons
systems. "We have to say where we want to draw the line and what we want to
do -- and then get an international agreement," he said.
Copyright AFP 2008, AFP stories and photos shall not be published,
broadcast, rewritten for broadcast or publication or redistributed directly
or indirectly in any medium
More information about the Infowarrior
mailing list