Post by Steve Gardner on Feb 27, 2008 14:47:56 GMT
This extract sums up the thrust of the article:
Also, just take a look at the US DOD's planned spending on unmanned systems - simply staggering.
Source: News.com.au
...up to now, a human hand has always been required to push the button or pull the trigger.
It we were not careful, that could change.
Also, just take a look at the US DOD's planned spending on unmanned systems - simply staggering.
Source: News.com.au
By Marlowe Hood in Paris
February 27, 2008 02:35pm
GUN-totting robots developed for warfare could easily fall into the hands of terrorists and may one day unleash a robot arms race, a top expert on artificial intelligence says.
"They pose a threat to humanity,'' said University of Sheffield professor Noel Sharkey ahead of a keynote address today before Britain's Royal United Services Institute.
Intelligent machines deployed on battlefields around the world - from mobile grenade launchers to rocket-firing drones - can already identify and lock onto targets without human help.
There are more than 4000 US military robots on the ground in Iraq, as well as unmanned aircraft that have clocked hundreds of thousands of flight hours.
The first three armed combat robots fitted with large-calibre machine guns deployed to Iraq last year, manufactured by US arms maker Foster-Miller, proved so successful that 80 more are on order, said Sharkey.
But up to now, a human hand has always been required to push the button or pull the trigger.
It we were not careful, that could change, Prof Sharkey said.
Military leaders "are quite clear that they want autonomous robots as soon as possible, because they are more cost-effective and give a risk-free war", he said.
Several countries, led by the US, have already invested heavily in robot warriors developed for use on the battlefield.
South Korea and Israel both deploy armed robot border guards, while China, India, Russia and Britain have all increased the use of military robots.
Washington plans to spend $US4 billion ($4.3bn) by 2010 on unmanned technology systems, with total spending expected rise to $US24bn, according to the Department of Defence's Unmanned Systems Roadmap 2007-2032, released in December.
James Canton, an expert on technology innovation and chief executive of the Institute for Global Futures, predicted that deployment within a decade of detachments that would include 150 soldiers and 2000 robots.
The use of such devices by terrorists should be a serious concern, said Prof Sharkey.
Captured robots would not be difficult to reverse engineer, and could easily replace suicide bombers as the weapon of choice.
"I don't know why that has not happened already,'' he said.
But even more worrisome, he said, was the subtle progression from the semi-autonomous military robots deployed today to fully independent killing machines.
"I have worked in artificial intelligence for decades, and the idea of a robot making decisions about human termination terrifies me,'' Prof Sharkey said.
Ronald Arkin of Georgia Institute of Technology, who has worked closely with the US military on robotics, agreed that the shift towards autonomy would be gradual.
But he was not convinced that robots did not have a place on the front line.
"Robotics systems may have the potential to out-perform humans from a perspective of the laws of war and the rules of engagement,'' he told a conference on technology in warfare at Stanford University last month.
The sensors of intelligent machines, he argued, may ultimately be better equipped to understand an environment and to process information.
"And there are no emotions that can cloud judgement, such as anger,'' he said.
Nor was there any inherent right to self-defence.
For now, however, there remain several barriers to the creation and deployment of Terminator-like killing machines.
Some are technical.
Teaching a computer-driven machine - even an intelligent one - how to distinguish between civilians and combatants, or how to gauge a proportional response as mandated by the Geneva Conventions, is simply beyond the reach of artificial intelligence today.
But even if technical barriers are overcome, the prospect of armies increasingly dependent on remotely-controlled or autonomous robots raises a host of ethical issues that have barely been addressed.
Mr Arkin pointed out that the US Department of Defence's $US230bn Future Combat Systems program - the largest military contract in US history - provided for three classes of aerial and three land-based robotics systems.
"But nowhere is there any consideration of the ethical implications of the weaponisation of these systems,'' he said.
For Prof Sharkey, the best solution may be an outright ban on autonomous weapons systems.
"We have to say where we want to draw the line and what we want to do -- and then get an international agreement,'' he said.