The US Army and Navy have both hired experts in the ethics of building machines to prevent the creation of an amoral Terminator-style killing machine that murders indiscriminately.
By 2010 the US will have invested $4 billion in a research programme into "autonomous systems", the military jargon for robots, on the basis that they would not succumb to fear or the desire for vengeance that afflicts frontline soldiers.
A British robotics expert has been recruited by the US Navy to advise them on building robots that do not violate the Geneva Conventions.
Colin Allen, a scientific philosopher at Indiana University's has just published a book summarising his views entitled Moral Machines: Teaching Robots Right From Wrong.
He told The Daily Telegraph: "The question they want answered is whether we can build automated weapons that would conform to the laws of war. Can we use ethical theory to help design these machines?"
Pentagon chiefs are concerned by studies of combat stress in Iraq that show high proportions of frontline troops supporting torture and retribution against enemy combatants.
Ronald Arkin, a computer scientist at Georgia Tech university, who is working on software for the US Army has written a report which concludes robots, while not "perfectly ethical in the battlefield" can "perform more ethically than human soldiers."
He says that robots "do not need to protect themselves" and "they can be designed without emotions that cloud their judgment or result in anger and frustration with ongoing battlefield events".
Airborne drones are already used in Iraq and Afghanistan to launch air strikes against militant targets and robotic vehicles are used to disable roadside bombs and other improvised explosive devices.
~ more... ~
No comments:
Post a Comment