You wish to kill a human. Cancel or Allow?

I find, to my embarrassment, that I am utterly unable to top this. The Reg reports on a notional rule of engagement for autonomous killing machines. Boiled down, it's “Let machines target other machines, and let men target men.” But these quotes are priceless:

Many Reg readers will be familiar with the old-school Asimov Laws of Robotics, but these are clearly unsuitable for war robots – too restrictive. However, the new Canning Laws are certainly not a carte blanche for homicidal droids to obliterate fleshies without limit; au contraire.

It isn't really made clear how the ask-permission-to-kill-meatsacks rule could really be applied in these cases.

Which seems to suggest that a robot could decide, under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear. Effectively the robot is allowed to disarm enemies by prying their guns from their cold dead hands.

As clever as Mr. Canning is in trying to come up with these rules for our lethal robotic servants, in the end the three rules are going to add up to one thing: if it is human, kill it.

Posted by Buckethead Buckethead on   |   § 0

[ You're too late, comments are closed ]