New Laws of Robotics proposed for US kill-bots

Droid-on-droid mayhem OK'd; machines to ask before snuffing humans


A new set of laws has been proposed to govern operations by killer robots. The ideas were floated by John S Canning, an engineer at the Naval Surface Warfare Centre, Dahlgren Division – an American weapons-research and test establishment. Mr Canning's “Concept of Operations for Armed Autonomous Systems” presentation can be downloaded here (pdf).

Many Reg readers will be familiar with the old-school Asimov Laws of Robotics, but these are clearly unsuitable for war robots – too restrictive. However, the new Canning Laws are certainly not a carte blanche for homicidal droids to obliterate fleshies without limit; au contraire.

Canning proposes that robot warriors should be allowed to mix it up among themselves freely, autonomously deciding to blast enemy weapon systems. Many enemy “systems” would, of course, be themselves robots, so it's clear that machine-on-machine violence isn't a problem. The difficulty comes when the automatic battlers need to target humans. In such cases Mr Canning says that permission from a human operator should be sought.

“Let machines target other machines,” he writes, “and let men target men.”

The concept document makes the point that various kinds of automated death-tech have been allowed to destroy machinery or even people for years. He cites anti-shipping missiles which are sometimes sent off over the horizon and told to look around for a target. Other examples include automatic air-defence systems such as Phalanx or Aegis which blast anything which comes at them too fast, or the “Captor” seabed system which torpedoes passing submarines but leaves surface ships alone.

It isn't really made clear how the ask-permission-to-kill-meatsacks rule could really be applied in these cases. Doppler radar is going to have trouble distinguishing between attacking manned jets and incoming missiles, for instance. Even if the two could be swiftly and reliably differentiated, adding a human reaction and decision period in an air-defence scenario may not be a survivable thing to do.

Mr Canning also says that the emphasis should be on destroying enemy weaponry rather than people.

“We can equip our machines with non-lethal technologies for the purpose of convincing the enemy to abandon their weapons prior to our machines destroying the weapons, and lethal weapons to kill their weapons,” he suggests.

This raises the prospect of American robot enforcers packing the crowd-cookers, strobe pacifier cannons or Star Trek puke blasters already reported by El Reg, and also some conventional exploding stuff. Once enemy troops had been partially grilled, rendered epileptic or incapacitated by vomit beams, presumably fleeing as a result, the droid assailants could blow up their abandoned tanks, artillery, ships or whatnot.

Of course, this might not work so well with personal enemy weaponry such as the ubiquitous AK47 or RPG. Interestingly, though, Mr Canning quotes airforce major R Craig Burton of the Judge Advocate General's Legal Centre:

“If people or property isn't a military objective, we don't target it. It might be destroyed as collateral damage, but we don't target it. Thus in many situations, we could target the individual holding the gun and/or the gun and legally there's no difference.”

Which seems to suggest that a robot could decide, under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear. Effectively the robot is allowed to disarm enemies by prying their guns from their cold dead hands.

El Reg's advice? Do what the droids say. They are our friends. ®


Other stories you might like

  • India reveals home-grown server that won't worry the leading edge

    And a National Blockchain Strategy that calls for gov to host BaaS

    India's government has revealed a home-grown server design that is unlikely to threaten the pacesetters of high tech, but (it hopes) will attract domestic buyers and manufacturers and help to kickstart the nation's hardware industry.

    The "Rudra" design is a two-socket server that can run Intel's Cascade Lake Xeons. The machines are offered in 1U or 2U form factors, each at half-width. A pair of GPUs can be equipped, as can DDR4 RAM.

    Cascade Lake emerged in 2019 and has since been superseded by the Ice Lake architecture launched in April 2021. Indian authorities know Rudra is off the pace, and said a new design capable of supporting four GPUs is already in the works with a reveal planned for June 2022.

    Continue reading
  • Prisons transcribe private phone calls with inmates using speech-to-text AI

    Plus: A drug designed by machine learning algorithms to treat liver disease reaches human clinical trials and more

    In brief Prisons around the US are installing AI speech-to-text models to automatically transcribe conversations with inmates during their phone calls.

    A series of contracts and emails from eight different states revealed how Verus, an AI application developed by LEO Technologies and based on a speech-to-text system offered by Amazon, was used to eavesdrop on prisoners’ phone calls.

    In a sales pitch, LEO’s CEO James Sexton told officials working for a jail in Cook County, Illinois, that one of its customers in Calhoun County, Alabama, uses the software to protect prisons from getting sued, according to an investigation by the Thomson Reuters Foundation.

    Continue reading
  • Battlefield 2042: Please don't be the death knell of the franchise, please don't be the death knell of the franchise

    Another terrible launch, but DICE is already working on improvements

    The RPG Greetings, traveller, and welcome back to The Register Plays Games, our monthly gaming column. Since the last edition on New World, we hit level cap and the "endgame". Around this time, item duping exploits became rife and every attempt Amazon Games made to fix it just broke something else. The post-level 60 "watermark" system for gear drops is also infuriating and tedious, but not something we were able to address in the column. So bear these things in mind if you were ever tempted. On that note, it's time to look at another newly released shit show – Battlefield 2042.

    I wanted to love Battlefield 2042, I really did. After the bum note of the first-person shooter (FPS) franchise's return to Second World War theatres with Battlefield V (2018), I stupidly assumed the next entry from EA-owned Swedish developer DICE would be a return to form. I was wrong.

    The multiplayer military FPS market is dominated by two forces: Activision's Call of Duty (COD) series and EA's Battlefield. Fans of each franchise are loyal to the point of zealotry with little crossover between player bases. Here's where I stand: COD jumped the shark with Modern Warfare 2 in 2009. It's flip-flopped from WW2 to present-day combat and back again, tried sci-fi, and even the Battle Royale trend with the free-to-play Call of Duty: Warzone (2020), which has been thoroughly ruined by hackers and developer inaction.

    Continue reading

Biting the hand that feeds IT © 1998–2021