The UK's Ministry of Defence is "actively" trying to create fully autonomous killer drones, according to a report (PDF) by a campaign group.
"Powered by advances in artificial intelligence (AI), machine learning, and computing, we are likely to see the development not only of drones that are able to fly themselves – staying aloft for extended periods – but those which may also be able to select, identify, and destroy targets without human intervention," said campaign group Drone Wars UK in its latest report, titled "Off The Leash: The development of autonomous military drones in the UK".
Google nukes military AI, Amazon happy touting Rekognition to police, and much moreREAD MORE
According to the group, whose raison d'etre is to advocate against the use of armed drones on the basis that they "encourage and lower the threshold for the use of lethal force", the British military is using academic and practical research to enhance military drone technology.
Key to that is the MoD's £80m/year innovation fund, which aims to tip cash into the pockets of companies carrying out R&D that might be useful on the battlefield one day. In addition, the Defence Science and Technology Laboratory (DSTL) is looking long and hard at AI and big data and running public data science challenges to tap up the private sector's problem-solvers. Drone Wars UK, whose report draws on Freedom of Information requests and public data, reckons a significant chunk of this research is leading us towards the rise of autonomous war machines.
"DSTL is also heavily involved in the MoD's work to develop the use of unmanned systems for use in the maritime environment (a number of which were tested and highlighted as part of 'Unmanned Warrior' – a special part of the 'Joint Warrior' naval exercise in 2016)," said Drone Wars UK.
The campaigners went on to highlight a drone swarming project funded by DSTL as being of particular concern. Swarming, the art of sending a large number of drones to carry out the same task, has a handful of applications – the most obvious of which is swamping enemy radars and air defence systems by giving them too many contacts to handle at once.
"A project investigating 'Autonomous Swarm-Based Mission Planning and Management Systems' aimed to develop a swarm-based mission management and mission planning system capable of handling multiple fleets of drones involved in multiple missions simultaneously," said the report, which added that the project aimed to let one human supervise four simultaneous drone missions.
Of greater concern to the report's authors was the MoD and BAE Systems' jointly developed Taranis drone, which was described as being capable of taxiing out to a runway, taking off, flying to an operational area and being able to "self-plot a route" around that area while "searching for targets" - and that current rules of engagement, the laws which currently state only humans can authorise firing a weapon, "could change".
All of this comes against the backdrop of US-based tech workers revolting against their bosses' eagerness to jump aboard the US military AI gravy train. Earlier this year Google was forced by public pressure to drop its Project Maven, which was billed as using AI to analyse drone footage. Microsoft, meanwhile, has decided that its monster $10bn US military cloud project, among other things, is worth more than bowing to pressure, telling critics a couple of weeks ago that it would continue to work on military tech contracts.
Responding to Drone Wars UK, an MoD spokesman said: "There is no intent within the MoD to develop weapon systems that operate entirely without human input. Our weapons will always be under human control as an absolute guarantee of oversight, authority and accountability." ®
Public opinion has more sway than some would believe in this area. A couple of years ago the MoD bought a fresh batch of Reaper drones - which it unconvincingly tried to rename Protector, in the hope of sparing itself future blushes.