This article is more than 1 year old

London's Metropolitan Police flip the switch: Smile, fellow citizens... you're undergoing Live Facial Recognition

This is not a test

The Metropolitan Police are using live facial recognition (LFR) in various locations in central London today after spending two years testing the technology.

Most recently spotted at Oxford Circus, the vans are equipped with NeoFace's recognition software which runs captured images against a pre-specified list of wanted crimes and suspects.

The use of the technology was pre-announced online and police placed signs placed around the van to warn the public what was going on. The Met's Twitter announcement included the line: "There is no legal requirement for you to pass through the LFR system."

Which stands rather at odds with the Met's arrest and fining of a gent in Romford who hid his face from their cameras while they were being tested.

kings_cross_police

Oops, wait, yeah, we did hand over photos for King's Cross facial-recog CCTV, cops admit

READ MORE

Privacy rights group BigBrotherWatch (BBW) decried the decision to use a technology that it considers grossly invasive of privacy as well as hopelessly inaccurate.

Silkie Carlo, director of BBW, said: "It's alarming to see biometric mass surveillance being rolled out in London. Never before have citizens been subjected to identity checks without suspicion, let alone on a mass scale.

"All the evidence shows this tech makes us less free and no safer. The 93 per cent misidentification rate poses a serious threat to innocent members of the public. The cost to our liberties, let alone the public purse, is unacceptably high.

"We're appalled that [city mayor] Sadiq Khan has approved such useless, dangerous and authoritarian surveillance technology for London. This undemocratic expansion of the surveillance state must be reversed."

The system uses NEC's NeoFace system to compare faces against a "bespoke watchlist" apparently created for every individual deployment.

The Met released some information on the results of its previous 10 test deployments of the technology run between August 2016 and February 2019. It said reports that the system was racially biased were not borne out by their findings, although there is a gender bias – men are more likely to be flagged as false positives.

The Met chose to compare the system's success against "Manhunt" tactics where officers are sent to multiple locations in an attempt to find an offender.

The report, available as a PDF here, also compared facial recognition with famously brilliant stop and search tactics – which resulted in an arrest in 13.3 per cent of cases versus 30 per cent for people stopped as a result of the cameras. So seven in 10 people flagged for arrest were false positives then.

Illustration of facial recognition

Europe mulls five year ban on facial recognition in public... with loopholes for security and research

READ MORE

But the figures are hardly Minority Report. Ten test deployments, lasting 69 hours, resulted in nine arrests. The report noted that: "Additional arrests were also made as a result of proactive policing by officers attached to the LFR deployment." Which sounds a bit like nabbing people who baulk at the signs of the cameras rather than a successful use of the system.

The tests used NeoFace's Rapid Deployment Alien Ware laptop version. This can deal with two concurrent camera streams, using 2 or 5 megapixel cameras. The operator can tweak the system to adjust the number of faces detected per frame and the decision threshold score for an alert. For test purposes, this was limited to five faces per frame – any more resulted in too much lag and threshold scores were left as preset by NEC.

Several cities around the world, notably San Francisco, have stopped police and other organisations using the technology. The European Commission has also voiced concerns about its use. ®

More about

TIP US OFF

Send us news


Other stories you might like