This article is more than 1 year old

Cops told live facial recog needs oversight, rigorous trial design, protections against bias

How about only using face-scan tech if it, er, actually works, is the only option, eh?

Cops should only use facial recognition tech if it is proven to be effective at identifying people, can be used without bias and is the only method available, a UK government advisory group has said.

The Biometrics and Forensics Ethics Group – set up to shadow and advise on the Home Office's activities in those areas – has published a report into the controversial use of facial recognition tech by police.

It looked at the use of automated facial recognition tools in real-time: when live images drawn from cameras trained on specific areas of moving people – entrances or exits, for instance – are matched with a curated watchlist so police can approach that person there and then.

London, UK - March, 2018. Police officers patrolling Leicester Square and Piccadilly Circus in central London. Pic Paolo Paradiso / Shutterstock.com

Zero arrests, 2 correct matches, no criminals: London cops' facial recog tech slammed

READ MORE

The overall tone was that the design of live facial recognition trials needed to be more rigorous, and the tech should be used with a large dollop of caution, plenty of independent oversight and only if it actually works. Crucially, police were also advised to take its intrusiveness seriously.

This seems to jar with what has been seen as an ad-hoc approach – and a sometimes cavalier attitude – to the use of the facial recognition technology by forces. Even London police commissioner Cressida Dick said last year she didn't think the use of facial recog would result in lots of arrests, but claimed the public "expected" cops to trial nascent technologies.

The report was commissioned by the Home Office following a series of highly scrutinised deployments by South Wales Police and London's Metropolitan Police at sports events and in shopping areas.

However, such deployment has taken place without a legal framework, leading to widespread criticism from pressure groups, MPs and watchdogs.

The police have gone on to reframe the rollouts as "trials" – but have typically been unwilling to discuss them in detail, and forces only release information at the very last minute, often just the day before the event.

The new report echoed these concerns, and said there was an "inherent ambiguity" in the use of the tech in that they are both operational deployments and "trial-like" experiments.

"It is difficult to discern the purpose of the recent police field trials; were they police operations or experiments?" it asked, saying this raised questions about trial design, consent for participation, and whether the pilots risked undermining public confidence.

The document – drawn up by a working group of three academics, Nina Hallowell, Louise Amoore and Simon Caney, and the lead of IBM's emerging tech unit, Peter Waggett – set out nine ethical principles for the government's policymaking and the police’s use of the kit.

These said facial recognition should be based on measures of necessity, effectiveness, impartiality, proportionality, cost-effectiveness and public trust.

Metropolitan Police at Notting Hill Carnival

London's top cop isn't expecting facial recog tech to result in 'lots of arrests'

READ MORE

It emphasised that the use of the technology interferes with people's rights to conduct their lives without being monitored. As such it should only be used when less invasive techniques aren't available.

And its use "can be justified only if it is an effective tool for identifying people" – a point campaign groups are likely to seize upon, given that responses to Freedom of Informations requests have revealed the Met's use of the tech had a 98 per cent false positive rate and had led to zero arrests.

The report also referenced work by Cardiff University (PDF) late last year that questioned the technology's abilities in low light and crowds – a major feature of most of the roll-outs of the kit to date.

A non-exhaustive list of potential questions the group said could be used to assess effectiveness included questions on the training human operators have, the trade-offs between speed and accuracy, and details of image quality, system set up and criteria for success.

The group also urged the government to ensure its use of the kit avoid bias and "algorithmic injustice", and said it had the potential to be unjust in two ways.

First, some kinds of mis-recognition are "inherently demeaning"; and secondly, biased tech can result in unequal and discriminatory treatment of some individuals, with some groups more likely to be detained or asked to identify themselves.

"Automated biometric recognition systems (including data training sets) that will be used in public places should be to be open to scrutiny and effective oversight," it said.

The report also stressed the need for proper oversight in another principle to do with the way watchlists are drawn up, who produces them and how accurate the images included actually are. This should also be subject to oversight by an independent body, it said.

People shopping in Piccadilly Circus, London

Who's watching you from an unmarked van while you shop in London? Cops with facial recog tech

READ MORE

There should be clarity on who has oversight of the deployment and how it will be evaluated after the fact. In addition, the group said it was important that facial recog should not be used in ways that disproportionately target certain events but not others "without compelling justification".

In their conclusions, the authors noted the lack of independent oversight and governance of the use of live facial recognition and said that, in lieu of this, trials should comply with "the usual standards of experimental trials, including rigorous and ethical scientific design".

The ethics group said it would continue to monitor development in this field and advise Home Office ministers "as appropriate". ®

More about

TIP US OFF

Send us news


Other stories you might like