This article is more than 1 year old

UK police's face recognition tech breaks human rights laws. Outlaw it, civil rights group urges Court of Appeal

Appeal starts over Cardiff creepycam deployment

Automated facial recognition (AFR) use by British police forces breaches human rights laws, according to lawyers for a man whose face was scanned by the creepycam tech in Cardiff.

"Put simply, connected to a database with the right information, AFR could be used to identify very large numbers of people in a given place at a given time," Dan Squires QC told the Court of Appeal of England and Wales in written arguments this morning.

Squires is barrister for one Ed Bridges, who, backed by human rights pressure group Liberty, wants to overturn a judicial review ruling from 2019 which failed to halt facial recognition tech use against him by South Wales Police.

The force had set up cameras on the city's iconic central Queen Street in June 2017 to coincide with the UEFA Champions League Final and outside a defence technology expo the following year.

The Divisional Court in the Welsh capital of Cardiff said it was satisfied police were complying with the Human Rights Act as well as "the data protection legislation", a ruling Bridges and Liberty now hope to overturn.

"Essentially the use of AFR is analogous to taking the fingerprints or DNA of thousands of persons (if it could be done without their knowledge, cooperation or consent) and instantaneously comparing such biometric data to that of persons whose location is being sought," continued Bridges in written submissions. He argued that the tech breaches Article 8 of the Human Rights Act, the right to privacy.

Bridges was questioned repeatedly by all three Court of Appeal judges on his legal arguments, with the judges sometimes asking whether they had understood what he was saying. Lord Justice Singh told the barrister at one point: "I thought you were making a different point [in your written arguments]… I had understood, maybe wrongly, you did complain about the original equality assessment and the only thing you should be on the lookout for is direct discrimination."

Originally Bridges had said that an equality impact assessment of AFR done by South Wales Police was not enough to satisfy its public sector equality duty. Addressing the judges, he replied: "You see, when we look at the impact assessment we see that it only looks at direct discrimination," explaining that indirect racial or sexual discrimination – two important parts of the public sector equality duty – therefore could not have been covered.

The case, due to continue over the next few days, sees three of Britain's most senior civil judges hearing from Liberty, South Wales Police, the Home Office, the Information Commissioner’s Office, the Surveillance Camera Commissioner and the Police and Crime Commissioner for South Wales.

The case continues. None of the various sides are seeking legal costs against the others. ®

Bootnote

The hearing began as a YouTube livestream which rapidly collapsed into chaos, with Sir Terence Etherton - president of the Court of Appeal - phoning judicial tech support for help only to be put through to O2 voicemail. Red-faced court admins quickly deleted the saved livestream from YouTube while the court went on hiatus for half an hour. It eventually abandoned YouTube and switched to a Skype-only hearing.

More about

TIP US OFF

Send us news


Other stories you might like