UK's GDPR replacement could wipe out oversight of live facial recognition
Question not whether UK police should use facial recog, but how, says surveillance chief
Biometrics and surveillance camera commissioner Professor Fraser Sampson has warned that independent oversight of facial recognition is at risk just as the policing minister plans to "embed" it into the force.
He said this week that the widely slated use of facial recognition at the recent crowning of Charles III was "a glimpse into the future of policing," but noted that new data protection measures being looked at in Parliament could scrap both his role and the rules governing the use of public space surveillance systems by police and local authorities.
Sampson's job, if you were wondering, is to encourage "compliance with the Surveillance Camera Code of Practice" – the only legal instrument that addresses police use of live facial recognition directly. His office is independent of the government.
Speaking to The Reg, he said: "All the indications are that it'll go through as currently drafted."
The warning lands a day after Sampson, a solicitor specializing in policing law, wrote to the committee overseeing the second take on the bill [PDF] the government hopes will replace the UK's implementation of GDPR.
With UK policing minister Chris Philp planning "to embed facial recognition technology in policing and ... considering what more the government can do to support the police on this," it becomes all the more pressing, as Sampson described in a post yesterday.
Sampson doesn't appear to be against the deployment of the tech in principle, saying he is "convinced that modern facial recognition, and other AI-driven biometric surveillance technologies in the pipeline, are potentially too useful an advance in the fight against crime and terrorism for us to turn our noses up at."
He warned The Reg two years back that the idea of "biometric and surveillance capabilities" such as facial recog were so "ethically fraught that they can only be acceptably carried out under licence in the future," saying this called for "a minimum single set of clear principles by which those using the biometric and surveillance camera systems will be held to account, transparently and auditably."
How's that going?
New data protection law
You may have read some of our previous coverage on the proposed replacement for UK GDPR, the "Brexit dividend" that has experts worried about EU data adequacy rulings.
According to Sampson, the latest iteration of the Data Protection and Digital Information (DPDI) Bill (version 1 was withdrawn) has two clauses – 104 and 105 – that look to abolish "the office of Commissioner for the Retention and Use of Biometrics" and repeal "both the duty on the government to publish a Surveillance Camera Code of Practice governing the use of public space surveillance systems by police and local authorities and the requirement for a Surveillance Camera Commissioner to oversee it."
Sampson wrote to the DPDI bill committee about the issue this week, concerned not only for the oversight his own role provides, but that "at this stage in the Bill's Parliamentary passage ... there is no provision for these non-casework biometrics functions and 'non-data protection' issues in relation to public space surveillance."
The commissioner said this "remains" his "principal concern," adding: "I am not aware of any meaningful plan to address them once the statutory offices are abolished."
He said in his email to the committee, which you can read here:
It is worth noting that police accountability in their use of new technology such as facial recognition, voice pattern analysis and other AI-driven capabilities is one of the most contentious aspects of biometric surveillance yet remains unaddressed, either in the Bill (the focus of which remains solely the regulation of DNA and fingerprints in certain, limited circumstances) or at all.
As an advocate of the accountable and proportionate use of new technology by the police I think this lacuna is problematic as much for the police themselves as for the communities they serve.
- Tech pros warn EU 'data adequacy' at risk if Brexit Britain goes its own way
- UK government set to extract hospital data to Palantir system without patient consent
- Cookie consent crumbles under fresh UK data law proposals
- Clearview AI fined millions in the UK: No 'lawful reason' to collect Brits' images
In the background, legal experts are also concerned about the bill's potential effect on data protection if it retains its current drafting.
Legal eagle Chris Pounder at HawkTalk Training has also been watching the new bill, and recently wrote he had "come to the conclusion that the new definition of personal data in the Data Protection and Digital Information No.2 Bill only applies to facial recognition CCTV if the data subject is on a watch-list," adding that the effect would be that many facial recognition systems "will process personal data in total secrecy (i.e. no transparency)."
Pounder believes this means "many facial recognition CCTV systems can be installed without any transparency obligations and used more or less in secret," thus entering what he termed "the data protection twilight zone."
Commenting on this possibility, Sampson told The Reg that this "bit would be of particular concern given the removal of the only instrument for the regulation of public space surveillance," adding that "if you then dilute by changing the definition" of personal data, the protection for data subjects would become "even less."
Pounder, meanwhile, has written that he expects "the number of Facial Recognition systems to mushroom exponentially after the enactment of the No2 Bill."
Brexit dividend? 'Newly independent' UK will be world's 'data hub', claims digital ministerREAD MORE
The Public Law Project, meanwhile, says [PDF]:
The Data Protection and Digital Information (No.2) Bill would weaken important data protection rights and safeguards, making it more difficult for people to know how their data is being used, how decisions about them are being made, and weakening requirements on those who process data to consider the rights and interests of those their actions will affect.
Privacy activists at Big Brother Watch, which regularly gives evidence on civil liberties to UK government and its regulators, have warned that Met Police facial recognition has been found to be "85 percent inaccurate 2016-2023."
Giving evidence [PDF] in September 2021, it has also expressed concerns that several UK police forces have also collaborated with "private companies using facial recognition surveillance."
Susannah Copson, Legal and Policy Officer at Big Brother Watch, told The Reg: "Abolishing the Surveillance Camera Commissioner and Surveillance Code of Practice will gut oversight of public surveillance activities and exacerbate the regulatory lacuna of surveillance technologies.
"This will allow for more information about the public to be processed with fewer safeguards and without much-needed oversight. At a time when intrusive technologies are rapidly expanding, the government is taking a big step in the wrong direction."
We have asked the newly formed Department for Science Innovation and Technology (formerly the Department for Culture, Media and Sport) for comment. ®