This article is more than 1 year old
UK's Surveillance Commissioner warns of 'ethically fraught' facial recognition tech concerns
How about being an anonymous face in a crowd? Is that not allowed anymore?
Facial recognition technology (FRT) may need to be regulated in much the same way as some ethically sensitive medical techniques to ensure there are sufficient safeguards in place to protect people's privacy and freedoms.
That’s according to Professor Fraser Sampson, the UK Government’s Surveillance Camera Commissioner (SCC), who works with the Home Office overseeing tech-related surveillance in the UK.
Where biometric surveillance systems are being bought with public money and deployed in the public interest then there is surely a legitimate expectation that all parties will adopt an ethical and human rights compliant approach
He was responding to last week’s report by the Geneva-based Human Rights Council (HRC) which argued that the protection of human rights should be at the heart of the development of AI-based systems including areas such as law enforcement.
The report went on to say that unless sufficient safeguards are in place to protect human rights, there should be a moratorium on the sale of AI systems and those that fail to meet international human rights laws should be banned.
Now, the SCC has added his voice to the debate as lawmakers around the world attempt to create a workable legal framework in the face growing calls for human rights protections.
“This is a fast-evolving area and the evidence is elusive but it may be that the aspects currently left to self-determination present the greatest risk to communities or simply to give rise to the greatest concern among citizens,” he told The Register.
“It may even be the case that some technological biometric and surveillance capabilities such as FRT are so ethically fraught that they can only be acceptably carried out under licence in the future – perhaps akin to the regulatory arrangements for human fertilisation and embryology.
“That is a matter of policy for others," he said.
Professor Sampson: "But we need as a minimum a single set of clear principles by which those using the biometric and surveillance camera systems will be held to account, transparently and auditably.”
- Plot twist: Google's not spying on King's Cross with facial recognition tech, but its landlord is
- Metropolitan Police's facial recognition tech not only crap, but also of dubious legality – report
- Those facial recognition trials in the UK? They should be banned, warns Parliamentary committee
- Zero arrests, 2 correct matches, no criminals: London cops' facial recog tech slammed
- Teen turned away from roller rink after AI wrongly identifies her as banned troublemaker
- Let's check in with our friends in England and, oh good, bloke fined after hiding face from police mug-recog cam
Asked to comment further on the HRC’s report he told us: “Where biometric surveillance systems are being bought with public money and deployed in the public interest then there is surely a legitimate expectation that all parties will adopt an ethical and human rights compliant approach.”
“I agree that, if used without sufficient regard to how they affect people’s human rights, the emerging technological capabilities in the area of surveillance and biometrics can be negative and potentially catastrophic,” he added.
The use of AI and technologies such as FRT has recently been the subject of governmental scrutiny both in the UK and the US.
In 2019, London's Metropolitan Police deployed a system that was not only extremely inaccurate, but led to them arresting people based on dodgy matches anyway.
In May of that year, Met cops fined a man for covering his face while cops were conducting a test of the technology in Romford, London.
In August 2020, a court of appeal found that use of facial recognition technology by South Wales police had been unlawful.
In April, the EU published its own proposals for harmonised rules on artificial intelligence (Artificial Intelligence Act) where it too recognised the benefits while acknowledging the “new risks or negative consequences for individuals or the society.” ®