This article is more than 1 year old

'Deeply concerned' UK privacy watchdog thrusts probe into King's Cross face-recognizing snoop cam brouhaha

ICO wants to know if AI surveillance systems in central London are legal

The UK's privacy watchdog last night launched a probe into the use of facial-recognition technology in the busy King's Cross corner of central London.

It emerged earlier this week that hundreds of thousands of Britons passing through the 67-acre area were being secretly spied on by face-recognizing systems. King's Cross includes Google's UK HQ, Central Saint Martins college, shops and schools, as well as the bustling eponymous railway station.

"I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector," said Information Commissioner Elizabeth Denham in a statement on Thursday.

"We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King's Cross area of central London, which thousands of people pass through every day."

The commissioner added her watchdog will look into whether the AI systems in use at King's Cross are on the right side of Blighty's data protection rules, and whether the law as a whole has kept up with the pace of change in surveillance technology. She highlighted that “scanning” people’s faces as they go about their daily business is a “potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding."

Earlier this week, technology lawyer Neil Brown of decoded.legal told us that businesses must have a legal basis under GDPR to deploy the cameras as it involves the processing of personal data. And given the nature of encoding biometric data – someone’s face – a business must also have satisfied additional conditions for processing special category data.

Person recognition system

Chinese government has got it 'spot on' when it comes to face-recog tech says, er, London's Met cops' top rep

READ MORE

"Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way," Denham continued.

"They must have documented how and why they believe their use of the technology is legal, proportionate and justified. We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”

This comes after London Mayor Sadiq Khan demanded more information on the use of the camera systems, and rights warriors at Liberty branded the deployment "a disturbing expansion of mass surveillance."

Argent, the developer that installed the CCTV cameras, admitted it uses the tech, and insisted is there to "ensure public safety." It is not exactly clear how or why the consortium is using facial recognition, though.

A Parliamentary body, the Science and Technology Select Committee, urged in mid-July for a “moratorium on the current use of facial recognition” tech, and “no further trials” until there is a legal framework in place. And privacy campaign groups have tried to disrupt police trials. The effectiveness of these tests have also proved dubious.

Early last month, researchers from the Human Rights, Big Data & Technology Project at the University of Essex Human Right Centre, found that using the creepy cams are likely illegal, and the success rates are highly dubious.

Use of the technology in the US has also been contentious, and on Wednesday this week, the American Civil Liberties Union said tests showed Amazon’s Recognition systems incorrectly matched one in five California politicians with images of 25,000 criminals held in a database. ®

More about

TIP US OFF

Send us news


Other stories you might like