Updated Britons working for Google at its London HQ are being secretly spied on by creepy facial recognition cameras – but these ones aren't operated by the ad-tech company.
Instead it's the private landlord for most of the King's Cross area doing the snooping, according to today's Financial Times.
"The 67-acre King's Cross area, which has been recently redeveloped and houses several office buildings including Google's UK headquarters, Central Saint Martins college, schools and a range of retailers, has multiple cameras set up to surveil visitors," reported the Pink 'Un.
King's Cross is no longer just a London railway terminus and notoriously seedy neighbourhood. The area around the station, once infamous for the types of activities featured in Irvine Welsh novels, has been extensively redeveloped – with tenants now including Google (and YouTube), various other trendy offices, eateries and so on, to the point where it apparently has its own unique postcode.
None of this, however, excuses the reported deployment of creepycams by the developers. They told the FT (paywalled): "These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public."
The Register has contacted the King's Cross developers' PR tentacle separately and will update this article if their promised response to our detailed questions is forthcoming.
Tech lawyer Neil Brown of decoded.legal told El Reg that any company running facial recognition cameras "needs to have not only a lawful basis under the GDPR, as is required for any processing of personal data, but also to have met one of the additional conditions for the processing of 'special category' data," basing this, he said, on the assumption that creepycams' encoding of peoples' faces would probably count as biometric data under current data protection laws.
Broadly, he told us, whoever's running the King's Cross creepycams needs to be certain their use is legal under section 10 of the Data Protection Act 2018, which permits non-consensual data processing for the "prevention or detection of an unlawful act".
Metropolitan Police's facial recognition tech not only crap, but also of dubious legality – reportREAD MORE
Indiscriminate use of facial recognition technology in the UK is largely believed to be illegal, though nobody's quite sure. So far the public conversation has focused upon the antics of police forces, which are very eager to deploy creepycams against the public, arguably in lieu of doing actual policing work. London's Metropolitan Police deployed a system that was extremely inaccurate, not that it stopped them indiscriminately arresting people based on dodgy matches anyway.
Even though cross-party committees of MPs have called for creepycams to be banned until the risks and pitfalls are properly examined, British police forces have decided that Parliament can be safely ignored without any consequences, with the public forced to rely on controls and safeguards designed in the paper-and-ink days of the 1980s.
Rights group Privacy International commented: "The use of facial recognition technology can function as a panopticon, where no one can know whether, when, where and how they are under surveillance.
In London the creep of pseudo-public spaces, owned by corporations who can deploy facial recognition in what we believe are public streets, squares and parks, presents a new challenge to the ability of individuals to go about their daily lives without fear that they are being watched.
The police are subject to increasing scrutiny about the legality of their deployment of facial recognition, but the use in the commercial and retail sector has received insufficient attention and scrutiny."
It added: "There is a lack of transparency not only about the use of this technology, but why it is being done, when it is being done, what happens to the images of people going to work, travelling to see family and generally going about their daily lives... These privacy intrusions are problematic regardless of whether or not you believe you have nothing to hide."
So far, private sector use of creepycams hasn't been part of the British national conversation about facial recognition tech. Thus, the developers of King's Cross are about to become the first test case in the court of UK public opinion.
San Francisco became the first major city in the world to ban facial recog, back in May this year. ®
Updated to add at 1547 UTC, 12 August
When The Register asked how many cameras there were, who supplies them and exactly what the safeguards were, a spokesperson for King's Cross instead decided to emit this quote: "In the interest of public safety and to ensure everyone who visits King's Cross has the best possible experience, we use cameras around the site, as do many other developments and shopping centres, as well as transport nodes, sports clubs and other areas where large numbers of people gather. These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public."