UK tribunal agrees with Clearview AI – Brit data regulator has no jurisdiction
American selfie-scraper shakes off $9M privacy fine as the 'actions of a foreign state are out of scope'
A British tribunal yesterday ruled US selfie-scraper Clearview AI would not have to pay a £7.5 million ($9 million) privacy fine.
The tribunal held that the Information Commissioner's Office (ICO), the UK's data regulator, didn't have the authority to fine ClearView, which scours the public web to collect images upon which it trains AI products.
The watchdog dished out the fine back in May last year, claiming the American company had breached the UK's GDPR by "failing to meet the higher data protection standards required for biometric data"; failing to have a lawful reason for collecting it; and failing to have a process in place to stop the data being kept "indefinitely."
Clearview AI fined millions in the UK: No 'lawful reason' to collect Brits' imagesREAD MORE
But although the three-member tribunal accepted it was a reasonable inference that there are images of UK residents in ClearView's database, given the country's size and the extent of internet and social media usage within the UK "as compared to other countries where internet usage is not so prevalent," it ruled that the AI company's data processing was "outside the territorial scope of the Regulations, with the consequence that the ICO had no jurisdiction to issue the notices."
When appealing the fine, ClearView had argued that its service is "an Internet Search Engine service which is offered exclusively to foreign (i.e. non-UK/EU) criminal law enforcement and national security agencies, and their contractors, in support of the discharge of their respective criminal law enforcement and national security functions, which are functions outside the material scope of the Regulations, pursuant to Article 2 of those Regulations."
Article 2 refers to Article 2(2)(a) of the UK GDPR (still in current British data protection legislation, albeit by the skin of its teeth), which states that the actions of a foreign state – and specifically "processing of personal data by a competent authority for any of the law enforcement purposes" – are out of scope.
This essentially means if Australian law enforcement is hunting down a Brit for breaking an Australian law, for example, it can process the UK person's data to do so outside of the "scope" of the UK GDPR.
The trio were careful to say that they didn't know whether or not the selfie-scraper had infringed the UK GDPR or the European Union flavored one, and that this was a matter purely addressing the "jurisdictional challenge to the notices."
Data privacy lawyer James Castro-Edwards of Arnold & Porter told us: "Clearview only provided services to non-UK/EU law enforcement or national security bodies and their contractors. The UK GDPR provides that acts of foreign governments fall outside its scope; it is not for one government to seek to bind or control the activities of another sovereign state."
He added: "The decision explores the extra-territorial reach of the UK GDPR, in particular the extent to which a company established outside the UK, but which is involved in monitoring the behavior of individuals in the UK."
He noted: "While in this instance, the Tribunal found that the UK GDPR did not apply, non-UK organizations carrying out similar activities for commercial purposes should still consider their obligations under applicable data protection law."
Why haven't UK cops spent money on this yet?
The question of why there was no take-up is an interesting one.
The UK's law enforcement maintains large biometric databases, spending £54 million ($65 million) with IBM just a few months back on a system that also includes fingerprint-matching.
Plus UK policing minister Chris Philp said earlier this year he was planning "to embed facial recognition technology in policing and ... considering what more the government can do to support the police on this."
- NYC rights groups say no to grocery store spycams and snooping landlords
- 'Slow AI' needed to stop autonomous weapons making humans worse
- US police have run nearly 1M Clearview AI searches, says founder
- Man wrongly jailed by facial recognition, lawyer claims
The tribunal judges noted that Clearview offered its service "on a trial basis to law enforcement/government" organizations in the UK between June 2019 and March 2020. It added that there were 721 searches made in that trial phase. The "UK Test Phase," it added, took place before the end of the incredibly protracted transition period when the United Kingdom Brexited from the European Union. And it never came back, as there "is no suggestion that the service has been offered to customers established within the UK since that time."
The ICO told the tribunal about the UK Test Phase to establish that there were images of UK residents held within the Clearview database – it wasn't part of its position claiming alleged infringements.
The trio noted:
We were not told the reason the trial ended, nor whether it was unsuccessful and if so, the reason why nor whether the trial was terminated by [ClerView] or the potential clients trialling the service. The possible reasons include that the database did not include sufficient images of UK residents to make it of use to UK law enforcement, but we simply do not know, and we do not speculate.
In a statement sent to The Reg, Jack Mulcaire, General Counsel for Clearview AI, said: “We are pleased with the tribunal's decision to reverse the UK ICO's unlawful order against Clearview AI.”
An ICO spokesperson said: "The ICO will take stock of today's judgment and carefully consider next steps. It is important to note that this judgment does not remove the ICO's ability to act against companies based internationally who process data of people in the UK, particularly businesses scraping data of people in the UK, and instead covers a specific exemption around foreign law enforcement."
Now, run along and replace that profile pic with one of your dog, and set the rest of them to "private." ®