Clearview AI, the controversial startup known for scraping billions of selfies from people's public social network profiles to train a facial-recognition system, may be fined just over £17m ($22.6m) by the UK’s Information Commissioner’s Office (ICO).
The watchdog on Monday publicly mulled punishing Clearview following an investigation launched last year with the Australian Information Commissioner. The ICO believes the US biz broke Britain's data-protection rules by, among other things, failing to have a “lawful reason” for collecting people’s personal photos and info, and not being transparent about how the data was used and stored for its facial-recognition applications.
Clearview harvests people's photos – 10 billion or more, it's thought – from their public social media profiles, and then builds a face-matching system so that if, say, the police upload a picture of someone from a CCTV still, the software can locate that person in its database and provide officers the corresponding name and online profiles.
As the regulator put it:
The images in Clearview AI Inc’s database are likely to include the data of a substantial number of people from the UK and may have been gathered without people’s knowledge from publicly available information online, including social media platforms. The ICO also understands that the service provided by Clearview AI Inc was used on a free trial basis by a number of UK law enforcement agencies, but that this trial was discontinued and Clearview AI Inc’s services are no longer being offered in the UK.
“I have significant concerns that personal data was processed in a way that nobody in the UK will have expected,” Elizabeth Denham, Blighty's Information Commissioner, added in a statement.
“It is therefore only right that the ICO alerts people to the scale of this potential breach and the proposed action we’re taking."
According to leaked documents, Clearview’s software was tested in the United Kingdom by the Metropolitan Police, the Ministry of Defence, and the National Crime Agency, as well as police in North Yorkshire, Northamptonshire, Suffolk, and Surrey. The University of Birmingham also tested the technology, too. Although its algorithms are no longer being used in the UK, Denham warned that the upstart may still be downloading images from people’s social media pages.
“The evidence we’ve gathered and analysed suggests Clearview AI Inc were and may be continuing to process significant volumes of UK people’s information without their knowledge,” she said. “We therefore want to assure the UK public that we are considering these alleged breaches and taking them very seriously.”
- Joint UK-Oz probe finds face-recognition upstart Clearview AI is rubbish at privacy
- Leaked: List of police, govt, uni orgs in Clearview AI's facial-recognition trials
- Facebook ditches its creepy, controversial robot – yes, its facial-recognition AI
- Mounties messed up by using Clearview AI, says Canadian Privacy Commissioner
The multi-million-pound fine is just a proposal. Clearview has time to discuss this little issue with the watchdog, and the amount may be varied; a final decision is expected to be made by mid-2022.
“The UK commissioner's assertions are factually and legally incorrect,” Clearview's British lawyer Kelly Hagedorn told The Register.
“The company is considering an appeal and further action. Clearview AI provides publicly available information from the internet to law enforcement agencies. To be clear, Clearview AI does not do business in the UK, and does not have any UK customers at this time."
Clearview's CEO Hoan Ton-That said he was disappointed with the proposed fine, and reiterated his startup collected “public data from the open internet.”
“My company and I have acted in the best interests of the UK and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts,” he said in a statement.
"It breaks my heart that Clearview AI has been unable to assist when receiving urgent requests from UK law enforcement agencies seeking to use this technology to investigate cases of severe sexual abuse of children in the UK." ®