This article is more than 1 year old
Facial recog firm Clearview hit with complaints in France, Austria, Italy, Greece and the UK
Privacy groups claim images are stored 'indefinitely', even after deletion, in GDPR breach
Updated Data rights groups have filed complaints in the UK, France, Austria, Greece and Italy against Clearview AI, claiming its scraped and searchable database of biometric profiles breaches both the EU and UK General Data Protection Regulation (GDPR).
The facial recognition company, which is based in the US, claims to have “the largest known database of 3+ billion facial images”. Clearview AI's facial recognition tool is trained on images harvested from YouTube, Facebook, Twitter and attempts to match faces fed into its machine learning software with results from its multi-billion picture database. The business then provides a link to the place it found the "match".
Although Clearview AI lists mostly US law enforcement agencies as customers on its website and in publicly avowed comments, according to documents cited in the complaint to UK data regulator, the Information Commissioner's Office, [PDF], the UK National Crime Agency, the Ministry of Defence and several police forces across England all allegedly have registered users with Clearview AI.
The complaint also alleges that "images and metadata collected through the scraping process are stored on Clearview’s servers... indefinitely, i.e. even after a previously collected photograph or hosting webpage has been removed or made private."
Google, Twitter, Facebook and even Venmo all sent cease and desist letters to Clearview AI last year asking that it stop scraping people's photos from their websites. The firm's CEO defended its business model at the time by saying: "Google can pull in information from all different websites. So if it's public and it's out there and could be inside Google's search engine, it can be inside ours as well."
The US firm was sued last year by the American Civil Liberties Union. The ACLU also sued the US Department of Homeland Security and its law enforcement agencies last month for failing to respond to Freedom of Information Act requests about their use of Clearview's tech.
One at a time
Back in January this year, [PDF], Chaos Computer Club member Matthias Marx managed to get Clearview to delete the hash value representing his biometric profile - although not the actual images or metadata - after filing a complaint with the Hamburg data protection authorities.
The decision by the Hamburg DPA was that Clearview AI had added his biometric profile to its searchable database without his knowledge or consent. It did not order the deletion of the photographs, however.
"It is long known that Clearview AI has not only me, but many, probably thousands of Europeans in its illegal face database. An order by the European data protection authorities to remove the faces of all Europeans is long overdue," Marx told The Reg via email. "It is not a solution that every person has to file [their] own complaint."
“Clearview seems to misunderstand the internet as a homogeneous and fully public forum where everything is up for grabs,” commented Lucie Audibert, legal officer at Privacy International, one of a group of four rights groups bringing the complaints. “This is plainly wrong. Such practices threaten the open character of the Internet and the numerous rights and freedoms it enables.”
- Don't scrape the faces of our citizens for recognition, Canada tells Clearview AI – delete those images
- How the tables have turned: Bloke says he trained facial recognition algorithm to identify police officers
The other campaign groups include the Hermes Center for Transparency and Digital Human Rights, Homo Digitalis and noyb - the European Center for Digital Rights.
Regulators, including the UK's ICO and France's CNIL, now have three months to respond.
We have asked Clearview for comment. ®
Updated at 1053 UTC to add
Clearview AI told The Reg it "has never had any contracts with any EU customer and is not currently available to EU customers.
"We have voluntarily processed the five Data Access Requests in question, which only contain publicly available information, just like thousands of others we have processed.
"Clearview AI has helped thousands of law enforcement agencies across America," it added. "National governments have expressed a dire need for our technology because they know it can help investigate crimes like money laundering and human trafficking, which know no borders."