This article is more than 1 year old
Clearview AI promises not to sell face-recognition database to most US businesses
Caveats apply, your privacy may vary
Clearview AI has promised to stop selling its controversial face-recognizing tech to most private US companies in a settlement proposed this week with the ACLU.
The New-York-based startup made headlines in 2020 for scraping billions of images from people's public social media pages. These photographs were used to build a facial-recognition database system, allowing the biz to link future snaps of people to their past and current online profiles.
Clearview's software can, for example, be shown a face from a CCTV still, and if it recognizes the person from its database, it can return not only the URLs to that person's social networking pages, from where they were first seen, but also copies that allow that person to be identified, traced, and contacted.
That same year, the ACLU sued the biz, claiming it violated Illinois' Biometric Information Privacy Act (BIPA), which requires organizations operating in the US state to obtain explicit consent from its residents to collect their biometric data, which includes their photographs.
Now, both parties have reached a draft settlement [PDF] to end the legal standoff. As part of that proposed deal, Clearview has agreed to stop giving or selling access to its database system to most private companies and organizations across the US. We say most because there are caveats. Also, the deal has to be accepted by the courts.
As per the proposed settlement, Clearview cannot share its database with any state or local government entity in Illinois for five years, nor any private entities in the state, and will allow residents to opt-out of the database. They can submit a photograph to the company, and it will block its facial recognition software from finding matches for their face. On top of this, Clearview will work on filtering out images that were taken in or uploaded from Illinois. The company will fork out $50,000 to pay for online adverts notifying residents of their ability to opt out.
Beyond Illinois, the settlement permanently blunts Clearview's ability to do business with private companies and organizations across America: it can, theoretically, sell customers a version of its facial recognition software not trained on the database, but it cannot supply them its huge database. This self-imposed ban does not extend to public entities, meaning law enforcement and local and federal government agencies and their contractors can use its giant database, except in the state of Illinois over the next five years.
- Ukraine uses Clearview AI to identify slain Russian soldiers
- Clearview AI plans tech to ID faces as they age, seek big government deals
- Clearview's selfie-scraping AI facial recognition technology set to be patented
- UK privacy watchdog may fine selfie-hoarding Clearview AI £17m... eventually, perhaps
- Joint UK-Oz probe finds face-recognition upstart Clearview AI is rubbish at privacy
Interestingly, Clearview has also agreed to "delete all facial vectors in the Clearview App that existed before Clearview ceased providing or selling access to the Clearview App to private individuals and entities." These so-called "Old Facial Vectors" are encoded from the billions of images the company scraped. Clearview, however, is allowed to create or recreate facial vectors subject to the new restrictions.
Clearview will also no longer be allowed to offer free trials of its facial recognition software to individual police officers without the approval of their bosses. Under the settlement, the biz does not admit to any issues of liability. It claimed it had already limited its dealings in America to law enforcement, so this agreement is a formality.
"Clearview AI's posture regarding sales to private entities remains unchanged," the upstart's CEO Hoan Thon-That told The Register in a statement.
"We would only sell to private entities in a manner that complies with BIPA. Our database is only provided to government agencies for the purpose of solving crimes. We have let the courts know about our intention to provide our bias-free facial-recognition algorithm to other commercial customers, without the database, in a consent-based manner.
"Today, facial recognition is used to unlock your phone, verify your identity, board an airplane, access a building, and even for payments. This settlement does not preclude Clearview AI selling its bias-free algorithm, without its database, to commercial entities on a consent basis, which is compliant with BIPA."
Nathan Freed Wessler, a deputy director of the ACLU's Speech, Privacy, and Technology Project, praised the strong privacy protections established in the state of Illinois, which is where this legal action unfolded.
"By requiring Clearview to comply with Illinois' path-breaking biometric privacy law not just in the state, but across the country, this settlement demonstrates that strong privacy laws can provide real protections against abuse," he said in a canned statement.
"Clearview can no longer treat people's unique biometric identifiers as an unrestricted source of profit. Other companies would be wise to take note, and other states should follow Illinois' lead in enacting strong biometric privacy laws." ®