This article is more than 1 year old
Privacy pilfering project punished by FTC purge penalty: AI upstart told to delete data and algorithms
Face-recognition biz hammered after harvesting people's pics, videos without permission
A California-based facial recognition biz has been directed by the US Federal Trade Commission to delete the AI models and algorithms that it developed by harvesting people's photos and videos without permission, a remedy that suggests privacy violators may no longer be allowed to benefit from ill-gotten data.
Everalbum, a consumer photo app maker that shut down on August 31, 2020, and has since relaunched as a facial recognition provider under the name Paravision, on Monday reached a settlement with the FTC over the 2017 introduction of a feature called "Friends" in its discontinued Ever app. The watchdog agency claims the app deployed facial recognition code to organize users' photos by default, without permission.
According to the FTC, between July 2018 and April 2019, Everalbum told people that it would not employ facial recognition on users' content without consent. The company allegedly let users in certain regions – Illinois, Texas, Washington, and the EU – make that choice, but automatically activated the feature for those located elsewhere.
The agency further claims that Everalbum's use of facial recognition went beyond supporting the Friends feature. The company is alleged to have combined users' faces with facial images from other information to create four datasets that informed its facial recognition technology, which became the basis of a face detection service for enterprise customers.
The company also is said to have told consumers using its app that it would delete their data if they deactivated their accounts, but didn't do so until at least October 2019.
There's nothing AI and automation can't solve – except bias and inequality in the workplace, says reportREAD MORE
The FTC, in announcing the case and its settlement, said Everalbum/Paravision will be required to delete: photos and videos belonging to Ever app users who deactivated their accounts; all face embeddings – vector representations of facial features – from users who did not grant consent; and "any facial recognition models or algorithms developed with Ever users’ photos or videos."
The FTC has not done this in past privacy cases with technology companies. According to FTC Commissioner Rohit Chopra, when Google and YouTube agreed to pay $170m over allegations the companies had collected data from children without parental consent, the FTC settlement "allowed Google and YouTube to profit from its conduct, even after paying a civil penalty."
Likewise, when the FTC voted to approve a settlement with Facebook over claims it had violated its 2012 privacy settlement agreement, he said, Facebook did not have to give up any of its facial recognition technology or data.
"Commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that derive much of their value from ill-gotten data," said Chopra in a statement [PDF]. "This is an important course correction."
In response to an inquiry from The Register, an FTC spokesperson said while the agency has previously issued orders that require businesses to delete data, "This is the FTC’s first case that focuses exclusively on facial recognition technology, and the particular data deletion requirements are tailored to the factual allegations in the case."
They told consumers they'd do one thing with their photography and turned around and used it for a facial surveillance technology
In a phone interview with The Register, Adam Schwartz, senior staff attorney for the Electronic Frontier Foundation, said the EFF is generally supportive of this FTC remedy.
"We think that what the company did here was very bad," he said. "They told consumers they'd do one thing with their photography and turned around and used it for a facial surveillance technology," he said.
"Part of the way the FTC should be solving these problems is by making the wrongdoing company disgorge all the benefit they obtained. If they build a facial recognition algorithm illegally, then the remedy is to delete the system."
A spokesperson for Paravision AI told The Register in an email that the FTC Consent Order reflects changes already implemented by the company.
"The Ever service was closed in August 2020 and the company has no plans to run a consumer business moving forward," the spokesperson said. "In September 2020, Paravision released its latest-generation face recognition model which does not use any Ever users’ data. The consent order mirrors the course we had already set and reinforces a mindful tone as we look ahead, and we will of course fully comply with it." ®