Roundup Hello, welcome to this week’s roundup of news in the ever encroaching world of AI and machine learning. We’ll be talking about everyone’s favorite topic at the moment: facial recognition.
First San Francisco, Somerville ... now Oakland: California's Oakland has become the third US city to ban its local government using facial recognition technology, after its council passed an ordinance this week.
Council member Rebecca Kaplan submitted the ordinance for city officials to consider earlier this year in June. The document describes the shortcomings of the technology and why it should be banned.
“The City of Oakland should reject the use of this flawed technology on the following basis: 1) systems rely on biased datasets with high levels of inaccuracy; 2) a lack of standards around the use and sharing of this technology; 3) the invasive nature of the technology; 4) and the potential abuses of data by our government that could lead to persecution of minority groups,” according to the ordinance.
It is well known that facial recognition models struggle with identifying women and people of darker skin, since both groups are often left out in the data used to train these systems. Oakland is now the third city to ban the emergent technology after San Francisco, California, and Somerville, Massachusetts.
All the facial surveillance hotspots: Where are those pesky AI cameras scanning people’s faces? Fight for the Future, an activist non-profit group, has mapped out known regions where facial recognition is being used by law enforcement and where its been banned. Find out if there’s cameras near you here.
“We are on the verge of an unprecedented increase in state and private spying that will be built in plain sight,” said Evan Greer, deputy director of Fight for the Future. “People are alarmed, and this map and the toolkit arms people everywhere with the resources to both fight back and learn from how others are doing it. It’s going to take all of us to rid this country of this most dangerous technology.”
Some of these regions are where law enforcement have reportedly scanned people’s driving licenses and mug shots, or have partnered with companies like Amazon to trial the technology, or where the issue is being discussed by local politicians, who are considering banning it.
“Facial recognition surveillance is like nuclear or biological weapons,” Greer previously told El Reg. "It's technology that is fundamentally too dangerous to be in the government's hands. Even if there are helpful applications, the risks are simply too great. Surveillance technology suffers from severe 'mission creep.' If we allow this dangerous biometric spying to spread and become ubiquitous, it won't be used to keep us safe – it will be used to control us."
Oops... can you give that back? Lawyers accidentally handed over confidential information about New York City Police Department’s facial recognition program to academic researchers from the Georgetown Center on Privacy and Technology, a think tank in Washington DC.
Now they are pleading with a judge to order the researchers to return the documents. But it’s not the first time such a mistake has occurred. The same mistake was made earlier this year in April, and was due to a software error, apparently.
“In the course of using an automated tool for redacting information from documents produced in response to a FOIL request, an error was made that resulted in portions of documents remaining visible that were intended to be blanked," Law Department spokesman Nick Paolucci said, according to the New York Daily News. "While we strive to avoid such errors, they occur occasionally in litigation, and there exists a legal process for securing the return of documents when such inadvertent disclosures occur."
It looks like the researchers had to return the confidential documents the first time round, but may not have to this time. At a legal hearing, Judge Shlomo Hager said: “One was too much. Two is more than I can tolerate at this time.”
Where do you get training data from? Massive companies like Google and Facebook have no trouble scraping together large datasets to train its facial recognition models, its users readily hand it over.
But what if you don’t have that luxury? Well then, you take photos from wherever you can get them, whether it’s dating sites, photo sharing platforms, or hell, install your own camera somewhere.
It shouldn’t come as too much of a shock, really. Even researchers probing facial recognition models have trouble obtaining training data and often just cobble them together from websites that they don’t ask first. Some just go ahead and collect their own datasets by putting up cameras around campuses.
So smaller companies just do that, too. Clarifai, a computer vision startup based in San Francisco, took images from OkCupid, a popular dating site. It’s not clear if the whole thing was done without the site’s consent, however. An OkCupid spokesperson didn’t clarify that, but did say that it “did not enter into any commercial agreement then and have no relationship with [Clarifai] now.”
Some universities and companies are getting spooked over the privacy concerns of compiling training datasets with people’s photos willy nilly. Microsoft and Stanford University have apparently deleted some of their datasets since, according to the New York Times. ®