This article is more than 1 year old
NYC rights groups say no to grocery store spycams and snooping landlords
Letter to City Council supports measures to ban biometric tech from public spaces
"New Yorkers should not be forced to accept biometric surveillance as part of simple activities like buying groceries or taking their kids to a baseball game," more than 30 civil and digital rights organizations said yesterday in a letter backing new privacy laws in the city.
The New York Civil Liberties Union, the Surveillance Technology Oversight Project, Amnesty International and others wrote a memo of support for two pending state bills (1014-2023 and 1024-2023) that aim to ban facial recognition and other biometric tech both in public spaces, such as shops and arenas, and in residential buildings.
In the letter [PDF], the groups say the City Council is "long overdue in taking action" to pass the legislation. The Register has asked the council for comment on the holdup.
The groups point out that biometric technology, including facial recognition, can be "biased, error-prone, and harmful to marginalized communities."
The first bill, 1014-2023, would stop any place or provider of public accommodation from using "any biometric recognition technology to verify or identify a customer, prohibit businesses from barring entry to customers based on FRT, and prohibit companies from selling customers' biometric data."
The second, 1024-2023, focuses on the use of facial recog and other biometric surveillance in "residential settings." The groups claim that "landlords will abuse this tech to justify evicting tenants from rent-stabilized units because the facial recognition technology system determines they were not at home often enough." They add: "In fact, vendors have already begun to advertise this technology. In public housing, its use has led to residents being evicted for minor violations of policy."
The orgs went on to cite the example of a single mother who "was targeted after she started night classes and asked her ex-husband to spend more time at her home watching their children, causing her to be flagged for potentially violating the housing authority's visitor policy."
UK's GDPR replacement could wipe out oversight of live facial recognition
READ MOREThe bills have come under particular attention after an incident in December where lawyer Kelly Conlon got turned away at the door of Radio City Music Hall after a facial recognition system pinned her as a "prohibited person." Conlon was there as a chaperone for a Girl Scout troop trip to see the Rockettes. The reason she'd been marked as persona non grata by the face-matching cams, it soon emerged, was because the New Jersey law firm she worked for – Davis, Saperstein and Solomon – had been involved in years-long personal injury litigation against a restaurant now owned by Madison Square Garden Entertainment, which operates the venue. Creepily, Conlon told a local NBC outlet: "They knew my name before I told them."
- Dems offer ban on Feds using facial, voice recognition
- Amazon gave Ring video to cops without consent or warrant 11 times so far in 2022
- EU countries want to pool photos in massive facial recog database
- UK's Surveillance Commissioner warns of 'ethically fraught' facial recognition tech concerns
MSG Entertainment, though, said Conlon's firm had been warned about its "policy" and that use of facial recognition was "clearly advertised to guests" – hopefully before they show up at the venue.
While obviously in Conlon's case it was effective, facial recognition critics say the tech is often highly inaccurate. Research on inherent biases in the system has previously shown that matching accuracies are lower on women and people of color and Amnesty International has said that in a criminal justice setting, face recognition technologies that are "inherently biased in their accuracy can misidentify suspects, incarcerating innocent Black Americans" [PDF].
Meanwhile, some facial recog business have fallen afoul of the law. Clearview AI, for example, had to promise not to sell facial recognition databases to most US businesses after settling a lawsuit with the ACLU last year. The outfit famously scrapes images from Facebook and the like to build its facial ID database.
Over in the UK, meanwhile, Clearview was fined millions by the Information Commissioner (ICO) for scraping the web for face data. The privacy watchdog said there was no "lawful reason" to collect Brits' images while Clearview has said in its defense that the data subjects are making these images publicly available themselves.
However, the ICO has been less explicit about how it intends to regulate the field from now on as the latest UK government tries to overhaul EU-era law. UK Biometrics and Surveillance Camera Commissioner, Professor Fraser Sampson, said [PDF] earlier this month that an ICO consultation on the draft biometric data guidance had some large gaps.
This guidance is a missed opportunity to start plugging the oversight and guidance gap caused by the abolition of the biometrics oversight role I currently fulfil, and the government's silence on where responsibility will subsequently lie accompanying those changes. If the decision is not to develop this guidance, then the ICO must stand ready to provide practical, timely advice when called upon by developers and deployers.
Sampson told us earlier this year that the UK's would-be GDPR replacement the DPDI bill (now on take 2) – could wipe out oversight of live facial recognition with the abolishment of his role despite recent police moves to "embed" facial recog into the country's police force.
Under the upcoming AI Act, the European Union adopted a draft text earlier this year that includes a complete ban on the use of AI for biometric surveillance, emotion recognition, and predictive policing. ®