Police should be banned from using blanket facial-recognition surveillance to identify people not suspected of crimes. Certain private databases of people’s faces for identification systems ought to be outlawed, too.
That's the feeling of the majority of members in the European Parliament this week. In a vote on Wednesday, 377 MEPs backed a resolution restricting law enforcement’s use of facial recognition, 248 voted against, and 62 abstained.
“AI-based identification systems already misidentify minority ethnic groups, LGBTI people, seniors and women at higher rates, which is particularly concerning in the context of law enforcement and the judiciary,” reads a statement from the parliament.
“To ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented, MEPs ask. Where possible, public authorities should use open-source software in order to be more transparent.”
As well as this, most of the representatives believe facial-recognition tech should not be used by the police in automatic mass surveillance of people in public, and monitoring should be restricted to only those thought to have broken the law. Datasets amassed by private companies, such as Clearview AI, for identifying citizens should also be prohibited along with systems that allow cops to predict crime from people's behavior and backgrounds. Here's specifically what the parliament stated:
To respect privacy and human dignity, MEPs ask for a permanent ban on the automated recognition of individuals in public spaces, noting that citizens should only be monitored when suspected of a crime. Parliament calls for the use of private facial recognition databases (like the Clearview AI system, which is already in use) and predictive policing based on behavioural data to be forbidden.
Ranking people with social scores, assigned by their personality, behavior, and what have you, was also given the thumbs down.
“Fundamental rights are unconditional”, said Peter Vitanov, an MEP representing the Bulgarian Socialist Party. “For the first time ever, we are calling for a moratorium on the deployment of facial recognition systems for law enforcement purposes, as the technology has proven to be ineffective and often leads to discriminatory results.
“We are clearly opposed to predictive policing based on the use of AI as well as any processing of biometric data that leads to mass surveillance. This is a huge win for all European citizens.”
- UK's Surveillance Commissioner warns of 'ethically fraught' facial recognition tech concerns
- If it were possible to evade facial-recognition systems using just subtle makeup, it might look something like this
- Leaked: List of police, govt, uni orgs in Clearview AI's facial-recognition trials
- Britain publishes 10-year National Artificial Intelligence Strategy
The vote is non-biding, meaning it cannot directly lead to any legislative change. Instead, it was cast to reveal if members might be supportive of upcoming bills like the AI Act, a spokesperson for the EU parliament told The Register.
“The resolution is a non-exhaustive list of AI uses that MEPs within the home affairs field find problematic. They ask for a moratorium on deploying new facial recognition systems for law enforcement, and a ban on the narrower category of private facial recognition databases,” the spokesperson added.
It also called for border control systems to stop using biometric data to track travelers across the EU, too. ®
A Black Uber driver in the UK, who lost his job after he was locked out of the ride-hailing app when its facial-recognition system failed to identify him, is taking legal action against the tech giant. He will be supported by the The Independent Workers’ Union of Great Britain (IWGB).
It's alleged people of color are five times more likely to be misidentified by Uber's facial-recognition authentication mechanism. The union has called for a 24-hour boycott of Uber to encourage the ride-hailing biz to change its ways.
“Uber’s continued use of a facial recognition algorithm that is ineffective on people of color is discriminatory," said Henry Chango Lopez, general secretary of the IWGB, on Tuesday.