This article is more than 1 year old

French lawmakers say oui to Olympic video surveillance, but non for faces

Pas le visage, merci

A bill to allow real-time video surveillance of vistors to the 2024 Paris Summer Olympics was approved by France's Senate on Tuesday and now advances to the National Assembly – but crucially bans the use of facial recognition technology.

The legislation would allow security cameras and drones in and around the stadiums hosting the events as well as in public transportation and on city streets, beginning this spring and continuing until June 30, 2025.

Surveillance footage collected would be processed using an algorithm "whose sole purpose is to detect, in real time, predetermined events likely to present or reveal" security risks – such as terrorist acts or other "serious threats to the safety of persons," according to the proposal.

This data would then be automatically sent to the police and/or security services so emergency responders can take action, if needed.

The lawmakers, however, rejected the use of facial recognition technology – or indeed any kind of biometrics that could be used for data analytics.

"This processing does not use any biometric identification system, does not process any biometric data and does not implement any facial recognition technique," the bill says. "They cannot carry out any reconciliation, interconnection or automated linking with other processing of personal data."

The proposal also calls for "human control measures" to prevent and/or correct any biases in the AI or misuse of the surveillance system.

Although the video monitoring is intended to be a temporary and "experimental," according to the legislation, some data privacy and human rights organizations fear that the government is using the Olympics as an excuse to set up a permanent surveillance system. 

"Once all these algorithms have been tested for two years … [and] tens of thousands of agents will have been trained in the use of these algorithms, it seems unlikely that the VSA will be abandoned at the end of 2024," argued NGO La Quadrature du Net, about the Olympic algorithmic video surveillance (VSA). 

Amnesty International France's Katia Roux told The Guardian that the surveillance program raises several human rights red flags. 

"We're deeply worried by the fact that these algorithms will be able to analyze images from fixed CCTV cameras or drones to detect 'abnormal or suspect' behavior," Roux said. "First, there is the issue of defining abnormal or suspect behavior – who will decide what behavior is the norm or not?"

The proposal could also have a chilling effect on freedom of expression, she added.

Additionally, even though the legislation doesn't include facial-recognition technology and says the security monitoring won't use biometric data, "in reality the algorithms will analyze behavior, and physical data, which is data that must be protected." ®

More about

TIP US OFF

Send us news


Other stories you might like