Canada’s Office of the Privacy Commissioner (OPC) announced yesterday that the Royal Canadian Mounted Police (RCMP) broke the law by using Clearview AI facial-recognition software.
An OPC investigation launched in July 2020 concluded in February this year that Clearview AI violated the country’s federal private sector privacy law when it created a three-billion-image databank by scraping social media accounts without user consent. Now the OPC has decided the RCMP’s use of the database to match images violated the country’s Privacy Act.
Federal Privacy Commissioner Daniel Therrien said in a canned statement:
The use of facial recognition technology by the RCMP to search through massive repositories of Canadians who are innocent of any suspicion of crime presents a serious violation of privacy.
A government institution cannot collect personal information from a third-party agent if that third-party agent collected the information unlawfully.
The RCMP initially denied using the software, both publicly and to Therrien, who is also an independent officer of Parliament. However, journalists from Toronto Star and BuzzFeed found that the RCMP had purchased the software, forcing it to reconsider that denial.
The RCMP then claimed it was only using Clearview AI to rescue children who were victims of online sexual exploitation — a story the organisation is still using.
The OPC said the vast majority of the RCMP’s hundreds of searches could not be explained. RCMP credits this discrepancy to a difference in how the two parties tracked searches — RCMP by investigative file and the OPC by the number of searches made in the software platform. The RCMP admits that other units used Clearview AI on a “test basis”.
The RCMP was Clearview AI's last remaining client in Canada, and ceased use of the software when it was removed from the country during an investigation.
- US Homeland Security sued for 'stonewalling' over use of Clearview facial recognition
- Los Angeles police ban facial recognition software and launch review after officers accused of unauthorized use
- ICE to see you: Homeland Security's immigration cops tap up Clearview AI to probe child exploitation, cyber-crime
- AWS won't sell facial-recog tool to police for a year – other law enforcement agencies are in the clear
Meanwhile, Clearview AI claims Canadian privacy law shouldn’t apply to US-based companies. The RCMP said it can’t be held accountable for the behind-the-scenes way its tools were engineered, as that creates an unreasonable obligation and is not explicitly required by law.
Nonetheless, the RCMP said it has “accepted all of the recommendations of the OPC and [has] already begun effort towards their implementation."
These recommendations include the RCMP undertaking privacy assessments of third-party data collection practices to determine legality, and creating a new technology privacy-focused oversight function.
The OPC will issue draft guidance for law enforcement use of facial recognition technology to aid police in compliance and respect of privacy.
The OPC would also like to see formal law in place. Therrien said:
We encourage Parliament to amend the Privacy Act to clarify that federal institutions have an obligation to ensure that third-party agents it collects personal information from have acted lawfully.
Last month, Clearview AI received complaints from data rights groups in the UK, France, Austria, Greece and Italy that the software violates EU and UK General Data Protection Regulation (GDPR). In response, Clearview AI told The Register that it does not have contracts with, nor provide access to, customers in the European Union.
Clearview AI is used by some law enforcement agencies in the USA. In April, four civil and immigration rights non-profits sued The US Department of Homeland Security and its law enforcement agencies, Immigration and Customs Enforcement (ICE) and Customs and Border Protection (CBP) for failing to respond to a Freedom of Information Act request regarding their use of the technology. ®