GATINEAU, QC, June 10, 2021 – The RCMP’s use of facial recognition technology (FRT) to conduct hundreds of searches of a database compiled illegally by a commercial enterprise is a violation of the Privacy Act, an investigation has found.
In a Special Report to Parliament, the Office of the Privacy Commissioner of Canada (OPC) shared its findings regarding the RCMP’s use of Clearview AI, a technology company that was itself the subject of a previous OPC investigation. The OPC, along with its provincial and territorial counterparts, also announced the launch of a public consultation to help establish clearer rules and consider whether new laws are desirable.
Clearview AI was found to have violated Canada’s federal private sector privacy law by creating a databank of more than three billion images scraped from internet websites without users’ consent. Clearview users, such as the RCMP, could match photographs of people against the photographs in the databank.
“The use of FRT by the RCMP to search through massive repositories of Canadians who are innocent of any suspicion of crime presents a serious violation of privacy,” Commissioner Daniel Therrien says. “A government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.”
The RCMP is no longer using Clearview AI as the company ceased to offer its services in Canada last summer in the wake of our then ongoing investigation. However, the OPC remains concerned that the RCMP did not agree with its conclusion that it contravened the Privacy Act.
While the OPC maintains the onus was on the RCMP to ensure the database it was using was compiled legally, the RCMP argued doing so would create an unreasonable obligation and that the law does not expressly impose a duty to confirm the legal basis for the collection of personal information by its private sector partners.
Commissioner Therrien says this is just another example of how public-private partnerships and contracting relationships involving digital technologies are creating new complexities and risks for privacy.
“Activities of federal institutions must be limited to those that fall within their legal authority and respect the general rule of law,” he says. “We encourage Parliament to amend the Privacy Act to clarify that federal institutions have an obligation to ensure that third party agents it collects personal information from have acted lawfully.”
In the end, the RCMP agreed to implement the OPC’s recommendations to improve its policies, systems and training. This includes conducting fulsome privacy assessments of third party data collection practices to ensure any personal information is collected and used in accordance with Canadian privacy legislation.
The RCMP is also creating a new oversight function intended to ensure new technologies are on-boarded in a manner that respects individuals’ privacy rights.
“The data involved in FRT speaks to the very core of individual identity and as both commercial and government use of the technology expands, it raises important questions about the kind of society we want to live in,” Commissioner Therrien says.
Draft guidance for police agencies and consultation
In an effort to provide some clarity to police agencies that are increasingly looking to FRT to solve crime or find missing persons, the OPC, along with its provincial and territorial privacy counterparts, have also issued draft guidance to assist police in ensuring any use of FRT complies with the law, minimizes privacy risks and respects privacy rights.
The draft guidance emphasizes that police agencies must have lawful authority for the proposed use of the technology and the importance of applying privacy protective standards that are proportionate to the potential harms involved.
“FRT is a powerful tool that has the potential to offer great benefits to society but it can also be a highly invasive surveillance technology fraught with many risks,” Commissioner Therrien says.
“It can provide racially biased results, erode privacy and undermine other rights such as freedom of peaceful assembly. FRT must be used responsibly and very carefully.”
Regulators will be consulting with police forces and other stakeholders on the guidance in the weeks and months ahead. It will be important to have a public discussion on how this technology should be used.
The process of establishing appropriate limits on FRT use remains incomplete. Unlike other forms of biometrics collected by law enforcement, facial recognition is not subject to a clear and comprehensive set of rules. Its use is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by the technology. This creates room for uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances.
“The question of where acceptable FRT use begins and ends is in part a question of the expectations we set now for the future protection of privacy in the face of ever-increasing technological capabilities to intrude on Canadians’ reasonable expectations of privacy,” Commissioner Therrien says.
“The nature of the risks posed by FRT calls for collective reflection on the limits of acceptable use of the technology.”
While the focus of this report is on the application of privacy laws and best practices for the use of FRT by police, the deployment of FRT writ large is worthy of closer examination as to whether our laws adequately protect Canadians from potential misuses of the technology.
Related documents…
Each Facebook User is Monitored by Thousands of Companies
This article was copublished with Consumer Reports, an independent, nonprofit organization…