On February 3, 2021, the Privacy Commissioner of Canada (OPC), the Commission d’accès à l’information du Québec (CAI), the Information and Privacy Commissioner for British Columbia (OIPC BC), and the Information and Privacy Commissioner of Alberta (AB OIPC, and together with the BC OIPC, the CAI and the OPC, the Regulators), published a joint Report of Findings (the Report) following an investigation into whether Clearview AI, Inc.’s (Clearview) collection, use and disclosure of the personal information by means of its facial recognition tool, complied with federal and provincial private sector privacy laws (Privacy Laws).

In the wide-ranging Report, the Regulators characterize Clearview’s activities as mass surveillance and as an affront to the privacy rights of individuals,1 and cover such issues as the scope of the consent exception for publicly available information, whether Clearview’s collection was for an appropriate purpose, and obligations relating to biometrics.

Background

Clearview, a technology company based in the United States, created a facial recognition software system incorporating a database that links images obtained from a variety of online sources with (i) facial recognition data derived from those images and (ii) hyperlinks to the online source. The Clearview system allows clients to upload a digital image of an individual’s face and run a search against it. The Clearview system then applies its facial recognition algorithm to the digital image, and runs the result against Clearview’s database to identify and display likely matches and associated source information.2

In January and February 2020, news media reported that Clearview was populating its facial recognition database with digital images collected from a variety of public websites (predominantly social media), 3 and that a number of Canadian law enforcement agencies and private organizations had used Clearview’s services in order to identify individuals.4

The Privacy Regulators opened a joint investigation into Clearview in February of 2020.

Decision

The core aspects of the finding are unsurprising and align with existing jurisprudence and Regulator guidance.

The Regulators affirmed that both federal and provincial Privacy Laws apply to Clearview’s activities,5 and that information posted on public social media profiles does not qualify for the “publicly available” or “which by law is public” consent exceptions provided for in Canadian private sector privacy laws.6 Early on in its discussion of the definition of publicly available information, the Report refers to a previous finding from the Office of the Privacy Commissioner of Canada, which concluded that information available on social media sites under the Personal Information Protection and Electronic Documents Act (PIPEDA).7 The Report concludes that Clearview should have obtained consent.

Consistent with earlier findings,8 the Report also concluded that facial biometrics are particularly sensitive personal information and that therefore Clearview should have obtained express, opt-in consent before it collected the images of any individual in Canada.9

Furthermore, the Report found that Clearview’s purpose in creating its system is not a purpose that a reasonable person would consider appropriate, reasonable, or legitimate in the circumstances.10 As such, even if Clearview had obtained consent (whether express or not), that consent would have been invalid.

The Report also sets out some general remarks in relation to appropriate purposes, including the known potential for systems similar to Clearview’s to generate false positives,11 the possibility that Clearview collected the personal information in breach of the terms of service for the various social media platforms,12 and the risk of harm engendered by the creation of a massive centralized database of sensitive facial biometric data.13

Finally, the Report found that in Quebec, Clearview’s system falls within the scope of legislation requiring express consent for the collection of biometric information. Moreover, as a creator of a database of biometric characteristics or measurements operating in Quebec, Clearview ought to have reported the existence of that database to the CAI.14

Commentary

Consent

As mentioned, the Regulators affirmed that information posted on public social media profiles does not qualify for the “publicly available” or “which by law is public” consent exceptions provided for in Privacy Laws. As such, Clearview should have obtained consent.

Since Clearview had stated from the outset that it relied on such exceptions and did not seek consent, the Report could have ended its analysis at that point as dispositive of the issue. Despite this, a considerable amount of space in the Report is given over to dissecting whether Clearview’s purposes are appropriate.

It is likely that the Regulators were concerned to ensure that this decision would be future-proofed in relation to Bill C-11, which sets out new privacy legislation that would replace PIPEDA, as well as legislative changes that may eventually come to pass at the provincial level.

For example, the Bill duplicates PIPEDA’s language for consent exceptions for publicly available information,15 leaving the specifics to regulation. Concerns have been raised16 about the pressure to expand the definition of publicly available information to include situations where individuals decide to post personal information on a public website.17 If Bill C-11 passes into law, a new regulation would be drafted and, if the definition were expanded, the analysis provided in the Report might be weakened or eviscerated.

Moreover, even if the new definition of publicly available information provided for in a new regulation excluded social media and other public websites, another provision of Bill C-11 could be construed as permitting activities that Clearview engages in, such as scraping personal information from such sites. Section 18(2)(e) of Bill C-11 permits the collection and use of personal information without an individual’s knowledge or consent in respect of a business activity “in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual”.18

Given, however, that Bill C-11 also replicates PIPEDA’s “appropriate purposes” provision, this analysis could still be used even where a consent exception applied.19 As such, it seems likely that the Regulators thought it prudent to address the appropriateness of purpose in case Bill C-11 and potential reforms of the provincial privacy laws materially alter the Report’s analysis relative to consent and exceptions thereto.

Organizations should therefore bear in mind that Canadian privacy regulators will likely approach future investigations and analysis with an eye to ensuring that their findings remain durable under C-11.

Appropriate purposes

The section of the Report that discusses appropriate purposes concludes that “continual mass surveillance by Clearview based on its indiscriminate scraping and processing of their facial images” is an “affront to individuals’ privacy rights”.20

While that may be something of an overstatement, there is no question that the system created by Clearview facilitates surveillance, and can improve the surveillance capabilities of its clients. Depending on the other resources available to those client organizations, such use could rise to the level of mass surveillance in some cases.

In coming to the conclusion that Clearview’s activities represented mass surveillance, the Report characterized the activities of Clearview as follows: the mass scraping of images of individuals, including children; the development of facial recognition arrays based on those images; and the collection of source links, all for commercial purposes unrelated to the original purpose for which the images were posted, and which may have detrimental effects on or create a risk of significant harm to those individuals.21 Taken altogether, the Regulators found that this is not a purpose that a reasonable person would consider appropriate, reasonable, or legitimate in the circumstances.22

This analysis, however, presents a problem…

Read The Full Article at BLG

Check Also

Canada’s Artificial Intelligence and Data Act: Impact for businesses

On June 15, 2022, the Minister of Innovation, Science and Industry, François-Phillippe Cha…