Blog

Class aptent taciti sociosqu ad litora

An Affront to Privacy: Clearview AI in Canada

  • February 14, 2021
  • Heather Ferg

Clearview AI scraped billions of photos from the internet and turned them into a powerful facial recognition tool used by at least 30 Canadian law enforcement agencies. Earlier this month, Canadian privacy officials released a report of their investigation into Clearview AI’s activities. The message from Daniel Therrien, the Privacy Commissioner of Canada, was clear: what Clearview does is mass surveillance and it is illegal.

1. What is Clearview AI?

Clearview AI (“Clearview”) is an American company that provides facial recognition software worldwide. The company was relatively unknown until Kashmir Hill’s 2018 New York Times piece, “The Secretive Company That Might End Privacy as We Know It” (here). In it, Ms. Hill explained the genesis of the company as a small New York start up founded by Australian Han Ton-That and former Rudolph Giuliani aide, Richard Schwartz. In 2017 Clearview began by marketing to law enforcement and in 2018 Ms. Hill’s article reported Clearview had provided its app to at least 600 law enforcement agencies and had licensed it to “at least a handful of companies for security purposes.”

While Clearview now claims to only provide their software to law enforcement, an investigative report published by Buzzfeed in February 2020 (here) showed Clearview’s technologies have historically been accessed by users in many sectors across the world. Reporters described search logs that revealed the app was used in at least 27 countries and users included numerous law enforcement agencies as well as: private security companies, banks & financial institutions, 50 educational institutions (2 of which were high schools), a sovereign wealth fund in the UAE, retail stores (including Walmart, Best Buy and Home Depot), casinos and entertainment companies (i.e. Madison Square Garden and Eventbrite), the NBA, and a Saudi Arabian research centre. A majority of the users were accessing the software using the company’s 30 day free trial feature and when Buzzfeed made inquiries, many employers had no idea their employees were using the app or had done so in the past.

The customer list (available in full on Clearview’s Wikipedia page, here) included 30 law enforcement agencies in Canada and the Royal Canadian Mounted Police was listed as a paying customer. The Toronto Police Service (which was accessing the account on free trials) ran more than 3,400 searches across 150 accounts (here and here). York Regional Police, who had previously stated they were not using the technology, turned out to have run approximately 500 searches without internal approval (here). In recent days, the British Columbia Privacy commissioner has confirmed that 5 police officers and one civilian had used the software (here).

2. How it Works

As with all facial recognition technologies (discussed in detail in my post, “Facial Recognition: The People Push Back“), Clearview’s software relies on comparing target photos to a known data set. While some systems use official data banks for comparison (such as mug shots or driver’s license photos), Clearview has populated its database with billions of photos from all over the internet using a process known as scraping. Scraping works by having an automated image crawler search public web pages (i.e., Facebook, Instagram, Twitter, etc.) and collect any images that contain faces. In addition to saving the images themselves, the crawler also saves the associated metadata such as the title, source link and description (here at para. 13).

3. The Canadian Privacy Commissioners’ Investigation

After public reports such as Hill’s New York times piece and this article in the Harvard Journal of Law and Technology, a number of Canadian government privacy offices launched an investigation into the legality of Clearview’s operations in Canada.

On February 2, 2021, the federal Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Information and Privacy Commissioner for British Columbia, and the Information Privacy Commissioner of Alberta (together, “the Canadian Privacy Commissioners”) released the report of their joint investigation of Clearview (“the report” here). The objects of the report were to determine whether Clearview had obtained the requisite consent to collect, use and disclose personal information and whether they had done so for a legitimate purpose. A legitimate purpose was defined as one that a reasonable person would consider appropriate in the circumstances and that was reasonable and to fulfill a legitimate need (para. 6).

The report considered the “four key sequential steps” of Clearview’s facial recognition: (1) scraping and storage of images of faces and associated data from publicly accessible online sources; (2) creating numerical biometric identifiers for each image; (3) allowing users to upload an image for comparison to the database; and, (4) providing a list of results which connects users to the original source page of any matches. The report noted that Clearview had amassed a database of over 3 billion faces which contained a vast number of Canadians (including children).

(a) Clearview’s Position

With respect to consent, Clearview took the position that the information it collected was “publicly available,” and they were therefore exempt from the consent requirements (para. 13). On the issue of purpose, Clearview initially told the Canadian Privacy Commissioners that its app “was intended to be for the sole and exclusive use of law enforcement” (para. 18). They argued that their services provide “substantial, concrete benefits to public safety by dramatically increasing law enforcement’s ability to identify and investigate suspects, victims and witnesses” (para. 19) and “obtain information quickly and accurately in the course of an ongoing investigation” (para. 80). Clearview argued that a reasonable person would find such assistance appropriate, reasonable and legitimate and “limiting such a service would arguably be at the expense of the public interest” (paras. 80 -82). They characterized any potential risk of harm to Canadians as “hypothetical” and akin to the level of harm associated with a regular Google search (para. 20). The Canadian Privacy Commissioners disagreed with all of these submissions.

(b) Findings on Consent

As the people who uploaded images would not expect their images to be collected and used for identification purposes, express consent was required under Canadian law. With respect to the “publicly available” exception, there is a difference between “publicly accessible” as that term might be used in everyday conversation and “publicly available,” a term with specific definitions in law.

The Canadian Privacy Commissioners held that “information collected from public websites, such as social media or professional profiles, and then used for an unrelated purpose, does not fall under the ‘publicly available’ exceptions” of the relevant legislation (para. 45). They found that the company was operating under an “erroneous interpretation of Canadian privacy law” and, contrary to their submissions, were required to obtain express opt-in individual consent to collect, use and disclose the information as they had (paras 36-37).

(c) Findings on Reasonable Purpose

The Canadian Privacy Commissioners rejected Clearview’s claim that their purpose was to provide a service to law enforcement. Rather, their activities were characterized as “mass identification and surveillance of individuals by a private entity in the course of commercial activity” (para. 70). Specifically, the report found that Clearview did not have an appropriate purpose for the following activities:

    1. the mass and indiscriminate scraping of images from millions of individuals across Canada, including children, amongst over 3 billion images scraped world-wide;
    2. the development of biometric facial recognition arrays based on these images, and the retention of this information even after the source image or link has been removed from the Internet; or
    3. the subsequent use and disclosure of that information for its own commercial purposes;

where such purposes:

    1. are unrelated to the purposes for which the images were originally posted (for example, social media or professional networking);
    2. are often to the detriment of the individual (for example, investigation, potential prosecution, embarrassment, etc.); and
    3. create the risk of significant harm to individuals whose images are captured by Clearview (including harms associated with misidentification or exposure to potential data breaches), where the vast majority of those individuals have never been and will never be implicated in a crime, or identified to assist in the resolution of a serious crime (para. 76).

They concluded that Clearview’s purpose was “neither appropriate nor legitimate” (para 68). While Clearview took the position Canada had no jurisdiction over its activities, in March 2020 the company stated it had suspended access to all users in Canada outside of the RCMP and in July 2020 it voluntarily exited the Canadian market (para 22).

4. Conclusion: An Affront to Individual Privacy

As Clearview exits the Canadian scene, one can be sure another service provider will take its place. Moving forward, the Clearview report will remain an important touchstone because it formally recognized the the harm to individuals caused by “the myriad of instances where false, or misapplied matches could result in reputational damage” and “the affront to individuals’ privacy rights and broad-based harm inflicted on all members of society, who find themselves under continual mass surveillance by Clearview based on its indiscriminate scraping and processing of their facial images” (para. 89).

In his statement of February 3, 2021 (here), Privacy Commissioner of Canada Daniel Therrien stated Clearveiw’s activities were illegal and completely unacceptable. While Clearview has been held to account, it remains to be seen whether Canadian courts will find that the widespread deployment of such a problematic tool to gather evidence by police is similarly deemed unacceptable.

Comments are closed.