Technology company Clearview AI’s scraping of billions of images of people from across the Internet represented mass surveillance and was a clear violation of the privacy rights of Canadians, the Office of the Privacy Commissioner of Canada has found after conducting an investigation into the matter.
The joint investigation by the Office of the Privacy Commissioner of Canada, the Commission d'accès à l'information du Québec, the Office of the Information and Privacy Commissioner for British Columbia and the Office of the Information and Privacy Commissioner of Alberta, concluded that the New-York-based technology company violated federal and provincial privacy laws.
“What Clearview does is mass surveillance and it is illegal. It is completely unacceptable for millions of people who will never be implicated in any crime to find themselves continually in a police lineup. Yet the company continues to claim its purposes were appropriate, citing the requirement under federal privacy law that its business needs be balanced against privacy rights. Parliamentarians reviewing Bill C-11 may wish to send a clear message, through that bill, that where there is a conflict between commercial objectives and privacy protection, Canadians’ privacy rights should prevail,” said Daniel Therrien, Privacy Commissioner of Canada.
Clearview AI’s technology allowed law enforcement and commercial organizations to match photographs of unknown people against the company’s databank of more than 3 billion images, including of Canadians and children, for investigation purposes. Commissioners found that this creates the risk of significant harm to individuals, the vast majority of whom have never been and will never be implicated in a crime.
The investigation found that Clearview had collected highly sensitive biometric information without the knowledge or consent of individuals. Furthermore, Clearview collected, used and disclosed Canadians’ personal information for inappropriate purposes, which cannot be rendered appropriate via consent, the Office of the Privacy Commissioner said.
“Clearview's massive collection of millions of images without the consent or knowledge of individuals for the purpose of marketing facial recognition services does not comply with Quebec's privacy or biometric legislation. The stance taken by Clearview that it is in compliance with the laws that apply to it, underscores the need for greater oversight of the use of this technology as well as providing regulatory authorities with additional tools of deterrence like those proposed in Bill 64,” said Diane Poitras, President of the Commission d'accès à l'information du Québec.
When presented with the investigative findings, Clearview argued that:
- Canadian privacy laws do not apply to its activities because the company does not have a “real and substantial connection” to Canada;
- Consent was not required because the information was publicly available;
- Individuals who placed or permitted their images to be placed on websites that were scraped did not have substantial privacy concerns justifying an infringement of the company’s freedom of expression;
- Given the significant potential benefit of Clearview's services to law enforcement and national security and the fact that significant harm is unlikely to occur for individuals, the balancing of privacy rights and Clearview’s business needs favored the company’s entirely appropriate purposes; and
- Clearview cannot be held responsible for offering services to law enforcement or any other entity that subsequently makes an error in its assessment of the person being investigated.
Commissioners rejected these arguments. They were particularly concerned that the organization did not recognize that the mass collection of biometric information from billions of people, without express consent, violated the reasonable expectation of privacy of individuals and that the company was of the view that its business interests outweighed privacy rights.
On the applicability of Canadian laws, they noted that Clearview collected the images of Canadians and actively marketed its services to law enforcement agencies in Canada. The RCMP became a paying customer and a total of 48 accounts were created for law enforcement and other organizations across the country.
The investigation also noted the potential risks to individuals whose images were captured and included in Clearview’s biometric database. These potential harms include the risk of misidentification and exposure to potential data breaches.
The privacy authorities recommended that Clearview stop offering its facial recognition services to Canadian clients; stop collecting images of individuals in Canada; and delete all previously collected images and biometric facial arrays of individuals in Canada.
“As the use of facial recognition technology expands, significant issues around accuracy, automated decision making, proportionality and ethics persist. The Clearview investigation shows that across Canada we need to be discussing acceptable uses and regulation of facial recognition. Regulation would not only assist in upholding privacy rights, it would provide much needed certainty to all organizations thinking about using or developing the technology,” said Jill Clayton, Information and Privacy Commissioner of Alberta.
Shortly after the investigation began, Clearview agreed to stop providing its services in the Canadian market. It stopped offering trial accounts to Canadian organizations and discontinued services to its only remaining Canadian subscriber, the RCMP in July 2020.
However, Clearview disagreed with the findings of the investigation and did not demonstrate a willingness to follow the other recommendations. Should Clearview maintain its refusal, the four authorities will pursue other actions available under their respective Acts to bring Clearview into compliance with Canadian laws.
A related investigation by the Office of the Privacy Commissioner of Canada into the RCMP’s use of Clearview AI’s facial recognition technology remains ongoing. The federal Commissioner's office, along with provincial counterparts, are currently developing guidance for law enforcement agencies on the use of facial recognition technologies. We expect to publish guidelines for consultation with stakeholders in the spring.