Mass Surveillance Is Bad News for Privacy — and Democracy

Canada is reviewing its privacy legislation, and its facial recognition technologies are under scrutiny. It’s well past time to strictly regulate their usage by both public and private actors.

A screen demonstrates facial recognition technology at the 2019 World Artificial Intelligence Conference. (Qilai Shen / Bloomberg via Getty Images)

It’s not paranoia if they’re really out to get you — or so goes the old line. In a recent report on the panoptic eye in the sky, Canada’s Standing Committee on Access to Information, Privacy and Ethics hasn’t gone quite as far as to drop that old chestnut. But they have recommended a handful of measures to prepare a framework for the use of surveillance technology in Canada, particularly as it pertains to its use by law enforcement.

The Royal Canadian Mounted Police (RCMP) paused its use of artificial intelligence and facial recognition software in 2020 after technology from Clearview AI (one of the RCMP’s contractors) was deemed illegal as mass surveillance by the Office of the Privacy Commissioner of Canada. As Maura Forrest reports for Politico, the RCMP is keen to resume using the technology. They’re also cagey about the technologies they rely upon and the extent to which they plan to use these surveillance systems in the future. That’s bad news for privacy — and democracy.

AI and facial recognition technology are important tools for policing crimes such as human trafficking and child sexual exploitation. However, Canada lacks sufficient legislation for controlling these technologies and holding those who use them to account for potential overreach and abuses. And they are indeed ripe for abuse. As the Canadian Press notes, members of Parliament expressed concern about privacy protection issues “including accuracy, retention of data and transparency in facial recognition initiatives, as well as a comprehensive strategy around informed consent by Canadians for the use of their private information.” Those concerns extend to both state use of surveillance technologies and the private sector’s collection of individuals’ images online and in public. So far, Canada is flying blind on facial recognition surveillance, and that’s unacceptable.

The Right to Privacy

Democracy and privacy go hand in hand. While so much of democratic life is lived in the open — in the public sphere — plenty more of it occurs, by necessity, beyond the gaze of the public eye, and especially that of the state. Moreover, healthy public lives require healthy private lives where citizens can go about their business confident they aren’t being surveilled by state or private interests. It is vital that people can freely spend time sorting out who they are, what they want, and how they wish to live without the pressures that come with being watched. Moreover, privacy is an essential check on state power and its abuse. Without privacy, individuals and groups cannot freely and fully exercise the rights liberal democrats are so quick to insist belong to all who live within the liberal democratic framework.

Philippe Dufresne, the Privacy Commissioner of Canada, welcomed the Standing Committee’s report, saying it “reiterates the pressing necessity of ensuring the appropriate regulation of privacy impactful technologies such as facial recognition and artificial intelligence in a way that protects and promotes Canadians’ fundamental right to privacy.”

Parliament is reviewing Bill C-27, which would enact three privacy-related acts and amend a handful of others. The government is under pressure to strike a balance between the need to respond to new realities brought about by digital technologies and their use, and the age-old right to individual privacy.

Open Media, an open-internet advocacy organization, has been critical of the legislative review undertaking, calling parts of it an “absolute bare minimum for privacy protections in Canada,” and warning that “in some cases it will make things actually worse.” That’s without diving into the Artificial Intelligence and Data Act, a part of Bill C-27 which pertains to a number of surveillance issues. But the other elements of the bill don’t bode well for a high privacy score on regulating facial recognition. Writing in the Toronto Star, Teresa Scassa, a research chair in information law and policy at the University of Ottawa, critiques Bill C-27, arguing its measures favor industry over individuals and noting that there “is no level playing field between individuals and organizations when it comes to data.”

The same may be true when it comes to policing and, more broadly, to the state. Scassa focuses on imbalances between the individual and industry, noting, “In order to access many of the services we depend upon (including banking and telecommunications) and increasingly across every human activity, our data are collected and we are tracked and monitored.” We risk the same imbalance between the state and citizens who have no choice but to use government services and to be out and about in public. The key difference is that unlike industry, governments are meant to be accountable to citizens and are meant to reflect the will of the people — or so goes the theory of representative democracy.

Privacy Is Essential to Democracy

Scassa concludes that we cannot simply take the word of the government that it will protect us — that trust must be earned. When it comes to surveillance technologies there is simply no reason to trust the state at all. We must insist that the state slow down its deployment of surveillance technologies, and consult broadly and deeply with the public, experts, and parliamentarians on what is to be done. This is particularly important because once powers are given — or taken — they tend to be difficult to rescind. We may be locking ourselves into a long-term surveillance arrangement with laws and norms of acceptable use. We ought to take our time now to get that right.

The Ottawa-based International Civil Liberties Monitoring Group highlights five of its recommendations that were picked up by the privacy and ethics committee to be included in its report. Those measures include “the establishment of ‘no-go zones’ where facial recognition is prohibited” and “a moratorium on the use of [facial recognition technology, or FRT] by federal law enforcement until new legislation to regulate police use of FRT is put in place.” Furthermore, greater public consultation and specific FRT governance legislation is needed alongside a beefier role for the Office of the Privacy Commission in enforcing “both public and private sector violations of privacy laws.” These recommendations ought to be nonnegotiable elements of Canada’s approach to regulating facial recognition technologies.

Privacy is essential for democracy. A free and democratic society can only function as such if people can trust they are not being surveilled by either public or private actors without due cause and/or freely given consent. And consent cannot be freely given if there’s a significant imbalance of power. Canada’s government ought to heed the recommendations of the privacy and ethics committee, slow down and take its time, consult with the electorate, and ensure they thoroughly protect the privacy of Canadians and restrain the use of current and future surveillance technologies.