Guillermo Federico Ibarrola was waiting for a train in the Buenos Aires metro when he was picked up by two police officers, booked and detained for six days. Ibarrola hadn’t committed a crime, nor was he a witness to one. Rather, Ibarrola was the unfortunate victim of a glitch in the city’s newfangled, artificial intelligence–based facial recognition system, installed citywide in 2019 and now covering 75 percent of the Argentinian capital.
Even some years after such biometric systems were developed, most European capitals haven’t followed Argentina’s example. But now, this could change. As large swaths of France take to the streets in protest against a proposed rise in the pension age, the National Assembly has been quietly pushing forward a bill related to the upcoming 2024 Olympics that privacy experts say will drastically increase the state’s policing and surveillance capacities.
The so-called Olympics and Paralympics law is being fast-tracked through Parliament after being proposed by Emmanuel Macron’s ruling Renaissance party (formerly En Marche!) last December. If adopted, it will pave the way for the installation of a controversial, if “experimental” according to its proponents, AI-based biometric categorization system aimed at detecting “predetermined events,” such as crowd movements, terrorist attacks, or acts of violence. In short, the bill is named after the Olympics, but it’s about much more.
“The government’s intention through the bill is to increase the social acceptability in France of generalized surveillance,” Elisa Martin, a deputy from the left-wing New Ecological and Social Popular Union (NUPES) coalition, told Jacobin.
An “Experimental” System
At the heart of the controversy surrounding the proposed Olympics and Paralympics bill is Article 7. This article states that “images collected by means of video protection systems,” including surveillance cameras around stadiums, metros, and other areas, “may be subject to processing, including through an artificial intelligence system.”
The proposed AI-based system is “experimental,” the authors write, and was initially planned to last through June 2025 — well beyond the Olympic Games, which are slated for August 2024. (This was later amended to last through the end of December 2024.) According to the authors of the law, the system would not include facial or biometric identification, but rather would use AI to identify “atypical” or “risky” behavior, such as standing in place for long periods of time or loud voices.
Privacy advocates say that while these systems are billed as not including facial recognition, for all intents and purposes they can be used in the same way.
“There’s a whole range of things you can do with that data,” including grouping people together by hair or skin color, recognizing emotions and making inferences about behavior based on these categories, Daniel Leufer, a senior policy analyst at Access Now, told me. “You’re capturing people’s behavior, the way they walk, the way that they’re shouting. The types of data that you’re gathering or processing about people will allow for them to be identified.”
In recent weeks, civil society groups have joined the chorus of mostly left-wing politicians opposing the law. On March 7, thirty-eight organizations, including Access Now, Amnesty International, and AlgorithmWatch, a Berlin-based AI watchdog group, signed an open letter asking for Article 7 to be removed from the law.
Katia Roux, who heads advocacy at Amnesty International France, told Jacobin that the law, in addition to infringing on antidiscrimination and privacy standards, “can have an extremely dissuasive effect on the freedom of expression, the freedom of peaceful assembly.” “When we know we’re being surveilled, or when we think we’re under surveillance obviously we don’t act the same way,” she said. “There can be a form of self-censorship.”
That the system would be deployed in Saint-Denis, the largely minority, working-class suburb north of Paris where much of the Olympics infrastructure is being built, she added, only adds to the potential for misuse.
Martin, from NUPES, noted that Article 7 is not the only concerning element of the Olympics and Paralympics law. It also includes US Transportation Security Administration (TSA)–style body scanners at the entrance to stadiums and the deployment of tens of thousands of police during the upcoming Games.
The proposed law, she added, is one of several laws and directives passed in recent months that aim to criminalize dissent in France, including a directive passed by the Interior Ministry that greatly expands the scope of misdemeanor fines. “Who is this aimed at? It’s aimed at poor people, it’s aimed at activists,” she said. “It’s part of a civil liberty-threatening continuum that’s being slowly and steadily reinforced.”
Surveillance Shock Doctrine
This wouldn’t be the first time the Olympics are used to push through extensive security measures, Jules Boykoff, the author of the book Power Games: A Political History of the Olympics, told me.
“When it comes to policing and the Olympics, local and national police forces in the host city and country use the Olympics like their own private cash machine, getting all the funding, special weapons and even laws that it would be very difficult to get during normal political times,” Boykoff told Jacobin. “And so, what we’re seeing transpire in Paris is what we see transpire in numerous cities.”
The 2004 Summer Olympics in Athens were a case in point.
Throughout those Games — the first to take place after 9/11 — a large blimp fitted with cameras surveilled Athens. Seventy-thousand police and military forces marauded the streets. In computer centers hidden from public view, security agents monitored thousands of surveillance cameras that had been set up to identify “suspicious behavior.”
The security was so intensive that one Greek scholar referred to that year’s Olympics as a “super-panopticon,” in reference to a nineteenth-century prison design that allowed one guard to simultaneously monitor all of a penitentiary’s inmates without them being able to tell if they’re being watched.
Rio de Janeiro and Tokyo, other host cities, saw similarly expansive security measures passed, Boykoff writes in Power Games.
In France, activists worry that the “state of exception” created by the Olympics will lead to a long-term buildup of surveillance technology, much as it did in Greece — which is currently in the midst of a spying scandal nicknamed Predator-gate, brought about by allegations that the ruling party used a spyware tool called Predator to spy on journalists and politicians.
“The Olympics are an accelerator of surveillance,” “Alouette,” a privacy activist at La Quadrature du Net, a digital rights group, who preferred to remain anonymous for security reasons, told Jacobin. “For the security industry, especially the French industry, they are a huge opportunity.” French, and international, surveillance companies “really see the Olympic Games as a security showcase that allows them to show off their know-how,” she added.
While it’s unclear which company would provide the technology for AI-based surveillance for the Olympics, one company, the Israeli firm BriefCam, already deploys video surveillance technologies in more than two hundred municipalities across France, La Quadrature has noted.
To Boykoff, the idea of using the Olympics to push through these types of measures is common because of the popular nature of the Games. “You get people kind of used to it in that happy face, celebration capitalism environment of the Olympics, and then that new technology that was injected during the Games in that state of exception becomes the norm for policing moving forward.”
A European Precedent
The French law also comes as the European Union is preparing to pass an AI Act that would regulate artificial intelligence. This legislation, proposed in April 2021, could be adopted as soon as next month, and would place significant restrictions on algorithmic video surveillance.
According to Roux, the passage of the Olympic law would position France as Europe’s “champion of surveillance” and could be used as a way to hack at the AI bill from below, weakening future surveillance legislation in the EU.
“The precedent that it creates and the signal it sends to member states is extremely dangerous,” she said.
Other security experts, meanwhile, questioned the effectiveness of algorithmic surveillance.
“You have an inherent problem if you’re using machine learning to predict rare events,” Leufer said. “If that’s what you’re doing with your system, that should be a red flag already. That’s precisely the type of thing that machine learning is not good at. The less data you have on the type of thing that you’re trying to predict, the worse the system is going to be at it.”