Russian authorities continue expanding their use of facial recognition technology across the country, with no regulation, oversight, or data protection and against the backdrop of misidentification reports, Human Rights Watch said today. Their unregulated use of the technology has serious implications for human rights and fundamental freedoms and has already facilitated Russian authorities’ targeting of political opponents.
On August 27, 2021, Moscow’s government published a call for bids to upgrade the system law enforcement uses to access the facial images and video footage collected by the city’s vast CCTV network. The upgrade would enable police to track movements of targets and those who frequently associate with them and identify people repeatedly appearing at a particular location. Authorities also plan to expand the filters for data searches, such as by gender, age, or race.
“Russian authorities should stop expanding their irresponsible and unregulated use of highly intrusive facial recognition technology,” said Hugh Williamson, Europe and Central Asia director at Human Rights Watch. “Weighty privacy concerns outweigh any alleged public safety benefits, and there is a need to protect fundamental rights such as liberty and freedom of association against the Kremlin’s misuse of this technology.”
Moscow authorities also plan to facilitate silhouette recognition. This technology creates an allegedly unique “silhouette” of a person based on height, size, clothing, and other factors to track movements when a subject’s face is not visible. New video surveillance systems allowing for silhouette and vehicle recognition have already been introduced in five regions of Russia.
The Moscow IT Department said the planned upgrade aims to create a safer environment for Moscow residents and to improve data protection. According to the authorities, in July, the Moscow metro facial recognition system detected 221 people from a list of wanted criminal suspects.
Privacy lawyers and digital rights groups have already expressed concerns over the human rights implications of the upgrade, including enhancing the Kremlin’s capacity to surveil and harass anyone it perceives as opponents.
The analytical agency Telecom Daily ranked Russia as the country with the second-largest CCTV network growth for the current year. According to the Moscow government, video analysis algorithms process video streams from the city’s 125,000 cameras. Facial recognition has a wide range of uses in Moscow, from monitoring public transportation payments to traffic monitoring and a school pass system. More than 5,000 cameras with facial recognition software already operate in other regions of Russia.
Despite the widespread use of facial recognition technology and vast amounts of data gathered by the authorities, Russian law still does not regulate the use of such technology, except in banking.
Russia’s law on Personal Data grants protection to information related to an identifiable person. However, Moscow’s IT Department claims that the data gathered by the city cameras cannot be considered personal since it is collected and stored anonymously. Instead, the officials argue, identification is carried out by the law enforcement agency that has access to the system. Moscow courts have upheld this position. At the same time, law enforcement procedures for processing data received from the facial recognition surveillance systems are not open to public scrutiny.
In April 2020, the Russian parliament adopted a law “On Experimenting with Artificial Intelligence,” allowing Moscow authorities to test new technology, including facial recognition, free of most of the personal data legislation restrictions.
“Federal law on experimenting with artificial intelligence made Moscow authorities feel like they can do whatever they want,” said Kirill Koroteyev, the head of the international justice program at Agora, a leading Russian network of human rights lawyers. “Thus, they do not only neglect the legislation on personal data but also disregard international standards.”
The lack of regulation and accountability for the use of facial recognition technology at the federal level has allowed a lack of transparency and data protection in local policies, Human Rights Watch said.
Roskomsvoboda, a prominent Russian digital rights organization, has documented multiple data leaks from the Moscow facial recognition system. Publicly available policies around processing Moscow’s video surveillance data do not regulate the use of facial recognition data. The existing policies are also reported to differ from the way the data is used.
In September 2020, an activist with Roskomsvoboda filed a lawsuit against the Moscow government after being able to purchase online information leaked to the dark web from the city’s CCTV system with facial recognition software, which revealed her whereabouts over the course of one month. At the trial, Roskomsvoboda lawyers pointed out that while Moscow’s data processing policies require that data must not be stored for more than two to five days, the plaintiff was able to acquire data for a longer period.
The data from Moscow’s video surveillance system with facial recognition is stored at the United Center for Data Processing and Storage managed by the Moscow’s IT Department. The Russian government is planning to expand the video surveillance system with facial recognition all over Russia and potentially to combine all the data gathered countrywide.
While the police increasingly rely on facial recognition in their daily work, reports show that the technology is far from flawless.
In October 2020, Sergey Mezhuyev reported to Roskomsvoboda that he had been mistakenly detained by the police in the Moscow metro. The facial recognition system falsely matched his image with that of a person on the wanted list, instantly notifying the police. Mezhuyev said the police officers detained, searched, and fingerprinted him and, while acknowledging the error, warned that his “troubles won’t be over” until they find the man they wanted. Mezhuyev said the system compared his image to a rough composite sketch of the criminal suspect.
In November, a security officer stopped Anton Leushin at a mall, claiming the facial recognition system identified him as a thief, and called the police. After six hours of interrogation at the police station and threats of eight years in prison, Leushin managed to prove his alibi and was released, he said.
The authorities already employ the facial recognition technology at their disposal to prosecute political opposition and peaceful protesters.
Following protests in January over the arrest of the political opposition leader Alexey Navalny and government corruption, media reported the detention and prosecution of more than a dozen protest participants and passers-by based on the facial recognition data. Some reported being stopped by the police days after – or even before – the protest for being on the list of “repetitive participants of unsanctioned protests.”
In April, Moscow authorities continued using the facial recognition system to identify and prosecute participants of the peaceful protests.
“The use of facial recognition technology to curtail the freedom of expression and association of people in Russia shows the repressive potential of this technology,” Williamson said. “The government should stop using this intrusive technology in public spaces instead of continuing to expand its use without minimal safeguards in place.”