Social media and messaging service companies need to do much more to meet their human rights responsibilities in Ukraine and other crises and conflicts around the world, Human Rights Watch said today.
Since Russia‘s invasion of Ukraine on February 24, 2022, companies such as Meta, Google, TikTok, and Twitter have announced numerous measures, most aimed at countering harmful disinformation. They have added labels to or blocked state-sponsored or state-affiliated media, and introduced extra safety measures. While it is too early to fully assess the adequacy of these steps, early reports indicate that measures to counter harmful disinformation and misinformation are falling short and raise serious question and concerns about whether these companies are meeting their human rights responsibilities.
“While most large social media platforms rushed to issue statements and enact emergency measures, the war in Ukraine brought to the surface what Human Rights Watch and others have documented for years,” said Deborah Brown, senior technology researcher at Human Rights Watch. “Companies have chronically underinvested in addressing the human rights issues in many countries where people rely on their products and services, and Ukraine is no exception.”
To explain the human rights responsibilities of technology companies in crises, Human Rights Watch released a detailed question and answer document, “Russia, Ukraine, and Social Media and Messaging Apps.” It explains what the companies have done on Ukraine before and during the crisis, and the extent to which they are meeting their responsibility to respect human rights. It provides background on these companies’ actions and inactions in other conflicts and offers recommendations for how they should meet their human rights responsibilities in wars and crises.
One clear finding is that companies should exhibit more clarity, consistency, and transparency so their actions can be assessed against their human rights responsibilities, which apply wherever their services are used.
Russia’s invasion laid bare the companies’ earlier lack of investment and transparency in Ukraine, Human Rights Watch said. As early as 2015, the country’s previous president reportedly asked Facebook to stop the Kremlin from spreading misinformation on the social network, which he said was mobilizing support for Russia’s occupation of parts of Ukraine. In September 2021, the Ukrainian minister of digital transformation asked Google officials in Silicon Valley to conduct their content moderation from Ukraine, instead of from Russia.
Human Rights Watch wrote to Google on March 9, inquiring whether its content moderators for Ukraine are still based in Russia and whether they have an office in Ukraine. Google has not yet responded.
In a submission to an April 2021 report of the UN special rapporteur on freedom of expression, Ukraine said that “measures taken by the social media companies, the practices of blocking fake profiles and activities of fact-checkers were only partly effective.” Ukraine noted that the success of the companies’ efforts to combat disinformation also required a “higher level of transparency.”
Under the UN Guiding Principles on Business and Human Rights, companies have a responsibility to respect human rights and remedy abuses. They are required to avoid infringing on human rights, and to take steps to address adverse human rights impacts directly linked to their practices or operations. The actions that companies take should meet international human rights standards, be conducted in a transparent and accountable way, and be enforced in a consistent and nonarbitrary manner.
To fully assess whether companies are respecting human rights, there is an urgent need to provide access to data for independent researchers, including those in the fields of human rights disinformation, hate speech, and incitement to violence, among others, while protecting user privacy, Human Rights Watch said.
Many of the social media companies’ actions during the war in Ukraine, such as taking down accounts, geo-blocking state-affiliated media channels, and removing and demoting content, have implications for freedom of expression. Companies need to be able to demonstrate how these actions comply with human rights standards, including whether restrictions on expression are necessary and proportionate to a legitimate aim, and whether they are procedurally fair.
To avoid arbitrary, biased, or selective decisions, companies should only take steps based on clear, established, and transparent processes, Human Rights Watch said.
Social media platforms and other content hosts that remove content, especially during a crisis, should also preserve and archive material that may have evidentiary value of human rights abuses, while ensuring the privacy and security of vulnerable people.
“Tech companies have had to deal with conflict playing out on and through their platforms for years in many countries around the world, including in Ukraine,” Brown said. “The conflict in Ukraine highlights the life-or-death importance of taking their responsibilities seriously and devoting the resources needed to ensure that their products don’t facilitate or contribute to abuse and harm, wherever people use them.”