- The paper calls for the development of a new policy framework to effectively address issues of justice that arise in a range of digital contexts.
- It recommends increasing the judicial system’s capacity to adjudicate more claims, and creating a resource guide that helps victims successfully navigate the system.
- Advancements in technology, particularly “deepfakes,” have highlighted the legal and justice system gap; protecting digital rights must encompass all forms of self-determination and personhood.
- Read the paper here.
San Francisco, USA, 29 September 2021 – The World Economic Forum has published the Pathways to Digital Justice report to address systemic legal and judicial gaps, and help guide law and policy efforts towards combating data-driven harms. This is particularly important with the increase in online activities and digitization of services, which – when misused – can present new types of risk.
“Global legal systems are set up in a way that disempowers those who have been victims of digital harm. The concept of digital rights extends beyond the online realm and has implications across the entire judicial system in terms of how we can rectify the power imbalances that currently exist and provide more agency to victims to ensure that the redress provided to them is relevant, meaningful and trauma-informed,” said Sheila Warren, Deputy Head of the Centre for the Fourth Industrial Revolution, World Economic Forum.
The Forum’s Global Future Council on Data Policy worked with the Global Future Council on Media, Entertainment and Sport and the Global Future Council on Artificial Intelligence for Humanity, in collaboration with an advisory committee consisting of experts from around the world, to make the case that a new policy framework is needed to effectively address issues of justice that arise in a range of digital contexts.
It is critical to note that this report does not claim that technology itself is the sole source of harm or that regulating technology is the only viable solution with respect to data-driven and predictive technology. Technology will continue to advance and indeed will have the potential to bring positive benefits to society. This report calls attention to the inadequacy of legal and judicial systems, and the quasi-legal and judicial systems that platforms offer, to address the types of harms that arise from these technologies.
In one example, Gibi, an American YouTuber and ASMR artist, has been repeatedly targeted by deepfakes and online harassment. The problem got so bad that she had to change her name, move out of her home and be extremely vigilant when revealing any potentially identifying information about herself. She has since discovered several online businesses in which others profited from selling her image without her consent in deepfakes and other fake, often pornographic, material.
“‘I can’t think of a single organization that is equipped to deal with this, lawmakers and different governments just let it go. I’d like to see perpetrators brought to justice and know that it’s a crime,” said Gibi.
She is far from being alone. According to 2019 research from Sensity.ai, of the 85,000 deepfakes circulating online, 96% are pornographic, with over 99% of those pornographic deepfakes being of women. But victims of deepfakes and other forms of digital harm lack meaningful ways to obtain justice.
What is digital justice?
Digital justice provides a space through which people can investigate community problems and generate solutions. Due to fragmentation and under-resourced legal and judicial systems, and the jurisdictional challenges of regulating international communications platforms, technology-enabled harms have flourished. Compounding this issue is that these systems falsely default to inadequate privacy-based protections, limited legal solutions and a lack of fair process in automated decision-making.
Most concerning is that legal and judicial systems currently fall short of protecting their citizens, particularly women, LGBTQI+, BIPOC and other historically marginalized communities. Providing individuals with robust pathways to recourse and redress should be vital aspects of how organizations and governments contend with the ethics and governance of data-driven technologies.
What can the legal and judicial system do?
The report recommends two multistakeholder pathways to digital justice:
- Increase judicial system’s capacity to adjudicate more claims
- Create a victim resource guide with at minimum the 10-core victim-centered components as outlined in the World Economic Forum’s digital justice paper
There is an opportunity to centre digital transformation initiatives and system design around strengthening and protecting specific, articulable rights. Systems prioritizing this frame would design for many of the same agency and redress rights, but measure impact instead of volume.
To ensure that these systems maintain their integrity and independence, it is essential to have accessible dispute resolution and professional stewardship infrastructure for those who cannot represent their interests.
Duty of care is another crucial legal construct that will help to conceptualize further the criticality of effective recourse measures and digital due process rights. Duty of care is the common law term describing the responsibilities that individuals owe each other, and while the nature of business and the market has evolved in the digital era, little has changed. The concept of duties remains a useful tool and offers questions that allow people to define responsible governance in digital ecosystems.