22 Oct Coding Justice? The Tradeoffs of Using Technology for Documenting Crimes in Ukraine
[Raquel Vazquez Llorente is the Head of Law and Policy, Technology Threats & Opportunities, at WITNESS. Wendy Betts is the Executive Director of eyeWitness, initiated by the International Bar Association.]
Traditionally, international justice has adjudicated abuses and crimes that occurred well in the past. For the first time in history, we are seeing active and cooperative investigations from the beginning of a conflict. Not just Ukraine itself, but also multiple other countries like Germany, UK and Sweden, and international institutions like the International Criminal Court are already gathering information for future accountability processes.
The speed with which this has been set in motion, and the abundance of actors and accompanying resources, are due in large part to an effort to leverage technology to collect information that can help advance justice and accountability more efficiently–from criminal cases to human rights reporting. Various Ukrainian ministries have built portals to enable citizens to upload information, and organizations are adapting existing technology to document international crimes. Photo and video in particular have received a lot of attention for their potential to uncover abuses, and multiple initiatives are underway to provide technology to record visual evidence. Yet, when trying to build new tools or adapt existing technology for a conflict zone, it is important to acknowledge the work that other actors have been conducting for years in the technology and human rights space. When moving fast endangers lives, the stakes are simply too high for the international community to ignore what past experience has taught us.
We know that using technology to collect data for evidence in accountability processes involves trade offs. While people in conflict zones film events for many reasons other than sharing it with the International Criminal Court and other judicial bodies, these institutions have standards for information collection that can be difficult to meet once the footage has been captured. Coding decisions must balance competing priorities from the design phase, particularly concerning user’s anonymity, their ownership–now and in the future–over the information they have collected, and the process for authenticating the photos and videos according to legal standards.
The Clash Between Anonymity, Verifiability and Control Over the Data
A key characteristic of software designed for gathering potential evidence of war crimes is that it must meet the needs of two constituencies: those collecting the information and those using it for analysis. Documentation teams, activists and other individuals capturing information want a tool that is easy and secure to use, protects their privacy, and allows them to maintain control over the information they collect. Investigators, prosecutors, and other organizations taking in the information to build cases or write reports want as much detail as possible about the data that was collected and the person who recorded the footage. They also want to limit access to the information and its circulation.
Addressing the needs of these two constituencies so war crimes footage is effectively used for justice and accountability is one of the principal challenges when developing this type of documentation technology.
For instance, anonymity of the individuals collecting information is a difficult feature to balance with the demands of justice. Individuals recording the footage may want to stay anonymous because of genuine threats to their personal safety. They may not want their IP addresses or other identifiable information collected. Unfortunately, applying lessons learned from other sectors which also deal with sensitive data is not straightforward. As an example, healthcare data can be anonymised for research purposes without losing its value. Personal details such as name or other contact details are not relevant to understand how cancer can be diagnosed earlier. However, such details are highly relevant for criminal justice investigators, who want as much information as possible about the source. Knowing who has accused you, and of what, is a fundamental tenet of fair trials that we should not forgo–else we risk creating a Kafkaesque international justice system.
People using tools for documenting human rights abuses and the investigators making use of that information need to have full and transparent knowledge of what identifying information is being collected, where it is stored, and who has access to it. Documenters, survivors and witnesses need to be informed that for certain fact finding processes such as criminal justice, any details they supply may be publicly released or shared with other parties, including the defense. Anything collected can be subpoenaed by governments with ulterior motives, and it is open to disclosure if the organization preserving the data is not able to resist or narrow down requests for information when they are overbroad or put individuals at risk. If personally identifiable information has been collected, the anonymity of the source cannot ever be guaranteed.
Relatedly, if people using a tool to collect information prefer to stay anonymous, this will inevitably interfere with the control they have over their information. Once they have uploaded the footage to a portal or a server managed by an institution, they are in the hands of the organization that now hosts the information. Even if the entity’s mandate is exclusively to use the data for accountability for human rights violations, the concept of justice may vary across cultures and peoples. Some people may agree for their information to be used for United Nations reporting, but may want to stay away from criminal justice processes.
Individuals recording footage understandably want a certain level of agency over the footage they collect. Users’ consent must not only be fully informed, but also agile, as close as possible to the decision point when their data is shared with a third body. For these conversations to happen, the user would have to provide enough information to be reachable, or receive sufficient details in advance to ensure their agreement covers a vast array of data sharing scenarios. As with anonymity decisions, what is crucial is that people using these documentation tools understand what control and ownership they have over the data collected, and investigators know what level of consent the users have given.
Media Authenticity and Justice: The Blockchain Fallacy
Technology designers will also need to balance how effortless it is to capture or upload footage with the verification practices required for accountability processes. At a minimum, institutions receiving the footage will want to know not only where it came from and when it was taken, but also that it has not been edited. Tools developed for these purposes will require compromises. For instance, controlled-capture technologies, which lock in provenance details when photos and videos are recorded, make it easier for investigators to verify the footage. However, current tools require the software to be installed on the device. This offers less flexibility to the user, who would need to have identified, downloaded and used that specific software to take the photo or video. Having a specialised tool at the ready may not be feasible, and even if available, in a conflict zone, citizens often default to their phone camera.
On the other hand, some data collection tools may allow people to upload footage they have taken with their regular phone camera. But footage recorded with a standard phone camera can be edited, as can any information about where and when that footage was captured. Even if at the point the footage is discovered some tracking details are embedded, often relying on blockchain technology, there will be shortcomings–uploading photos and videos into a blockchain does little for verification if the footage has not been authenticated first.
Blockchain technology can offer a misleading appearance of authenticity. While it can prove that the footage has not been edited while in the system, blockchain cannot tell you whether the photo or video had been manually altered prior to upload or generated by artificial intelligence. Malicious actors can then easily submit photos and videos with edited metadata, or even deep fakes. Developers of software for war crimes documentation should not overstate what their technology can verify and authenticate, and need to be upfront about what metadata and content may require additional verification techniques.
What’s Ahead for Documentation Technologies
With the war between Russia and Ukraine, we are witnessing a shift in the investigation of war crimes and widespread human rights violations in conflict. Some of these changes may be specific to this conflict, but other developments may signal how accountability for mass scale violations will unfold in the future. Inescapably, technology will play a bigger role in the documentation of coming conflicts. Rather than quick and ad hoc fixes as each new situation arises, we should invest in the responsible evolution of once niche media authentication tools—many actually pioneered by human rights organizations—including mainstream applications driven largely by commercial companies.
As resources pour in and money is channeled into developing software to document visually war crimes and other international violations, institutions funding technology solutions for justice and accountability must be cautious about tools that promise both extensive user control and thorough verifiability. They should also be aware of the difficult balance of different human rights at play and how this may affect specific communities, particularly paying attention to their vulnerabilities and the technical knowledge they may need to understand the tradeoffs of a documentation tool.
If you want anonymity, users will relinquish their opportunity to provide informed consent to some usages. It is difficult to meaningfully consent at the time of recording when the future options for justice and accountability are still not fully defined. If the premise is that consent cannot be traded, then the software must collect personally identifying information. This then runs against privacy rights.
The potential of technology for recording war crimes and human rights abuses is unquestionable, but if we want meaningful justice and accountability, we owe it to the communities affected by conflict to think carefully about how documentation tools are developed and deployed.