09 Feb Symposium on Fairness, Equality, and Diversity in Open Source Investigations: “Fair Game”? Rethinking Ethics and Harm in the Age of Digital Investigations
[Vidhya Ramalingam is the Founder and CEO of Moonshot.
Raquel Vazquez Llorente is the Head of Law and Policy, Technology Threats & Opportunities, at WITNESS. The views and opinions expressed in this article are those of the author alone.]
This piece is a conversation between two professional communities that use open source information in overlapping contexts, yet are rarely in dialogue. The first contributor is Raquel Vazquez Llorente, a lawyer specialized in the application of technology to the documentation and investigation of international crimes. The second contributor is Vidhya Ramalingam, the founder of Moonshot, a company developing innovative investigative techniques to counter online harms worldwide. In the piece, the authors reflect on why a lack of representation and diversity has led to unethical and discriminatory applications of OSINT methodologies; whether there is a way to navigate the ethical boundaries of conducting digital research on individuals who themselves do not operate within ethical constraints; and if online investigations force us to rethink what “fair game” means in the digital age.
Raquel: I started incorporating open source investigative techniques early on in my career, some of the teams I was part of didn’t have access to relevant sites like protest locations or secret prisons. It was a time when companies had not yet started cracking down on the use of virtual identities, there wasn’t yet much of a debate about certain practices or uses of this information, and UN-endorsed guidance like the Berkeley Protocol was nonexistent. I believe now we are in a strong position as a community to discuss whether some of this work may put in peril fundamental human rights such as privacy or the presumption of innocence, and how some of these practices by private investigators and civil society may be replicating the very same power structures we are trying to hold accountable.
Vidhya: A lot of these debates are rooted in representation in online investigations. Lack of diversity in the field has all too often led to unethical and discriminatory applications of OSINT methodologies. Open source investigations require contextual knowledge, which can only be obtained by applying knowledge from a diverse group of investigators, however they often rely on a range of assumptions about the audiences under inquiry. Too often, sociodemographic data such as ethnicity, religion, or gender, are assumed to constitute risk or to be proxies for a harm. One need not look further than how OSINT methodologies have historically been applied by governments to minority communities in the name of “public safety” or “national security”– in the counter-terrorism field namely, where OSINT practices were historically used to surveil and target Muslim communities. For these reasons, Moonshot never uses sociodemographic information to seek audiences at-risk of perpetrating harm online–what constitutes risk is behaviors online. It is the action of posting, engaging, sharing, or actively seeking hate content, violent extremist or terrorist content. Even so, and with diverse teams, this work must be subjected to internal and external ethical and human rights reviews. We do this at multiple levels–all new OSINT projects are subjected to a series of internal and external reviews so that there’s a systematic, traceable approach to ethical reviews of our methodologies, and unintended impacts. This can also serve as an early warning system for uncovering possible breaches of our ethics guidelines, which can then be the basis for proactive solutions.
Raquel: You have hit an important point when you said that online open sources rely on a number of assumptions. In our field, there has been an important push to diversify the pool of online investigators, but we can certainly do more to bring the experience of affected communities to the center of this investigative work and put an emphasis on the “human rights” of human rights investigations. For instance, when civil society organizations try to feed into the news cycle, they risk publishing investigations that may be based on circumstantial evidence, do not provide the full complexity of a situation, or misinterpret events–especially if the organization or researcher doesn’t have contextual expertise. Even with the technical knowledge for verifying a piece of footage or confirming the use of a weapon system, rushing to conclusions can do more harm than good to those communities we are trying to support, for example by ending up contributing to mis- and disinformation–even if unintentionally.
Relatedly, a matter that preoccupies me is how we draw ethical boundaries when conducting digital research on individuals who themselves do not operate within ethical constraints. This requires a delicate balance between protecting the rights of the victims and the rights of suspected perpetrators. Ethics should not be optional–particularly when organizations carry out work that is similar to investigations conducted by law enforcement. A practice cannot be ethical if it considers the potential harms to only one of the parties affected by the investigation. While investigations by civil society can play an important role in exposing human rights abuses and war crimes, they should also respect the principles of due process and the presumption of innocence. Open source investigations conducted by certain private groups may have a particular agenda or bias, not taking into account alternative perspectives or explanations. We need to look at these investigations with a critical eye. Civil society organizations are not subject to the same legal frameworks as law enforcement agencies, and may not have in place comprehensive standards to conduct their work. This could lead to practices that do not respect the rights of the people being investigated. For instance, publicly releasing the names and personal information of individuals they accuse of committing crimes or human rights abuses, particularly before any legal proceedings have taken place, clearly interferes with the right to be presumed innocent until proven guilty in a court of law. This presumption applies to everyone, no matter who they are or where they are from.
Vidhya: This is a complex issue. Navigating the ethical boundaries of conducting digital research on communities and individuals–who themselves are not restricted by ethics–is far from straightforward. Two of the principles that should guide this work are: necessity and proportionality. In our case at Moonshot, we assess why the work is necessary to help safeguard the online space, and how our solution is proportional to the public safety threat in question. Is there a risk of threat amplification if the work was to be carried out? Could we amplify one threat at the expense of another, more pressing one, leading to stigmatization of particular communities? Are we gathering more data than we need? And what are our obligations around duty of care for potential perpetrators of violence? We apply human rights assessments to any work with potential perpetrators, assessing whether our methods will ensure respect for their rights to freedom of expression, belief, and privacy. However, I believe there are limits to freedom of expression. Incitement to violence is one of these limits.
In the earlier years of our work, I was more open to pushing ethical boundaries when we were engaging potential perpetrators. I remember using what we called “trojan” advertising, where we were aiming to divert people at-risk of extremism by giving them something which looked like it might be extremist content, but would actually lead them towards a counter-narrative. Today, I no longer support this approach, and in fact we banned it at Moonshot several years ago. It’s not because it wasn’t effective–it was! Our engagement rates would increase substantially when we took this approach. The reason we don’t do this anymore is because it is ethically fraught. Potential perpetrators of violence are often deeply vulnerable individuals themselves. I’ve learned through my work the humanity that sits behind a person who may do harm, and the possibilities for change. Consent is so critical to stand any chance at changing someone’s path. When we engage with an at-risk person online, we prioritize consent and voluntary engagement with an intervention. We ask ourselves “Is the service offered clear?” “Are the terms and limitations of the service offered clear to the user?” “Is there a privacy policy visible on the site?” “Is there a plan in place to obtain immediate consent to engage?”. It’s a welcome evolution in our ethical guidelines and principles as an organization.
Raquel: This example is a great illustration of how much our understanding of the harms that open source information can bring has evolved–especially when connecting and drawing inferences among data points, even when that information is out in the open. I am reading “Freedom of Thought” by Susie Alegre, who makes a strong case for how the infrastructure of our digital information ecosystem can undermine our right to freedom of thought.
I have worked with all parties to proceedings in cases for international crimes, in or with the prosecution, defense and victims’ lawyers. Over the years I have seen how the internet has enabled new actors to take part in accountability processes, although so far this has been disproportionately focused on European and US-based organizations, individuals and networks. This takes us back to our initial point on the urgent need for diversification, not only geographically but across all areas where we may have blinkers on and which may have important consequences for this type of investigation. Civil society as a whole is in need of stronger practices that take into account the particularities of the online environment–many people like Gabriela Ivens, Zara Rahman, Hadi Al Khatib, Libby McAvoy, Lindsay Freeman, Sam Dubberley, Alexa Koenig, Jackie Geis, Daragh Murray and Fred Abrahams have been pushing for years on ethical approaches to open source investigations. I believe the burden for this push should be distributed, though. Funders should be more discerning about who they are supporting and interrogate whether certain practices should have a place in the human rights community; the public should demand a media that does not play into narratives that sensationalize this work or take away the story or agency of the communities that have been affected by the crimes; and law enforcement, prosecutors, United Nations mechanisms and other fact-finding institutions taking in information and reports from civil society organizations should take a closer look at how an online investigation has been carried out.
This conversation took place online in January 2023, and has been edited for clarity.
Sorry, the comment form is closed at this time.