20 Sep Q & A with Facebook on Myanmar
The International Commission of Jurists (ICJ) has been in discussions with Facebook regarding the human rights impact of Facebook’s operations around the world, with a particular emphasis on situations such as Myanmar, where there are credible reports of international crimes. This Q&A is a result of the ongoing deliberations and is focused on Myanmar and Facebook’s approach to the multiple legal issues that have arisen in that context.
Responses have been provided by Miranda Sissons, the inaugural Director of Human Rights Product Policy at Facebook, and a long time human rights defender.
What steps has Facebook taken to implement the UN Fact-Finding Mission on Myanmar (FFM)’s recommendations of 17 September 2018, including to “…allow for an independent and thorough examination of the use of their platform to spread messages inciting to violence and discrimination in Myanmar”?
Facebook commissioned an independent human rights impact assessment on the role of its services in Myanmar and published the findings, in full, in November 2018. The assessment was conducted and completed by BSR (Business for Social Responsibility)—an independent non-profit organization with expertise in human rights practices and policies — in accordance with the UN Guiding Principles on Business and Human Rights and as part of our pledge as a member of the Global Network Initiative. It’s the only example of the complete disclosure of a full HRIA in the technology field, possibly in any business field.
The findings and recommendations from the HRIA have had strong, ongoing impacts. One of them was the creation of the human rights director role that I now occupy. Another is the very focused work Facebook is now doing to prevent the misuse of social media in countries with high risk of conflict. We have set up a dedicated, multidisciplinary team focused on understanding the historical, political and technological contexts of countries in conflict, and inviting local input and expertise to country-specific solutions that span the breadth of Facebook, including policy, product, operations and programs.
We have also continued to conduct human rights due diligence in countries around the world, and have issued Facebook responses to recent HRIAs that detail the resulting mitigations we’ve adopted. You can find more information about those work streams here. I’d like to note the very specific details provided in Facebook’s response to the Sri Lanka HRIA, disclosed in May 2020 and available in full here. While Sri Lanka is a different country context, the details of mitigations are very relevant to understanding the kinds of work we do in countries at risk of conflict.
Which recommendations did Facebook adopt from the November 2018 Human Rights Impact Assessment, including to “[e]xplore the co-creation of a system to preserve removed content for use as evidence later” and “[i]f established, provide evidence to international mechanisms created to investigate violations of international human rights” and how have they been implemented?
Facebook has preserved data related to account takedowns completed between August and October 2018; data related to multiple takedowns for Coordinated Authentic Behavior; and data preserved at the request of the UN Fact Finding Mission on Myanmar (FFM) and the Independent Mechanism on Myanmar (IIMM).
We’ve begun the process of lawfully providing the data to the IIMM. In doing so, we’re aware of the IIMM’s mandate to “prepare files in order to facilitate and expedite fair and independent criminal proceedings, in accordance with international law standards, in national, regional or international courts or tribunals that have or may in the future have jurisdiction over these crimes, in accordance with international law.” (A/HRC/RES/39/2, para 22).
What criteria is Facebook applying to identify content that may have evidential value in legal proceedings and is it seeking to identify both potentially inculpatory and exculpatory content?
It’s important to know that in our preservations and disclosures, we don’t make evidentiary assessments for any legal proceedings. Rather, we respond to requests for data consistent with our terms and applicable law. Our disclosure to the IIMM will be consistent with this framework.
Once relevant content has been identified, what information is Facebook preserving and how is it being preserved, to ensure it meets the requirements to be produced as evidence in legal proceedings? Given the volume of data at hand, what challenges does Facebook foresee will emerge in the preservation process?
Disclosure is a different process from preservation. All disclosure requests must identify requested records with particularity, including the specific data categories requested, and date limitations for the request. (See more in our answer to question 5, below).
With social media accounts, data volume is always an issue. We regularly find that even well-equipped law enforcement or judicial mechanisms struggle with disclosure volumes, including when dealing with a single account.
With whom will Facebook share preserved content that may have evidential value in legal proceedings and what criteria is being applied to this decision? Does this assessment differ, depending on the institution in question (domestic or international court, or investigative mechanism) that is meant to receive the information?
U.S. law explains these data types, namely: subscriber information; other account records; and content data. The data types have different levels of protection and sensitivity, with content being the most sensitive and the most protected.
As our guidelines explain: a search warrant issued under the procedures described in the Federal Rules of Criminal Procedure or equivalent state warrant procedures upon a showing of probable cause is required to compel the disclosure of the stored contents of any account. In addition, a Mutual Legal Assistance Treaty request or letter rogatory may be required to compel the disclosure of the contents of an account.
Our guidelines ensure we comply with our terms of service and applicable law, including the Stored Communications Act, Europe’s GPDR, and Irish Law. Guidelines related to civil matters are available on our help center.
It’s going to be increasingly important for accountability advocates to become adept in this complex landscape.
Why has Facebook chosen to oppose the request of The Gambia for information that may be critical to its case before the International Court of Justice?
First, it’s important to note that we have begun the process of lawfully providing data to the UN Independent Investigative Mechanism (IIMM). IIMM is mandated to assist all fair and independent criminal proceedings in accordance with international law standards.
The IIMM’s broad mandate is significant because our disclosure to the IIMM means that we potentially make the data available for many efforts, including those at the International Court of Justice.
Second, U.S. social media companies, including Facebook, are generally prohibited by U.S. law under the Stored Communications Act (SCA) from disclosing the content of communications on their platforms to third parties, absent specific statutory exceptions.
The SCA is the foundation of the entire regime that protects user data and user privacy from arbitrary government demands worldwide.
The SCA ensures that we give all of our users the same level of privacy protections that Americans are guaranteed under U.S. law and the Constitution. In that sense, it is a shield for human rights, protecting our users from requests that would violate their fundamental rights, like freedoms of expression and opinion, association, and assembly.
We cannot violate it in response to any governments’ data requests without significant risk to the privacy of our billions of users.
Permitted exceptions to the SCA include clear consent from the user, emergencies where disclosure could prevent imminent harm, and a request under the newly passed CLOUD Act. The SCA does not include any exception for international justice efforts.
In addition, the long-established framework for foreign governments obtaining content from U.S. companies has been Mutual Legal Assistance Treaties, which is consistent with U.S. law, including the SCA. The Gambia doesn’t have an MLAT or a CLOUD Act agreement with the USA.
Thus, disclosing to The Gambia would present legal and precedential risks that are extremely high. We’d be piercing our global privacy protection framework for a government that is not party to any agreement that is usually a prerequisite for lawful data sharing under U.S. law. Other such countries include Russia and Vietnam.
While some have argued this seems arcane and irrelevant—especially given the gravity of the charges in the ICJ case—it’s not. These are essential protections we use every hour of every day to defend users against arbitrary or overbroad government requests for user data. We regularly push back against government demands that we consider to be unlawful or in violation of international human rights standards, as shown by our transparency reports. (And which, as members of the Global Network Initiative, we’ve committed to do).
If U.S. companies lower the SCA shield to respond to governments in violation of our legal obligations in instances like The Gambia’s investigation, it will be impossible to effectively defend our refusal to produce data in response to other governments’ requests, or in response to requirements that would raise more significant human rights concerns.
That being said, we underscore we recognize the extraordinary gravity of the atrocities in Myanmar. Indeed, that is why we are making a voluntary disclosure to the IIMM.
Regarding the Shared Communications Act argument in U.S Federal Court, this does not seem to be a bar to sharing of information, given the accounts violated Facebook’s own rules, and were also publicly disclosed before being taken down by Facebook. Why use this argument to oppose The Gambia’s motion for information?
This is a subject of current litigation, so it’s difficult for me to comment. Your audience may find it most useful to read the legal briefs directly.
Why did Facebook indicate it was cooperating with the IIMM, when it was not, as clarified by the Independent Investigative Mechanism for Myanmar (IIMM)? And more recently, Facebook has shared the first data set that only “partially complies” with requests from the IIMM – why this reluctance to share information, if Facebook is serious about accountability and justice for the Rohingya, as indicated by the Facebook spokesperson?
In the first article, dated August 10, 2020, the author wrote Facebook “has not released evidence.” On August 10, we had not yet produced data, although we were actively preparing to do so. Many readers appear to have interpreted the lack of release to equate a lack of cooperation.
In the second article, dated August 25, 2020, the reporter stated the head of the IIMM confirmed the body had received a “first data set which partially complies with our previous requests”.
That’s accurate: the first data set was simply an initial disclosure. We’re setting out on a process of rolling disclosure, with multiple steps and discussion along the way.
As your readers know, international justice processes are complex. Social media data is voluminous. By making a voluntary disclosure to the IIMM, Facebook is opening a potential pathway to assist all global accountability efforts that meet international standards, whether under universal jurisdiction, at the ICJ, or by other means.