‘The Role of Social Media is Significant’: Facebook and the Fact Finding Mission on Myanmar

‘The Role of Social Media is Significant’: Facebook and the Fact Finding Mission on Myanmar

[Emma Irving is an Assistant Professor of Public International Law at Leiden University.]

The OHCHR Fact Finding Mission on Myanmar drew the world’s attention last week by issuing a report finding that genocide, war crimes, and crimes against humanity against the Rohingya and other ethnic groups took place in Myanmar last year. The Mission went further, drawing up a non-exhaustive list of perpetrators, and pointing the finger directly at six members of the Myanmar military who the Mission found exercised effective control over the perpetrators. In addition to these features of the report, there is another development that is worthy of discussion: the attention paid to the role of Facebook in the violence in Myanmar.

Facebook and the Fact Finding Mission Report

On a general note, the Fact Finding Mission describes ‘the role of social media’ as ‘significant’ in the spreading of hate speech in Myanmar (§74). The report describes in detail the ratcheting up of ethnic tensions and systematic discrimination and abuse over decades; a key part of which was the spreading of hate speech by non-state actors and the military (and the civilian government’s failure to address this). In recent years, Facebook’s prominence in the country – the report notes that Facebook is the internet in Myanmar – means that it has become a ‘useful instrument for those seeking to spread hate’ (§74). The preponderance of hate speech is one of the factors that is said to have created a climate in which the catastrophe inflicted on the Rohingya and other ethnic groups was entirely predictable.

Beyond commenting on Facebook’s general role in facilitating the spreading of hate speech, the report uses specific Facebook posts to support its findings. In one instance, the report cites a Facebook post from the Myanmar State Counsellor Office as its source for the Government’s position on when the operations in Rakhine State ended (§49). In other instances, the use of Facebook posts is more interesting. One particular post by the military’s Commander-in-Chief attracted attention: ‘The Bengali problem was a long-standing one which has become an unfinished job despite the efforts of previous governments to solve it. The government in office is taking great care in solving the problem.’ (§35) This statement is used in the report to support two findings. First, it supports the Mission’s finding that the ‘nature, scale and organisation of the operations suggests a level of preplanning and design on the part of the’ military leadership (§35). This is an important consideration for establishing criminal responsibility. Second, it supports the Mission’s finding of the existence of genocidal intent (§86). Without genocidal intent, the acts perpetrated against the Rohingya cannot be labelled as genocide, making this element of particular importance for the claim that genocide took place. While the Facebook statement is not the only piece of information on which the two findings are based, it is the only one that is expressly set out in the report.

The Fact Finding Mission’s inclusion of Facebook posts as an important source of information fits within a broader trend among international accountability mechanisms. In August 2017, the ICC issued its first arrest warrant to be based largely on seven videos obtained from social media (commentary on Al-Werfalli arrest warrant). In the second arrest warrant for Al-Werfalli, issued in July 2018, the Pre-Trial Chamber reaffirms position that the videos are sufficient to establish ‘reasonable grounds to believe’. The IIIM, for its part, has formed partnerships with organisations that specialise in the collection of open source digital information (including social media) in order to pursue its mandate of accountability for Syria (here). The report of the OPCW Fact Finding Mission in Syria also lists open source information among the types of information it relied upon, and the annex to the report lists Twitter, Facebook, and YouTube posts. This trend is a consequence of technological developments and the digitisation of conflict facilitating remote investigations, and is new ground into which international accountability mechanisms are carefully but surely treading.

Facebook’s Response

The same day that Fact Finding Mission report was released, Facebook announced a number of measures that the platform was taking in response to the report. Admitting that it was ‘too slow to act’, Facebook claimed that it is now making progress, and has developed better technology to identify hate speech. As part of the response in relation to Myanmar, Facebook has removed a total of 18 Facebook accounts, one Instagram account, and 52 Facebook pages. Among the removed accounts is that of the Commander-in-Chief Senior-General Min Aung Hlaing, the same individual whose posts have been used in the Fact Finding Mission report to support the finding of genocidal intent. In its response, Facebook doesn’t mention whether it will address the Mission’s ‘regret’ at its unwillingness to provide data about the spread of hate speech in Myanmar, suggesting that Facebook’s uncooperative tendencies may continue in this regard.

There is one element of Facebook’s response which is particularly noteworthy and welcome. In relation to the accounts and pages that have been removed, Facebook has stated that it has preserved the data and content associated with them. This is significant, as concerns have emerged over the last year concerning the take-down policies of prominent social media platforms (here and here). In summer of last year, many organisations that use social media to document conflicts, including those in Syria, Yemen, and Myanmar, noticed that either their social media accounts had been shut down, or large amounts of their content had been removed. These take-downs were part of a new strategy to remove extremist and terrorist content from social media platforms, but the algorithms behind the process lacked nuance, and insufficient account was taken of the fact that certain content related to documentation and accountability efforts. While some content has been restored, much remains unavailable, and it is not clear whether the removed content has been preserved or destroyed. If the latter, this could have a detrimental impact on accountability efforts. In light of this, Facebook’s assurance that the content from the removed pages and accounts in Myanmar is being preserved is important.

Facebook’s response of taking accounts and pages offline, while an improvement on the inaction of the past, is unlikely to be sufficient to reign in hate speech. Removing accounts and improving systems for reporting hate speech are not enough when the very algorithms that power Facebook are promoting hateful content. Reports show that Facebook posts drawing on negative, primal emotions such as anger, fear, and tribalism perform better on the platform and are made more visible. A study looking at anti-refugee violence in Germany made similar findings, positing that Facebook made ordinary individuals more prone to xenophobic violence. This problem is not unique to Facebook, with similar concerns being raised about YouTube.

Conclusion

The Fact Finding Mission on Myanmar did not shy away from using an important source of information about the violence in Rakhine State: Facebook posts. It relied on Facebook posts (among other sources) to support significant findings on the planning of the violence and on genocidal intent. In doing so the Mission remained cautious, indicating that only corroborated and verified information had been relied upon, and thereby pre-empting the criticisms that justly accompany the use of open source information. Beyond Facebook, the Mission also relied on another form of digital technology: satellite imagery. Before and after images of burned and razed villages were used to establish a pattern of widespread and systematic violence. Technologically-derived information therefore has a clear place in the work of the Fact Finding Mission, and in this sense the Mission joins the other international accountability mechanisms that are incorporating technology into their work.

One of the main take-aways from the Fact Finding Mission report, apart from the important findings on events and crimes committed in Rakhine State, is the signal that international accountability mechanisms do not have a blind spot when it comes to social media. Where possible and appropriate social media will be used to support findings of fact and intention; and where social media platforms are part of the problem, these mechanisms can generate international pressure to do better.

Print Friendly, PDF & Email
Topics
Featured
No Comments

Sorry, the comment form is closed at this time.