17 Dec Digital Accountability Symposium: Weapons of Mass Media–Facebook News and a Call to Accountability
[Shannon Raj Singh is an attorney specialized in international criminal law and human rights; she is currently a Visiting Fellow of Practice at Oxford’s Institute for Ethics, Law & Armed Conflict and an Associate Legal Officer at the Special Tribunal for Lebanon in The Hague. The views expressed herein are those of the author and do not necessarily reflect the views of the STL.]
For years now, human rights activists have been sounding the alarm about the role of social media in fueling mass atrocities, both by driving extremism within their platforms and spreading disinformation like wildfire. These calls reached a fever pitch at the 2018 release of the Report of the Independent International Fact-Finding Mission on Myanmar, which found that social media played a “significant role” in driving the extraordinary violence against the Rohingya, and that Facebook in particular was a “useful instrument for those seeking to spread hate”. Following its release, many analogized social media to traditional news companies in calling for accountability, recalling the prosecution of RTLM executives at the International Criminal Tribunal for Rwanda, and the publisher of Der Stürmer at the International Military Tribunal at Nuremburg.
Facebook, for its part, resisted these comparisons. “I consider us to be a technology company”, proclaimed Mark Zuckerberg in his testimony before Congress – and for a long time, the distinction made sense. After all, Facebook’s mission and core features bore little relation to those of traditional news companies, which disseminate information in the public interest and profess a commitment to the truth.
That distinction has now eroded. This fall, Facebook began rolling out a feature called Facebook News: a dedicated news section aimed, for now, at a subset of US-based users. At its core, Facebook News contains a feature called Today’s Stories, managed by a “curation team” which selects stories for users based on “publicly available guidelines”. In the run-up to the launch of Facebook News, the platform offered media companies millions of dollars a year to license content, including The Washington Post, Dow Jones, and Bloomberg. Not to worry: Facebook has made clear that its news content will only come from publishers that abide by the platform’s Publisher Guidelines, which include a range of “integrity signals” for determining eligibility, such as misinformation (as identified based on third-party fact checkers) and community standards violations (e.g. hate speech). Facebook has also committed to check the integrity status of Pages on a continual basis to ensure eligibility criteria are consistently being met.
If you’re wondering how stringent these integrity standards are, consider this: as of October 2019, Facebook’s approved publishers included Breitbart, the toxic alt-right platform with close affiliations to the Trump administration. Breitbart’s storied reputation for promoting extremism, divisiveness, and violence within American politics bodes ill for the possibility that Facebook News will have an ameliorating effect on the ways in which the platform has been used to foment violence in years past.
Whatever one may have previously believed about Facebook’s responsibility for the content disseminated on its platform, that responsibility has fundamentally changed with the launch of its news feature. While social media has been weaponized before, Facebook News carries the potential to be a weapon with particularly acute capabilities and range.
Let’s take a hypothetical example. At the time of writing, an article is available on Breitbart, entitled Does Islam Convert Social Losers into Time Bombs? The article examines the inherent “danger” posed by Muslims living in the West, and posits that “introducing Islam into free societies based on meritocracy and freedom is tantamount to throwing a lit match into a barrel of gasoline”. While Facebook has long been used to share such content, its news feature will enable the “curation team” to, if it so wishes, select this article and highlight it as one of Today’s Stories. The impact of such a decision would be nothing short of momentous: posted on Breitbart alone, the story could reach about 5 million readers – but if highlighted on Facebook News, the story could reach the platform’s 183 million users in the US alone, blowing its earlier reach out of the water.
No one suggests that a single Breitbart article is likely to incite atrocities against minority groups. But when these articles are given extraordinary amplification over time and specifically targeted to users receptive to its message, they could play a pivotal role in fueling violence. Indeed, this is precisely what the FFM found in Myanmar, where social media’s role in the atrocities was deemed “significant”. In that case, Facebook merely enabled content to be disseminated on its platform and employed algorithms that had the effect of fostering extremism. Today, its conduct goes a step further – not only is Facebook amplifying dangerous content posted by others, but its “curation team” is making fundamental decisions about what news has enough “integrity” to warrant being hand-selected and highlighted for hundreds of millions of people. While limited to US-based users today, it is entirely conceivable that the feature will ultimately be rolled out for Facebook’s 2.4 billion-user global audience. The international law community would do well to begin considering how Facebook’s legal exposure has changed by adopting this feature, and how the contours of international criminal law (ICL) might be used to shape its incentives going forward.
In a prior publication, I argued that the (limited) ICL jurisprudence on media accountability is ill-equipped to cover Facebook’s conduct, partly because of (now eroded) factual distinctions between social media and traditional news media, and partly because of the intent requirement: RTLM and Der Stürmer executives themselves sought to further the commission of mass atrocities, whereas no such intent can be attributed to Facebook executives. For this reason, I suggested that a better analogy might be to arms suppliers, using principles of aiding and abetting liability under customary international law. That analogy is even stronger today. Despite its awareness of disinformation, Facebook has decided to assume direct responsibility for the integrity and dissemination of news on its platform. Where it excels, the company should receive all due credit – but when it fails, accountability must ensue.
Freedom of expression, online and in print, is at the core of what keeps a society free. But freedom to express oneself is not the same as being given the ability to reach an audience of millions, to have your views highlighted and amplified, and to have your words target precisely those within a society most vulnerable to their message. For anyone already concerned about the weaponization of social media to further atrocity crimes around the world, the capabilities posed by Facebook News are unprecedented. Consider this a call to action to evaluate and regulate its use.