International Law and Popular Culture Symposium: ‘Westworld’ and Human Rights

International Law and Popular Culture Symposium: ‘Westworld’ and Human Rights

[Sarah Zarmsky is a PhD Candidate with the Human Rights Centre at the University of Essex with a focus on human rights, international law, and new technologies. She received her LLM in Public International Law from Leiden University and her BA in Psychology from Brandeis University. Sarah has completed internships with the International Criminal Court, the International Bar Association, and the International Court of Justice. She has also served as a volunteer digital verification researcher for the Cameroon Database of Atrocities and Bellingcat.]

Warning: This piece contains spoilers for the HBO series Westworld

‘The right information at the right time is deadlier than any weapon.’ –

Martin Connells, Westworld Season 3, Episode 5

Introduction

At the end of season 2 of HBO’s sci-fi drama Westworld, it became clear that the true purpose of Westworld, an adult ‘theme park’ in which guests could interact with extremely realistic AI robots, was to collect data on those who visited the park. The use of this data was revealed in season 3, when the show was set for the first time out of the bounds of Westworld and in real, futuristic society. Even at a surface level, it is clear that the park is a metaphor for today’s social media companies, who have admitted to harnessing user data for a variety of reasons. This piece highlights the international legal and human rights issues raised by Westworld’s data collection, including concerns of consent, arbitrariness, and discrimination with regard to privacy, and the right to freedom of expression and personal identity.

Setting the Scene: the Westworld Universe

In the most recent third season of Westworld, the protagonists have escaped from the park and entered into the real world of Los Angeles in the year 2053. Westworld’s vision of the future is similar in many ways to the usual depictions found in science fiction–very modern buildings, self-driving cars, and robots working at reception desks. However, its unique (and unsettling) features quickly become apparent. One of the most notable is the ‘Rico’ app, which one of the main characters, Caleb, uses to earn money committing crimes in a way that is similar to today’s Uber. When the scene zooms into Caleb’s phone, his mobile carrier is displayed as Incite, a large data collection company in the Westworld universe. It has been suggested that one of the ways Incite collects data is through being a mobile carrier, which could be a nod to the fact that present day carriers, such as AT&T, T-Mobile, and Sprint have been caught selling customer location data. 

Where Westworld clearly becomes an analogy to today’s social media companies is when we are introduced to Engerraund Serac, the richest man in the world, owner of Incite Inc., and co-creator of Rehoboam, the world’s most advanced artificial intelligence. Speaking in a way that is unnervingly similar to the way Mark Zuckerberg has gushed about Facebook, Serac claims he created Rehoboam, a predictive software developed using mass amounts of personal data, only for good–to ‘prevent humanity from destroying itself’. Yet, the dark undertones of Rehoboam became clear as the series progresses. For instance, the protagonists learn that the AI deliberately restricts opportunities for people after predicting their futures, controlling every aspect of their daily lives. Rehoboam identifies individuals who are ‘high risk’ based on their data, who may then even be incarcerated. When provided access to the Rehoboam database, Caleb learns that he has been prevented from jobs and succeeding in life because the software predicted he would commit suicide within a decade.

Season 3 finishes after the show’s AI robot protagonist, Dolores, hacks into the Rehoboam system and shares the software’s individual predictions to everyone on the planet, sending the world into chaos. Rehoboam is finally destroyed, and it is presumed that in the next season, people will finally be able to choose their own paths.

To summarize, the Westworld universe is not an entirely crazy prediction for what the future could look like if tech companies continue to harvest user data and end up using it to shape society in malicious ways. However, a key difference between our world and the one created by HBO is the existence of the law, specifically international human rights frameworks. Throughout the series, there is no mention of how this extreme data collection is a violation of human rights, leading to the assumption that these laws may not exist (or are simply completely ignored) in the Westworld universe. The following section aims to highlight some of the explicit human rights frameworks from our world that are implicated by the corporate data collection and employment of predictive and restrictive AI software in Westworld.

Human Rights Issues

The Right to Privacy – Arbitrary and Discriminatory Practices

The right to privacy is enshrined in Article 12 of the Universal Declaration of Human Rights (UDHR) and Article 17 of the International Covenant on Civil and Political Rights (ICCPR). The use of ‘big data’ and AI routinely puts the right to privacy at risk, and Westworld represents the extreme scenario. From watching the show, it seems that people are extremely reliant on technology, including their mobile phones and smart watches, which continuously collect their personal data. Aside from the Westworld park, which collected data on its guests, everyday mobile and social media usage is presumably one of the main ways Rehoboam gathers information for its predictive and oppressive functions.

The first step to this data collection is consent, which in the real world is something most users gloss over or do not fully understand, resulting in consent being ill-informed. This is because it is extremely difficult to anticipate how data might be shared with third parties, and the intimidatingly long and complex ‘fine print’ is not fit for all social media users to read and comprehend. In Westworld, the concept of consent for data collection seems to have been eradicated, as guests never discuss this upon entering the park. Moreover, after all of Rehoboam’s data is sent out to the world, people’s violent reactions are indicative that they had no idea their data was being used.

Considering that according to the UDHR and the ICCPR, interferences on privacy must not be arbitrary, and that in order to prevent the arbitrary use of personal information, its processing must be ‘based on the free, specific, informed and unambiguous consent of the individuals concerned, or another legitimate basis laid down in law’, it is clear that the practices in Westworld are in violation of international human rights frameworks. This is because data is collected without consent, and it does not seem to be based on any ‘legitimate aim’ as required by international human rights texts. While Serac may argue that the ‘legitimate aim’ of the data collection and the creation of Rehoboam was to ‘save humanity’, it is unlikely that this conjecture would hold up in a human rights court, as this purpose is extremely vague.

Further, the International Principles on the Application of Human Rights Law to Communications Surveillance expressly prohibit discrimination ‘based on national or social origin, birth, or other status’–which is also standard across human rights texts. The practice in Westworld, where individuals are restricted in their opportunities based on an AI predictive software, whose decisions are based on factors such as social class, is blatantly arbitrary and discriminatory. This is especially the case when algorithms are in use, as decisions are often stereotypical, meaning they are based on group-level characteristics, and based on correlation instead of causation. Algorithmic models therefore fail to account for individual agency and the relevance of individual choice. Thus, the right to privacy is clearly infringed upon in Westworld, and is unlawful as it does not work to achieve a legitimate aim in a manner that is neither unarbitrary nor non-discriminatory.

This being said, in today’s world, the data company Incite and the owners of the Westworld park would have obligations under the UN Guiding Principles (UNGPs) on Business and Human Rights. Principle 17 of the UNGPs requires businesses to exercise human rights due diligence in order to ‘identify, prevent, mitigate and account for how they address their adverse human rights impacts’. However, in Westworld, the corporate executives are not once seen discussing topics relating to Business and Human Rights or recognizing how their excessive infringements on the right to privacy can be mitigated.

Freedom of Expression, Agency, and Personal Identity

The violations of the right to privacy mentioned in the previous section also led to an interference with the rights to freedom of expression and the development of one’s personal identity. Freedom of expression is enshrined in Article 19 of both the UDHR and the ICCPR. Further, Article 22 of the UDHR provides the right to realization of the ‘economic, social and cultural rights indispensable for [one’s] dignity’ and the ‘free development of [one’s] personality’.

Both of these fundamental rights are infringed upon by the oppressive nature of Rehoboam’s predictive software, which places limits on everything a person can do in life based upon how likely the AI deems them to succeed. If someone does not have a high enough score, which is essentially based upon a stereotype of who they might be, they will not be able to apply for certain jobs, go to certain places, or may even be incarcerated. This directly violates the right to express oneself freely and to have the agency to seek out opportunities to develop one’s individual personality.

On another note, in the Westworld universe, it does not seem possible to be anonymous, as everyone is registered within the Rehoboam system, which presumably monitors all devices. It has been demonstrated that the inability to be anonymous and excessive surveillance can have a chilling effect on speech even in today’s society, as sometimes (especially in areas of political unrest or conflict) people may feel unsafe voicing their opinions due to the possibility of retaliation. Thus, it is certain that the inability to become anonymous and the level of surveillance in Westworld inhibits the freedom of expression of individuals, serving as a violation of human rights.

Conclusion

To summarize, the practices of data collection and the development of the AI software Rehoboam in Westworld raise serious concerns for the right to privacy, freedom of expression, and the ability to realize one’s personality. Admittedly, this piece sounds like a very pessimistic take on how today’s social media companies could ultimately lead our world down a path like the one seen in Westworld. However, perhaps the law may serve as a glimmer of hope in that it seemingly ceases to exist in the Westworld universe. Further, it is important to note that international human rights law, while valuable, is not and cannot be the sole solution to avoiding the catastrophes of Westworldnational jurisdictions and independent oversight mechanisms also play essential roles in the fight to protect privacy in a digital age. Nonetheless, it is still possible for existing human rights frameworks to do what they were meant to do and protect us from this type of data mining, but only if states and companies are willing to respect them. Time will tell.

Print Friendly, PDF & Email
Topics
Featured, General, International Human Rights Law, Public International Law, Symposia, Technology, Themes
No Comments

Sorry, the comment form is closed at this time.