Fifth Annual Symposium on Pop Culture and International Law: ‘War Never Changes’ – Fallout, Restraint and the Enduring Nature of Warfare

Fifth Annual Symposium on Pop Culture and International Law: ‘War Never Changes’ – Fallout, Restraint and the Enduring Nature of Warfare

[Professor Luke Moffett is chair of human rights and international humanitarian law at Queen’s University Belfast and a lone wanderer through Fallout 3, New Vegas, 4 and 76]

“War. War never changes. 

The Romans waged war to gather slaves and wealth. 

Spain built an empire from its lust for gold and territory. 

Hitler shaped a battered Germany into an economic superpower. 

But war never changes.”

Ron Perlman – Fallout 1 (1997)

The game franchise Fallout, now an Amazon TV series, is an action role-playing game in a post-apocalyptic nuclear wasteland that encourages players to explore and survive. The tag line of Fallout – ‘war never changes’ – signifies that even after nuclear apocalypse, human condition is predilected to violence. This post will explore these recurrent themes with law aiming to restrain such violence through international humanitarian law, in face of the integration of AI. By drawing on Fallout’s recurring themes of survival, legitimacy of violence, and dehumanisation, this post considers how law grapples with algorithmically enabled violence, the extent it can change our human behaviour and war, and whether current frameworks are capable of restraining the risks posed by algorithms in war and an increasingly fragmented multilateral world.

War and the Human Condition

The notion that ‘war never changes’ exhibits a Clausewitzian view of armed conflict as ‘an act of violence intended to compel our opponent to fulfil our will’. Current international politics is witnessing a schism in the multilateral legal order. States are regularly using violence that violates international law to further their own political ends. Rather than international law being the medium to build consensus and changing minds through diplomacy, war is again in its ascendancy as a key tool of statecraft. Whether from the war in Gaza or Ukraine, to the US strikes on Venezuelan drug boats, law is constantly being reinterpreted and co-opted into giving legal cover to actions which violate the UN Charter, Geneva Conventions and International Human Rights Law. War is becoming normalised with law being bent to the new moral and political outlook of the world where each state is trying to survive and in competition with others.

This resonates with sentiments in the Fallout games and TV series where “everyone wants to save the world, they just disagree on how”, on the moral ambiguity of each belligerent. Here war is not prohibited, but a return to the natural order of human nature where a Hobbesian leviathan of the state has not arisen to monopolise violence and maintain order and governance. In Fallout the player is immersed in a struggle for survival against both rival factions and the deadly fauna of the nuclear wasteland. This gives the player a freehand to navigate the socio-political environment, allying or fighting against different factions, supporting those who align with your own ethics and worldview, or just going rogue. Either way the player has to live the consequences of their actions, losing support or facing hostility from factions they cross.

Fallout is clearly a work of fiction, but how its dystopian world came about is due to the place of technology in war and human nature. War and technology have often sold as means to solve social problems. Today we are witnessing war being sold as a means to solve migration, drug trafficking and organised crime. At the same time there is an increasing algorithmic turn in warfare, commercial involvement in militarisation (move fast and break things) and the resort to force to compel our adversaries to our will. What is deeply disturbing about such developments is how they are increasingly psychologically detaching us from those who are subjected to violence – whether as datapoints fed into algorithms or branding whole peoples as terrorists. We have already seen human operators rubber stamping algorithmic outputs of AI-decision support systems in Gaza without exercising sufficient cognition and legal compliance, that bodes ill for maintaining any human control over more developed lethal autonomous weapons systems (LAWS). Algorithms only stretch the reach and magnitude of human violence, rather than restrain it.

In Fallout where war is normalised, killing of those who are seen as less than human is permitted, whether they be irradiated humans (ghouls), synths (robotic humanoids), super-mutants (enhanced super-soldiers) and raiders (bandits) in a bid for survival and ultimately supremacy over the wasteland. This echoes justifications during the Global War of Terror, which shifted the binary of war from civilian/combatant to innocent civilian/terrorist, justifying algorithmic killing by association and proximity. In Fallout this distinction between who is deserving of immunity from attack and those subjected to collective violence through unleashing nuclear weapons to using robots or synths against enemy factions really demonstrates the slippery slope of such dilution. Though unlike some states, Fallout also humanises such monsters with storylines for ghouls (Doc Barrows), synths (Nick Valentine) and super-mutants (Strong).

War and Technology

Historian Ian Morris argues that throughout history war has been a crucible for the development of human civilisation. WWII saw the development of radar, missile technology, trauma care and helicopters. In turn these had civilian application from microwaves, jet planes to the commercial production of penicillin. With the integration of machine-learning algorithms/AI the dual-use of such technology is deeply entangled in military and civilian use. Despite this the road to progress on the battlefield comes at a high human cost by enabling ‘new vistas of harm’ for civilians and hors de combat that has contributed in over half of civilian casualties in Western military operations being misidentified. AI blurs the principle of distinction through the diffuse dual-use infrastructure of AI across continents that can be part of the ‘kill cloud’ and risk data centres, compute and data labellers being targeted as potentially direct participation in hostilities.

In Fallout the backstory to the 2077 nuclear war, was that defence companies, like Vault-Tec and West-Tek, made products that would only be useful in a nuclear holocaust (nuclear bunkers/vaults) and so precipitated the war. Today we are not being sold nuclear vaults, but the omnipotence of AI. Whether it is the promise of AI providing a military edge over competitors through speed, autonomous weapon systems to improve mass and scale, to ending uncertainty or fog and friction of war through full situational awareness. War is increasingly being seen as inevitable as countries increase their defence budget and invest in AI for use in the military domain. We have had warnings of Russia threatening to expand its war into eastern Europe and provoke NATO, to China threats of reunification with Taiwan. There is increasing pushback against the hype of AI, where the explosion of the value of AI and military tech companies is fuelling a business model that will need a ‘major war’ to make a profit. Investment signs point to the AI hype bubble soon bursting that will perhaps deflate continued efforts to seek a return from military investment in AI. 

Leaving aside companies selling us the end of the world, tech companies are intimately involved in armed conflict. Whether from using Palestine as a ‘laboratory’ for Israel to develop  weapons and technology to export to the world to Ukraine being an ‘AI War Lab’ for Western companies like Anduril and Palantir. While technological words like AI-enabled, data fusion and future warfighters seduce us into sterilising war as clean and bloodless, where innovation can change violence into something technical, it  obscures the reality that war is hell. It also enables Anduril founder Palmer Luckey to make attention seeking statements like “I love killer robots”. As Professor Elke Schwarz  says these companies live in ‘a land of make-believe and unicorns’ where such industrialists promising to defend democracy and ensure world peace through technological deterrence are set to make massive profits, whether or not the technology works or is legally compliant. These risks aren’t futuristic or in some distant warzone. Our personal data is likely being collected and used to train algorithms in peacetime for use in any future conflict. Mobile company Telenor allegedly provided data to Myanmar military junta to target civilians. 

Law in Regulating Technology in War

In Fallout the cause of the nuclear war was overconsumption that led to shortages. In the wasteland some factions, such as the Brotherhood of Steel and the Institute, pursue technology to master control. Today mass consumption has exacerbated climate change, but also conflict over resources. The pursuit of exploiting AI is promised as a way to solve our human problems, but will just add to climate and resource pressures with data centres hungry consumption of electricity and water. We are seeing increasing calls for regulation of AI in the military domain beyond lethal autonomous weapons systems. These calls echo calls for regulation before the death and destruction of the World Wars. Henry Dunant, the founder of what became the International Committee Red Cross, argued in 1862 that,

‘If the new and frightful weapons of destruction are now at the disposal of the nations, seem destined to abridge the duration of future wars, it appears likely, on the other hand, that future battles will only become more and more murderous.’

In 1898 Tsar Nicholas II called states leaders to join him to negotiate what would become the first of the Hague Peace Conventions and Regulations in 1899 to limit war and regulate its conduct. In his words such efforts were aimed at the cataclysm of investment in new weaponry:

‘The maintenance of general peace, and a possible reduction of the excessive armaments which weigh upon all nations … Hundreds of millions are devoted to acquiring terrible engines of destruction, which, though today regarded as the last word of science, are destined tomorrow to lose all value in consequence of some fresh discovery in the same field.

… the continual danger which lies in this massing of war material, are transforming the armed peace of our days into a crushing burden, which the peoples have more and more difficulty in bearing. It appears evident, then, that if this state of things were prolonged, it would inevitably lead to the very cataclysm which it is desired to avert, and the horrors of which make every thinking man shudder in advance.

To put an end to these incessant armaments and to seek the means of warding off the calamities which are threatening the whole world-such is the supreme duty which is today imposed on all States … [to] focus the efforts of all States which are sincerely seeking to make the great idea of universal peace triumph over the elements of trouble and discord. 

These sentiments were echoed over a century and a quarter later by Ukrainian President Volodymyr Zelensky before the UN General Assembly last month,

‘We are now living through the most destructive arms race in human history – because this time, it includes artificial intelligence….We need global rules – now – for how AI can be used in weapons. And this is just as urgent as preventing the spread of nuclear weapons. We need to restore international cooperation – real, working cooperation – for peace and for security. A few years from now might already be too late…..’

Similarly the 1938 Amsterdam draft convention on the protection of civilians was framed as protecting civilians from ‘new engines of war’. Perhaps this law cannot restrain the impulse for total war. The 1923 Hague Rules of Air Warfare were ignored in the 1930s-1940s as ‘merely another example of the delusion that it was possible for people sitting around a conference table to dream up rules that would make war more humane’ (p. 20). Law creates its own illusions of the valour of ideals for a better tomorrow, while blinding itself to some of the realities of the human condition – ‘war’s ruin, and the wreck of chivalry’ (Oscar Wilde, Queen Henrietta Maria). Being indifferent to human suffering in war diminishes our humanity – as one character remarks in Fallout ‘people die all the time. It’s the law of the Wasteland, so who cares, right?’

Conclusion

Law, war and technology are human endeavours. The wasteland of Fallout paints a dystopian future where law has been abandoned in the exploitation of resources for technological and military advantage. This is not to say law doesn’t abate technological exploitation in war, but tries to limit its worst excesses. There have been cyclical calls to better regulate war and the gravitational pull of rearmament towards war that predates the world wars, and is something that we are witnessing again. We all don’t want to ‘set the world on fire’, but in a time of uncertainty we need to use every legal tool we have to struggle and to restrain our worst human impulses for war.

Print Friendly, PDF & Email
Topics
Featured, General, Symposia, Technology, Themes

Leave a Reply

Please Login to comment
avatar
  Subscribe  
Notify of