Taming the Lions: The Role of Industry in the Debate on Autonomous Weapon Systems (AWS)

Taming the Lions: The Role of Industry in the Debate on Autonomous Weapon Systems (AWS)

[Dr. Elisabeth Hoffberger-Pippanis associate at the Peace Research Institute Frankfurt (PRIF) with a research focus on biological and chemical weapons within the CBWNet Project as well as AI-enabled technology in the military domain. Prior to joining PRIF, she was researcher and head of project of “iPRAW”, the International Panel on the Regulation of Autonomous Weapons, a project funded by the German Foreign Office.

Vanessa Vohs, LL.M. is research associate and doctoral candidate at the University of the Bundeswehr Munich (Chair Prof. Dr. Carlo Masala) working on legal and ethical elements for Trustworthy AI in the EU-funded project AI4DEF. Previously, Vanessa worked as a Research Assistant for iPRAW, the International Panel on the Regulation of Autonomous Weapons.]

Artificial Intelligence (AI) Researcher Haydn Belfield from the University of Cambridge warns that companies in the AI market “aren’t building with the level of robustness and security one needs for defence systems in safety-critical/adversarial settings”

The robustness of AI-enabled weapons systems – understood as the system’s capability to sustain or fend off adversarial attacks – is only one among various other challenges private companies working in the defense industry have to overcome. At the same time, industry has gained more prominence in the manufacture of weapons systems over the last years and plays a pivotal role in ensuring that force is used in compliance with military, legal, and ethical considerations. The United Nations Group of Governmental Experts (UN GGE) in the area of (lethal) autonomous weapons systems (AWS) has hitherto failed at addressing adequately the role of companies in the context of AWS. 

The next GGE will take place between May 15 and 19, 2023. With a view to this important and yet challenging event, this blogpost examines the role of industry in the debate on AWS in the UN GGE and provides concrete recommendations on how to address best industry stakeholders. 

There seems to be a lack of clarity how private companies could be addressed and whether existing rules of international humanitarian law (IHL) are sufficient to address these issues or whether new rules or adjustments of existing laws are needed. 

The concrete role of private industry and how they are addressed highly depends on the regulatory outcome of the UN GGE. In our view, three scenarios are possible: 1) States Parties agree on legally non-binding standards; 2) States Parties agree on a legally binding document and 3) States Parties cannot find consensus but private industry decides to regulate itself by establishing voluntary commitments. 

1. A Non-binding Document on AWS and the Role of Weapons Reviews 

Australia, Canada, Japan, the Republic of Korea, the U.K., and the U.S. clearly favor the adoption of non-binding standards where weapons reviews take center stage (for the last common working paper submitted at the UN GGE in March 2023 see here, see also the UN GGE Report of 2019). Even though the U.S. has not ratified Additional Protocol I to the Geneva Conventions, it has established one of the most rigorous and detailed weapons review processes that have already served as an example of best practices in the UN GGE. Even though weapons reviews seem to be an ideal mechanism to address private companies, three main challenges remain which merit further consideration. 

a. Different Standards for Different Types of AWS – The Scope of Application of Article 36 AP I GC 

The first challenge relates to the question of when a review process should commence and what role private companies should play in it. According to Article 36 AP I GC, States Parties are obliged to undertake weapons reviews “[i]n the study, development, acquisition or adoption of a new weapon, means or method of warfare”. In practice, private companies play a central role in both the study and development of AWS. Thus, industry would be directly affected by a weapons review (for more information see here). They would, for example, be confronted with additional requirements on how to manufacture an AWS and would be compelled to implement these principles into the design of a particular weapons system. However, a distinction is often made between so-called off-the-shelf products – understood as products manufactured solely by the private sector without any involvement by the government – and products that were developed by government-funding. Some authors argue that off-the-shelf products would likely not be part of a weapons review at the stage of study and development. But what happens if an off-the-shelf AWS is exported to another country? And what if that country does not undertake an adequate weapons review at the stage of acquisition? In our view, States should consider their obligations under Article 36 AP I GC in conjunction with their obligations under Article I mandating States Parties to ensure respect for IHL together with their national export control laws. By the same token, Damian Copeland, Rain Liivoja and Lauren Sanders argue that

“[t]he increased prominence of private industry in weapon manufacture may give States cause to consider the temporal application of the weapons review obligation in relation to their national weapons review directives and export control obligations”

(p. 23)

b. What is the Object of Review? 

The second challenge relates to the question of what should be the object of a weapons review. In practice, different types of companies are involved in the process of developing AWS ranging from the company creating the relevant algorithm to a company building the software into the hardware up to the point where an AWS is completed. As adumbrated above, a weapons review extends to the stage of study and development of an AWS. Solely focusing on the final product that was manufactured would be short-sighted. Rather, States Parties to AP I GC are legally obliged to undertake weapons reviews at the stage of study and development which clearly affects not only those companies which complete weapons by putting software and hardware together, but also the companies that are responsible for the individual algorithm destined to become an integral part of an AWS. 

Article 36 AP I GC obligates States Parties to undertake reviews with regard to weapons, means and methods of warfare without, however, providing for a definition of these terms (for a detailed overview on the various national interpretations of Article 36 AP I GC see here, for the US position see here). The Tallinn Manual 2.0 might provide useful information in this regard. According to Rule No. 103 “[c]yber means of warfare” includes both cyber weapons and related systems and includes (…) software used, designed, or intended to be used to conduct a cyber-attack.” In light of this, software components of an AWS, such as algorithms, do not qualify as weapons per se as they still need to be built into the hardware of the weapons system but they certainly do qualify as means of warfare. It is important to note, however, that only software intended to become an integral part of a weapons system must be subjected to a weapons review within the meaning of Article 36 AP I GC, whereas algorithms which will be integrated into weapons systems for law enforcement are not covered (for more information see here).  

Companies providing algorithms certainly have the best knowledge of the software, know whether and according to what extent the technology could be susceptible to errors and can also assess best an AI-system’s robustness. Thus, the object of a weapons review – especially at the stage of study and development – clearly encompasses not only hardware components but also software, which has direct implications for companies providing such technology.

c. Follow-up Reports

The third challenge with regard to weapons reviews is the question of how long review processes should take place and what role private industry should play in it. Even though the wording of Article 36 AP I GC lends support to the assumption that States Parties are merely required to undertake weapons reviews in the pre-deployment phase, several States have already engaged in some form of post-deployment reviews including the Netherlands and the US (for more information see here, page 128). AI-enabled AWS based on machine learning may change over time and adapt their behavior. Mandatory “periodic post-deployment reviews” or so-called “runtime verifications” would help ensure that the continuous evolution of self-learning algorithms is constantly monitored thereby assuring that AWS are deployed in line with military, legal, and ethical considerations. Such follow-up reports would not necessarily have to take the form of an official legal review but could be limited to exercising some form of due diligence. For example, instead of reviewing the entire weapons system, particular emphasis should be placed on the algorithm itself, how it evolved and whether it can still meet operational and humanitarian demands. Most importantly, private companies could play a pivotal role at this stage of the process. As most weapons systems are developed by the private sector, industry representatives could and should be consulted when analyzing AWS in a post-deployment phase. They could provide information on the data that have been used and the robustness of the algorithm. They could share insights into their work and provide information on how they created the respective algorithm (provided that no intellectual property laws are violated). Only in case an algorithm substantially modifies, a new review process has to be initiated. 

The U.S. has already acknowledged the necessity for post-deployment review mechanisms with regard to AI-enabled technology. In its new Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy presented in February 2023, the U.S. emphasizes that “[s]elf-learning or continuously updating military AI capabilities should also be subject to a monitoring process to ensure that critical safety features have not been degraded”. 

2. A CCW Protocol or an Independent Treaty 

The second approach in the UN GGE is to favor a legally binding treaty on AWS. Most States from the Global South, most prominently Argentina, Costa Rica, Palestine, and Mexico (for more information see here) but also several European countries (see for example here, for a comprehensive overview of country positions see here) support this position. It is rather improbable that all States Parties to the Convention on Certain Conventional Weapons (CCW) would agree on a Protocol on AWS. It is far more likely that an independent process will be initiated even though such a process would certainly not be inclusive. A legally binding document would probably obligate States to enact new or adjust existing laws prohibiting the development, deployment, acquisition and proliferation of AWS which cannot be used without a sufficient level of human control or judgment. In order to ensure adherence to such legal obligations, national laws would have to provide for penal sanctions and/or options for civil litigation in case companies do not comply with their obligations. But a legally binding treaty would certainly have its drawbacks in terms of addressing private companies. First, given the fact that major military powers would not be on board, a regulatory gap might ensue creating different legal standards for companies around the world depending on where they are located potentially creating disruptive economic effects. Second, the terminology employed in a treaty would need to be sufficiently precise in order to ensure compliance, whereas the watering-down of terminology must be avoided at all costs. But reaching consensus seems to be a challenging undertaking, even among States principally in favor of a legally binding document on AWS. On the other hand, a legally binding treaty could be a strong signal to the international community that at least some States are willing to further regulate the development, use and transfer of AWS. 

3. “Ethical Defense Engineering”: Self-regulation as a Way Forward?

A third option would be that companies themselves establish voluntary standards when producing AWS and AI-related technology respectively. The Future Combat Air System (FCAS) forum is a good example in this regard. FCAS – a German-Franco-Spanish defence project – is a network of various already-existing weapons systems and new platforms including the Next Generation Weapon System with substantial degrees of autonomy. The forum serves as a platform where industry, scientific experts and civil society representatives meet to discuss how FCAS can be used responsibly and in line with legal and ethical standards. 

The NATO Principles on the Responsible use of AI adopted in 2021 could serve as a valuable source of inspiration for industry self-regulation. The principles encompass lawfulness, responsibility and accountability, explainability and traceability, reliability, governability and bias mitigation. Obviously, the principles would have to be adjusted and slightly be re-interpreted in order to meet the needs of the private sector. Moreover, the NATO Principles refer to AI and not AWS which would also justify further adjustments. Even though the NATO Principles do not explicitly encompass transparency standards, they do play a role when it comes to managing data and the private sector and should thus be added to the list. Companies developing algorithms or entire weapons systems often retain substantial control over the data (for more information on AWS and data issues in general see here) that have been used, transferred, and processed. In case States acquire such weapons systems they might not have adequate access to the underlying data and might thus be incapable of sufficiently understanding the technology that has been employed. Companies regulating themselves could decide to share certain amounts of data so that customers understand the technology they use. 

In light of these challenges, we suggest the following five points. First, States Parties at the UN GGE should clarify the relation between their obligations under IHL, especially Article 36 AP I GC in conjunction with Article 1, and national export control laws with a view to ensuring that AWS are not proliferated in an uncontrolled manner. In doing so, companies manufacturing off-the-shelf AWS would also be faced with a certain degree of governmental oversight. Second, States Parties at the UN GGE should pay particular attention to companies providing software components. Those States, which do not already consider software to be an integral part of their weapons review obligations should adjust their national review processes. Third, States Parties at the UN GGE should agree on mandatory post-deployment reviews in case of AI-enabled AWS able to adapt their behavior according to their surrounding environment. In the course of such reviews, all States Parties should consult with companies having provided algorithms and other relevant technical data. Fourth, States favoring a legally binding document should consider the effects of a non-inclusive approach, such as market distortions for private companies. The fact that major military powers will not be on board should certainly not prevent supporters of a legally binding document from pursuing their goals. But continued co-operation with States in favor of non-binding rules remains paramount. Fifth, private industry stakeholders should consider self-regulation in order to sow trust among the international community, especially the wider public, and to make sure that AWS are used in compliance with military, legal, and ethical considerations. Only a comprehensive and inclusive effort will allow States Parties to adequately address private industry in the debate on AWS, whose role in weapons manufacture is ever increasing.

Print Friendly, PDF & Email
Topics
Autonomous Weapons, Featured, General, Technology
No Comments

Sorry, the comment form is closed at this time.