04 Mar Warning! Obstacles Ahead! The Regulation of Autonomous Weapons Systems in the GGE LAWS
[Jeroen van den Boogaard is a legal counsel for the Dutch Ministry of Foreign Affairs and a lecturer in international humanitarian law at the University of Amsterdam. He writes this post in his personal capacity.]
In the coming weeks, Opinio Juris will host a symposium on “Responsible Military AI and the Law of Armed Conflict.” The purpose of the symposium is to examine the challenges in regulating autonomous weapons systems. There is no forum where challenges become more apparent than during the meetings of the GGE LAWS. The symposium takes place as the GGE LAWS reconvenes in Geneva (4-8 March) for the first meeting of 2024; the second meeting is planned to take place in August. Since becoming a member of the Netherlands’ delegation four years ago, I have had the privilege to witness the discussions in the GGE LAWS first-hand. This post examines some of the challenges in regulating autonomous weapons systems in the GGE LAWS in more detail.
In this post, I first reflect on the state of the debate in the GGE LAWS. Second, I attempt to offer some ideas on how the GGE LAWS may proceed with the challenge to give further detail to the so-called two-tier approach. I end this post by offering some reflections on how the role of human judgement and control may be addressed in the discussions in the GGE LAWS.
The state of the debate in the GGE LAWS
The extensive discussions on the regulation of autonomous weapons systems in the GGE LAWS in the past ten years showed active and constructive participation of the many States Party to the Conventional Weapons Convention. This is a very positive development and gives hope that States may be willing and able to agree on how to address the challenges in the development and use of autonomous weapons systems.
On the other hand, some States and civil society organizations are not satisfied at all with the pace of the regulation of autonomous weapons systems. For example, in 2022, the Campaign to Stop Killer Robots voiced a very different perspective on the state of progress of the GGE LAWS:
“After 9 years of international discussions, the CCW has failed to deliver any form of international limits to the use of autonomy in weapons systems. It has become clear that certain states are deliberately preventing advancements through abuse of procedural rules, using the consensus principle as a veto to progress. The resistance from these states is now deeply entrenched, and discussions at the CCW have become mired in procedural deadlock, preventing certain meetings from taking place at all.”
Indeed, reading the 2023 Report of the GGE LAWS, one may argue that the regulation seems to be in its infancy. After all, after ten years discussing the challenges on the use of autonomy in weapons systems, there is still no agreed definition of autonomous weapons systems; no emerging consensus on what type of weapons systems should be prohibited and what an instrument that regulates them should look like. However, the discussion in the GGE LAWS is still quite lively, at times very constructive, and benefits from active input of a wide variety of States, including not in the least place the P5.
It should be noted that the GGE LAWS does not operate in a vacuum. There are many other places where the regulation of autonomous weapons systems is being discussed, including in the First Commission of the United Nations General Assembly. In 2023, the UNGA adopted Resolution 78/241 that requests the UN Secretary General to draft a report to the UNGA reflecting States’ views on the challenges related to autonomous weapons systems. In regional meetings, sometimes (co)organized by civil society organizations, these views are discussed and developed. Furthermore, there are numerous fora where the broader regulation of Artificial Intelligence is discussed, such as in the context of the REAIM Conferences. The first of these took place in The Netherlands in February 2023, the next is planned for September 2024 in Seoul, Republic of Korea. Also, civil society and academia play a very important role fueling the GGE LAWS discussions with their input, such as the ICRC, UNIDIR, The Future of Life Institute, SIPRI, the T.M.C. Asser Institute, and the organizations forming part of the Coalition to Stop Killer Robots, among many others.
The GGE LAWS in 2024
The current mandate of the GGE LAWS is to
“further consider and formulate, by consensus, a set of elements of an instrument, without prejudging its nature, and other possible measures to address emerging technologies in the area of lethal autonomous weapon systems, taking into account the example of existing Protocols within the Convention, proposals presented by High Contracting Parties and other options related to the normative and operational framework on emerging technologies in the area of lethal autonomous weapon systems, building upon the recommendations and conclusions of the Group, and bringing in expertise on legal, military, and technological aspects.”
The current mandate does not spell out which elements should be formulated, or whether the GGE LAWS is expected to produce a legally binding instrument, such as a new Protocol to the CCW. In fact, a group of States already submitted a draft Protocol VI to the GGE LAWS in 2022. And while many States, including the Netherlands, support the drafting of a legally binding instrument, other States hold the view that it is too early to negotiate a text for a Protocol. One could argue that as a result, the GGE LAWS’ progress is hampered because some States are not in a hurry to regulate autonomous weapons systems. At the same time, some States seem to advocate for a blanket prohibition of autonomous weapons systems as soon as possible, that other States may never be willing to accept.
Implementing the two-tier approach
One concept that gained some traction in recent years, is to use a so-called two-tier (or two-track) approach to structure the debate in the GGE LAWS. This approach was first introduced by France and Germany to the GGE LAWS in 2021 and later resubmitted in 2022 by a larger group of States. The aim of the two-tier approach is to structure the debate into, on one level, discussions about a prohibition on certain types of autonomous weapons systems and, on another, to formulate regulations for those types of autonomous weapons systems that would not be covered by a prohibition. It remains to be seen whether this approach will be successful, given that it may be difficult to find consensus on the definition of the types of autonomous weapons systems covered by this prohibition. Luckily, some guidance can be drawn from the 2019 Guiding Principles from the GGE that states, in part, that “International humanitarian law continues to apply fully to all weapons systems, including the potential development and use of lethal autonomous weapons systems.” Nonetheless, it may be challenging to formulate a prohibition that can stand the test of time in an area where technology is developing so quickly. In this context, the GGE LAWS may end up only prohibiting expressis verbis what is already prohibited by international humanitarian law. The first-tier prohibition could then be limited to one sentence: Autonomous weapons systems that cannot be used in accordance with international humanitarian law are prohibited. Although such a short prohibition fits the example of other very short Protocols to the CCW, this does not address all the issues. The real challenge will then be how to regulate all other autonomous weapons systems.
What to expect in 2024 and beyond
There are many obstacles ahead for the Chair of the GGE LAWS, Ambassador Robert in den Bosch from the Netherlands and his team. For the first meeting, the Chair circulated several guiding questions that will be discussed in Geneva during the first meeting this week.
Although much discussion about a definition or the characteristics of autonomous weapons systems has already taken place, one set of guiding questions that the members of the GGE LAWS were asked to consider in the preparation for the meeting is to “provide a concrete explanation or characterization of what is considered an ‘emerging technology in the area of LAWS’” and, based on that explanation or characterization, “what functions of LAWS would be ‘autonomous‘? How could ‘autonomy’ be described or explained?”
It is crucial to advance on this issue because over the past year, in my view, there were instances where States, despite considering themselves at odds, could have been able to find common ground on what should be prohibited and regulated if they would have had a common understanding of the exact characteristics of the systems under discussion. But because of a lack of common language of an agreed and clear definition of the type of systems under discussion, consensus was never reached. It has to be said that much work has already been done on this issue in recent years (see for example here and here. It remains to be seen whether States will finally be able to overcome this definitional obstacle in the 2024 GGE LAWS.
Human judgement and control
Another obstacle to progress has been that of the delineation of the level of human involvement in the development and use of autonomous weapons systems, a question also on the agenda this week. The guiding questions proposed by the Chair is: “which elements/characteristics would make a LAWS incompatible with international humanitarian law (IHL)? Does compliance with IHL depend on the use of LAWS in a specific context and if so, in which manner?” These questions are framed in reference to the issue of human control, judgement and involvement. It seems that in formulating an answer to these questions, the members of the GGE LAWS are being asked to formulate their views on how a prohibition or regulation of certain types of autonomous weapons systems could look.
In this context, since 2013, effort has been devoted to how ‘meaningful human control’ should be defined, although consensus on the use of this phrase was never found. In my view, the term ‘meaningful human control’ was used by States in this context primarily as a lightning conductor, because they were unsuccessful in agreeing on a definition of autonomous weapons systems. Therefore, instead, States attempted to find agreement on a definition of the degree of human control required for the lawful use of autonomous weapons systems, however for now, such agreement is still pending.
In my view, the key problematic issue for weapons systems that have autonomous features is that there are some decisions in the use of certain weapons systems in certain contexts, that are not to be delegated to algorithms. This may be because in that context, human judgement cannot be excluded in the use of that weapon within the applicable legal regime during armed conflict. A concrete example of this may be the obligation to take precautionary measures and to apply the IHL proportionality rule in a complex and dynamic environment where civilians are present. But it may also be that sufficient control over the weapons system cannot be guaranteed because the course of action taken by the self-learning AI algorithm cannot be predicted. Therefore, human judgement and control must be preserved for those systems that are otherwise capable to be used in accordance with the applicable legal regime. At the same time, there are also many contexts in which the use of certain types of weapons systems with autonomous features are unproblematic. For example, using autonomous anti-missile weapons systems in unpopulated areas seems unproblematic in most cases, also because the elements of human judgements and control have been addressed in phases long before the actual launch of an autonomous weapons system.
Conclusion
As States’ experts and many experts from civil society reconvene for the 2024 GGE LAWS in Geneva this week, there is a lot to discuss. It may be hoped that the political sensitivities related to the armed conflicts raging in the world today will not form another obstacle for a productive week of discussions in the GGE LAWS. Because there is no Report due in 2024, the participants of the GGE LAWS have a unique opportunity to focus on making substantive progress to overcome the challenges of regulating autonomous weapons systems.
Sorry, the comment form is closed at this time.