March 2016

Here’s your weekly selection of international law and international relations headlines from around the world: Africa The European Union plans to cut back its funding for Burundi's lucrative peacekeeping contingent in Somalia to try to force President Pierre Nkurunziza into talks with opponents and away from the brink of ethnic conflict, diplomatic sources said. Burundi's ruling party has accused Rwandan president Paul Kagame...

[Jeroen van den Boogaard is assistant professor military law of the Netherlands Defence Academy and a lecturer and associate researcher at the Amsterdam Center of International Law.] Despite Chris Borgen's plea that “the immediate legal issues may have to do more with international business transactions than international humanitarian law”, the International Committee of the Red Cross (ICRC) hosted their second expert meeting on autonomous weapons systems last week. The meeting brought together a number of legal and technical experts on the subject as well as governmental representatives (the Report of the first expert meeting in 2014 is here). Autonomous weapons systems, or ‘killer robots’ as they are referred to by others, are sophisticated weapons systems that, once they have been activated, can select and attack targets without further human intervention. The focus of the ICRC in their definition of autonomous weapons systems (AWS) is on systems with a high degree of autonomy in their ‘critical functions’, namely autonomously selecting and attacking targets. The ICRC has in the past called on States to ensure that AWS are not employed if compliance with international humanitarian law (IHL) cannot be guaranteed. The Campaign to stop Killer Robots have called for a pre-emptive and comprehensive ban on AWS and to prohibit taking the human ‘out-of-the-loop’ with respect to targeting and attack decisions on the battlefield. It is important to realise that professional militaries around the globe already possess and use scores of weapon systems with varying levels of autonomy. The use of artificial intelligence of future AWS may however enable AWS to learn from earlier operations, which enhances their effectiveness. It is feared that this will lead to scenarios where AWS go astray and decide in an unpredictable way which targets to attack. The concerns for the use of AWS are based on a number of grounds, for example the moral question whether decisions with regard to life or death can be left to machines. Another concern is the fear that the protection of civilians during armed conflict would be adversely affected through the use of AWS. In legal terms, this means that it is unclear whether AWS are in compliance with IHL, particularly the principles of distinction, proportionality and precautionary measures. The main focus of the ICRC expert meeting was to establish what may be understood by retaining ‘adequate, meaningful, or, appropriate human control over the use of force’ by AWS. This is important because although there is by definition always a human actor who deploys the AWS, the question is what the consequences are in case the AWS is fully independently making decisions as required by IHL. For example, it is unclear whether AWS would be able to comply with the obligation to verify whether its target is a legitimate military objective. It seems that in technical terms, it may be expected that the use of complex algorithms may enable AWS to reliably identify the military advantage of attacking a certain target. Recent history has revealed the exponential speed of developments in computers, data storage, and communications systems. There is no reason to assume that this would be any different for the development of self-adapting AWS whose algorithms rely on artificial intelligence to independently assess what the destruction of a certain military objective would contribute to the military advantage of an operation. This is necessary to attack an object in compliance with IHL. Especially in environments without any civilian presence, such as below the sea on the high seas, IHL seems to be no obstacle to deploy AWS. The picture changes as soon as

I've been slowly working on a post that points out Ted "Carpet Bombing" Cruz is no less scary than Donald "Torture Everyone" Trump when it comes to foreign-policy. (Schadenfreude isn't a strong enough word for how much I am enjoying the implosion of the Republican party under the combined weight of their insanity.) To tide you over, I will simply offer this doozy of...

[Patrick Wall is studying for an LL.M. in International Law at the Graduate Institute of International and Development Studies, Geneva, as the Sir Ninian Stephen Menzies Scholar in International Law.] Last Monday, the US House of Representatives overwhelmingly passed—by 392 votes to 3—a resolution ‘[e]xpressing the sense of the Congress condemning the gross violations of international law amounting to war crimes...

Thanks to Steve Vladeck for the thoughtful post over at Just Security about his take on Garland’s record on Guantanamo cases and related matters. Steve, like Charlie Savage in the Times, is in one sense far more critical of Garland than I. I say “in one sense” because, before jumping back into the details here, it seems apparent we’re all applying somewhat different metrics here in assessing that record, some I fear more problematic than others.

Here’s your weekly selection of international law and international relations headlines from around the world: Africa A former Congolese vice-president becomes the most senior political leader ever to face judgment before the International Criminal Court on Monday, when judges rule on whether he committed war crimes in the Central African Republic (CAR). The judgment will be handed down at 14:00 (CET) and...

Sponsored Announcements Admissions to the Seminar “Public Health and Human Rights – Current Challenges and Possible Solutions” (19 May 2016), organised by the European Inter-University Centre for Human Rights and Democratisation (EIUC) are open until 25 April 2016, early bird 30 March 2016 with 10% discount. The issue of global health governance, which deals with the question how to regulate efficiently a panoply...

AJIL Unbound has just published a fantastic symposium entitled "TWAIL Perspectives on ICL, IHL, and Intervention." The symposium includes an introduction by James Gathii (Loyola-Chicago) and essays by Asad Kiyani (Western), Parvathi Menon (Max Planck), Ntina Tzouvala (Durham), and Corri Zoli (Syracuse). All of the essays are excellent and worth a read, but I want to call special attention to Ntina's essay, which is...

On the hopeful assumption the Senate will come to its senses and consider President Obama’s nomination of Merrick Garland to the U.S. Supreme Court on its merits, I wanted to respond to what appears to be some skepticism among progressives that Garland is indeed a good choice for the Court. The Huffington Post, for instance, published an article following the nomination headlined (ominously) that Garland once sided with the Bush Administration on Guantanamo. I was curious, so I decided to look up the cases.

Over the years a few of us have written issues concerning battlefield robots. (See, for example: 1, 2, 3, 4, 5.)  Sometimes, we had links to remarkable videos of quadruped robots stomping through forests. Those robots and videos were made by Boston Dynamics, a company that started from an MIT research group. Besides its designing quadruped robots, Boston Dynamics gained further...

[Craig H. Allen is the Judson Falknor Professor of Law at the University of Washington, where he directs the university’s Arctic Law and Policy Institute.] In a March 10, 2016, op-ed in the Wall Street Journal, Canadian professor Michael Byers (along with U.S. co-author Scott Borgerson), reprises an earlier suggestion aimed at bringing legitimacy to Canada’s claim of sovereignty over the...