Another Warbot Metaphor: Nanobot Swarms and Regulatory Challenges

by Chris Borgen

My previous post mentioned battlefield robot analogs of dogs, cheetahs, pack animals, even humans. Now behold the synchronized nanobot swarm

Here’s what national security analyst John Robb had to say about the tactical benefits of a battlefield drone swarm:

•It cuts the enemy target off from supply and communications.
•It adversely impacts the morale of the target.
•It makes a coordinated defense extremely difficult (resource allocation is intensely difficult).
•It radically increases the potential of surprise

Things start to get really interesting when the confluence of two technologies cause even more radical changes. Take, for example, how fabrication technology and micro-drone tech may one day allow new drones to essentially be printed out by fabbing machines.  Not there yet, but perhaps someday.

The underlying issue is that technology is changing so fast, it may be thwarting legal regulation from adequately responding to the implications of technological change. I italicized “may” because I am not certain that this is the case.

Law (and perhaps especially the common law) is propelled by metaphors.  Its timely adaptation to a new technology partially relies on whether an apt metaphor can first orient the regulatory perspective, providing a basic frame for the problem, so that a combination of legislation and judicial interpretation can then fill-in more precise details. 

For example, there were the arguments in the 1990’s (and still today…) over whether the internet is more like a broadcast medium, a mail service, or phone service. In part, the regulation of activiities on the internet has been based on applying various metaphors to different fact patterns, trying to apply old rules and, with some new legislation and interpretation, make them do new tricks. Perhaps this is all that is needed and technology has not left law in the dust.

If that is the case, while battlefield robots may present some new risks, do they actually overturn IHL as we know it? (Similarly do some of the other topics mentioned in the links, such as the implications of DNA hacking, raze pre-existing rules?) Are these actually areas where many whole new areas of substantive rules are needed, or are these examples of areas where regulatory enforcement just got alot harder?

At least regarding IHL, is technological change affecting primarily the substance of law or the enforceability of law, or both equally?  I look forward to any comments from others in the Opinio Juris community…

http://opiniojuris.org/2012/02/21/another-warbot-metaphor-nanobot-swarms-and-regulatory-challenges/

2 Responses

  1. I am afraid I don’t have any comments on the IHL aspects of this, but I do have some thoughts on the science and technology aspects of it.  One of my concerns (if I were a planner for the US military) would be proliferation.  10 years ago, sequencing the human genome took years, cost 10s of millions of dollars and large parts of it were done by hand.  Today, you can buy rack mounted machines that will do it in a couple of hours for a thousand dollars.  10 years from now doctors will most likely have them in their offices, they will take 15 minutes and probably cost less than $100 dollars.  Similar things have happened with lots of other technologies (e.g., personal computing).

    The battlefield drone swarm may sound like a great idea when only the US has it.  But within a couple of years after the US first field them, most industrialized nations will be able to field such a thing.  A couple of years after that, virtually any state could field such a thing.  Some time after that, the technology will be available to non-state actors with an axe to grind. That sounds like it has some scary IHL implications to me.  On the other hand, it is not clear to me that non-proliferation as a strategy could work (or at least it has not been that successful for nuclear, chemical or biological weapons).

    One other thought.  It seems to me that one of the advantages of our (US) military is its superior training vis a vis most of those it is in conflict with.  But if weaponized drones really do become prevalent in conflicts, such drones can be mass produced, and they are all virtually identical, it seems we have lost one of our comparative advantages.  The end result of such a conflict might end up coming down to manufacturing capacity (i.e., which state can produce drones faster).

    Ok, enough speculative fiction for now.

  2. Chris,

    Very interesting posts of late!  My initial take is that the basic principles of international humanitarian law (IHL) will remain and are timeless.  Think of all of the advancements in weaponry that they have already survived!  The devil is in the details of their application, as you indicate, and that has always been and will remain a problem.  In cases where states have been somewhat uncertain, they have clarified what the law may already have required.  The clearest example of this, to me anyway, is the blinding weapons protocol (Protocol IV to the 1980 Convention on Certain Conventional Weapons).

    For me, the real issue all of this raises is one of accountability, of establishing that an IHL violation occurred and the responsibility for it, both national and individual.  That is a problem now, and one that will only get worse to the extent we automate aspects of the targeting and attacking process.  What the international criminal tribunal (and the recent U.S., and my own) experiences appear to be showing us is that except in cases of clear and substantial violations, establishing whether a state or a given individual has violated IHL is no easy task (and sometimes not even then).  Evidentiary issues, establishing intent, accounting for the “fog of war,” etc. have been and will remain very real practical enforcement problems.  Automating the battlefield makes it more complex (as if it isn’t enough already), and will therefore exacerbate these practical difficulties.

Trackbacks and Pingbacks

  1. There are no trackbacks or pingbacks associated with this post at this time.