Rise, Robots, Rise!

Rise, Robots, Rise!

NPR’s Fresh Air with Terry Gross has a great interview with P.W. Singer of Brookings (and coordinator of the Obama campaign’s Defense Policy Task Force) about his new book concerning battlefield robots, Wired for WarKen and others have written extensively about the use of battlefield robots on this blog and elsewhere, so I won’t re-hash the various legal, moral, and strategic issues here.  (But do take a moment to look at this creepy video of a four legged robot.) Instead, I want to highlight a few interesting points from the Singer interview.

What struck me the most was Singer’s answer to a question as to whether there is any unifying theme to his three books, on of which examines the use of child soldiers, a second the rise of private military contractors, and the most recent on battlefield robots.  His response was essentially that he is concerned about how our assumptions of “what war is” are no longer accurate. In previous eras, if the U.S. was going to enter into prolonged military conflict, we had a declaration of war. We don’t do that anymore.  Our image of the soldier was a uniformed man (typically) fighting on behalf of our country against other uniformed men (typically) in the armed forces of other states. But, as his PMC book has shown, the Bush Administration has actually cut back on what the government handles in combat zones and increasingly outsourced this to private companies. So rather than the citizens of a nation thinking about the possibility of a draft if we engage in war, we maintain an all volunteer force and contract-out for the rest.

Now we are not even using people, but rather machines (some remotely controlled, others actually autonomous) in a variety of missions. We currently have about 5,000 aerial drones in Iraq (some armed, others not) and something like 12,000 ground robots (we started the war with none) that do things ranging from shooting at incoming missiles and mortar fire (!) to sweeping for IEDs (using a robot based on the household cleaning Roomba, believe it or not).  If we are not even sending people to war but machines, how will this affect our decision to go to war?

And, finally, he notes that the people that we are fighting are less and less uniformed soldiers and more and more irregular fighters, including children.

Altogether, he explains that his concern is that the traditional barriers to using force (formal declarations of war, political checks over concerns of sending large numbers of people off to fight, etc.) are being lowered to the point where they are essentially just lying on the ground.  He is worried about what this implies for our proclivity to use military force in the future. My summary doesn’t do his argument justice, you really should listen to the interview.

He also notes that, unlike nuclear weapons, the use of robotics do not require a large industrial base.  It is essentially “open source” technology with commodified components. While there are U.S. soldiers sitting in Nevada remote controlling aerial drones on fire missions, there are people around the world logging into jihadi websites that use webcams in Iraqi streets and allow remote detonation via the Internet of IEDs near those webcams.  Just wait for a target of opportunity. He also points out that the Israel-Hezbollah conflict in Lebanon was perhaps the first conflict where both sides (a state and, importantly a non-state actor) fielded battlefield robots.

However, the most chilling section for me came in the second half of the interview when he described what some people call “oops moments” and the reaction of some of the scientists to these moments. “Oops moments” are when, due to software glitches, heavily armed robots do things like set their sights on their own soldiers (in one case killing eleven 9 South African soldiers) or (in another case where the robot actually did not have live ammo) on a group of dignitaries who were watching a demonstration. Singer described the troubling disconnect that some of the scientists whom he interviewed had in regards to whether in light of this there are any legal or ethical issues in using battlefield robots. One scientist argues that he could not think of any such legal or ethical issues. When pressed about what if a robot consistently kills soldiers on its own side or innocent civilians. he answered: “That’s not an ethical issue; that’s a product recall issue.”

(Sounds like a Ford Motor Company memo concerning the Pinto. I half expected Singer to say that the scientist then said: “Praise your new robotic overlords you glorified monkeys! Praise them!” But, no.)

A troubling look at what may be the future of warfare. I plan on reading the book.

And my apologies to these guys for the title of this post.

Print Friendly, PDF & Email
Topics
Foreign Relations Law, National Security Law
Notify of
Kenneth Anderson

Yes, it was a very interesting interview, and I should add that I’ll be joining an online discussion at Complex Terrain Lab on the book sometime in the next few weeks, so I’ll hold my comments until then and post a link.

trackback

[…] Opinio Juris has a nice summary of points raised during an NPR interview with Singer, particularly his theme that the assumptions by which we define “war” are less and less valid. This is critical, and it seemed to be a point overlooked in a Washington Post opinion piece that caromed around a few blogs recently.  […]