AlphaDogs, Cheetah-bots, and Mecha-Avatars

AlphaDogs, Cheetah-bots, and Mecha-Avatars

Three quick updates from the “robots and warfare” side of things (largely culled from recent Danger Room posts that caught my eye and I wanted to point out to Opinio Juris readers).

I have previously posted about Big Dog, the four-legged beast of burden being developed for use by the U.S. military.  DARPA (the Defense Advanced Research Projects Agency) is now developing (along with Boston Dynamics) AlphaDog, the larger more advanced version of the robot. See this new video of AlphaDog walking around, carrying stuff. Getting closer to actual use in the field…

While the BigDog and AlphaDog videos are interesting for the weird, surreal sensation of watching something that sounds like a lawnmower but walks like a young horse, as a matter of international humanitarian law, they are probably not a big story in and of themselves in the same way that supply trucks are not a “big story” in regards to IHL. However, they do point to advances that may lead to new weapons systems at some point. Quadruped robotic hunters, perhaps running in packs, perhaps autonomous, have been hypothesized. I had previously referred to such robots using William Gibson’s term “slamhounds.” According to Danger Room, DARPA is now working to develop quadruped hunter robots, but is going for a different animal metaphor, calling the project “Cheetah.” Boston Dynamics, the developer of BigDog/ AphaDog, is running the Cheetah project.  Adam Rawnsley of Danger Room writes:

As the name implies, Cheetah is designed to be a four-legged robot with a flexible spine and articulated head (and potentially a tail) that runs faster than the fastest human. In addition to raw speed, Cheetah’s makers promise that it will have the agility to make tight turns so that it can “zigzag to chase and evade” and be able to stop on a dime.

This does have IHL implications.  Its not clear whether the cheetah-bot will be remotely-controlled, in which case the legal issues will be akin to those of UAV’s, or will detect, hunt, and possibly attack autonomously.  That latter issue brings up the knottier question of how you code IHL parameters into software and what types of liability ensues when something goes wrong.

But here’s the piece de resistance, DARPA has

allotted $7 million for a project titled “Avatar.” The project’s ultimate goal, not surprisingly, sounds a lot like the plot of the same-named (but much more expensive) flick.

According [to] the agency, “the Avatar program will develop interfaces and algorithms to enable a soldier to effectively partner with a semi-autonomous bi-pedal machine and allow it to act as the soldier’s surrogate.”

These robots should be smart and agile enough to do the dirty work of war, Darpa notes. That includes the “room clearing, sentry control [and] combat casualty recovery.” And all at the bidding of their human partner.

[Emphasis added.]

You can place a Terminator or Avatar joke here (two James Cameron movies, huh), but I think this is a better film metaphor (and it’s by “District 9 ” director Neill Blomkamp).

So, while these mecha-avatars will not be autonomous, they will be remotely-controlled armed bipedal drones. As a legal matter, one could say that this is similar to our current use of UAV’s.  But I don’t think so. These things would be interacting with humans in close-up situations (“room clearing”) but the operators, perhaps half a world away, would be physically and possibly emotionally removed from the situation.  Cues might be missed. How do instincts translate via a video-link?  The possibility for increased loss of life is very real. This is one development I will want to track.

Print Friendly, PDF & Email
Topics
Featured, Foreign Relations Law, General, National Security Law
Notify of
Mihai Martoiu Ticu

Not only the possibility for increased loss of life is very real. The possibility for increased number of wars is very real. Since no Americans will ever die in wars anymore, there will be less public opposition to wars altogether.

Kenneth Anderson

Really interesting – thanks for flagging this!  Luckily for everyone, Matt Waxman and I have a short legal policy piece in the works on the regulation of autonomous lethal weapons systems as they are gradually developed. Up on SSRN soon.  But this was also the subject of a panel at Santa Clara’s terrific conference a couple of weeks ago, in which Ashley Deeks and I commented on Oren Gross’s draft paper – he is also working on the legal questions here.  Lots of interest, and with good reason.