Search: battlefield robots

and ran into Buford's cavalry unit, the little town of Gettysburg was not a battlefield. Yet both forces immediately attacked each other and a new battlefield was created. When the Japanese tried to track down individual Coastwatchers who were radioing information about Japanese ship movements in the Solomons, there was no "battlefield". They were targeted individuals attacked by military force, frequently large units of searching infantry. They were located deep in the jungle specifically to be as far away from the enemy and any battle as possible. Yet they were...

the US simply has the possibility of dropping bombs on the offending robots and, poof. Problem goes away. No special technology response needed. But I have a feeling it’s not that simple. There may be collateral damage issues in simply dropping a bigger bomb or any kind of bomb. But the far more important worry is that the autonomous firing weapons robots with impure ethical circuits might very well not be big, vehicle sized battlefield robots – easily identified and destroyed on the battlefield. Rather, they might be whole swarms...

...justice. Killer robots cannot learn from an unjust crime they have committed. Therefore, by prosecuting an object that does not possess any mental capacity, objective of justice will not be fully served. Another existing legal gaps is that “neither criminal law nor civil law” provides any sources for an efficient procedure to address the liability and accountability issues of robots. This goes beyond partial use of AWS by human (Sharkey’s spectrum 1, 2, and 3). It expands to accountability issues for the use of fully killer robots. The legal vacuum...

One of my favorite issues of the New York Times Magazine is its “year in ideas” issue, which comes annually in December. Because OJ is a repository of things related to battlefield robotics and law and ethics, I wanted to flag for your attention the item by Dara Kerr, “Guilty Robots.” [I]magine robots that obey injunctions like Immanuel Kant’s categorical imperative — acting rationally and with a sense of moral duty. This July, the roboticist Ronald Arkin of Georgia Tech finished a three-year project with the U.S. Army designing prototype...

A busy week of grading prevented me from addressing Ken’s May 6 post on battlefield geography along with the May 6 news that the US conducted a drone attack in Yemen any sooner, but there should be an important take away on the boundaries of the battlefield from the bin Laden operation. An often heard complaint about the US conduct of the “war on terror” is that it treats “the whole world as a battlefield.” Many contend that such a conception of the battlefield, particularly in the context of a...

My previous post mentioned battlefield robot analogs of dogs, cheetahs, pack animals, even humans. Now behold the synchronized nanobot swarm! Here’s what national security analyst John Robb had to say about the tactical benefits of a battlefield drone swarm: •It cuts the enemy target off from supply and communications. •It adversely impacts the morale of the target. •It makes a coordinated defense extremely difficult (resource allocation is intensely difficult). •It radically increases the potential of surprise Things start to get really interesting when the confluence of two technologies cause even...

...suggestion and blog in 2013 about robots, technology, and the inter-relationships with international law, international organizations, globalization more broadly. I think Chris is right to say that we reflexively think about domestic law when it comes to these areas, but there are many issues in international economic law, as well as the economics of globalization more generally. Here’s a question I plan to explore across all the places I blog in 2013, including OJ – I’m thinking about offering a course in robots and the law, a research seminar, in...

...Gal also serves as a coach for the Israeli team participating in the 2024 Jessup International Law Moot Court Competition.] Israel is unique in its willingness to openly discuss its use of AI-based tools on the battlefield. Recently, high-ranking officers in the Israel Defence Forces (IDF) acknowledged the growing use of AI-based tools as part of Israel’s military arsenal, and this trend is also evident during the Israel-Gaza war of 2023-2024, in which it can be seen how the IDF deploys AI-based systems for defensive needs, command and control, collection,...

El roam Thanks for the post , just two reservations , with your permission : First, you insist on the gap, between robots and human being, in exercising complex discretion (tactical, and strategic one) and indeed so!! Yet , if so , the very basic or fundamental definition of robots , as been brought in the post : " Autonomous weapons systems, or ‘killer robots’ as they are referred to by others, are sophisticated weapons systems that, once they have been activated, can select and attack targets without further human...

...One of the most significant, though rarely discussed, aspects of WWII court-martials was in the court-martial and sentencing to death of black GI's in the European theater. I would encourage you to read Alice Kaplan's The Interpreter and learn about Plot E in an American Battlefield Cemetery outside of Paris where black soldiers convicted and executed in dubious court-martial proceedings are buried in numbered graves. (Leon Jaworski was a famous lawyer who cut his teeth on those types of prosecutions.) The plot is not mentioned on the battlefield cemetery website...

...timely, then, that I have just posted a short co-written article on SSRN entitled “Beyond the Ban: Comparing the Ability of ‘Killer Robots’ and Human Soldiers to Comply with IHL.” The article is co-authored with Lena Trabucco, a brilliant postdoctoral researcher at the Centre for Military Studies who is an expert in emerging weapons technologies. (She has a PhD in international law from the University of Copenhagen and a PhD in political science from Northwestern.) Here is the abstract of the article, which will appear relatively soon in a Fletcher...

certain firing decisions, rather than an emotionless computer might be regarded as a possible war crime. It is a profound mistake to pre-judge the humanitarian possibilities created in the future by technology. That is about autonomy, but it is still well down the road. The more immediate reason why robotics is important to OJ discussions is that it is not just about sci-fi autonomous robots. It is about other issues that are taking place on the battlefield now, and transforming it in all the ways that Singer discusses so well....