Lethal Battlefield Robots: Sci-Fi or the Future of War?

Warbots don’t exist yet, and the Campaign to Stop Killer Robots hopes to keep it that way.

We're not quite there yet.20th Cenury Fox/Entertainment Pictures/ZUMAPRESS.com

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.


“We are not talking about things that will look like an army of Terminators,” Steve Goose, a spokesman for the Campaign to Stop Killer Robots, tells me. “Stealth bombers and armored vehicles—not Terminators.” Goose, the director of Human Rights Watch’s arms division, has been working with activists and other experts to demand an international ban on robotic military weapons capable of eliminating targets without the aid of human interaction or intervention, i.e., killer robots.

The bluntly titled campaign, which at sounds like something from a Michael Bay flick or Austin Powers, involves nine organizations, including the International Committee for Robot Arms Control. The campaign is spearheading a preemptive push against efforts to develop and potentially deploy fully autonomous killer robots—a form of hi-tech weaponry that doesn’t actually exist yet.

“I’m not against autonomous robots—my vacuum is an autonomous robot,” says Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield and chair of the International Committee for Robot Arms Control (and a fixture on British television). “We are simply calling for a prohibition on the kill function on such robots. A robot doesn’t have moral agency, and can’t be held accountable for crimes. There’s no way to punish a robot.”

The real-life equivalent of Isaac Asimov’s Three Laws of Robotics (which posits that robots may not harm humans, even if they are instructed to do so) is, like killer-robot technology itself, a ways off. In April, the United Nations released a report (PDF) that recommended suspending the development of autonomous weapons until their function and application is discussed more thoroughly. Last December, the Department of Defense issued a directive on weapon systems autonomy, calling for the establishment of “guidelines designed to minimize the probability and consequences of failures in autonomous and semi-autonomous weapon systems that could lead to unintended engagements.”

Though the Pentagon document stresses the need for human supervision of military robots, critics claim it leaves the door open for the development of autonomous lethal robots that aren’t accountable to meaningful human oversight. “We already don’t understand Microsoft Windows; we’re certainly not going to understand something as complex as a humanlike intelligence,” says Mark Gubrud, a research associate working on robotic and space weapons arms control at Princeton. “Why would we create something like that and then arm it?” Killer robot foes also note that, according to the Pentagon directive, it only takes signatures from two department undersecretaries and the chair of the Joint Chiefs of Staff to green-light the development and use of lethal autonomous technology that targets humans.

“I’m not against autonomous robots—my vacuum is an autonomous robot. We are simply calling for a prohibition on the kill function on such robots.”

Militaries and contractors are already working on combat systems that surpass our current fleet of killer drones by requiring less human control. The US Navy commissioned Northrop Grumman’s X-47B (as yet unarmed) to demonstrate the takeoff and landing capabilities of autonomous unmanned aircraft. Researchers at Carnegie Mellon University have developed an trucklike combat vehicle called the “Crusher,” designed for fire support and medevac, for the Defense Advanced Research Projects Agency. (“This vehicle can go into places where, if you were following in a Humvee, you’d come out with spinal injuries,” said the director of DARPA’s Tactical Technology Office.) The $220 million Taranis warplane, developed by BAE Systems for the United Kingdom, could one day conduct fully autonomous intercontinental missions. And China has been developing its Invisible Sword unmanned stealth aircraft for years.

Yet the technology required for to make an advanced fighting robot is still far from complete. “Our vision and sensing systems on robots are not that good,” Sharkey says. “They might be able to tell difference between a human and a car, but they can be fooled by a statue or a dog dancing on its hind legs, even.”  Experts also say that the technology is nowhere near being able to make crucial distinctions between combatants and noncombatants—in other words, whom it’s okay to kill.

This technological uncertainty has caused some experts to think a preemptive injunction on warbot development is misguided. “We are making legal arguments based entirely on speculation,” says Michael Schmitt, chairman of the international law department at the US Naval War College. (Schmitt recently planned a workshop on the legal issues surrounding killer robots, but sequestration has delayed it.) “Do I have my concerns? Of course. But these systems have not been fielded on the battlefield, nor are they in active development in the US.”

Schmitt argues that existing international law would keep the use of robots from spiraling into a sci-fi nightmare. “If such a system cannot discriminate between civilians and enemy combatants in an environment, then it is therefore unlawful,” he explains. “No one is talking about a George Jetson-type scenario. What we are talking about is going to a field commander and saying, ‘Here’s another system, like a drone, or a frigate, or an F-17.’ If I were a commander, I would know what laws there are, and in what situation I can use it.”

Another side of the debate is over whether killer robots would reduce or increase civilian casualties. The Department of Defense has been funding the research of Georgia Tech roboticist Ronald Arkin, who seeks to design a software system, or “ethical governor,” that will ensure robots adhere to international rules of war. He’s argued that machines will be more effective fighters than humans. “My friends who served in Vietnam told me that they fired—when they were in a free-fire zone—at anything that moved,” Arkin recently told the New York Times. “I think we can design intelligent, lethal, autonomous systems that can potentially do better than that.”

“If a robot commits a war crime, who’s responsible for it?”

Creating an artificial intelligence that could act upon just-war principles or the idea that civilian casualties should be minimized would involve elaborate programming. “That’s kind of what we’re worried about,” says George Lucas, Jr., a professor of ethics and public policy at the Naval Postgraduate School who has worked with Arkin. “Those extraordinarily complex algorithmical systems, they may operate fine 99 percent of the time, but every once and a while they go nuts.” If armed robots are eventually deployed, Lucas says they should be limited to simple and very tightly scripted scenarios, like protecting a no-go zone around a vessel at sea. In a counterinsurgency setting, the sheer number of complicated variables—determining who’s an enemy, ally, or noncombatant—might overwhelm a robot’s capabilities.

The Campaign to Stop Killer Robots suspects that any benefits of battlefield robots might come at the expense of civilians. “Reducing military casualties is a desirable goal, but you shouldn’t do that by putting civilians at risk,” says Goose of Human Rights Watch. “Most roboticists we’ve talked to say we’ll never get to a point that machines will adequately make distinctions between targets, or meet requirements of humanitarian law. Sometimes these decisions require emotions and compassion, and having a machine with attributes necessary for this kind of legal reasoning is not at all likely.”

So far, these questions remain largely hypothetical. But the Campaign to Stop Killer Robots wants to answer them before we find ourselves debating the ethics of a lethal technology that can’t be put back in the box. Should warbots become a reality, who will take the fall for an atrocity committed by a autonomous machine during the course of an operation? “If a robot commits a war crime, who’s responsible for it?” Goose asks. “The commander? The manufacturer? If you can’t hold someone responsible for a war crime, then there’s nothing to deter these war crimes.”

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do things differently in the aftermath of a political crisis: Watergate. We stand for justice and democracy. We reject false equivalence. We go after, and go deep on, stories others don’t. And we’re a nonprofit newsroom because we knew corporations and billionaires would never fund the journalism we do. Our reporting makes a difference in policies and people’s lives changed.

And we need your support like never before to vigorously fight back against the existential threats American democracy and journalism face. We’re running behind our online fundraising targets and urgently need all hands on deck right now. We can’t afford to come up short—we have no cushion; we leave it all on the field.

Please help with a donation today if you can—even just a few bucks helps. Not ready to donate but interested in our work? Sign up for our Daily newsletter to stay well-informed—and see what makes our people-powered, not profit-driven, journalism special.

payment methods

DONALD TRUMP & DEMOCRACY

Mother Jones was founded to do things differently in the aftermath of a political crisis: Watergate. We stand for justice and democracy. We reject false equivalence. We go after, and go deep on, stories others don’t. And we’re a nonprofit newsroom because we knew corporations and billionaires would never fund the journalism we do. Our reporting makes a difference in policies and people’s lives changed.

And we need your support like never before to vigorously fight back against the existential threats American democracy and journalism face. We’re running behind our online fundraising targets and urgently need all hands on deck right now. We can’t afford to come up short—we have no cushion; we leave it all on the field.

Please help with a donation today if you can—even just a few bucks helps. Not ready to donate but interested in our work? Sign up for our Daily newsletter to stay well-informed—and see what makes our people-powered, not profit-driven, journalism special.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate