БЛОГ

Mar 13, 2009

Q&A: The robot wars have arrived

Posted by in categories: defense, engineering, futurism, military, robotics/AI

March 12, 2009 10:00 AM PDT

Q&A: The robot wars have arrived

P.W. Singer

P.W. Singer

Just as the computer and ARPAnet evolved into the PC and Internet, robots are poised to integrate into everyday life in ways we can’t even imagine, thanks in large part to research funded by the U.S. military.

Many people are excited about the military’s newfound interest and funding of robotics, but few are considering its ramifications on war in general.

P.W. Singer, senior fellow and director of the 21st Century Defense Initiative at the Brookings Institution, went behind the scenes of the robotics world to write “Wired for War: The Robotics Revolution and Conflict in the 21st Century.”

Singer took time from his book tour to talk with CNET about the start of a revolution tech insiders predicted, but so many others missed.

Q: Your book is purposely not the typical think tank book. It’s filled with just as many humorous anecdotes about people’s personal lives and pop culture as it is with statistics, technology, and history. You say you did this because robotic development has been greatly influenced by the human imagination?
Singer: Look, to write on robots in my field is a risky thing. Robots were seen as this thing of science fiction even though they’re not. So I decided to double down, you know? If I was going to risk it in one way, why not in another way? It’s my own insurgency on the boring, staid way people talk about this incredibly important thing, which is war. Most of the books on war and its dynamics–to be blunt–are, oddly enough, boring. And it means the public doesn’t actually have an understanding of the dynamics as they should.

It seems like we’re just at the beginning here. You quote Bill Gates comparing robots now to what computers were in the eighties.
Singer: Yes, the military is a primary buyer right now and it’s using them (robots) for a limited set of applications. And yes, in each area we prove they can be utilized you’ll see a massive expansion. That’s all correct, but then I think it’s even beyond what he was saying. No one sitting back with a computer in 1980 said, “Oh, yes, these things are going to have a ripple effect on our society and politics such that there’s going to be a political debate about privacy in an online world, and mothers in Peoria are going to be concerned about child predators on this thing called Facebook.” It’ll be the same way with the impact on war and in robotics; a ripple effect in areas we’re not even aware of yet.

Right now, rudimentary as they are, we have autonomous and remote-controlled robots while most of the people we’re fighting don’t. What’s that doing to our image?
Singer: The leading newspaper editor in Lebanon described–and he’s actually describing this as there is a drone above him at the time–that these things show you’re afraid, you’re not man enough to fight us face-to-face, it shows your cowardice, all we have to do to defeat you is just kill a few of your soldiers.

It’s playing like cowardice?
Singer: Yeah, it’s like every revolution. You know, when gunpowder is first used people think that’s cowardly. Then they figure it out and it has all sorts of other ripple effects.

What’s war going to look like once robot warriors become autonomous and ubiquitous for both sides?
Singer: I think if we’re looking at the realm of science fiction, less so “Star Wars: The Clone Wars” and more so the world of “Blade Runner” where it’s this mix between incredible technologies, but also the dirt and grime of poverty in the city. I guess this shows where I come down on these issues. The future of war is more and more machines, but it’s still also insurgencies, terrorism, you name it.

What seems most likely in this scenario–at least in the near term–is this continuation of teams of robots and humans working together, each doing what they’re good at…Maybe the human as the quarterback and the robots as the players with the humans calling out plays, making decisions, and the robots carrying them out. However, just like on a football field, things change. The wide receivers can alter the play, and that seems to be where we’re headed.

How will robot warfare change our international laws of war? If an autonomous robot mistakenly takes out 20 little girls playing soccer in the street and people are outraged, is the programmer going to get the blame? The manufacturer? The commander who sent in the robot fleet?
Singer: That’s the essence of the problem of trying to apply a set of laws that are so old they qualify for Medicare to these kind of 21st-century dilemmas that come with this 21st-century technology. It’s also the kind of question that you might have once only asked at Comic-Con and now it’s a very real live question at the Pentagon.

I went around trying to get the answer to this sort of question meeting with people not only in the military but also in the International Committee of the Red Cross and Human Rights Watch. We’re at a loss as to how to answer that question right now. The robotics companies are only thinking in terms of product liability…and international law is simply overwhelmed or basically ignorant of this technology. There’s a great scene in the book where two senior leaders within Human Rights Watch get in an argument in front of me of which laws might be most useful in such a situation.

Is this where they bring up Star Trek?
Singer: Yeah, one’s bringing up the Geneva Conventions and the other one’s pointing to the Star Trek Prime Directive.

You say in your book that except for a few refusenicks, most scientists are definitely not subscribing to Isaac Asimov’s laws. What then generally are the ethics of these roboticists?
Singer: The people who are building these systems are excited by the possibilities of the technology. But the field of robotics, it’s a very young field. It’s not like medicine that has an ethical code. It’s not done what the field of genetics has, where it’s begun to wrestle with the ethics of what they’re working on and the ripple effects it has on the society. That’s not happening in the robotics field, except in isolated instances.

What military robotic tech is likely to migrate over to local law enforcement or the consumer world?
Singer: I think we’re already starting to see some of the early stages of that…I think this is the other part that Gates was saying: we get to the point where we stop calling them computers. You know, I have a computer in my pocket right now. It’s a cell phone. I just don’t call it a computer. The new Lexus parallel-parks itself. Do we call it a robot car? No, but it’s kind of doing something robotic.

You know, I’m the guy coming out of the world of political science, so it opens up these fun debates. Take the question of ethics and robots. How about me? Is it my second amendment right to have a gun-armed robot? I mean, I’m not hiring my own gun robots, but Homeland Security is already flying drones, and police departments are already purchasing them.

Explain how robotic warfare is “open source” warfare.
Singer: It’s much like what’s happened in the software industry going open source, the idea that this technology is not something that requires a massive industrial structure to build. Much like open source software, not only can almost anyone access it, but also anyone with an entrepreneurial spirit, and in this case of very wicked entrepreneurial spirit, can improve upon it. All sorts of actors, not just high-end military, can access high-end military technologies…Hezbollah is not a state. However, Hezbollah flew four drones at Israel. Take this down to the individual level and I think one of the darkest quotes comes from the DARPA scientist who said, and I quote, “For $50,000 I could shut down Manhattan.” The potential of an al-Qaeda 2.0 is made far more lethal with these technologies, but also the next generation of a Timothy McVeigh or Unabomber is multiplying their capability with these technologies.

The U.S. military said in a statement this week that it plans to pull 12,000 troops out of Iraq by the fall. Do you think robots will have a hand in helping to get to that number?
Singer: Most definitely.

How?
Singer: The utilization of the Predator operations is allowing us to accomplish certain goals there without troops on the grounds.

Is this going to lead to more of what you call the cubicle warriors or the armchair warriors? They’re in the U.S. operating on this end, and then going to their kid’s PTA meeting at the end of the day?
Singer: Oh, most definitely. Look, the Air Force this year is putting out more unmanned pilots that manned pilots.

Explain how soldiers now come ready-trained because of our video games.
Singer: The military is very smartly free-riding off of the video game industry, off the designs in terms of the human interface, using the Xbox controllers, PlayStation controllers. The Microsofts and Sonys of the world have spent millions designing the system that fits perfectly in your hand. Why not use it? They’re also free-riding off this entire generation that’s come in already trained in the use of these systems.

There’s another aspect though, which is the mentality people bring to bear when using these systems. It really struck me when one of the people involved in Predator operations described what it was like to take out an enemy from afar, what it was like to kill. He said, “It’s like a video game.” That’s a very odd reference, but also a telling reference for this experience of killing and how it’s changing in our generation.

It’s making them more removed from the morality of it?
Singer: It’s the fundamental difference between the bomber pilots of WWII and even the bomber pilots of today. It’s disconnection from risk on both a physical and psychological plain.

When my grandfather went to war in the Pacific, he went to a place where there was such danger he might not ever come home again. You compare that to the drone pilot experience. Not only what it’s like to kill, but the whole experience of going to war is getting up, getting into their Toyota Corolla, going in to work, killing enemy combatants from afar, getting in their car, and driving home. So 20 minutes after being at war, they’re back at home and talking to their kid about their homework at the dinner table. So this whole meaning of the term “going to war” that’s held true for 5,000 years is changing.

What do you think is the most dangerous military robot out there now?
Singer: It all hinges on the definition of the term dangerous. The system that’s been most incredibly lethal in terms of consequences on the battlefield so far if you ask military commanders is the Predator. They describe it as the most useful system, manned or unmanned, in our operations in Afghanistan and Iraq. Eleven out of the twenty al-Qaeda leaders we’ve gotten, we’ve gotten via a drone strike. Now, dangerous can have other meanings. The work on evolutionary software scares the shit out of me.

You’re saying we’re gonna get to a HAL situation?
Singer: Maybe it’s just cause I’ve grown up on a diet of all that sci-fi, but the evolutionary software stuff does spook me out a little bit. Oh, and robots that can replicate themselves. We’re not there yet, but that’s another like “whoa!”

People have finally got the attention of companies and governments to look ahead to 2020, 2040, 2050 in terms of the environment and green technology. But as you said in your book, that’s not happening with robotics issues. Why do you think that is?
Singer: When it comes to the issue of war, we’re exceptionally uncomfortable looking forward, mainly because so many people have gotten it so wrong. People in policymaker positions, policy adviser positions, and the people making the decisions are woefully ignorant in what’s happening in technology not only five years from now, not only now, but where we were five years ago. You have people describing robotics as “mere science fiction” when we’re talking about having already 12,000 (robots) on the ground, 7,000 in the air. During this book tour, I was in this meeting with a very senior Pentagon adviser, top of the field, very big name. He said, “Yeah this technology stuff is so amazing. I bet one day we’ll have this technology where like one day the Internet will be able to look like a video game, and it will be three-dimensional, I’ll bet.”

(laughing) And meanwhile, your wife’s at Linden Labs.
Singer: (laughing) Yeah, it’s Second Life. And that’s not anything new.

At least five years old, yeah.
Singer: And you don’t have to be a technology person to be aware of it. I mean, it’s been covered by CNN. It appeared on “The Office” and “CSI.” You just have to be aware of pop culture to know. And so it was this thing that he was describing as it might happen one day, and it happened five years ago. Then the people that do work on the technology and are aware of it, they tend to either be: head-in-the-sand in terms of “I’m just working on my thing, I don’t care about the effects of it”; or “I’m optimistic. Oh these systems are great. They’re only gonna work out for the best.” They forget that this is a real world. They’re kind of like the atomic scientists.

Obviously the hope is that robots will do all the dirty work of warfare. But warfare is inherently messy, unpredictable, and often worse than expectations. How would a roboticized war be any different in that respect?
Singer: In no way. That’s the fundamental argument of the book. While we may have Moore’s Law in place, we still haven’t gotten rid of Murphy’s Law. So we have a technology that is giving us incredible capabilities that we couldn’t even have imagined a few years ago, let alone had in place. But the fog of war is not being lifted as Rumsfeld once claimed absurdly.

You may be getting new technological capabilities, but you are also creating new human dilemmas. And it’s those dilemmas that are really the revolutionary aspect of this. What are the laws that surround this and how do you insure accountability in this setting? At what point do we have to become concerned about our weapons becoming a threat to ourselves? This future of war is again a mix of more and more machines being used to fight, but the wars themselves are still about our human realities. They’re still driven by our human failings, and the ripple effects are still because of our human politics, our human laws. And it’s the cross between the two that we have to understand.

Candace Lombardi is a journalist who divides her time between the U.S. and the U.K. Whether it’s cars, robots, personal gadgets, or industrial machines, she enjoys examining the moving parts that keep our world rotating. Email her at [email protected]. She is a member of the CNET Blog Network and is not a current employee of CNET.
2

Comments — comments are now closed.


  1. John Hunt says:

    This is an excellent post dealing with an issue which isn’t receiving enough attention.

    If the US or Israel decided to invade Iran in 5–10 years from now they would have to expect that their greatest difficulty would be in dealing with urban-based warfare and insurgency. One could imagine that it may be nearly impossible to overthrow a country (Iran) with more than twice the population, possibly a very large number of organized suicide bombers, and with an ethnic majority which may be opposed to an imposed new form of government.

    But with the second battle of Fallujah the US took the town while loosing only about two dozen of its soldiers. This was largely because Fallujah had been surrounded and civilians were given the opportunity to evacuate which they did.

    Recently Israel refrained from significantly entering Gaza city perhaps due in part to the expected losses on both sides. As a result, Hamas, their irreconcilable enemy was left intact.

    If in these situations civilians were first ordered to be evacuated and then the robots were sent in first, a city could be largely cleansed of insurgents and booby traps before soldiers were risked. It would be pointless suicide to try and defend a city under those circumstances. So it seems to me that there is a tremendous incentive on the part of Western countries to construct many remote-controlled combat robots not much more advanced than the Talon. So I think that we can expect this in the very near future.

    In the more distant future I could also imagine robots being genetically designed, modeled in a virtual environment, and then pitted against other genetic variants. The winner then has its genes mutated to create new models which then fight each other in a virtual environment. Within a short period of time we could have the designs for frighteningly efficient killing machines. The whole Terminator paradigm doesn’t seem so sci fi any more.

  2. John Hunt says:

    Here’s a good article which very much relates to the topic of this blog:

    COLUMN-Killer robots and a revolution in warfare:Bernd Debusmann
    http://www.alertnet.org/thenews/newsdesk/LM674603.htm