Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Friday, June 30, 2006

When Robots Kill

The folks over at Meme Therapy did one of their Brain Parade columns yesterday -- in which they ask a handful of people for their responses on some topic or other. The latest question: "The military is increasingly using robotic technology. What kinds of ethical considerations should we be making before we automate killing?" This was my own, rather off-the-cuff, response:
Well, I think ethical considerations should compel us to reject the automation of killing altogether. Ethics also has something to say about the social costs of the war addiction of our bomb-building elites, and about the long-term personal and social costs imposed by the brutal roboticizing process that transforms citizens into soldiers in the first place. You know, killing a human being should simply never seem easy. It’s so obvious it sounds sanctimonious to point it out, but there it is. And since we’re having this exchange in a time of war it should be said often and loudly as well that definitely we know we’re in trouble when so many of our elected representatives sound glib at best when they say war is a last resort. Every war is a disaster, every war is a defeat -- even when we “win” one. Wars of choice like the current catastrophic Iraq adventure especially bespeak an almost unfathomably profound breakdown of the ethical imaginary.

The automation of mass violence -- via mass media distraction, via the video-gamization of weaponry, via the neuroceutical modification of soldiery -- is an extraordinary intensification of the techniques of training and drill that have long functioned as a ritual instrumentalization of the individual soldier. This instrumentalization has everything to do with the obliteration of ethics in the encounter of subjects in a war-zone and its replacement with an encounter between objectified no-longer-quite subjects. The outright roboticization of militarism is a step along a tragic trajectory rather than the appearance of something altogether new.

I was intrigued by this comment to my response (it appeared within minutes): "We're a few thousand years too late for that. The evolution of weapons from stick to spear to sword to machine-gun is all about making it more effecient. I can't see a world where we lay down rifles for rocks, which is where your thoughts would lead us. Nice to think about, just isn't going to happen."

I personally don't think that weapons "evolve" at all, and also I worry that there is an awful lot of important stuff that falls out of any story you might want to tell about the history of killing tools for which "efficiency" is what matters most. My own hope is that things like more respect for the Geneva Conventions is where my thought would lead us, rather than swtiching rifles for rocks (which doesn't seem to me to be that dire a prospect to contemplate, come to think of it). But anyway, the somewhat smug dismissal of what I had to say as empty naive abstraction ("Nice to think about, just isn't going to happen.") was a little rankling to me since it seems that what is being dimissed here are "ethical considerations" as such, which was kinda sorta the premise of the question at hand.

I can watch it happen time after time after time, and still I find as fascinating as ever the joyless spectacle in which a technologist resigns the field of "ought" for some parochial characterization of "is" and convinces himself that this transaction somehow constitutes progress toward solving a problem rather than simply massively missing the point. The key symptomatic moment for me is the statement "I can't imagine a world" which precedes the "hardboiled" rejection of ethical considerations that follows. Just so. The dismissal of "ought" (which I think is doubly indicated by the substitution of an evolutionary for a deliberative vocabulary as well as by the foregrounding of "efficiency") constitutes an impoverishment rather than improvement of intelligence which soon enough constrains what will be available to us in the way of "is." Why this consequence would be imagined to be useful or pragmatic in any way is altogether beyond me.

4 comments:

Jose said...

You make some good points here. Personaly I'm too cynical to really hope for more than civilian oversight of advanced automated weapon systems and I'd be suprised if we even got that.

The thing is that there are plenty of citizens in the west who are very gung ho on their countries possession of advanced weapon systems and damn the consequences. There doesn't even seem to be much interest in nuclear disarmanent.

I suspect this kind of thinking will continue so long as western countries are facing down rivals that are vastly inferior to them technologicaly. As soon as that changes then you'll start hearing talk about disarmanent and treaties like you did between the US and the USSR in the 70s and 80s.

Cheers

Jose

Dale Carrico said...

Hi, Jose. I don't think you seem cynical at all. The truth is, your expectations on this issue are not far from my own. I guess the larger point I'm trying to remind people of is that there's a difference between what we sensibly expect and what we hope for (ethics) or demand (politics) from the world. It's easy to forget just how often the things we sensibly expect turn out to be wrong just because some people's hopes and demands changed the world in unexpected ways.

n8o said...

you poor, naive, shameless ideologue!
/sarcasm

The commenter is indeed missing the point if what he is reading from you concerns a steady rollback of weapons technology, instead of finding more ways to //lay down// weapons entirely.

When it's well-justified, yeah, use a smart bomb. Or robots. or lasers, or whatever comes along. The point is to reduce the number of scenarios in which use of such weapons can be considered "well-justified"; making it harder to justify the use of highly targeted, highly "efficient" weapons. It doesn't matter how effiecient they are - what matters is that they're still weapons, regardless of how effective or cheap they are.

Charlie said...

I was thinking about this, and it occured to me that maybe the technologies the military are deploying could also provide the solution to the oversight issue. It seems unlikely that the uncensored data feeds from tele-operated machines will be made available to the media, but what is to stop, say, Amnesty International deploying its own remotely operated observers into the thick of the fighting? Even the fiercest fighting could then be subject to scrutiny by the rest of the world, and short of shooting down the independant observers its hard to see what the military could do about this...