While clear deliberation about and regulation of military artifice does need to account for specificities, I simply do not agree that the differences introduced by contemporary military drones or what passes for autonomous weapons systems today are sufficiently different from the quandaries posed by balloons, carrier pigeons, time-bombs, land-mines, guided munitions, and remotely operated weapons systems of years past to justify dramatic, deranging talk of unprecedented transformations and revolutionary robocalypse. Let me be clear: It is because I take the threat of programmed drones and weapons so seriously that I worry about the inflated science-fictional narratives increasingly framing their stakes. The futurological repudiation of available analogies, all-too-familiar issues, and perennial quandaries of war function very readily as a pretext for distractions and deceptions to the cost of hopes for accountability and sanity in this time of world war without end.
There is, after all, nothing more commonplace nowadays than the application of the terms "smart" and "intelligent" to palpably unintelligent devices and inept software. Hyperbole is the argot of digital culture, and the phony investment of dumb tech commodities with agency and intelligence may encourage users to forgive the dysfunction of their computational "companions" while at once this false investment answers to what appears a widely shared ideology or even faith among many of the designers and peddlers of these devices that they are taking humanity step by step, handheld by handheld, landfill by landfill, along the road to techno-transcendental salvation via the serially failed, fatally-flawed program of AI.
Recently, many of the super-rich salesmen (Bill Gates, Elon Musk, Peter Thiel) and so-called "Thought Leaders" (Ray Kurzweil, Stephen Hawking, Nick Bostrom) of our celebrated VC tech culture have been raising alarms about the urgent existential threat of satanic super-intelligent AI. This talk represents the extreme form of the now long-standing and utterly prevalent robo-fixated public imagery and discourse of popular science fiction, commercial advertizing, and corporate-military think tanks full of pronouncements about the wonders of Big Data and smart cards, and the horrors of robot armies and smart drones.
Well-meaning opponents of war atrocities and engines of war would do well to think how tech companies stand to benefit from military contracts for "smarter" software and bleeding-edge gizmos when terrorized and technoscientifically illiterate majorities and public officials take SillyCon Valley's warnings seriously about our "complacency" in the face of truly autonomous weapons and artificial super-intelligence that do not exist.
It is crucial that necessary regulation and even banning of dangerous "autonomous weapons" proceeds in a way that does not abet the mis-attribution of agency, and hence accountability, to devices. Every "autonomous" weapons system expresses and mediates decisions by responsible humans usually all too eager to disavow the blood on their hands. Every legitimate fear of "killer robots" is best addressed by making their coders, designers, manufacturers, officials, and operators accountable for criminal and unethical tools and uses of tools.
Let us take up the point for a different but related issue. Automation has not created and sustained our ongoing unemployment crisis or lowered the earning power of generations of workers -- the decline of collective bargaining to demand an equitable share in profits from productivity gains as well as social support amidst dislocations in labor markets have made automation deployed by plutocrats the occasion for a general crisis of unemployment and wealth concentration. Just so, killer robots don't kill people, people kill people with killer robots -- and they regularly do so in our names, and often as war crimes.
I am always baffled by gun zealots who like to crow that guns don't kill people. I have never understood why the recognition that people kill people with guns provides less reason to ban especially dangerous guns, or restrict their purchase and use, or demand rigorous licensing standards, or require safety measures to protect citizens from accidents and criminal misuse of guns, or impose liabilities on their manufacturers and retailers. Thermonuclear weapons don't kill people, either, people kill people with thermonuclear weapons. That recognition hardly recommends their private sale or use.
There simply is no such thing as a smart bomb. Every bomb is stupid. There is no such thing as an autonomous weapon. Every weapon is deployed.
The only killer robots that actually exist are human beings waging and profiting from war.
No comments:
Post a Comment