Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All
Saturday, June 11, 2005
GIGOO, AI, Potayto, Potahto
"Tomayto Tomahto," is what, in effect, one e-pistolary correspondent had to say about my claim in yesterday's "GIGOO" post that, "I'm personally incomparably more scared of GIGOO (unprecedentedly powerful, complex, distributed, possibly self-recursive, probably globally networked, buggy software) than 'Unfriendly AI.'"
"It doesn't matter how you call it," he went on to say, but "rapidly evolving and speciating/radiating postbiology is a potentially terminal threat at any levels[.]"
Of course, I get his point. But "how you call it" seems to me to matter a great deal, indeed.
The figures through which we frame what he calls here the quandaries of "postbiology" will mobilize radically different kinds of attentions, assumptions, interventions. Consider the choice of the term "postbiology" itself here, a word already freighted with entailments that probably cost one sympathy without compensatory benefits of clarity or force to justify them. Why not simply "nonbiology," for example?
My own point is that it pays to frame developmental threats in a way that mobilizes more efficacious intervention. The figures of "intelligence" and especially "superintelligence" seem to me more trouble than they're worth if what is wanted are relevant safety standards, regulatory and funding guidelines, and institutional safeguards for self-replicating, self-modifying, distributed software applications too complex for humans to understand which nonetheless intervene in human environments in potentially devastating ways.
Why intelligences, personages, monsters, angels, succubi? The last thing we need, if you ask me, are figurations that launch people into fight or flight in the face of an imaginary potential predator, or lust at an especially appealing imaginary sex partner, or remobilize Oedipal narratives from childhood and hence irrational desires for or aggressivity toward a parent-substitute where replicative software is concerned, of all things.
As an example of the kind of argument I am making here from a slightly different context: I would also argue that the figure of "Gaia" is usually much more trouble than it's worth as a way of framing the regulatory quandaries of threatening complex systemic interactions of global petrochemical industrialization and the terrestrial environment, since some of the more reasonable projects to ameliorate climate change will likely involve technological interventions that would look like "violations" of "Gaia" as well. Better, then, not to frame the threats in terms that mobilize intuitions about violations of bodily integrity in the first place -- whatever the allure of such a mobilization given the otherwise difficult task of overcoming the complacency provoked by the overwhelming complexity of these issues.
But, no, let's not call the whole thing off.
"It doesn't matter how you call it," he went on to say, but "rapidly evolving and speciating/radiating postbiology is a potentially terminal threat at any levels[.]"
Of course, I get his point. But "how you call it" seems to me to matter a great deal, indeed.
The figures through which we frame what he calls here the quandaries of "postbiology" will mobilize radically different kinds of attentions, assumptions, interventions. Consider the choice of the term "postbiology" itself here, a word already freighted with entailments that probably cost one sympathy without compensatory benefits of clarity or force to justify them. Why not simply "nonbiology," for example?
My own point is that it pays to frame developmental threats in a way that mobilizes more efficacious intervention. The figures of "intelligence" and especially "superintelligence" seem to me more trouble than they're worth if what is wanted are relevant safety standards, regulatory and funding guidelines, and institutional safeguards for self-replicating, self-modifying, distributed software applications too complex for humans to understand which nonetheless intervene in human environments in potentially devastating ways.
Why intelligences, personages, monsters, angels, succubi? The last thing we need, if you ask me, are figurations that launch people into fight or flight in the face of an imaginary potential predator, or lust at an especially appealing imaginary sex partner, or remobilize Oedipal narratives from childhood and hence irrational desires for or aggressivity toward a parent-substitute where replicative software is concerned, of all things.
As an example of the kind of argument I am making here from a slightly different context: I would also argue that the figure of "Gaia" is usually much more trouble than it's worth as a way of framing the regulatory quandaries of threatening complex systemic interactions of global petrochemical industrialization and the terrestrial environment, since some of the more reasonable projects to ameliorate climate change will likely involve technological interventions that would look like "violations" of "Gaia" as well. Better, then, not to frame the threats in terms that mobilize intuitions about violations of bodily integrity in the first place -- whatever the allure of such a mobilization given the otherwise difficult task of overcoming the complacency provoked by the overwhelming complexity of these issues.
But, no, let's not call the whole thing off.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment