Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Tuesday, July 10, 2007

"The Singularity Won't Save Your Ass"

Musing on a somber topic (the fraught "intersection of crisis-response thinking and transformational-future thinking"), but in a playful mood, Friend of Blog Jamais Cascio has proposed a bumper sticker that pithily captures an attitude I endorse heartily myself: "Singularity is not a Sustainability Strategy." He intends this as a jokey-serious technocentric analogue to the better known bumper sticker "The Rapture Is Not an Exit Strategy."

"Singularity," for those who don't know about it, is a term that refers to a constellation of overlapping Superlative Technology theses, almost always taking on an apocalyptic or transcendentalizing coloration, in which we are told that technodevelopment is accelerating (or even that this acceleration is itself accelerating) in ways that demand the circumvention of democratic intuitions about the usefulness of public deliberation, the value of precautionary recommendations, the necessity of conventional regulatory oversight, or a proper developmental responsiveness to the actual diversity of stakeholders to development. (Elsewhere, I have described such technocentric and futurological acceleration fixations as "Accelerationalism.")

Usually, for "Singularitarians," these claims about acceleration are tightly coupled to claims about the imminence of artificial intelligence and likewise, immediately thereafter, the imminence of artificial superintelligence. "Singularity" is conventionally used to describe the imagined Event when "post-biological superintelligence" arrives on the scene, although it sometimes is used to describe the aftermath of a basic historical discontinuity (usually directly connected to some version or other of the "imminent post-biological superintelligence" claim), beyond which it is impossible to make reasonable predictions because technoscientific change is happening too quickly and too radically for "mere human" intelligence to grasp.

Qualms about the anti-democratic entailments of Singularitarian variations of the Accelerationality Thesis are usually addressed either by investing the imagined "post-biological superintelligence" with Salvational properties or by insisting that the "urgency" of the threat of "post-biological superintelligence" justifies "reluctant" elite technocratic decision-making in the "interests of all" as they see it for now, which amounts to investing the Singularitarians themselves with Salvational properties.

In the very interesting comments occasioned by Jamais's post, Friend of Blog Michael Anissimov bravely makes the Singularitarian case to a skeptical audience, suggesting that
smarter beings would think up better ways to run a sustainable civilization: using byproduct-free manufacturing, space-based solar panels, fusion power, etc. Being smarter, they'd also be able to invent and implement such things much faster than the most competent humans would, and also discover technologies we cannot yet even imagine. That's the power of increased intelligence.

Needless to say, we already have the intelligence to do such things, even without Robot Gods to pray to, especially when we realize the power of interpersonal collaboration to solve shared problems (a power that is renewed and reinvigorated by planetary peer-to-peer networked formations). But also, and one would expect this to be just as needless to say, it is not the lack of intelligence but the impediment of the heartless, greedy, short-term, anti-democratic politics of incumbency that stands between humanity and the solution of many of our conspicuous shared problems. We have intelligence already, and "more intelligence" (especially not the too-reductively instrumental vision of intelligence Singularitarians tend to confine themselves to, a tendency among True Believers in the Strong Program of AI that I deride as "Artificial Imbecillence") is not going to break the impasse of diverse stakeholder politics in a shared and finite world. Technology is not "neutral" nor "autonomous," and technoscientific developments, properly so-called, are always articulated by politics and culture.

Without good democratic politics even Robot Gods would not "save us." With better democratic politics, human ingenuity and benevolence could be marshaled further in the service of shared ends, so that we no longer feel the need to "be saved" in the first place.

Another comment, from "Kim," pointed out that since Singularitarians, like most people beguiled by Rapture rhetorics, are responding to deep fears and fantasies, passions that are not entirely rational when all is said and done, it is probably counterproductive to point out to them that they are being unreasonable or to patiently ennumerate more reasonable alternatives. This may be true, but I do think it is important to add that the brand of irrationality peddled by Singularitarians has powerful resonances with the intuitions of neoliberals and neoconservatives. Some neoliberals and neoconservatives have already started to drift in a broadly Singularitarian, or at any rate technocentric, direction to save their anti-democratic agenda in the face of its current catastrophic culmination (Thomas Friedman, Glenn Reynolds, and William Safire are pretty good examples of this in my view), and it is hard for me to see how the majority of neoliberals and neoconservatives could long resist the lure of Singularitarian arguments that
[1] provide a rationale for the circumvention of democratic politics
[2] provide a rationale for increased investment in military R&D
[3] make recourse to tried and true strategies of fearmongering
[4] appeal to Old School conservative intuitions about the special Destiny of the West
[5] appeal to Old School conservative intuitions about the indispensability of elite Gatekeepers of the True Knowledge
[6] appeal to more newfangled conservative intuitions about "spontaneous order" and "natural(ized) markets."

Given all this, it seems to me there is every reason to expose the unreasonableness and even ridiculousness of Singularitarian doctrine, even if its more passionate partisans will likely turn a deaf ear. The danger is not so much the True Believers among Singularitarians themselves (who would, of course, be properly jailed for terrorism or scarily hired by the military were they to edge even a nanometer in the direction of actually creating the silly Robot Army that so preoccupies their fancy), but the cynical incumbent interests and corporate-militarist formations that are desperately scouting about these days for a new rhetoric to bamboozle people with as they continue their reckless crime spree.

By the way, the title for this post is also taken from the Comments to Jamais's post. It is an alternate Bumper Sticker to Jamais's own suggestion proposed by "Stefan Jones." It cracked me up.

7 comments:

Michael Anissimov said...

Sometimes it seems like this blog is a continuous series of letters addressed directly to me rather than any "blog" in the typical sense.

Dale Carrico said...

It doesn't seem that way to me.

Anonymous said...

Me neither.

ZARZUELAZEN said...

Dale,

I refuse to engage with any-one associated with SIAI. They're cultists.

For readers not in the know, M Anissimov is well-known Singularitarian troll who's spent years trolling transhumanist lists with claims not supported by any mathematical proofs or empirical data.

The founder of Singularity Institute (E Yudkowsky) is a high school drop-out and an egotistical bully who's been free rein on the transhumanist lists.

Both M.Anissimov and E.Yudkowsky should have been moderated and banned from the transhumanist lists long ago, and until they are, transhumanism can continue to do without me.

Unknown said...

Marc_Geddes,

You level the charge of being egotistical at Eliezer, then make a great threat of transhumanism having to do without you.

I can't decide if you meant that to be ironic or trolly or neither.

Please enlighten!

jimf said...

Cheesy Chimp wrote:

> You [Marc Geddes] level the charge of being egotistical at Eliezer, then make
> a great threat of transhumanism having to do without you.

Yes, I think we can all justly acknowledge that Mr. Geddes put
his foot in his mouth.

> I can't decide if you meant that to be ironic or trolly or neither.

I fear it was not "meant" to be noticed at all. Mr. Geddes is not
the only habitue of transhumanist circles who is "tone deaf"
in this way.

> Please enlighten!

The "feud" between these two on SL4 and elsewhere could fairly,
I think, be characterized as an embarrassingly immature clash
of two oversized male egos, of the kind someone I knew once
called "weenie waving". Eliezer got the best of it.

ZARZUELAZEN said...

The Singularitarians completely destroyed any enthusiasm I may once have had for organized transhumanism. It's no fun talking to complete arseholes, no matter how clever they are.