Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Saturday, April 24, 2010

Singularitarian Faithful Throng the IEET

According to a poll of the readership of the stealth-Robot Cult outfit IEET, fully 68% -- over two-thirds of respondents -- declared that "The Singularity" either "definitely" or "probably" will take place within this century.

"The Singularity" is a notion with many variant formulations, most of which have strong eschatological/ theological overtones: Cory Doctorow derides "The Singularity" as "The Rapture of the Nerds." Ray Kurzweil declares "The Singularity Is Coming" in a pop-futurological book proselytizing the notion. Vernor Vinge, the fine sf writer whose work inspired much of the cottage industry in Singularitarianism, has described "The Singularity" as a kind of techno-constituted "ascension" or "transcension" of selfhood incomprehensible to anyone who has not likewise so transcended, or described it as a kind of escape velocity propelling some lucky inhabitants of technological society out of history, a velocity achieved through an ongoing acceleration of accelerating change or through the arrival of a greater-than-human post-biological intelligence. All of these descriptions tend to get smudged together, as well as many of them to depend on rather nonsensical claims made about Moore's Law misconstrued as some kind of pseudo-Hegelian avatar of the World Spirit.

Needless to say, there are enormous confusions that bedevil all of these formulations, and I have written more than my fair share of debunkings of the variations of the Singularitarian sects of the Robot Cult archipelago (among them, The Singularity Won't Save Your Ass, What's Wrong With the Robot Cultists and Their Scary (or Shiny) Singularity?, Singularitarian Agony, Sanewashing Superlativity, among many others to be found in the Superlative Summary).

To simplify somewhat, self-identified "Singularitarians" in full-on Robot Cult mode tend to be not only among the dwindling dead-enders still clinging to the endlessly failed Good Old Fashioned Program of Strong AI but to have doubled down, as it were, and infused this faith with the amplified expectation not only of the proximate arrival of artificial intelligence but of a superintelligent post-biological Robot God who either will solve all our problems for us as a kind of sooper-parent or will instead reduce to the world to ubergoo computronium feedstock, either way ending human history and society and life As We Know It.

Given this, I personally think it is fair to say that the prominence of Singularitarian faithful among the IEET's readership is another indication (as is the whole White Guys As Far As the Eye Can See problem, as is the chirpy eugenicism problem, discussed here and here, for example), poses something of an insurmountable problem for the IEET in its ongoing effort to sanewash their organization as a mainstream-legible bioethics and technology policy-wonk think tank, rather than a congenial spear-tip into the transhumanist sub(cult)ure and Robot Cult organizational archipelago.

It is interesting that the definition with which IEET inaugurates the curious reader into the idea of "The Singularity" -- which, recall, over two thirds of IEET's readers seem to expect to be on its way -- is highly fumigated: "a theorized future point of discontinuity when events will accelerate at such a pace that normal unaugmented humans will be unable to predict or even understand the rapid changes occurring in the world around them."

It is hard not to wonder just what an "unaugmented human" is supposed to be, given that all human animals make recourse to culture/prosthesis, language, clothing, ceremonial, and so on -- or how transhumanists would defend the inevitable parochialism of any criteria that would presumably permit demarcations of the cultures/prostheses that "augment" humans as against the ones that do not. Futurological discourses are of course completely suffused with such definitional quandaries and uncritical assumptions (you should hear them speak of parochially preferred medical outcomes as "sub-optimal" or "enhancements" as if these were merely neutrally "technical" descriptions or universally affirmed values), which is one of the reasons I personally think futurological discourses tend to be better understood as modes of fraudulent advertising than as modes of serious analysis in the first place.

It is also hard not to wonder just how the frustration at predicting or understanding changes marked "singularitarian" ultimately differ from the well-known frustrations of prediction and understanding that already inspire us to distinguish, say, tomorrow from today in a conventional sense.

As a rule, futurologists (especially superlative futurologists of the transhumanist, singularitarian, techno-immortalist, cybernetic-totalist, nano-cornucopiast sects of the Robot Cult organizational archipelago) will tend EITHER to peddle their formulations as near-vacuities for mainstream consumption (issues of network security, healthcare funding, science education, global risk management, and so on) even though not one of these discourses originated with or is improved by or inspires any interest in their futurological deployments OR to peddle their formulations as extraordinarily marginal and hyperbolic faith-based initiatives for their own members (superintelligent post-biological Robot Gods, brains "uploaded" into cyberspace, bodies quasi-immortalized through sooper-medicine, cheap-as-dirt nanotechnological genies-in-a-bottle, and so on).

As I have elaborated here and here, nobody needs to join a Robot Cult to take technodevelopmental issues seriously in the former mode, and, overwhelmingly, nobody ever does. Anybody who takes techno-transcendentalizing wish-fulfillment fantasies seriously in the latter mode is ripe pickings for a Robot Cult and fully marginal for that.

The danger, of course, is that futurological discourses appeal to mass-media outlets hungry for narratives of technodevelopmental change and their politics more dramatic than useful, appeal to ill-educated audiences (either comparatively technoscientifically illiterate or politically-culturally oblivious or both), appeal to those whose inhabitation of disruptive technoscience in the midst of precarizing neoliberal development is primarily emotional (inspiring dreams of omnipotence, nightmares of impotence), and in these appeals function to derange public deliberation on the most equitable, most consensual distribution of costs, risks, and benefits to all the diverse stakeholders to technoscientific change at the worst possible historical moment, the moment of catastrophic climate change, proliferating WMDs, and unprecedented wealth disparities.

The comparatively suave futurologists of the IEET are doing this dance of derangement for the usual quick buck or for a few more asses in the pews of their sub(cult)ural organizations but to me they are not only targets for edifying ridicule in the way most Robot Cultists are but also to be more dangerous and deserving of exposure in their pretensions to academic seriousness and mainstream political relevance.


Isaac said...

I have noticed that many times you have found the transhumanists at fault for not having clear definitions of the subjects they discuss. Now, not being a transhumanist--more of just a hobbiest or a sci-fi fanatic or whatever--I will not contend that the following definitions are endorsed by the IEET, Kurzweil, or any other faction of the futurist community.

So, here are my definitions, with comments following each one:

"Singularity:" The point in time at which reliable predictions about the future, in any domain of observation, can no longer be made due to beings with motives humans cannot understand controlling all aspects of government, scientific and technological development, the military, and the economy.

Basically, the singularity is just a time when we could no longer make any kind of guesses about the future. For instance, barring any kind of singularity-like event taking place, I could say with extreme confidence that in 100 years people would still be divided by artificial boundaries (i.e. countries), there would still be militarism, corporatism, inequity, greed, racism, homophobia, etc etc, and I'm pretty sure most everybody would agree with that. However, if a "singularity" were to take place, all such predictions would be thrown out the window, since something else would be calling the shots.

"Enhancement:" The use of prosthetic (note that "prosthetic" contains many of the things you listed above, including clothing, the internet, etc) devices to alter the usual function of an individual's body or brain in a way such that the individual finds the end result desirable.

Now, of course, this definition does not agree at all with the typical concept of "enhancement" that is rampant among futurist circles, where "enhancement" is the means by which humans develop abilities far beyond the species' norms (whatever that means!), blah blah blah and surpassing of limits and some more crap.

When it comes right down to it, I find the concept of "enhancement" to be the use of any means to affect for the better an individual's experience of life (note that this includes recreational drugs). But the key word here is 'individual,' since it is pretty much a vacuity these days to say something along the lines of, "Just because one person likes something doesn't mean everyone else will."

"Super-intelligence:" Any being that has the ability to consciously alter its environment in such a way that humans would never even consider, much less have the means to carry out.

Now, this definition is easily the most clunky of the ones I'm going to do right now, since the entire concept of intelligence is so hard to get one's head around. Basically, a "super-intelligence" is any being that can "out-think" us by leaps and bounds (and you know perfectly well what I mean by this, even though I can't articulate it).

If these helped clarify anything, I'd be happy to write others. If not, then I'll just anonymously crawl back to the anonymous hole from whence I came.

Dale Carrico said...

Given that none of these definitions of yours have any authority or prevalence in the organization and related subcultures under discussion, by your own admission, this exercise might be clarifying if you and I were to discuss these topics in an abstract way.

That is something I'll admit I'm disinclined to do beyond a certain point, by the way, to be honest, because I am more interested these days in exposing those who take these formulations seriously to ridicule and political critique -- or to treat their pathologies as symptomatic of larger pernicious tendencies in corporate-militarist developmental discourses more prevalent in the world.

Given all this, your definitions don't really clarify the actual issue at hand so much (I don't that in an insulting way). That issue, recall, is grasping what it means that so many folks associating with an organization that wants to peddle itself as a mainstream-legible technoscience and bioethics policy think-tank clearly believe in flabbergastingly implausible utterly marginal ideas few actually serious technoscience or bioethics policy wonks would devote a second's attention to -- except possibly to enjoy them as science fictional conceits without confusing them for the substance of actual science or policy.

Needless to say, I personally think this is just another reminder that futurology in general is better understood as a promotional and advertising discourse than a critical or analytic discourse, and as such is prey to all the hyperbolic pathologies and vulnerabilities to fraud that plague those promotional and advertising discourse -- while superlative futurological discourse takes this hyperbole and fraud to literally transcendentalizing lengths (in so doing it is hardly the first organized forms of religiosity to misbehave, especially once its priests start hankering after the authority of scientific descriptions or of moralizing politics).

Black guy from the future past said...

Recent singularity "documentary" released. Thought this might interest you. In order to combat your enemy and opposition you must know what they are all about.

>this review actually approaches a direct critique of the "singularity nonsense:

Happy hunting!