You're right Dale, few will listen to you about superlativity, now or ever.
Well, it's a philosophical critique and only a minority of people engage in such philosophical critique. By the way, this isn't an elitist condemnation of the majority -- people can be thoughtful without being philosophical, strictly speaking, after all.
Meaning, few will start saying "yes Dale, you're right, transhumanist discourse is wrong or silly or harmful." To the contrary, a lot of serious, bright, and thoughtful people will likely continue to see transhumanist discourse as having value no matter how many times you repeat your critique.
Ah, poor little Robot Cultist. Nearly everybody who comes into contact with the transhumanists decides they are silly and wrong and dismisses them on the spot as kooks, and quite rightly so. No doubt some people will continue to be drawn to superlativity, for reasons like the ones I mention at the end of the post to which brave "Anonymous" is responding: namely, "in order to sell their scam[s] or cope with their fear of death and contingency or indulge their social or bodily alienation or lose themselves in wish-fulfillment fantasies inspired by science fiction or try to gain some sense of purchase, however delusive, on their precarious inhabitation of a dangerously unstable corporate-militarized world[.]"
Only a vanishingly small minority of people are transhumanist-identified or avowed singularitarians and so on. Thoughtfulness is not exactly the quality these people share in my view.
Most people don't take the superlative futurologists and Robot Cultists seriously enough in the first place to understand why I devote the time I do to critiquing them. I don't think many grasp that superlative futurology is a symptom and clarifying extreme expression of corporate-militarist developmental discourse more generally, and that such futurology, in turn, is the quintessential ideological expression of neoliberalism.
I do think it is regrettable that I have not managed to attract more attention from like-minded critics of corporate-militarism, but I must say that not convincing a few dumb boys who fetishize their toys to give up their Robot Cult is hardly any kind of abiding regret of mine where the superlativity critique is concerned.
2 comments:
> > You're right Dale, few will listen to you about superlativity,
> > now or ever. . .
>
> Ah, poor little Robot Cultist. Nearly everybody who comes into contact with
> the transhumanists decides they are silly and wrong and dismisses them on the
> spot as kooks, and quite rightly so.
And some people go on to write about it. John Bruce, by the way, came
to his conclusions entirely on his own, with no input from Dale (and
certainly not from me).
An educated "civilian" bumps into transhumanism.
Hard.
In the Shadow of Mt. Hollywood
John Bruce's Observations on Education, Epistemology,
Writing, Work, and Religion
http://mthollywood.blogspot.com/2006_03_01_mthollywood_archive.html
-------------------------------------
Friday, March 24, 2006
I Really Hate To See The Same Dumb Mistakes
The one thing nobody else is pointing out is that
[Glenn] Reynolds is what [one] might characterize as a
“libertarian evangelist”.
We do great damage to our society when we allow religious
and philosophical tinhorns to dominate middlebrow discussion
without adequate counter-argument. We’re still suffering
from the likes of Alan Watts. Generations of bright kids
have been wasting some or all of their time, energy, and
youth on the idealized, highly selective, and heavily
sanitized versions of mysticism that he and others began
to promulgate in the 1950s.
http://mthollywood.blogspot.com/2006_05_01_mthollywood_archive.html
---------------------------------------------------------
Wednesday, May 10, 2006
. . .
Narcissism As A Motivator For Cryonics
It occurred to me after I put up the post on narcissism
and transhumanism just below that cryonics might in fact
be the perfect mode of interment for narcissists, irrespective
of whether they can be pulled out of the dewar, connected
to some new or transplant body, and revivified at some future
date. The narcissist has a static, idealized self-image that's
exactly the opposite of the memento mori skull. What better
way to preserve it than freezing?
. . .
My own feeling is that science fiction – not actual technological
advance – drives transhumanism. The appeal transhumanism has to
narcissists is the potential for refusing to acknowledge the
passage of time, as well as a static, immortal, grandiose self-image.
The humorlessness comes with the territory.
Transhumanism seems to have a particular appeal to the wealthy –
look at the Silicon Valley millionaires on the board of the
Foresight Nanotech Institute, for instance – and I think this
follows. A narcissistic rich person can control a great many things,
but there’s one threat that won’t go away: you’re going to die,
no matter how rich you are. Get rid of that one fly in the ointment,
and you’ve got it made: a static, timeless self-image of a rich guy.
(Failing that, freezing your head comes in as a valid second choice.)
Dale wrote:
> I don't think many grasp that superlative futurology is a symptom and
> clarifying extreme expression of corporate-militarist developmental
> discourse more generally, and that such futurology, in turn, is the
> quintessential ideological expression of neoliberalism.
>
> I do think it is regrettable that I have not managed to attract more
> attention from like-minded critics of corporate-militarism. . .
Nato Welch on the "Singularity Syndrome"
http://n8o.r30.net/dokuwiki/doku.php/blog:soundadvice
> If it happens and it's interested in us, all our
> plans go out the window. If it doesn't happen, sitting
> around waiting for the AIs to save us from the rising
> sea level/oil shortage/intelligent bioengineered termites
> looks like being a Real Bad Idea.
>
> –Charles Stross, on the Singularity
Singularity Syndrome: the derangement in public policy discussions that
occurs when one rationally considers the possibilities of uncertain future
developments happening all at once, rather than over time.
It bears a resemblance to phenomenon observed in the policy discussion regarding
the "War On Terror", climate change, and, more recently, the Large Hadron Collider,
wherein partisans make up for a lack of the probability that pet risk will occur
by simply inflating the stakes.
When it comes to emerging technologies, one considers several possibilities:
the possibility that a technical capability will emerge sooner, or later, by
degrees. The further away in the futures some contingency is, not only is it
that much less necessary to prepare for, but it is also that much less possible
to prepare for.
Thus, the only reasonable output any policy discussion or think-tank can produce
is in preparation for near-term contingencies, however unlikely they may be. Because
little useful attention can be paid to likelier scenarios due to their distance
from the present, no useful recommendations can be made. As a result, all of
the attention is focused on unlikely outcomes, giving the work an absurd an
alarmist character, regardless of how well-meaning, mature, or rational the
process or practitioners used to undertake the assessment.
When confronted with parents anxious about their kids getting involved with playing
Dungeons and Dragons, there's a turn of phrase that I have found succinct and useful:
"Don't worry. Role playing games don't make kids weird; they just attract weird kids."
In this case, discussions of emerging technologies before the fact of their arrival
actually seems to have the opposite effect. The singularity not only attracts weird
people - but it actually actively deranges otherwise perfectly rational participants
by forcing them to take highly uncertain possibilities seriously.
Post a Comment