Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Saturday, October 20, 2007

Superlative Church

Upgraded and Adapted from Comments:

ONE

Me: "Do I have to remind you that you have responded in the past to some of my characterizations of Superlative outcomes as implausible by arguing not that they were wrong but that they constituted defamation against transhumanists like you?"

Giulio Prisco: Do I have to remind you that what I told you was that the TONE and LANGUAGE you used were unnecessarily conflictive and insulting -- not an argument about your ideas, just a remark about your lack of manners.

I must say that this seems a bit disingenuous to me. Of course you have castigated me for my tone and language and so on in the past many times, but that's hardly the substance of the discussion we've been having here.

Look, I am offering up rhetorical, cultural, and political analyses delineating general tendencies (oversimplification of technodevelopmental complexities, fixations on particular idealized outcomes, vulnerabilities to technological determinism and technocratic elitism, and so on) that seem to me to arise from certain Superlative and Sub(cult)ural ways of framing technodevelopmental problems.

Individual people who read such analyses and then complain that I am insulting them are, for the most part, finding in what I say a shoe that fits on their own and wearing it themselves. And that is simply not the same thing as me insulting them. It is basic incomprehension or cynical distortion to say otherwise (sometimes my critiques are of particular authors or particular texts, and the charge of personal insult could conceivably make sense in such contexts, but not when my analyses are general and when the categories I deploy name general tendencies and general social and cultural formations).

The fact is, that you have actually compared your personal "transhumanist" identity, and in earlier exchanges with me your "Technological Immortalist" identity to identity categories like being gay, or gypsy, and so on. Clearly these comparisons are designed to mobilize proper progressive intuitions about lifeway diversity in multiculture by analogy to persecuted minorities. I think this sort of analogy is wildly inappropriate, and perhaps your response here suggests that you have come to agree that it is upon further consideration. Maybe you weren't entirely conscious of your rhetoric here?

As you know I am immensely interested in the politics and policy of ongoing technodevelopmental social struggle, and one of the things that troubles me enormously is that any shift from a deliberative/open into a subcultural/identity mode of technodevelopmental politics is going to be extremely vulnerable to mistake critique for persecution, disagreement for defamation.

But how can one debate about a changing diversity of best technodevelopmental outcomes when some will feel threatened in their very identities by the prospect of a failure to arrive at their own conception of best outcomes? How can such subcultural identifications with particular outcomes comport with democratic intuitions that we must keep the space of deliberation radically open -- even as we struggle together to find our way to our provisional sense of best, fairest, safest, emancipatory outcomes -- so as always to remain responsive to the inevitable existing diversity of stakeholders to any technodevelopmental state of affairs, in the present now as well as in future presents to come?

This is why I stress that the anti-democratizing effects of Superlative and Sub(cult)ural Technocentrisms are often more structural than intentional: one can affirm democratic ideals and yet contribute to discursive subversions of democracy against the grain of one's affirmed beliefs in these matters. It completely misses the force of my point and the nature of my deepest worries to imagine that I am calling people anti-democratic names when I try to delineate these tendencies. If only it were so simple as a few anti-democratic bad apples! Such personalizations of the problem utterly trivialize the issues and stakes on my own terms, quite apart from these issues some of my conversational partners complain of that their feelings have been hurt by what they see as unfair accusations or what have you.

None of this is to deny, by the way, that there are indeed explicit reactionaries and authoritarian personalities -- both gurus and followers -- to be found aplenty in Superlative Technocentric social formations. It is well documented that there are unusual numbers of both to be seen in these curious marginal spaces. And I have exposed and ridiculed these manifestions among the Extropians, Singularitarians, transhumanists, and so on many times before, and I will continue to spotlight and to ridicule them as they well deserve.

But my own sense is that it is the larger structural tendencies that preoccupy my own attention that make these formations strange attractors for some reactionaries, occasional authoritarians, legions of True Believers, and so on, rather than vice versa. And it is also true that these structural tendencies can yield their anti-democratizing effects just as well when Superlative and Sub(cult)ural Technocentrics have no explicit anti-democratizing intentions in the least.

Since you probably read all of those claims about general tendencies as personal insults in any case it isn't entirely clear to me that you will have quite grasped the force of my critique by my lights, but such are the risks of interpersonal communication.

TWO

Me: "So you really think Superlative frames have no impact on your assessments of the significance and stakes of emerging genetic and prosthetic healthcare, nanoscale toxicity and sensors or current biotechnologies, security issues connected with networked malware today, cybernetic totalist ideology in contemporary coding cultures, and so on?

Giulio Prisco: Yes. Why should I have written it otherwise? The timescales involved are quite different aren't they? The Robot God, the Eschaton or whatever you like have nothing to do with health care and network security (and civil rights, and world peace, and...), so why should I let the RG have any impact on my assessments here and now?

Well, needless to say, not all Superlative Technocentrics would agree with you that the timescales are that different, inasmuch as they finesse this problem through the expedient recourse to accelerationalism, whereby extreme or distant outcomes are rendered "proximate" by way of accelerating change, and even accelerating acceleration to really confuse matters and make the hype more plausible.

But setting all that aside, you simply can't have thought about this issue very clearly. Of course becoming wedded to Superlative outcomes influences your sense of the stakes and significance of technoscience quandaries in the present.

Much of the force of Jeron Lanier's cybernetic totalism critique, for example, derives from the way he shows faith in the Superlative outcome of Strong disembodied AI becomes a lens distorting the decisions of coders here and now. Word processing programs will correct authorial "errors" that aren't errors in fact, substituting the program's "judgement" for the author's in part because too many coders see this crappy feature through the starry-eyed anticipation of an AI that will actually have judgments, properly speaking.

The fears and fantasies of medicalized immortality crazily distort contemporary bioethical framings of genetic and prosthetic medicine here and now, all the time, and almost always to the cost of sense. Surely you agree with that, at least when the distortions involve bioconservative intimations of apocalypse and, as they like to handwave, "Playing God," arising from research and development into new genetic and prosthetic medical techniques to relieve people from suffering unnecessarily from now treatable diseases.

There are also incredibly energetic debates about whether the definition of "nanotechnology" will refer to current and proximately upcoming interventions at the nanoscale (and all their problems) or to more Superlative understandings of the term when public funds are dispersed or regulations contemplated.

So, of course your Superlative framing of technodevelopmental outcomes impacts your present perception of technodevelopmental stakes. I suspect that you are now going to walk back your claim yet again and try another tack altogether while claiming I have misunderstood you all along, correct?

THREE

Giulio Prisco: Note to Nick: I agree with "rooting for the Transhumanist Team is different from and secondary to actually trying to make the world better". This is not the issue here.

It seems to me that this IS INDEED no small part of the issue here.

I connect Sub(cult)ural Futurism to Superlative Technocentricity, inasmuch as a shared enthusiasm for particular, usually Superlative technodevelopmental outcomes is the bond that actually serves to maintain these subcultures. But the politics of subcultural maintenance in turn impose restrictions on the openness, experimentalism, flexibility of the technoscientific deliberation you can engage in without risk to the solidarity of the identity formation itself.

This is why so many Superlative and Sub(cult)ural Technocentrics can constantly pretend that the future is going to be wildly different from the present and wildly soon, and yet the Superlative Sub(cult)ural vision of the future itself, from its depiction in Regis's Great Mambo Chicken, to Stiegler's "The Gentle Seduction," to Peterson and Drexler's, Unbounding the Future, to Alexander's Rapture to the latest pop futurological favorites in the Superlative mode simply reproduce virtually the same static vision, over and over again, calling attention to the same "visionaries," the same promissory technologies (strong AI, ubiquitous automation, virtual reality, cryonics, nanotechnology, genetic medicine, and often "uploading" personality into information after first discursively reducing it to information already), the same appeals to "superhuman" capacities, technological immortality, personal wealth beyond the dreams of avarice and post-political abundance in general (usually framed in a way that appeals to neoliberal/libertarian anti-political intuitions), the same seductive conjuration of the conventional omni-predicates of theology, but this time personalized and prostheticized, omnipotence, omniscience, omnibenevolence, the same scientistic championing of reductive totalizing explanations coupled with anti-intellectual pillorying of every other kind of thought, poetic, cultural, political, and on and on and on. For two decades and counting the vision has remained unchanged in its essentials, including the insistence on utter, totalizing, accelerating, transcendentalizing change, usually uttered in the tonalities of ecstasy or of dread, a prophetic declaration of transformation that never seems to transform itself, of change that never seems to change, a static, dwindling, tired repetition of platitudes in the midst of a planetary technodevelopmental disruption (corporate precarization, catastrophic climate change, rampant militarization, emerging peer-to-peer network formations, emerging Green movements demanding decentralized nontoxic sustainable appropriate techs, emerging non-normativizing anti-eugenicist movements to democratize medical research, development, and provision, etc.) to which static Sub(cult)ural Superlativities seem to respond less and less in substance.

Superlative Sub(cult)ural Technocentrisms are too much like straightforward faiths, with a particular heaven in mind and a few Churches on hand with marginal memberships. And, as I keep saying, as esthetic and moral formations faithful lifeways seem to me perfectly unobjectionable even when they are not my own cup of tea. What is troubling about Superlativity is that its faithful seem curiously cocksure that they are champions of science rather than True Believers in the first place, which makes them ill-content to confine themselves to their proper sphere, that is offering up to their memberships the moral satisfactions of intimate legibility and belonging as well as esthetic pathways for personal projects of perfection. They fancy themselves instead, via reductionism, to be making instrumental claims that solicit scientific consensus, or, via moralizing pan-ideology, to be making ethical claims that solicit universal assent. (For a sketch of my sense of the different modes of rationality in play here see my Technoethical Pluralism.)

This sort of delusion is common enough in variously faithful people (especially in the fundamentalist modes of belief that Sub(cult)ural Futurisms seem so often to typify) and would hardly qualify as particularly harmful or worthy of extended attention given the incredible marginality of these formations -- an abiding marginality that has remained unchanged for decades, after all. But Superlative Technology Discourses seem to me to have an enormous and disproportionately influential media megaphone deriving -- on the one hand -- from their symptomatic relation to much broader fears/fantasies of agency occasioned by contemporary technodevelopmental churn and -- on the other hand -- from their rhetorical congeniality to neoliberal assumptions that serve the interests of incumbent corporate-military interests. That is why I devote so much of my own attention to their exposure and analysis.

17 comments:

Anonymous said...

Me: "So you really think Superlative frames have no impact on your assessments of the significance and stakes of emerging genetic and prosthetic healthcare, nanoscale toxicity and sensors or current biotechnologies, security issues connected with networked malware today, cybernetic totalist ideology in contemporary coding cultures, and so on?

GP: Yes. Why should I have written it otherwise? The timescales involved are quite different aren't they? The Robot God, the Eschaton or whatever you like have nothing to do with health care and network security (and civil rights, and world peace, and...), so why should I let the RG have any impact on my assessments here and now?


Well, not all Superlative Technocentrics would agree with you that the timescales are that different, inasmuch as they finesse this problem through the convenient recourse to accelerationalism, whereby extreme or distant outcomes are rendered "proximate" by way of accelerating change, and even accelerating acceleration to really confuse matters and make the hype more plausible.

But setting all that aside, you simply can't have thought about this issue very clearly. Of course becoming wedded to Superlative outcomes influences your sense of the stakes and significance of technoscience quandaries in the present.

The force of Jeron Lanier's cybernetic totalism critique, for example, derives in large part from the way he shows faith in the superlative outcome of strong AI becomes a lens distorting the decisions of coders here and now. Word corrects "errors" that aren't errors, substituting its judgement for yours because coders see this crappy feature through the starry eyed anticipation of an AI that will actually have judgments.

The fears and fantasies of medicalized immortality crazily distort contemporary bioethical framings of genetic and prosthetic medicine here and now, all the time, and almost always to the cost of sense. Surely you agree with that, at least when the distortions are bioconservative.

There are incredibly energetic debates about whether the definition of "nanotechnology" will refer to current interventions at the nanoscale or to more Superlative understandings of the term when public funds are dispersed or regulations contemplated.

So of course your Superlative framing impacts your present perception. I suspect that you are now going to walk back your claim yet again and try another tack altogether while claiming I have misunderstood you all along, correct?

-----

The Robot God can't change his opinion of whether peace is good, since it has nothing to do with peace. Depending on what you mean by Robot God, it might not even affect his view of most other things.

Even if it did affect his view of abortion, stem cell research, nanotechnology, etc., his opinion would not be very different from those of many other people. The only things that he might disagree with nearly everybody on are not currently in the public spotlight (AI, life extension, etc.).

Your tone is making it hard to figure out exactly what you are saying. Right now, everybody seems to be arguing past each other, with nobody making relevant arguments. Dale is saying that transhumanism is a cult that threatens democracy and that the things predicted by transhumanists can't happen. I think I remember him explaining why transhumanism is bad for democracy, but he has never explained exactly why transhumanists' predictions are completely wrong. He also has not explained why being like a cult is bad. A group isn't good or bad because of its organization; it is good or bad because of what it is doing (not its intentions or what it is trying to do).

jimf said...

Eric Yu wrote:

> Right now, everybody seems to be arguing past each other. . .

That's true enough.

> . . .with nobody making relevant arguments.

That's not true.

> Dale is saying that transhumanism is a cult that threatens
> democracy. . .

Something like that. "A" cult that threatens democracy is
a bit misleading -- **all** cults are authoritarian (encourage
the followers to relinquish independent thought and judgment)
and hence are anti-democratic.

> . . .and that the things predicted by transhumanists can't happen.

No, Dale never said that.

> [H]e has never explained exactly why transhumanists' predictions
> are completely wrong. . .

He has suggested that the transhumanists' predictions do not bear
the rather hysterical fervency in which they are believed.
A fervency bound up with expectations of both paradisiacal
and apocalyptic outcomes.

> He also has not explained why being like a cult is bad.

Fortunately, he doesn't have to. There's plenty of literature
about that. I'd recommend Joel Kramer & Diana Alstad,
_The Guru Papers: Masks of Authoritarian Power_
http://www.amazon.com/Guru-Papers-Masks-Authoritarian-Power/dp/1883319005

There's also a book folks were expected to read in high school,
back in my day: Eric Hoffer, _The True Believer: Thoughts on
the Nature of Mass Movements_
http://www.amazon.com/True-Believer-Thoughts-Movements-Perennial/dp/0060505915

> A group isn't good or bad because of its organization; it is good or bad
> because of what it is doing (not its intentions or what it is trying to do).

Like hell it is.

Giulio Prisco said...

Dale: "The fact is, that you have actually compared your personal "transhumanist" identity, and in earlier exchanges with me your "Technological Immortalist" identity to identity categories like being gay, or gypsy, and so on. Clearly these comparisons are designed to mobilize proper progressive intuitions about lifeway diversity in multiculture by analogy to persecuted minorities. I think this sort of analogy is wildly inappropriate, and perhaps your response here suggests that you have come to agree that it is upon further consideration. Maybe you weren't entirely conscious of your rhetoric here?"

I believe I am aware of my rethorics. My point is not comparing transhumanists and gays (these are two non overlapping identities and one does not exclude the other I hope), but comparing your RobotGodphobia with the homophobia of some gay haters.

In some political movements, unfortunately not only right-wing, gays have been discriminated and excluded from political initiatives _on the basis of their sexual preferences alone_. Which is, of course, bullshit because one's preference in the bedroom do not impact on his value and effectiveness as a political activist. This is the analogy I am making: I think being or not being a transhumanist has no impact on one's value and effectiveness as a political activist, and therefore I am comparing you to those homophobics.

Of course I do not mean this as an analogy between the _degrees_ of personal suffering involved. At this moment in time you do not have the power to discriminate against me, so I can take this as only an interesting discussion over a relatively abstract issue. I am perfectly aware that often persecuted minorities do not have this luxury as, unfortunately, those who discriminate against them do have the power to do harm. In this sense, the analogy is wildly inappropriate.

But the fact remains that, even if with very different degree and impact, we are still talking of the same attitude: excluding people from politics on the basis of their personal preferences on unrelated things.

And by the way, I just hate those programs that automatically "correct" what I write. I always disable this feature, and if I cannot disable it I just do not use the program. This despite my confidence that a real AI will be developed relatively soon.

G.

jimf said...

Giulio Prisco wrote:

> My point is not comparing transhumanists and gays. . ., but comparing your
> RobotGodphobia with the homophobia of some gay haters.

This sounds exactly like Tom Cruise or Kirstie Alley comparing
critics of Scientology to anti-Semites and other religious bigots.

As Eleanor says to King Henry in _The Lion in Winter_, "I am not
moved to tears."

> I just hate those programs that automatically "correct" what I write. . .
> despite my confidence that a real AI will be developed relatively soon.

Ah, "relatively". Is that "relatively" as in "The early Christians
were expecting Christ to come again relatively soon." or as in
"In the history of life on Earth, once eukaryotic cells came on
the scene, multicellular organisms, nervous systems, and intelligence
all appeared relatively soon."?

One wonders at the source of this confidence. What could it
be that you see that, e.g., Bruce Sterling does not?

"I'm a hard-AI skeptic -- not because, you know,
I'm, like, convinced by the Searle arguments or
so forth, but just because I don't see much evidence
on the ground. I think it's been long enough now
that if we were approaching that there would,
there would clearly be some line of progress
that would get us there, and we've sort of tried
it from the top down, and we've tried it from
the bottom up, and it's just not goin' -- I, I think
it's a bad metaphor. . .

I do find AI very interesting from a literary
perspective, and I, I've often written about it
in fiction, but I don't really see much evidence
of it in the, in the real world. . ."

-- Bruce Sterling, "The Singularity:
Your Future as a Black Hole"

(Eighth "Seminar About Long-Term Thinking"
[ http://www.longnow.org/10klibrary/Seminars.htm ]
given by the Long Now Foundation at
Fort Mason Center, San Francisco,
11 June 2004)

jimf said...

Giulio Prisco lamented:

> In some political movements, unfortunately not only right-wing, gays
> have been discriminated and excluded from political initiatives
> _on the basis of their sexual preferences alone_.

BTW, I would not lean too heavily on this analogy if I were
you for an additional reason -- namely, that the overall tone,
the "syntality", of the transhumanist community is **far**
from congenial to homosexuals.

Maybe it's the left-over Objectivism, or the anti-Left
(or pro-Right) leanings, but there's a whiff of hostility (or
perhaps just icy indifference) to the whole topic. Just as there's
an icy indifference to the difficulties of other minorities
among the >Hists (why bother about those, after all, when
the big S is looming on the horizon?)

As an example with respect to gays, I was rather depressed
to come across the following record of an ImmInst chat with
special guest G. Stolyarov II from a few years ago. The
subject was ostensibly "Objectivism & Immortality",
but a good portion of the chat seemed to be hijacked
by the topic of homosexuality, and **nobody** from
ImmInst stepped forward to challenge Stolyarov's
unfortunate views on the subject:

http://www.imminst.org/forum/index.php?s=&act=ST&f=63&t=4362

Dale Carrico said...

Giulio: You say you are not comparing your "Robot God Cultist" identity with the identities and identity-politics of persecuted minorities of gay folks, and such, but you only mean to compare what you describe as my Robot God Phobia to homophobia.

Your response conforms exactly with my critique. Do you not see that?

You are confusing disagreement with your ideas about AI with defamation of your identity as someone who clings to those ideas sub(cult)urally. Exactly as I said you would.

I'm not Robot God Cult phobic (although I guess it is true to say that I find some of the things Robot God Cultists say and do a bit scary), but more to the point, surely, it is simply true that there are no Robot Gods here on Planet Earth after all, and I see no respectable work (either practical or theoretical) on the ground providing support that Robot Gods are in the works any time soon enough to devote much in the way of energy to their contemplation given the other urgent work to do done in the world.

This is why sub(cult)ural futurisms are a terribly bad idea if what one is looking for is foresight, as I keep hammering on about.

Now if one is looking to indulge in collective wish-fulfillment, or staving off anxieties occasioned by ongoing technodevelopmental change through True Belief, then, sure, the Robot God Cult thing might be a way for one to go. That's just not my cup of tea, having little temeramental disposition to groupthink or authoritarian hierarchical rituals and such.

But, be all that as it may, you can be damned sure that such cultisms and movement sub(cult)ures are not something I'll help people pretend are really, truly modes of practically useful or politically democratizing deliberation. Because they simply are not, and they are usually, in fact, the opposite. That's my position. Yet again.

You also say you agree with me (and Lanier) that dumb software programs that override user judgement with "their judgment" are annoying, but then you offer up the declaration of faith that all this will get better relatively soon, therefore exhibiting precisely the cybernetic totalist behavior Lanier predicted, and which was the actual reason I offered up that example as one among many showing how Superlative fancies derange contemporary assessments of the stakes and significances of the technodevelopmental terrain.

You said you didn't believe that this was a danger for you. Obviously, it is. But I knew that already.

gp said...

Dale: "You are confusing disagreement with your ideas about AI with defamation of your identity as someone who clings to those ideas sub(cult)urally".

Not at all - I have always considered them as two different things. I find it interesting to discuss your ideas here, but object to your using a language (robot god cult etc.) that I find offensive.

Like you did when I referred to your arguments as something like "abstract hair-splitting and mental masturbations of over-intellectualizing and self-righteous library worms". I have stopped using these terms because I don't believe insulting others is the best approach to constructive debate.

Of course I would prefer if you could try showing the same respect to me, but wtf, use the terms you prefer. Believe it or not, I am doing my best to try understanding your arguments, and I think I do understand some. I just don't agree.

"You also say you agree with me (and Lanier) that dumb software programs that override user judgement with "their judgment" are annoying, but then you offer up the declaration of faith that all this will get better relatively soon"

I did not offer any declaration of faith, I offered my opinions formed after 30 years in the IT world. If what I wrote is a declaration of faith, then my English is not as good as I thought. For the record, I don't _believe_ in anything. This does not mean that I cannot be _reasonably confident_ in specific outcomes.

jimf said...

> I offered my opinions formed after 30 years in the IT world. . .
> For the record, I don't _believe_ in anything.

That's hard to believe. ;->

> This does not mean that I cannot be _reasonably confident_
> in specific outcomes.

I dunno. I've been computering almost as long as you, and
for me the kool-aid has lost its savor. I got all excited by
Moravec's _Mind Children_ back in '88 (when I was working for
the NYU Robotics Lab), and even as late as '99 I was taking
books like _The Age of Spiritual Machines_ and _Robot: Mere
Machine to Transcendent Mind_ seriously.

But I had also read books by Gerald Edelman about the brain,
and when I actually came to rub shoulders (so to speak) with
the >Hists I could no longer take too many of them seriously
as intellectuals. Or even as folks with a modicum of humor
and common sense.

Instead, there was altogether too mucha what sounded like
this kind of thing:

"In this post I am looking back at the sequence of events which
attracted me in scientology. I am retrieving old goals; a few were
attained, many of them were not; but they are still alive, and this is
an occasion to postulate them again in present time.

It all started by reading the novel from A.E. Van Vogt: _The World of
Null-A_. I was enthralled by this book. Here I became aware of several
goals I had, which were expressed in this book:

- A technology able to remove the aberrations of man (general semantics)
- A solution for immortality
- Advanced abilities: teleportation, telepathy
- Contacting extra-terrestrial civilizations
- An organized teaching and training system designed to attain these
goals
- Building a new civilization without war and crimes

I had the idea to study the general semantics, but after that I was
reading a biography of Van Vogt, which indicated his interest in
Dianetics; eventually I found this book, and this was the beginning of
my road in scientology.

I received a scientific education, and had many doubts about religions.
I was hoping that spiritual abilities did exist, but I was not
satisfied by just hopes and never certainty. I was looking for proofs.
For example I had a project to do scientific experiments in a haunted
house. But I canceled this project when I discovered scientology.

. . .

With the discovery of scientology, I formulated a new set of goals:

1 - Freedom from unwanted reactions and emotions, and from past painful
experiences
2 - Retrieve my past lives
3 - Being exterior with full perceptions
4 - Immortality as a conscious being (going into the next life without
amnesia)
5 - Ability to heal other people
6 - OT abilities (teleportation, etc.)

Eventually I did quit the scientology path for different reasons.
Especially the high prices. . .

But if I have to choose a main goal, this is the goal of
immortality as a conscious being. . . Without this preliminary
goal, it is impossible to do long term planning."
http://groups.google.com/groups?selm=8md0lp%24l9d%241%40nnrp1.deja.com


Especially the high prices. Oy.

jimf said...

BTW, back in '97 when I discovered Eliezer's _Staring into the Singularity_,
I shared it with a woman at work. I later found out from a mutual
acquaintance (she didn't say this to me directly) that her reaction had
been "What is this, some kind of Scientology front?"

**My** reaction, at the time, was "how could she have possibly
have gotten **that** idea?"

Now I know. Cheers, Dyanne, you were right on the ball.

Anne Corwin said...

Regarding comparing superlative-technophilia (or whatever you want to call it) with, say, homosexuality in terms of the possibility of being marginalized and/or undeservedly maligned, I think it's worth looking at the power dynamics involved.

Gays, ethnic minorities, etc. frequently come (or originally came) from positions of fairly little power. Civil rights movements first and foremost concern people who are in danger of being physically attacked, institutionalized, or perhaps even enslaved on the basis of their orientation, skin color, neurology, etc.

Of course, a person's level of relative power can depend upon the environment they happen to be in -- e.g., a member of an overall minority can still experience situations where s/he is a member of a "local" majority. But there are still the overall majorities, and overall groups that enjoy power by virtue of some other attribute than sheer population (such as money or political clout).

So, as unpalatable as this notion might be to some, I would say that apparent "double standards" are justified when a person (or group) is comparatively lacking in power.

It's not as if transhumanist-identified folks are being physically attacked or intimidated to the point of being in danger of PTSD for the mere fact of being transhumanists (and I'm saying this as someone who still sees value in transhumanism as a kind of affiliative term).

Sexual, ethnic, morphological, and neurological minorities, however, are still sometimes persecuted to the point of having to fear for their lives.

This is why you will sometimes see things that look like "identity" movements coming from such groups at times, and why sometimes this is entirely appropriate -- while the "identity" facet of activism may end up being merely a temporary tool that eventually fades into redundancy for any given group, it can be an important tool that allows such groups to compensate somewhat for that lack of culturally ingrained power.

"Transhumanism" might be a minority viewpoint (though I'd suspect Dale's notion of the "emerging technoprogressive mainstream" probably encompasses a lot of people who might, if they came across a description of transhumanism, shrug and say, "sure, why not?") but transhumanists by and large tend to be members of fairly powerful groups.

Many of us are white, middle-to-upper class, financially secure, educated individuals who get to yammer endlessly on mailing lists and blogs because we don't have to spend our time tilling the fields, defending ourselves physically, or dealing with working three jobs to afford a tiny living space.

I'm not saying we should feel guilty about this -- I don't think that's very useful -- but we do need to acknowledge privilege where it exists. When you don't acknowledge the ways in which you yourself are privileged, you run the risk of looking at a minority or less-powerful group's civil-rights activities and feeling that they are somehow getting a special privilege you aren't.

I've seen this before from, for example, straight people who complain that they should be able to have "straight pride" parades without criticism because gay people can have "gay pride" parades.

But when was the last time someone got lynched or beaten up for being straight?

Even though knowing the ubiquity of violent tendencies in humanity leads me to suspect that this has probably happened once or more, it certainly happens far less frequently than the reverse.

Similarly, I know of no-one who has been beaten up for aligning with "cybernetic totalism" (though I do think there's something to be said for defending the rights of nerds as a kind of cultural/neurological minority, seeing as many of us have been beaten up or had our lunch money stolen by popular bullies).

But -- even so, that's different from defending one's "right" to publicly embrace totalism or superlative "true belief" without anyone ever critiquing you in terms you find rude, or suggesting that people who engage in such critiques are engaging in ad-hominems.

The Cult-That-Shall-Remain-Nameless -On-The-Grounds-That-I-Don't-Want -To-Get-Sued (the one started by that guy who wrote sci-fi books and which now attracts a lot of celebrities) manages to harm people in part because it presents itself as a "persecuted" group when it is actually one that is extremely powerful and able to mobilize massive legal and financial resources when it feels threatened.

Obviously, "transhumanism" isn't like that. It's more diffuse, more varied, more permissive (and even encouraging) of internal disagreement, and not lorded over by any single "figurehead".

But there is still something to be said for acknowledging that most transhumanists are privileged in particular ways that should make us wary of being overly sensitive to criticism of superlative tendencies and claims.

jimf said...

Dale wrote:

> [T]he politics of subcultural maintenance. . . impose[s] restrictions
> on the openness, experimentalism, flexibility of the technoscientific
> deliberation you can engage in without risk to the solidarity of the
> identity formation itself.
>
> This is why so many Superlative and Sub(cult)ural Technocentrics can
> constantly pretend that the future is going to be wildly different from
> the present and wildly soon, and yet the Superlative Sub(cult)ural vision[s]
> of the future itself. . . reproduce virtually the same static vision,
> over and over again. . .

Yes, this is frustrating. And that static vision **is already
out of date!**

I attempted to express this frustration a few years ago:

There's. . . a great deal of wishful
thinking (leading to a de facto, largely
unexamined, collective orthodoxy)
concerning what **kinds** of technology
are going to lead to the great Rapture.
Digital computers, for one thing. In a
way, that's not surprising, since so many
of the transhumanists and singularitarians
are computer programmers. But the **assumption**
that AI, if it happens, **must** be
based on digital computer technology
seems to me to be a premature one,
growing less and less likely
the more we learn about the brain. . .
This gives the transhumanist
and singularitarian communities a
peculiarly retro feel, as if they're
stuck in the 1950s. The good old days
of L. Ron Hubbard, A. E. Van Vogt, and
Robert A. Heinlein.

. . .

Since most of [my] points. . . [have been] negative
ones, I've decided
to attempt a more positive list as well.

1. Just based on current technology, the
networking together of billions of human beings
over the globe is going to transform society
in ways only dimly imagined at present. This will
probably end up as a qualitative transformation
of human life -- a Singularity, if you will -- all
by itself. Robert Wright, in _Nonzero: The Logic
of Human Destiny_
( http://www.amazon.com/exec/obidos/tg/detail/-/0679758941/ ),
has described how increases in population density,
and new technologies of communication, have
impelled technology (and the cultures and
civilizations that rely on it) into realms of
increased complexity. Plausible enough, whether or not
you believe the teleological view of history put forth
in that book.

2. One should cultivate an attitude of
expecting the unexpected. Not just unexpected
**disasters**, but unexpected new directions
in technology, too. In 1968, when Kubrick and
Clarke made the movie _2001: A Space Odyssey_,
everybody was still thinking of computers
as huge, centralized, and expensive
appliances. Despite the Newspad portable
flat TV aboard Discovery (with its 90-degree-wrong
aspect ratio, but that was probably because
it was being used at the dinner-table in the
scene it appeared in ;-> ), nobody thought to
generalize that sort of miniaturization to
computers themselves. While they were waiting for
HAL, **everybody** was taken by surprise by the
Internet and the Web. Even the folks (like SF authors)
you might have expected to foresee
such a thing (ironically, E. M. Forster came
close to getting it right in his remarkable
1909 story "The Machine Stops").

3. One should resist the temptation to force
premature **closure** about any aspect of the
Singularity. Especially basing all one's
expectations on this or that technology du jour
and then taking them oh-so-terribly seriously.
For one thing, it leads to putting the
cart before the horse for anyone who considers
verself a **scientist** to get that wrapped up
in how things **must** be, or **must** happen,
if the sky is to be prevented from falling, or
if we are to get personal immortality in time for me
or my friends or my family to live forever,
rather than on how things actually **are**.

For example -- I was always struck by the **party-line**
reactions on the Extropians' mailing list
to the question of whether the universe
is simulable, in principle, by a digital
computer. Yes, digital implementations have
advantages over messy "analog" ones (as has been
argued to death in the decades-long CD vs. LP
debate) -- you can correct errors, and stop the
clock and read out the precise state of a device.
Also, a digital implementation is an abstract
machine that frees you from the actual physical
substrate. But folks got so **angry** if you
suggested that the world might not, after all, be --
at bottom -- digital. They thought you might as well
be telling them that the Singularity -- and the
"party at the end of time" -- had been cancelled.

My reaction to that bristling was always an
amused "so what?" Yeah, it'd be inconvenient,
by the standards of what we know now, but maybe
**not** by the standards that will prevail
closer along toward the Singularity. Do you
think the 18th-century French philosophes would
have thrown tantrums to learn that mechanical
clocks and gears would not be used in future
calculators? (Or could have believed a cursory
description of how an integrated circuit works?)
The alternative is to maintain a certain deliberate
**distance** from **everything** that counts
as "state of the art" today. It's a kind of
lateral thinking, and I suspect this is difficult
for the somewhat literal-minded types attracted
to transhumanism in the first place.

4. One should foster an attitude of trust and
inclusiveness toward the rest of the human race.
I suspect this is also hard for the personality
types who are attracted to transhumanism in the first
place (most of whom seem to have a touch of
Friedrich Nietzsche, or at least Ayn Rand,
in their makeup). Get over the goddam elitism,
already. **I** see the Singularity as a force
of nature, no more alterable (if it's going
to happen at all) than the course of the Moon.
I do **not** see a small group of fanboys steering
or controlling it in any significant way (except
by putting the idea itself in bad odor).

5. Avoid selling anything -- nanotechnology,
computers, or the Singularity, as a religion-
substitute. You're bound eventually to disappoint
yourself, and everybody else, by taking that tack.
To say nothing of the fact that it's dangerous --
it can engender the sort
of hysteria and fanaticism that folks
have always been prone to when they think
that religious-scale (cosmic, eternal)
goals are at stake.

oops, here I go again with the negatives.
Guess that means I'm done.
-----------------

"If there's one rule to this game
Everybody's gonna name
It's 'be cool'. . .

Charm 'em
Don't alarm 'em
Keep things light
Keep your worries out of sight
And play it cool
Play it cool
Fifty-fifty
Fire and ice

. . .

Don't get over-zealous
Keep your cool. . .

Be cool fool
Be cool. . ."

--Joni Mitchell, "Be Cool",
in _Wild Things Run Fast_

jimf said...

Anne Corwin wrote:

> The Cult-That-Shall-Remain-Nameless-On-The-Grounds-That-I-Don't-Want-To-Get-Sued. . .
> manages to harm people in part because it. . . is extremely powerful and able to
> mobilize massive legal and financial resources when it feels threatened.
>
> Obviously, "transhumanism" isn't like that.

Not **yet**, anyway.

There aren't currently any >Hist-identified organizations
(WTA, SIAI, ImmInst, etc.) with the resources to behave toward
perceived enemies as the Scientologists have.

If Dale's experiences are any indication, though, the litigious
**impulse** is certainly there.

Maybe if more "silicon-valley wealthoids" get mixed up in >Hism,
we'll see lawsuits as well.

> It's more diffuse, more varied, more permissive (and even encouraging)
> of internal disagreement. . .

That hasn't been my experience, even when I was "politer" than I
am now. Even apart from ostensibly technological issues, the
political straitjacket is pretty rigid. Try being anything but a
libertarian in >Hist circles, and there's no mistaking you're
swimming against the stream.

> . . .and not lorded over by any single "figurehead".

Oh, there are certainly Lord Wannabes. "Figurehead" included. :-/

BTW, to the extent that >Hism shares the spirit of a "self-help"
movement -- and Extropianism certainly did so, quite explicitly, with
its espousal of "dynamic optimism" and its recommendation of
self-esteem guru (and erstwhile Ayn Rand "intellectual heir")
Nathaniel Branden's books -- it's also very much in the tradition
of Scientology as a sort of "pop-psychology"
substitute for professional therapy, which makes both the
Scientologists' and the >Hists disparagment of psychology as
"pseudo-science" all the more ironic. Although with >Hism, the
"therapy" seems to consist in trying to think and act how you
imagine an AI might think and act, while looking forward to being
able to tinker with your own "code".

A Web commentator notes: "[L. Ron Hubbard] claimed that
"Dianetics," little more than a recycling of Freudian theory with a
bunch of gratuitous neologisms thrown in, was mankind's greatest
breakthrough since the harnessing of fire! [He]. . . appeal[ed] to. . .
the authority of the all-seeing psychologist.

This may sound strange today, but fifty years ago, psychology seemed
almost magical in its ability to pierce the secrets of the soul."
http://www.lermanet.com/cos/lordoftheflies.htm

OK, so today the "magic" inheres in computers.

But the market for self-help has certainly been big enough to
accommodate several newer-generation Hubbard wannabes. Werner Erhard is
one well-known example:

"Werner Erhard initially had a positive response to his education
in Scientology beliefs and practices in the 1960s. He purchased books
from the Church of Scientology and reached the Scientological level
of "Grade II". Corydon and DeWolf quoted Erhard as characterizing
L. Ron Hubbard as an esteemed philosopher.

Erhard later decided to have some of his staff at Mind Dynamics enroll
in Scientology communications coursework as a way to train them in
sales-techniques. When Erhard started his own group, Erhard Seminars Training,
he incorporated portions of Scientology practices into his training,
initially including the usage of the e-meter. Erhard had hired Scientologists
in order to develop these techniques as his own. L. Ron Hubbard and the
Church of Scientology did not take kindly to this usage of their materials
without their permission. In 1992, the Church of Scientology listed Erhard
as a 'Suppressive Person'."
http://en.wikipedia.org/wiki/Scientology_and_Werner_Erhard

"The titular head of Landmark Education today is Harry Rosenberg, but it
was his brother 'Werner Erhard' previously known as Jack Rosenberg,
a high-school graduate and former used car and encyclopedia salesman,
who created the seminar 'technology' touted by the company.

EST, something of a craze in the 1970s, drew endorsements from celebrities
such as sitcom star Valerie Harper ('Rhoda') and singer John Denver.

Forbes Magazine dubbed Werner Erhard a 'millionaire guru.'

But the programs Mr. Erhard devised were soon associated with and/or
linked to 'psychiatric disturbances' and 'psychosis.' Amidst extensive and
unfavorable media coverage he sold EST in 1991 to employees, who then
formed the current company Landmark Education."
http://www.cultnews.com/?cat=68

One of the newer spin-offs in this succession self-help-for-hire businesses
(known generically these days as LGATs -- "Large-Group Awareness Training")
is Keith Raniere's "NEXIVM":

"Keith Raniere says he conceptualized a practice called 'Rational Inquiry'
at the age of 12 while reading _Second Foundation_ by Isaac Asimov.

The premise of the science fiction series is that a mathematician forecasts
the end of civilization and devises a plan to shorten the period of barbarity
before a new civilization is established.

Rational Inquiry, a formula for analyzing and optimizing how the mind
handles data, as Raniere describes it, is the basis for NXIVM
(pronounced NEX-ee-um), a multimillion-dollar international company. . ."

http://www.rickross.com/reference/esp/esp32.html
-------------------

"Keith Raniere's devoted followers say he is one of the
smartest and most ethical people alive. They describe
him as a soft-spoken, humble genius who can diagnose
societal ills with remarkable clarity. . .

His teachings are mysterious, filled with self-serving
and impenetrable jargon about ethics and values, and defined
by a blind-ambition ethos akin to that of the driven
characters in an Ayn Rand novel. His shtick: Make your own
self-interest paramount, don't be motivated by what other
people want and avoid 'parasites' (his label for people
who need help); only by doing this can you be true to
yourself and truly 'ethical.' The flip side, of course,
is that this worldview discredits virtues like charity,
teamwork and compassion--but maybe we just don't get it."

http://www.rickross.com/reference/esp/esp31.html
-------------------

**All** of these outfits have shown a propensity toward
litigiousness. Rick Ross, who maintains a cult-awareness
Web site, has been sued by both Landmark Education and
NEXIVM:

"A new lawsuit filed by NXIVM. . . claims it is losing members and
money because of negative publicity characterizing it as a cult. . .
Actions and statements by the defendants have caused the. . .
company, also known as ESP, to lose such things as the support of
prominent members, $10,000 per day in revenue and a speaking
engagement by actress Goldie Hawn. . .

'We've had two billionaires retract their support from us because of
untrue publicity,' said Keith Raniere, the founder of ESP, in his first public
statements on the company's legal maneuverings. He said that more
than 20 billionaires had taken his courses. . .

Raniere, 43, is known as 'Vanguard' to NXIVM students. The group's
more than 3,700 students worldwide wear special scarves denoting
their rank and must bow to Raniere and NXIVM President Nancy Salzman,
dubbed 'Prefect', before and after long seminars, known as 'intensives' . . .

Raniere. . . said his goal was to build a successful center where 'the honor
system is a viable alternate for human beings.'

He said he imagines a world in which people walk into employee-less
stores in the middle of the night and dutifully drop off money for the
products they want.

The word NXIVM is a contraction of the words next millennium
and nexus, Raniere said."

http://www.rickross.com/reference/esp/esp19.html
-------------------

>Hism certainly shares something of the Circuit Chatauqua revival-tent
atmosphere of the LGATs. Dale has compared it with Amway-style get-rich-quick
gimmicks, and I can certainly see the resemblance. Used-car and door-to-door
encyclopedia salesmen indeed. Shades of Elmer Gantry and the Music Man.

Love is the morning and the evening star.

We've got trouble, right here in River City.

Michael Anissimov said...
This comment has been removed by the author.
Michael Anissimov said...

Maybe "superlative" technologies have a media megaphone because many educated people find these arguments persuasive. If a majority decides to allocate research funds towards Yudkowskian AGI and Drexlerian MNT, who would you be to question the democratic outcome? Because that is what is likely going to happen in the next couple decades.

jimf said...

Michael Anissimov wrote:

> If a majority decides to allocate research funds towards
> Yudkowskian AGI and Drexlerian MNT, who would you [That's **you**,
> presumably, Dale; I'm beneath notice. "What the Klingon has
> said is unimportant, and we do not hear his words." ;-> ]
> be to question the democratic outcome?

-------------------------------
Jean Brodie: How **dare** you speak to me in this manner!

Sandy: I suppose I've always known that one day
you were going to ask 'how **dare** I'?

-- _The Prime of Miss Jean Brodie_
-------------------------------
http://www.script-o-rama.com/movie_scripts/p/prime-of-miss-jean-brodie-script.html

> Because that is what is likely going to happen in the next couple decades.

Got a high-definition crystal ball, have we?

"We will control the horizontal; we will control the vertical."
;->

Well, I suppose with high-energy cheerleaders like you
(Duplicate Boy? Triplicate Girl?), they can't miss.

Dale Carrico said...

Michael Anissimov said: Maybe "superlative" technologies have a media megaphone because many educated people find these arguments persuasive.

There is no question at all that many educated people fall for Superlative Technology Discourses. It is very much a discourse of reasonably educated, privileged people (and also, for that matter, mostly white guys in North Atlantic societies). One of the reasons Superlativity comports so well with incumbent interests is that many of its partisans either are or identify with such incumbents themselves.

However, again, as I take pains to explain, even people who actively dis-identify with the politics of incumbency might well support such politics inadvertently through their conventional recourse to Superlative formulations, inasumuch as these lend themselves to anti-pluralistic reductionisms, elite technocratic policies, naturalization of neoliberal corporate-military "competitiveness" and "innovation" as the key terms through which "development" is discussed, vulnerability to hype, groupthink, and True Belief, and so on, all of which tend to conduce to incumbent interests and reactionary politics.

If a majority

Whoa, now, just to be clear: The "many" of your prior sentence represents neither a "majority" of "educated" people (on any construal of educated), nor a "majority" in general.

If a majority decides to allocate research funds towards Yudkowskian AGI and Drexlerian MNT, who would you be to question the democratic outcome?

Who would I be to question a democratic outcome? Why, I democratic citizen with an independent mind and a right to free speech, that's who.

I abide by democratic outcomes even where I disapprove of them time to time, and then I make my disapproval known in the hopes that the democratic outcome will change -- or if I fervently disapprove of such an outcome, I engage in civil disobedience and accept the criminal penalty involved to affirm the law while disapproving the concrete outcome. All that is democracy, too, in my construal of it.

In the past, Michael, you have claimed to be personally insulted by my suggestion that Superlative discourses have anti-democratizing tendencies -- you have taken that claim as equivalent to the accusation that Superlative Technocentrics are consciously Anti-Democratic, which is obviously not at all logically implied in the claim (although the evidence suggests that Superlativity is something of a strange attractor for libertopians, technocrats, Randroids, Bell Curve racists and other such anti-democratic dead-enders). When you have assured me that you are an ardent democrat in your politics yourself I have tended to take your word for it.

But when you seem to suggest that "democracy" requires that one "not question" democratic outcomes I find myself wondering why on earth you would advocate democracy on such terms? It's only reactionaries who falsely characterize democracy as "mob rule" -- because they hate common people. Actual democrats tend not to characterize their own views in such terms. Democracy is just the idea that people should have a say in the public decisions that affect them -- democracy is a peer-to-peer formation.

Because that (AGI/MNT funding) is what is likely going to happen in the next couple decades.

Be honest: if you were you as you are now twenty years ago, would you have said the same? What could happen in twenty years' time to make you say otherwise?

I personally think it is an arrant absurdity to think that majorities will affirm specifically Yudkowskian or Drexlerian Superlative outcomes by name in two decades. Of the two, only Drexler seems to me likely to be remembered on my reckoning (don't misunderstand me, I certainly don't expect to be "remembered" myself, I don't think that is an indispensable measure of a life well-lived, particularly). On the flip side, it seems to me that once one has dropped the Superlative-tinted glasses, one can say that funding decisions by representatives accountable to majorities are already funding research and development into nanoscale intervention and sophisticated software. I tend to be well pleased by that. If one is looking for Robot Gods or Utility Fogs, however, I suspect one that ion twenty years' time one will find them on the same sf bookshelves where one properly looks for them today, or looked for them twenty years ago.

Anonymous said...

I knew Keith when he was a student at RPI - in fact, I was (for a short time) a roomie. I would guarantee that the basis of his cult are ripped straight out of Transcendental Meditation and Dianetics. He was also heavily involved in Amway among other "pyramid" schemes.

He was a blowhard then and he still is. There is nothing special about him that isn't explainable by realizing his years of study in the human psyche and how to manipulate and control people using this knowledge.