Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Wednesday, August 01, 2007

Singularitarianism Makes Your Brains Fall Out

The more you know about Singularitarianism the more levels are available on which that line is funny.

I had an interesting exchange with a Singularitarian elsewhere online this morning, part of which I reproduce below (names omitted to protect, etc. etc.):

Our Modern Singularitarian:
I would love a democratic world gov't, but I strongly doubt it will happen in the next 10-30 years, the period in which we are likely to either create H+ intelligence or exterminate ourselves.


I replied:

The point for me is that in a very real sense we already have global governance [I refer in the post to which this individual is responding, to the World Bank, the IMF, international crime cartels, treaty organizations, proliferating NGOs and so on]. Democratizing these governance structures is a process, not a destination, and it has already begun.

As for your idea that all of this is irrelevant because of the immanence of the arrival of the robot army and so on, we'll just have to agree to disagree on that one.

Claims of this nature seem to me usually confident but rarely right, and it seems to me that technophiliacs who incessantly make them also make it more difficult to talk about technodevelopmental policy in anything like a realistic or non-hyperbolic fashion. But I'm sure that's just because I'm not all awesome and hardcore like the Superlative Technoboosters are, etc. etc.

Singularitarian:
[T]echnology development is embedded in the social context, etc., but, say, if someone were to release some tenacious grey goo, it wouldn't bother to poll people on their social and political leanings before bulldozing through the biomass.


I reply:

Yes, technodevelopment is actually really complex and socioculturally articulated [contrary to those technophiles who like to crow about progress being a politically neutral accumulation of useful devices], but, even so, just think how much less complex it seems once we take into consideration scary utterly made up shit.

Sorry, but this seems like Bush Administration Terror talk transposed onto futurological handwaving to me. But I'm sure it is a very serious contribution to technodevelopmental discourse and I would be able to understand that if I were smart like the Singularitarians are.

Singularitarian:
The upshot of this is that we should encourage global unity under the supervision of democratic consensus, while acknowledging that true unification will not occur in the timeframes of greatest importance. Thus we have to encourage the benefits and mitigate the risks of technological development with this unfortunate reality in mind.

In other words, as far as I can tell:

Our technocratic elite masters must be given priority over democratization, because only the elites understand important stuff like how the robot army is on the way.

26 comments:

Anonymous said...

Dale,

You frequently use the word 'immanent,' in place of 'imminent,' as in:

"the immanence of the arrival of the robot army."

Are you punning and alluding to "immanentizing the eschaton?"
http://www.m-w.com/dictionary/Imminent
http://en.wikipedia.org/wiki/Immanence

"Our technocratic elite masters"
Who are these 'elite masters'? Is Nick Bostrom a member of this terrible league of supervillainy? If not, how does he support them?

"must be given priority over democratization"
How are you using 'democratization' here? Putting more issues to the popular vote (or raising them in electoral campaigns)? Or as a synonym for progressive policies? The two are often opposed, e.g. on immigration, civil rights, affirmative action, etc.

Also, 'priority' varies depending on the context of decision-making. For an individual actor, there will often be more effective paths to do good than focusing on democratization (in the sense of responsiveness to popular will), e.g. public interest lawyers bringing court cases like Roe v Wade, Bill Gates investing directly in health in the developing world (spurring government action by example), and so on.

"because only the elites understand important stuff like how the robot army is on the way."
This seems to mean that the 'elites' are Singularitarians. In that case, why worry? If you think with extremely high (99.9+%) certainty that no superlative technologies will be developed over the next several decades, Eliezer Yudkowsky or Ray Kurzweil will never have the chance to become "our elite masters." On the other hand, if there is a substantial chance that these technologies will be very important, then investing a small fraction of what we spend on global warming (or tobacco subsidies, for that matter) SHOULD be a priority.

Imagine, hypothetically, that after extensive analysis you concluded that there is at least a 5% chance that smarter-than-human AI will be developed over the next 30 years, and that a substantial risk existed that, either through mistakes in design or because of idiosyncratic values of the programmers, this would produce a disastrous outcome rather than an excellent one. What would you do to improve the odds? How would your actions relate to 'democratization' (in whatever sense is relevant)?

Nato Welch said...

I'm always intrigued by the use of the term "but". Logically, it's equivalent to "and", and yet it carries a lot of connotations.

"I would love a democratic world gov't, but I strongly doubt it will happen in the next 10-30 years, the period in which we are likely to either create H+ intelligence or exterminate ourselves"

This carries the subtext that democratic world governance is somehow suddenly not worth working toward, because we can't get it in time for... [insert event].

Somehow, I can't help thinking that democratic world governance actually bears something of a strong resemblance to what should emerge from, say, "Friendly" (in SIAI's formulation) recursively-self-improving artificial general intelligence (deep breath).

But how that renders work on developing democratic world governance moot when the development of RSIAGI - whether friendly or unfriendly - is beyond me.

The same argument applies here supporting democratizing world governance as it does against critics of SIAI who would say that it's a waste of resources, and so shouldn't be pursued. The resources SIAI requires to do what it's doing are TINY, relative to what's available. There's plenty of room for people to place their own bets without obstructing or even denigrating the pursuits of others.

The significant but non-total risks posed by RSIAGI are absolutely no excuse whatsoever to give up the tasks of improving democratic governance at any scale.

So good. Go work on the positive singularity. we need somebody doing that if it's going to happen. I'll be over here, working to mitigate this other risk. When it comes to high-stakes risk, one always HOPES that one is wasting one's time, if you get my drift.

Much of the problem faced by those trying to tell us about existential risks is the fact that we've been bitten to hard and too long by wolf-criers for the past six years. As a result, ANYONE who talks about dangers is likely to get the cold shoulder, regardless of whether 1) they are sincere as opposed to jockeying for power, or 2) whether the risk they're talking about is actually real or not.

Furthermore, I'm not worried about Yudkowsky becoming "our elite masters", but I'm less confident about Kurzweil, who fits the profile rather well, from the secrecy-based profiteering down to the "it'll be good, don't worry" hand waving.

It's not even that these guys will care to constitute elites; it's what anti-democratic elites will be attracted to them and the tech they're behind as events develop. It doesn't matter if every singularitarian is sincere, when politicians start invoking robot terrorists instead of human ones to justify wiretaps, torture, and war.

In that case, why worry?

Because tyrants are not made from real risks. They are made from mere threats. You don't need to be right in this political climate in order to provide tools to prospective despots. Just sincere.

Dale, how do you go from "under the supervision of democratic consensus" to "given priority over democratization"? S's formulation sounded pretty unproblematic to me, so long as "bearing in mind" certain things didn't override consensus.

Dale Carrico said...

The resources SIAI requires to do what it's doing are TINY, relative to what's available.

I'm a rhetorician, and once one transposes this discussion into the terms of mass-mediation and attention economy it is no longer clear to me at all that the technodevelopmental discourse space commandeered and/or skewed by SIAI and other Superlative Technology Discourses can properly be described as tiny at all. In my view, technohype, disasterbation, transcendentalizing wish-fulfillment, scientistic reductionism, and highly supple opportunistic justificatory technocratic vocabularies for incumbency are so vast and proliferating as to make sensible technodevelopmental talk incomparably more difficult than it should be, and for less common than it needs to be given the real risks and promises at hand.

I'm not worried about Yudkowsky becoming "our elite masters", but I'm less confident about Kurzweil, who fits the profile rather well...

Contrary to the Anonymous commenter who preceded you, Nato, I'm not casting about for "supervillains" (although not just a few Singularitarians seem well pleased to fancy themselves candidates for eventual prostheticized Superhero status, which seems to me just as idiotic), and by our "elite masters" I mean to indicate the class politics of incumbency -- which, as we all know by now, function perfectly well without cartoon villains at the helm (although the Bush Administration has provided us with some flabbergastingly cartoonish candidates for such villainy), but just privileged thoughtless people caught up in parochial uncritical self-serving attitudes and practices. The point is not to worry about the iceberg-tip of Superlativity exemplified presumably by a few public figures, but to delineate the discourse more generally, its entailments, its points of contact with institutional incumbency, the opportunities it affords for resistance and so on.

Dale, how do you go from "under the supervision of democratic consensus" to "given priority over democratization"?

It seemed to me that the initial formulation yoked democracy to "unity" in a way that made it unattainable, followed by the "reluctant conclusion" that until such impossible unity is achieved we must content ourselves with technocratic circumventions of democracy. Indeed, it is hard for me to believe that the "we" of that final quoted sentence -- "Thus we have to encourage the benefits and mitigate the risks of technological development with this unfortunate reality in mind." -- is the same one in "We the people." And I do not concede the "unfortunate reality" that compels this circumvention of democratic ideals in the first place. I think that anti-democratic assumptions drive that sense of "unfortunate reality" and that the protestations to democracy that precede it are just so much window dressing (although it is perfectly possible that the author doesn't honestly see it that way). Perhaps I'm jumping the gun a bit, but this ain't exactly my first time at the rodeo. The anti-democratizing intuitions and arguments of Superlative Technology discourses and so on repeat themselves with a depressing, even robotic, inevitability.

As for what I mean by "democratization," I define democracy as the idea that people should have a say in the public decisions that affect them, and I describe as "democratizing" any reform, struggle, regulation, or institution that gives more people more of a say in the public decisions that affect them. As it happens I do not fetishize any one particular implementation of this ideal over the others -- sometimes it will constitute relative democratization to make a decision process representative, sometimes simply accountable, sometimes democratization demands nothing short of direct popular mandates. Anti-democrats like to make this idea seem more complicated than it is. That's because they disapprove of and/or misunderstand democracy.

To the question of whether I indentify Singularitarians with "elites" there are two answers: Many Singularitarians seem to imagine themselves elite, which is an edifyingly hilarious spectacle suffused with testosterone. But I also believe that Singularitarian rhetoric is likely to attract the attention of some actual incumbent elites casting about for self-congratulatory language to justify their position -- a point I spell out in greater detail in, I believe, my recent post here, "The Singularity Won't Save Your Ass."

Imagine, hypothetically, that after extensive analysis you concluded that there is at least a 5% chance that smarter-than-human AI will be developed over the next 30 years, and that a substantial risk existed that, either through mistakes in design or because of idiosyncratic values of the programmers, this would produce a disastrous outcome rather than an excellent one. What would you do to improve the odds?

I do not agree that this sort of speculation is serious or useful. One can invest any made up bullshit with urgent significance to justify the diversion of attention from proximate concerns simply by endlessly ratcheting up the hypothesized scenario's terms. Futurist handwaving of the Superlative Variety shares with science fiction that it pretends to cast its gaze to the far future when more often than not it is best understood as a symtomatic response to current events. But quite apart from these dismissals, to the extent that I do take recursively self-"improving" software seriously as an existential threat I believe that the regulatory formulations and institutions that will deal with it will arise out of actual security discourses coping here and now with actually existing software viruses and pandemics and so on, rather than from the "insights" arising here and now out of fanboy extrapolation into distant futures or near-term futures rendered alien by acceleration of acceleration and comparable foolishness.

I can celebrate the sub(cult)ural expressions of Superlative Technology Discourse, in all their dread and exhilaration, precisely in the way I celebrate other manifestations of religiosity, aestheticism, and perversion in the world, even when they hold little personal allure (as is certainly the case for me where Singularitarianism is concerned), but my celebration of diversity does not render me incapable of distinguishing such sub(cult)ural formations from formations to which I would ascribe "seriousness" or "relevance" in matters of prediction, control, or democratic policy making.

Dale Carrico said...

PS: On the "immanent," "imminent" question -- it's not a pun, it's just a mistake I can't seem to shake the habit of.

Michael Anissimov said...

You misjudge me so. The overarching relevance of intelligence enhancement and molecular nanotechnology is not about any particular technocentric masters - but it does emphasize the importance of decisions that have bearing on the way these technologies are developed and unfolded, sidelining cultural critics who can't be bothered to figure out the technical details. To use a strawman of "technocratic masters" is desperate lashing out for a desire to remain relevant in an increasingly technocentric world. (The fact that we are having a conversation on a blog, something impossible 15 years ago, underscores that.)

Dale, all transhumanists want to be "prosthetized superheros", not just singularitarians... what I find intriguing is that are so closely involved with and consistently bring up things related to a philosophy (transhumanism) that you so strongly object to many of the most basic tenets of. Like Bush, you are dismissing technologies: synthetic biology, molecular manufacturing, AI and robotics - simply because you are troubled by their sociopolitical implications and feel jealous of the attention commanded by those genuinely concerned about existential risk, which you dismiss as "disasterbation".

Luckily, as you observantly point out, Kurzweilian and Drexlerian discourses are strongly proliferating into a variety of futurist discussions, informing both policymakers and scientists alike. All your friends in or related to the H+ movement: Mike Treder, Chris Phoenix, James Hughes, Jamais Cascio - can easily discuss these topics without missing a beat, and contribute invaluably to this essential dialogue.

Anyway, I'm all for working towards global democratic governance full force. I'm currently reading "Global Covenant: the Social Democratic Alternative to the Washington Consensus", for instance, but only out of personal curiosity - my day job is to analyze risky technologies and come up with direct mitigative strategies.

Dale Carrico said...

Dale, all transhumanists want to be "prosthetized superheros"

Poor dears.

what I find intriguing is that are so closely involved with and consistently bring up things related to a philosophy (transhumanism) that you so strongly object to many of the most basic tenets of.

What's so intriguing? I'm critiquing something dumb and dangerous that stands in the way of reasonable technodevelopmental discourse that I consider urgent.

Like Bush, you are dismissing technologies...

I'm calling made up bullshit made up bullshit. But, yeah, I'm like Bush. Good one. No doubt, anybody who doesn't believe in technological immortalism, nanosanta, and the Coming Apocaloid Robot Army must be a bioconservative, then, by your lights. Sorry, Michael. No amount of handwaving or retrofitting will turn your weird robot cult into something I take seriously.

I'm glad you're reading about democratic alternatives to the neoliberal Washington Consensus. Even a stopped clock is right twice a day, and you're a bright kid, I'm not surprised to hear you talk sense occasionally (indeed I have personally known you to do so). Neoliberalism is a topic about which I write here with some regularity, although you only seem make comments about this sort of thing here to shore up your democratic cred in the face of my general critiques of the anti-democratizing tendencies that seem to me to inhere in transcendentalizing, technophiliac, and technocratic discourses of the kind you find so appealing. Who knows why that might be? Surely not me.

I don't doubt your sincerity, however, and so I will abjure from any available Bush comparisons.

Nato Welch said...

Couching this in terms of the attention economy is a good point I hadn't thought of. That's mostly due to the fact that SIAI is more intersted in funding than discussion. "Send us money, so that smart people can afford to solve the problem". I've long noticed that there's a strain of qualification to entry into the discussion, dating back to the [[http://sl4.org/intro.html#standards rather strict moderation controls]] on the SL4 list - which were enough to scare me off when I first joined a few years ago. Add to this more recent [[http://www.acceleratingfuture.com/michael/blog/?p=384 conversations I've had with Michael Anissimov on the Accelerating Future blog]] about how most people "can't contribute to the serious discussion" about AGI.

If you think the rattling they do from their rather insular communities is so loud as to interfere with more proximate issues, can you imagine how much worse it would be if they were actually out engage in public discussion and education?

//"Futurist handwaving of the Superlative Variety shares with science fiction that it pretends to cast its gaze to the far future when more often than not it is best understood as a sym[p]tomatic response to current events"//

http://www.locusmag.com/Features/2007/07/cory-doctorow-progressive-apocalypse.html

Doctorow is always talking about how science fiction is always about the present (with nods to William Gibson). So what you're saying here is that so is futurism pointed more than a decade hence? I can see the point.

Aside from that, why the hell does it seem to take so long for you both to seem to be on the same page?

Robin said...

I always follow these exchanges with great interest, but I really want to hear more about nanosanta, as December rapidly approaches.

gp said...

Dale, for one who does not take weird robot cults seriously, you certainly write about them a lot.

Are you sure you don't take them seriously? Or is it that maybe you wish to be a weird robot cultist and don't dare saying so?

But don't worry. To rejoin the family of true believers, from which you have never been truly apart, and ensure the salvation of your immortal mindfile, you only have to repent your sins and accept the Eschaton as your personal Savior.

We are all praying the One that you leave the night and see the Light.

Yours in Omega,
Brother Giulio

jimf said...

> I'm a rhetorician, and once one transposes this discussion
> into the terms of mass-mediation and attention economy it
> is no longer clear to me at all that the technodevelopmental
> discourse space commandeered and/or skewed by SIAI and other
> Superlative Technology Discourses can properly be described as
> tiny at all.

I see 'em (the "Superlative Technology Discourses", that is) as
contiguous (if not precisely continuous) with other scientistic
cults either originated by or enthusiastically adopted by the
sci-fi community in the post-WWII years. E.g., Dianetics, General
Semantics, Objectivism. I think of Singularitarianism as
Scientology 2.0 ;-> .

http://groups.google.com/group/alt.religion.scientology/msg/4aec1277d08493c1
--------------------------------
In about 1972,I had the occasion to meet privately
with [A. E. Van Vogt] in a bar in Washington DC.
One of the things I wanted to talk about was his relationship
with Hubbard and Scientology. . .

Anyway, by the time I met with Van, he no longer believed
in any of the stuff put out by Hubbard. Also, he felt he
had to remain silent because of threats against him and
his family. Accordingly, he asked me not to broadcast
his story since he still lived a bit in fear of Hubbard and
his minions.

Van had nothing flattering to say about either Hubbard,
Dianetics, or Scientology during that meeting. I'd say
he was totally disillusioned by his experiences.

He had involved himself with Hubbard as part of an exuberance
of youth; a group of people out to 'set the world right' and
'make a difference.' What that group evolved into, we now know.
But in the beginning, all of the people involved were filled
with idealistic visions of what the world would be like if
people were free of the bad motivations inside themselves.
--------------------------------

http://groups.google.com/group/alt.clearing.technology/msg/bad583620041a34a
--------------------------------
In this post I am looking back at the sequence of events which
attracted me in scientology. I am retrieving old goals; a few were
attained, many of them were not; but they are still alive, and this is
an occasion to postulate them again in present time.

It all started by reading the novel from A.E. Van Vogt: _The World of
Null-A_. I was enthralled by this book. Here I became aware of several
goals I had, which were expressed in this book:

- A technology able to remove the aberrations of man (general semantics)
- A solution for immortality
- Advanced abilities: teleportation, telepathy
- Contacting extra-terrestrial civilizations
- An organized teaching and training system designed to attain these
goals
- Building a new civilization without war and crimes

I had the idea to study the general semantics, but after that I was
reading a biography of Van Vogt, which indicated his interest in
Dianetics; eventually I found this book, and this was the beginning of
my road in scientology.

I received a scientific education, and had many doubts about religions.
I was hoping that spiritual abilities did exist, but I was not
satisfied by just hopes and never certainty. I was looking for proofs.
For example I had a project to do scientific experiments in a haunted
house. But I canceled this project when I discovered scientology.

. . .

With the discovery of scientology, I formulated a new set of goals:

1 - Freedom from unwanted reactions and emotions, and from past painful
experiences
2 - Retrieve my past lives
3 - Being exterior with full perceptions
4 - Immortality as a conscious being (going into the next life without
amnesia)
5 - Ability to heal other people
6 - OT abilities (teleportation, etc.)

Eventually I did quit the scientology path for different reasons.
Especially the high prices. . .

But if I have to choose a main goal, this is the goal of
immortality as a conscious being. . . Without this preliminary
goal, it is impossible to do long term planning.
--------------------------------

--------------------------------
Hubbard: We promised them the moon and then demonstrated a
way to get there. They would sell their soul for that. We were
telling someone that they could have the power of a god --
that's what we were telling them.

Penthouse: What kind of people were tempted by this promise?

Hubbard: A whole range of people. People who wanted to raise
their IQ, to feel better, to solve their problems. You also
got people who wished to lord it over other people in the
use of power. Remember, it's a power game, a matter of climbing
a pyramidal hierarchy to the top, and it's who you can step
on to get more power that counts. It appeals a great deal
to neurotics. And to people who are greedy. It appeals a
great deal to Americans, I think, because they tend to
believe in instant everything, from instant coffee to instant
nirvana. By just saying a few magic words or by doing a
few assignments, one can become a god. People believe this.
You see, Scientology doesn't really address the soul; it
addresses the ego. What happens in Scientology is that
a person's ego gets pumped up by this science-fiction
fantasy helium into universe-sized proportions. And this
is very appealing. It is especially appealing to the
intelligentsia of this country, who are made to feel
that they are the most highly intelligent people, when in
actual fact, from an emotional standpoint, they are completely
stupid. Fine professors, doctors, scientists, people
involved in the arts and sciences, would fall into
Scientology like you wouldn't believe. It appealed to
their intellectual level and buttressed their emotional
weaknesses. You show me a professor and I revert back
to the fifties: I just kick him in the head, eat
him for breakfast.
------------------
_Penthouse_ interview with L. Ron Hubbard **Jr.**,
June, 1983
http://www.rickross.com/reference/scientology/scien240.html

Dale Carrico said...

for one who does not take weird robot cults seriously, you certainly write about them a lot.

I'm hearing this complaint quite a bit from Superlative Technology partisans and sub(cult)ural futurists lately. It's hard to know if this is really a difficult phenomenon for you all to grasp or if you're just trying rather desperately to change the subject here.

Just in case your perplexity is real: Of course I take Extropians, Singularitarians, transhumanists, and all the rest seriously.

Serious as a heart attack.

I've been critiquing sub(cult)ural, transcendentalizing, Superlative, bioconservative, retro-futurist, reductionist, technocratic, and other anti-democratizing or irrationalist modes of technodevelopmental discourse and practice for a decade and a half.

I take very seriously the damage I think they can do, the skewed priorities they inspire, they fearful and hateful passions they can indulge, the rationales they offer up to incumbents, the anti-democratizing attitudes and practices they can inspire.

I take very seriously the way they derange technodevelopmental discourse at the very moment when technoscience must be democratized else it destroys the open world, or living world altogether.

Against these attitudes, distortions, indulgences and so on I have also tried to delineate positive, hopeful technoprogressive alternatives. I don't mind at all that some transhumanists (especially the democratic transhumanists) have discerned a measure of sense and derived a measure of pleasure from some of my technoprogressive writing.

I hope that at least some True Believers, self-appointed techno-elites, sub(cult)ural technophiliacs, techno-transcendentalists, Priestly as opposed to pragmatist champions of science, and technocentric opponents ("reluctant" or otherwise) of democracy will be nudged through their engagement with my writing and that of many technoscientifically-literate democratically-minded others into the incomparably more useful peer-to-peer planetary technodevelopmental struggle to democratize technoscientific change, to distribute the costs, risks, and benefits of technoscientific change more fairly, to celebrate planetary p2p-multiculture, and to universalize the scene of informed nonduressed consent to prosthetic self-determination.

Nobody needs to join a robot cult to participate in the great shared but diverse work of democratic, social, and technoscientific progress. And those robot cultists who do participate in that work will, at the very least, need to estheticize and privatize their religiosity like everybody else does in a secular multicultural planetary society. This means they really should stop confusing their marginality for indispensability, their quirky sub(cult)ural attitudes for "unbiased" science, and their idiosyncratic indulgences and rituals (which I do not doubt are perfectly edifying for them personally, and which I am quite happy to celebrate as part of humanity's rich existential tapestry as far as that goes, even when these particular edifications don't particularly appeal to me personally) for Serious pragmatic policy discourse.

Dale Carrico said...

I see 'em (the "Superlative Technology Discourses", that is) as
contiguous (if not precisely continuous) with other scientistic
cults either originated by or enthusiastically adopted by the
sci-fi community in the post-WWII years. E.g., Dianetics, General
Semantics, Objectivism. I think of Singularitarianism as
Scientology 2.0 ;-> .


Oh, yes, I quite agree. One sees anticipations of these derangements from the Enlightenment (check out the enthusiastic and horrified debates provoked by roboticist and social scientific reductionist Jacques de Vaucanson in the 18C) through mid-century (check out Herman Sorgel's environmentally devastating technopolitical megalomaniacal mega-engineering fantasia Atlantropa, replete with novelistic sf popularization and financial backing and pseudo-respectability from weird "maverick" bazillionairs), through to the transhumanist prefiguration in the L5ers and pill-poppers advertising in Omni Magazine three decades ago when I started paying attention to these things...

Definitely I agree with you about the kooky contiguities of sub(cult)ural transhumanisms and especially Singularitarians with Dianetics, Objectivism, and GS.

If only those guys had read William Burroughs instead of Ayn Rand...

jimf said...

> Add to this more recent [[http://www.acceleratingfuture.com/michael/blog/?p=384
> conversations I've had with Michael Anissimov on the Accelerating Future blog]]
> about how most people "can't contribute to the serious discussion" about AGI.

The sad truth is that the people who use the term "AGI" are not contributing
to the serious discussion of the nature of intelligence (artificial
or otherwise). This is blindingly obvious even to a modestly well-read
non-expert, once he has stepped outside the circle of true believers.

Twigging this fact was the beginning of the end of my own inclination to
take the Singularity stuff seriously.

jimf said...

Nato Welch wrote:

> SIAI is more interested in funding than discussion. "Send us
> money, so that smart people can afford to solve the problem".

http://www.mail-archive.com/singularity@v2.listbox.com/msg00269.html
---------------------------------
$5M . . . is a fair estimate of what I think it would
take to create Singularity based on further developing
the current Novamente technology and design.
---------------------------------

Cheap at the price! Amazing how much action DARPA must be missing out on.

---------------------------------
David Drumlin: In the meantime, my office has made out a preliminary budget.

Ellie Arroway: Wait a minute. This is a privately-funded operation.
We're only leasing dish time from the government--

Michael Kitz: Doctor, if there's a more clear-cut case of eminent domain,
I've never seen it. I'm recommending to the President we militarize this
project immediately.

Ellie: What?! This is my project! Nobody knows more about these
scenarios than I do. David, tell them how many years --

David: Ellie, sorry.
---------------------------------
_Contact_

> I've long noticed that there's a strain of qualification to entry into
> the discussion

The primary qualification is unquestioning obeisance to the Authority
of the Guru. Given that, you don't even have to spell good. Absent
that, you could be a genu-wine expert gone slumming, and you'd still
be kicked out on your butt.

jimf said...

Michael Anissimov wrote:

> The overarching relevance of intelligence enhancement and molecular
> nanotechnology. . . does emphasize the importance of decisions that
> have bearing on the way these technologies are developed and unfolded,
> sidelining cultural critics who can't be bothered to figure out the
> technical details.

Ouch, that's **you** Dale. Just a "sidelined cultural critic". :-0

"The overarching relevance of intelligence enhancement and molecular
nanotechnology" is that they don't exist, and will almost certainly
never exist in the form in which the SIAI crowd imagines them.

"who can't be bothered to figure out the technical details" Hah!
Like any of you guys have? I can just imagine what Eugen Leitl would
have to say about that. ;->

> To use a strawman of "technocratic masters" is desperate lashing
> out for a desire to remain relevant in an increasingly technocentric
> world.

"Technocratic masters" is a strawman in almost the exact same sense that
the Scientologists' overarching plan to "clear" the earth (of psychiatrists ;-> )
and then to go to clear the galaxy is a strawman if taken literally as a
reason to criticize Scientologists (no, no, we mustn't clear the galaxy,
it would be a violation of the Prime Directive! :-0 ). Of course nobody
(no "wog", at any rate) takes their grandiose goals at face value (and neither,
presumably, do the people who run the organization). Doesn't stop 'em from
raising a lot of money, making a lot of noise, intimidating the
ostensibly free press, and even destroying the occasional enemy
(along with the usual run-of-the-mill exploitation of their recruits).

And sounding silly has not stopped every Singularitarian from imagining
that he has his hand on the fulcrum, the very tipping point of
human (or galactic, or even Universal) history. Imagining that he
will be memorialized to the end of time (sort of like Yarlan Zey
in Arthur C. Clarke's _The City and the Stars_ only, you know, bigger --
on a Stapledonian scale).

> (The fact that we are having a conversation on a blog, something
> impossible 15 years ago, underscores that.)

What it underscores is "plus ca change, plus c'est la meme chose".
The technology is new; the **use** of it -- the dark underbelly of
freer human communication: the printing press, radio, TV, and
the spread of literacy -- is old, old, old.
http://www.trap2.com/list_j/joni_mitchell_lyrics/tax_free_lyrics.html
Prestige-mongering by means of paradisiacal and apocalyptic
fortune-telling, leveraging a swollen ego by putting the guru-whammy
on a congregation of admirers, fund-raising
"Calling for large donations, Promising estates.
Rolling lawns and angel bands, behind the pearly gates.
He will have his in this life, but -- yours'll have to wait.
He's immaculately tax-free. . ." It's an old story.

ZARZUELAZEN said...

Dale,

The problem is not so much Singularitarianism per se but *Singularitarians* ;. To be as tactful as possible, most of these self-professed *Singularitarians* are... er... complete arseholes.

I suppose I can take some small consolation from the fact that they're 'my kind of arsehole' (to borrow a quote from Die Hard). But not much.

The few 'hard core' Eliezer cheer-leaders on the SIAI mailing list are the most narrow-minbded, hostile and self-righteous people I've ever met.

According to their world-view there's two kinds of people: 'Smart People' (people with IQ's greater than 150 - worth talking to) and 'Stupid People' (people with IQ's less than 150 who's opinions are completely worthless). Only 'Smart People' count you see. Only Smart People can see the 'true path' to salvation through SIAI. (These aren't my terms. These are the actual words they use). The last straw was a post on M.Anissimov's blog a while back lamentating the fact that 'another smart person died today' and missed out on cryonics (apparently the other 150 000 people that died that day didn't count because they were stupid people).

Of course this sort of behaviour is symptomatic of a cult and indeed the level of self-righteousness on the SL4 list was worse than the worst of the cults I've ever seen any-where else on the web (even the Objectivst messagebaords didn't have people that fanatical).

---

Incidentally Anissimov my boy, you backed the wrong horse. You were undoubtably hoping to bask in some of the reflected glory from an SIAI victory, but unfortunately for you old Eli's theories are all screwed up.

---

I spent 5 years engaged in deep deep throught in an attempt to come up with a top-level ontology for a general intelligence and I recently completely succeeded. It turns out my basic contentions on SL4 were propbably right all along. You may like to view my completed top-level domain model here:

http://marc.geddes.googlepages.com/MCRT_ClassDiagram.html

And my brief note about these ideas on the Everything-List here:

http://groups.google.com/group/everything-list/browse_thread/thread/b613d6b76db8bdaa

See ya on the other side of Singularity. And remember kid, winners write history.

Dale Carrico said...

The problem is not so much Singularitarianism per se but *Singularitarians* ;. To be as tactful as possible, most of these self-professed *Singularitarians* are... er... complete arseholes.

I can't say that I agree with you on this one.

I have no doubt at all that at least some Singularitarians aren't assholes at all except when they are saying things and doing things that bespeak their Singularitarianism. And I daresay any of us can be assholes occasionally, not just silly Singularitarians.

My problem is with the usually reductionist, usually undercritically technophiliac, usually transcendentalizing, usually anti-democratizing (through many routes) doctrinal confusions that Singularitarianism bundles together so appallingly in my view. The details of my critique are scattered throughout the archive.

As for the personal failings exhibited by individual Singularitarians, and to which all of us are too prone in our own frailty, I think it is probably better not to overgeneralize overmuch -- except of course when a particular exemplar is really asking for it.

jimf said...

> The few 'hard core' Eliezer cheer-leaders on the SIAI
> mailing list are the most narrow-minded, hostile and
> self-righteous people I've ever met.
>
> According to their world-view there's two kinds of people:
> 'Smart People' (people with IQ's greater than 150 - worth
> talking to) and 'Stupid People' (people with IQ's less
> than 150 whose opinions are completely worthless). Only
> 'Smart People' count you see. Only Smart People can see the
> 'true path' to salvation through SIAI. (These aren't my terms.
> These are the actual words they use). The last straw was
> a post on M. Anissimov's blog a while back lamenting the
> fact that 'another smart person died today' and missed out on
> cryonics (apparently the other 150 000 people that died that
> day didn't count because they were stupid people).
>
> Of course this sort of behaviour is symptomatic of a cult and
> indeed the level of self-righteousness on the SL4 list was
> worse than the worst of the cults I've ever seen anywhere else
> on the web (even the Objectivst messagebaords didn't have people
> that fanatical).

"[Ayn] Rand outlined in her journal the basic character traits
of her most famous hero, Howard Roark: 'His emotions are
entirely controlled by his logic.' Two things dominate
his entire attitude toward life: 'his own superiority and
the utter worthlessness of the world.' He was 'born without
the ability to consider others. . . . Indifference and an
infinite contempt is all he feels for the world and for
other men who are not like him.' Other people are merely
a convenience for his work. He recognizes only the
right of the exceptional (and by that he means only
himself) 'to create, and order; and command.'"

_The Ayn Rand Cult_, Jeff Walker, p. 246


The irony here, of course, is that Ayn Rand and her followers
were not (are not) nearly as smart as they imagined they
were. Nor, I daresay, are those on the SL4 mailing list.
A snotty attitude almost always outstrips any justification
it may claim in the realm of intellectual superiority, alas.


"If the ultraintelligent machines decide that more
than a million human beings constitute an epidemic, they might
order euthanasia for anyone with an IQ of less than 150, but
I hope that such drastic measures will not be necessary."

-- Arthur C. Clarke, "The Mind of the Machine",
_Playboy_, December, 1968


"I do find AI very interesting from a literary
perspective, and I've often written about it
in fiction, but I don't really see much evidence
of it in the real world; but the tinkering
with human neural behavior is something I really
do fret about some. I'm not sure there's
a Singularity situation there, but I think
that really is catnip for the intelligentsia --
the ability to sort of mess with your own
neural processes. And it really plays to
people's pride, in a very painful and tempting
way -- it's, like, 'well, I'm smart, and I love
myself for being smart, and if we smart guys
were just more like we already are then we'd be
godlike.'"

-- Bruce Sterling, "The Singularity:
Your Future as a Black Hole"
(Eighth "Seminar About Long-Term Thinking"
given by the Long Now Foundation at
Fort Mason Center, San Francisco,
11 June 2004)


"Everybody looks so ill at ease
So distrustful so displeased
Running down the table
I see a borderline
Like a barbed wire fence
Strung tight, strung tense
Prickling with pretense
A borderline

Why are you smirking at your friend?
Is this to be the night when
All well-wishing ends?
All credibility revoked?
Thin skin, thick jokes!
Can we blame it on the smoke
This borderline?"

-- Joni Mitchell, "Borderline"
http://www.poemhunter.com/song/borderline-2/

ZARZUELAZEN said...

Yes. In fact dispite their alleged super-high IQ's , there was almost no-one on the SL4 list with a record of real-world achievements or qualifications. Funny that.

In the beginning I thought wta-talk, Extropy-list, SL4 would be the center of the uiverse , and there would be hot-lines to all the world's top scientists and media hanging on our every word. But the sad recognition sunk in that almost no one gives a stuff about anything on there. The world doesn't care. The reasons why eventually dawned. A real genius doesn't waste time on mailing lists - they too busy achieving results in the real world.

As far as I can make out, most of the people on SL4 are not in the slightest bit interested in artificial intelligence at all. They're only interested in furthering their own egos and playing status games which show off their IQ's. Either you have to totally surrender to the ego of the founder or hit the high-way.

In fact that seems to be the way of internet messageboards in general. They seem to be a magnet for a lot of not very pleasant high IQ types pushing a lot of extremist pet-ideas.

As to AGI, the sad truth is that most of SL4/SIAI ain't got a clue. They've confused high IQ/Book Knowledge for atual genius. To M.Anissimov and any other Singularitarian readers: sorry kids, but the ability to suck up a lot of book knowledge and manipulate it in a narrow reductonistic logical way does not NOT make for genius.

I've seen this kind of 'reductionistic blindness' many times before - and certainly Dale has quite rightly pointed out it out.

I mean I've posted a few big secrets of the universes to various transhumanist lists - stuff that seems blantatly obvious to me, but was just completely ignored and ridiculed by the SIAI cultists. Regarding the REAL Singularity, we may be close to the end of the game yet the SL4 crowd honestly truly still believe (as far as I can make out) that I'm the biggest crack pot in history and they're the keepers of all truth. The extent of their delusions are quite extraordinary. Honestly I'm aghast.

---

But never fear for when the real Singularity arrives Dale. You see the winners write the history books. It will be a simple matter for me (Armed with said robot army) to re-write the Internet archives and ensure the complete historical irrelevence of the SIAI and the people associated with it. (Newton was able to take down all of Robert Hookes pictures and condemn him to historical obscurity after all).

You'll also be pleased to know Dale that after the Singularity there won't be any more corporations either ;)

jimf said...

> They're only interested in furthering their own egos and
> playing status games. . .
>
> In fact that seems to be the way of internet messageboards in general.

From John Bruce's blog ( http://mthollywood.blogspot.com ):
--------------------------------
"Monday, May 07, 2007. . .

I was at the annual meeting of the Pennsylvania Railroad Technical
and Historical Society. . . One big impression that I came away with
was how many of the 300-plus people in attendance (mostly guys, of course,
and mostly well over 60. . .) were motormouths. . .

A lot of these guys were the worst sort of motormouth, in fact, the kind
that always sounds like a slightly off-kilter DJ. . . Actually, I think a
lot of bloggers are like this. If you meet them in person, they're motormouths,
but you don't quite pick up on that in a blog. That's probably the way they
like it, . . . no singsong tone to turn people off.

It's interesting that so many model railroaders are also this way. . .
I enjoy the hobby -- and I often enjoy blogging -- but model railroaders
and bloggers are basically nuts and jerks."
--------------------------------

Oh well. Caveat lector.
;->

Dale Carrico said...

Marc, it sounds to me rather as though you are bitterly disappointed that the hopes you had invested in the Singularitarian Robot Cult didn't pan out, but that you haven't managed to completely learn yet the wholesome lesson available to you in that disappointment: namely, that self-appointed elite organizations and self-described soopergeniuses are generally either trying fraudulently to sell you something or pathetically to sell themselves something (and regularly both at once). Next time you encounter such a salesman (whether it be a Robot Cultist or whatever else offers you salvation in exchange for obedience), perhaps you should give the whole project a pass and rather attend to whatever it is inside you that foolishly confuses you into finding some allure in such nonsense in the first place.

Intelligence seems to me to be a matter of forming, grasping, and applying abstractions in a way that facilitates our various ends (instrumental, moral, esthetic, ethical, political, and so on). These ends are irreducibly plural, arise in irreducible plural contexts, and are immensely dynamic and importantly unpredictable. This makes the business of intelligence incomparably more complex than the things that pass for "intelligence" in much discourse on the topic. For the most part, I notice that if I substitute for what is claimed to be considerations of "intelligence" what amounts in the main instead to considerations of "class" or "incumbent privilege" and the iconography through which these latter considerations are expressed, well, it is extraordinary how clear and how different such discussions suddenly become.

All this is certainly true where talk turns to "geniuses" in marginal sub(cult)ures charismatically demanding attention, praise, and, usually, cash, or the need discerned by technocrats for the "smart people" to solve the complicated problems that beset everybody (an "everybody" consisting presumably of mostly folks who are "less smart" than necessary), or the various handwaving exercises of technophiliacs for "artificial intelligence" or "enhanced intelligence" (relying on figurative conjurations of futural ideal exemplars which are usually just absurd reductios of the various distorted and impoverished visions of what "intelligence" consists of in the abstract affirmed by their advocates -- usually an intelligence conceived as a dull numbers-cruncher, or libertopian maximizer, or dot-eyed instrumentalist with no love or poetry in him, or a ruggedly individualistic atom in an asocial void, etc etc etc).

As for the overgeneralizations about bloggers I'm seeing in several recent comments, I realize that they are a rather easy target these days (and, as is usual in such matters, plenty of that ridicule is plenty deserved in particular cases), but it seems to me that distributed creative expressivity and collaborative problem-solving is unleashing more intelligence and more democratization than anything other popular movement afoot in the world today. This is, no doubt, why it is so widely ridiculed.

jimf said...

Dale wrote:

> I realize that [bloggers] are a rather easy target these days. . .,
> but it seems to me that distributed creative expressivity and
> collaborative problem-solving is unleashing more intelligence
> and more democratization than anything other popular movement
> afoot in the world today. This is, no doubt, why it is so widely
> ridiculed.

Yes, yes, of course. I was just (sourly) pointing out that
the blogosphere (and more than just that, the Web as a whole,
and Usenet before it) is simply the latest medium "where talk
turns to 'geniuses' in marginal sub(cult)ures charismatically demanding
attention, praise, and, usually, cash. . ."

You just gotta watch what you're drinking, more than ever these
days. Never before has such a deal of unvetted vanity publication
been available so widely. Woe to the uncritical, at such a
time! It was heady stuff to me, a decade ago, when I got my
first at-home flat-rate ISP account.

See also "Personality characteristics of net kooks"
http://groups.google.com/group/sci.psychology.psychotherapy/msg/8f6074d8a869ca64

> Next time you encounter such a salesman (whether it be a Robot Cultist
> or whatever else offers you salvation in exchange for obedience), perhaps
> you should give the whole project a pass and rather attend to whatever it
> is inside you that foolishly confuses you into finding some allure in such
> nonsense in the first place.

Something to take up with one's therapist. ;->

"[T]he 'Dispensing of Existence'. . .amounts to a claim that only
members of the group meaningfully exist. They alone essentially are
'good' and/or 'saved'. This is in stark contrast to non-members who
are 'bad' and/or 'damned'. Only group members are really 'walking in
the light', know the 'truth', or are in 'the Kingdom of God' -- while
others are somehow negative and excluded. Summing up this belief -- those
outside the group are essentially somehow inferior and those within the
group are seen as superior. Destructive groups often foster and reinforce
this mentality by claiming to be the only ones who have a valid claim to
truth -- or in extreme circumstances even [to] the earth itself. Those
who are inferior, base, and/or simply seen as not yet ready to take on
their proper responsibilities -- may be treated with less concern,
respect or sometimes even contempt by cult members (e.g. critical family
members, government authorities, old friends etc.) Regarding that
treatment -- 'the ends [may] justify the means'. Often this feeling of
superiority or worthiness becomes a motive for people remaining within
cults."
http://www.rickross.com/reference/general/general431.html

Dale Carrico said...

I was just (sourly) pointing out that the blogosphere (and more than just that, the Web as a whole, and Usenet before it) is simply the latest medium "where talk turns to 'geniuses' in marginal sub(cult)ures charismatically demanding attention, praise, and, usually, cash. . ."

You just gotta watch what you're drinking, more than ever these days. Never before has such a deal of unvetted vanity publication been available so widely. Woe to the uncritical, at such a time!


Oh, I know! Believe me, you get no arguments from me on that score. Even a p2p-democrat like me knows to heed the wisdom in Vinge's slogan "The Net of a Million Lies."

jimf said...

> Even a p2p-democrat like me knows to heed the wisdom
> in Vinge's slogan "The Net of a Million Lies."

Y'know, the interstellar Usenet in _A Fire Upon the Deep_
was perhaps the best thing in the book. And now, 15 years
later, only the hard-core old-timers remember Usenet
at all (or think to search Google Groups).
Sic Transit Gloria Gaynor.

(The other best thing about _Fire_ was the edge-of-seat
first chapter in which the Blight awakens. I must admit
I don't remember much about the rest of the book.)

-------------------------------
Crypto: 0
As received by: Transceiver Relay03 at Relay
Language path: Samnorsk->Triskweline, SjK:Relay units
From: Straumli Main
Subject: Archive opened in the Low Transcend!
Summary: Our links to the Known Net will be down temporarily
Key phrases: transcend, good news, business opportunities, new archive,
communications problems
Distribution:
Where Are They Now Interest Group, Homo Sapiens Interest Group,
Motley Hatch Administration Group, Transceiver Relay03 at Relay,
Transceiver Windsong at Debley Down, Transceiver Not-for-Long at Shortstop

Date: 11:45:20 Docks Time, 01/09 of Org year 52089
Text of message:

We are proud to announce that a human exploration company from Straumli
Realm has discovered an accessible archive in the Low Transcend. This is not
an announcement of Transcendence or the creation of a new Power. We have in
fact postponed this announcement until we were sure of our property rights
and the safety of the archive. We have installed interfaces which should
make the archive interoperable with standard syntax queries from the Net. In
a few days this access will be made commercially available. (See discussion
of scheduling problems below.)

Because of its safety, intelligibility, and age, this Archive is
remarkable. We believe there is otherwise lost information here about
arbitration management and interrace coordination. We'll send details to the
appropriate news groups. We're very excited about this. Note that no
interaction with the Powers was necessary; no part of Straumli Realm has
transcended.

Now for the bad news: Arbitration and translation schemes have had
unfortunate clenirations[?] with the ridgeway armiphlage[?]. The details
should be amusing to the people in the Communication Threats news group, and
we will report them there later. But for at least the next hundred hours,
all our links (main and minor) to the Known Net will be down. Incoming
messages may be buffered, but no guarantees. No messages can be forwarded.
We regret this inconvenience, and will make up for it very soon!
Physical commerce is in no way affected by these problems. Straumli
Realm continues to welcome tourists and trade.
-------------------------------

jimf said...

Something interesting from Slashdot, apropos
Vinge's _A Fire Upon the Deep_:

http://books.slashdot.org/books/03/09/18/0411259.shtml
---------------------------------------------------
Recently a new 'special edition' of [Vernor Vinge's
_A Fire Upon the Deep_] was published electronically,
containing the annotations that had previously only been
available on the 1993 Hugo/Nebula CDROM, and I knew I
had to make the purchase--and then, since I couldn't dig
up any other mention of it on Slashdot, review it. . .

It would not be exaggeration to call A Fire Upon the Deep
one of the seminal SF novels of the digital age. While
there had been many other books that depicted computer
networks of the future, Fire was one of the first to
present such a network in terms of its resemblance to
USENET of the then-present-day. . .

With a USENET-style network in place, Vinge is free to
homage the USENET of today (or, rather, of 1990) in subtle ways.

For example, one of the commonly-recurring themes throughout
the book is that of identity and truth. One of the ways this
theme is explored should be only too familiar to most Slashdot
readers: the net is often called the Net of a Million Lies. Just
as "on the Internet, no one can tell if you're a dog," on the
Known Net nobody can tell what race you really are--only the
one you say you are. Interspersed through the story are about
a dozen USENET-style netnews posts, from sources considered
reliable, questionable, or outright mysterious. . .
Many conflicting and unexplained viewpoints are presented,
and sorting out the truth is an important part of the story.
(I have heard rumors that some of these posts were written
based on the posting styles of well-known USENET kooks of
the day, but the annotations failed to provide any proof
of this.)

Vinge also makes a few cute little digs that only a net-user
might get--such as when he refers to "chronic theorizers
[as being] the sort of civilizations that get surcharged
by newsgroup automation," . . . or when he implies. . .
that some of those strange, semi-intelligible posts that show
up on USENET today might simply be the fault of a faulty translator.
(The notes reveal that he considered using some even more
familiar net slang, such as "IMHO," but decided against it.)
---------------------------------------------------

ZARZUELAZEN said...

>Marc, it sounds to me rather as >though you are bitterly >disappointed that the hopes you >had invested in the >Singularitarian Robot Cult didn't >pan out, but that you haven't >managed to completely learn yet >the wholesome lesson available to >you in that disappointment: >namely, that self-appointed elite >organizations and self-described >soopergeniuses are generally >either trying fraudulently to >sell you something or >pathetically to sell themselves >something (and regularly both at >once). Next time you encounter >such a salesman (whether it be a >Robot Cultist or whatever else >offers you salvation in exchange >for obedience), perhaps you >should give the whole project a >pass and rather attend to >whatever it is inside you that >foolishly confuses you into >finding some allure in such >nonsense in the first place.

Your point about "self-appointed elite organizations and self-described soopergeniuses" is well taken. I agree. The elitist behaviour of *some* of the people associated with SIAI (for example) is inexcusable.

As to AGI and Singularity we just don't know. Could be possible according to available science. Will have to wait and see won't we.

I've thought about the topic for a few years now, came up with some good ideas. I think the "self-appointed elite organizations and self-described soopergeniuses" need to take their heads out their own arses and try listening to other people's point of views.

Given the way that I've been treated by some SIAI folks, I won't have anything more to do with them unless they one day acede to the demands I'm making below:

*Apologize for the bullying, elitist and condescending remarks made in public designed to ridicule and intimidate SIAI critics over a concentrated period of time.

*Both privately *and* publicly admit that I (Marc Geddes) was absolutely correct about all of the key contentions I list below (and have previously posted on various messages):

A General Intelligence without consciousness is impossible

B There exist at least some objective values (or ideals).

C Any recursively self-improving intelligence will automatically be ethical

D There are 3 different definitions of causality (could be either 3 different levels of organizations in systems - things with moving parts - or might even be literal time dimensions).

E Consciousness is equivalent is higher-order causality

F The top-level MCRT domain model displayed at link given below is 'substantially correct' in the general concepts expressed

http://marc.geddes.googlepages.com/MCRT_ClassDiagram.html


Once at some time in the future I've received the apology AND a public acknowledgement that I (Marc Geddes) was correct on ALL points of contention (A,B,C,D,E,F) THEN I might consider supporting the SIAI once again.

In short to SIAI: Get a clue kids.