Monday, October 08, 2007

Superla-Pope Peeps

Eliezer Yudkowsky, Co-Founder and Research Fellow of an outfit called the Singularity Institute for Artificial Intelligence (SIAI) has recommended a short text of his posted on the blog Overcoming Bias as his response to my exchanges with Aleksei Riikonen and others in the Singularitarian Agony posts from the last couple of days (just scroll down a bit for these if you like). I responded to Eliezer's recommended essaylet on a discussion list we occasionally nip in and out of and then he and I sniped at one another very entertainingly a few circuits around the roller-rink. I regret that I can't re-post all that edifying snark -- since it seems to me content posted to that list is probably offered up with the expectation of a certain privacy I will respect -- but since the original blog-post Yudkowsky recommended as the corrective to my position is freely available I do think I can let you in on my response to that.

In Yudkowsky's text he proclaims that "what scares me is wondering how many people, especially in the media, understand science only as a literary genre."

Since he is offering up this thesis as one that applies to me, I can only assume he wants to paint me as a person who "understand[s] science only as a literary genre." Think carefully about what it means that Yudkowsky either can't see why he is wrong here, or why he thinks it is a good idea to peddle such a false impression.

Certainly I will attest that there are key aspects of technoscientific change that are well describable in cultural, social, and political terms. Because my training and temperament suits me to talk about just those aspects, and because I address my arguments to technocentric folks who often underestimate just those aspects to the cost of sense it makes enormous sense that those are the aspects I devote much of my attention to in my writing. But it is quite a leap from here to the conclusion that I think of science "only as a literary genre." I certainly don't believe any such thing. I challenge Yudkowsky to unearth such a reductive claim in my writing. One place to look might be my essaylet Is Science Democratic? Another, Technoethical Pluralism.

I daresay Yudkowsky should be able grasp, at least in principle, that even a scholar whose training best suits him to discuss the literary dimensions of the texts in which scientists seek to communicate their findings, their inspiration, their significance, and so on will still be able to distinguish those dimensions from the proper scientific dimensions (the collective practices of invention, testing, debate, and publication through which descriptions are offered up as candidates for warranted belief in matters of prediction and control and which satisfy the criteria hacked over millenia to select among these candidates, testability, coherence, concision, saving the phenomena, and so on) also exhibited in these texts.

Later in his post Yudkowsky offers up this bit of homespun wisdom:
"Probably an actual majority of the people who believe in evolution use the phrase 'because of evolution' because they want to be part of the scientific in-crowd -- belief as scientific attire, like wearing a lab coat. If the scientific in-crowd instead used the phrase 'because of intelligent design,' they would just as cheerfully use that instead -- it would make no difference to their anticipation-controllers."

I would be very interested to know more about the empirical experiences on the basis of which Yudkowsky has "induced" this "probability." I guess he imagines people in general too dull witted to affirm instrumental beliefs warranted by consensus science because these have proved especially good in facilitating their satisfaction of instrumental wants. Yudkowsky seems to think the poor benighted masses choose surgeons over faith healers so regularly only because they hope to be taken for the cool kids in lab coats in University Science Departments. It's an, uh, interesting theory.

I must say that the very notion of a "scientific in-crowd" so compelling in their allure that a dim-witted "Mob" parrots their utterances so as to be mistaken for them seems to me at once so flabbergastingly off-the-reservation and so baldly elitist that I laughed out loud upon reading it.

As an aside, I do agree that sometimes utterances that appear to affirm as warranted what are in fact absurd would-be instrumental descriptions ("An angry Patriarchal Creator-God with a gray beard sitting in a great Stone Chair exists," "the Earth was created in seven days," "my team won because Jesus likes us better than our opponents," and so on) are really functioning as social signals, indications of moral identification/ disidentification. And so I think, for example, that the worrisome reports one sometimes reads that vast numbers of Americans affirm idiotically Creationist beliefs in surveys are often actually using these affirmations of belief to signal to the surveyor and to an imagined audience something like the statement "I try to be a good person" -- where this correlates to membership in some church they haven't attended more than sporadically most of their adult lives. I suspect that misinterpretations of these reports often make sensible secularists and atheists panic unnecessarily about the state of this actually promisingly secular multicultural American society. But the proof is in the pudding: it seems to me that when people's circumstances demand actual instrumental beliefs their conduct is more pragmatic and scientifically informed than not, and this is scarcely because they want to be like idealized scientists. I think people in general are capable and sensible precisely to the extent that they have access to information and protection from duress. In other words, I really do think people are much more rational than they might seem to be when they report belief in superstitious nonsense and that they exhibit more rationality than they are often credited for in their actual instrumental conduct. It seems to me that many Singularitarians and Technocratic Elitists (perhaps Yudkowsky among them, given the above) have formed the opposite impressions, for whatever reasons, and to their cost.

Yudkowsky continues on:
"I encounter people who are quite willing to entertain the notion of dumber-than-human Artificial Intelligence, or even mildly smarter-than-human Artificial Intelligence. Introduce the notion of strongly superhuman Artificial Intelligence, and they'll suddenly decide it's 'pseudoscience.'"

I encounter people who are quite willing to entertain the notion that the streets are filled with actual bipeds, but who "suddenly" decide it's "pseudoscience" if you introduce the notion of perfectly bipedal angels into the discussion. Why, it's like a world gone mad!

You know, the world is littered with actually existing calculators and computers, many of which can already "outperform" normative exhibitions of human intelligence in some of its conventional registers. But, I hate to break it to Yudkowsky et al, there is nothing like an entitative post-biological superintelligence even remotely on the horizon, and so I think skeptics might be forgiven their "sudden" skepticism on this score in light of this alone. Surely it isn't exactly only "suddenly" that sensible people have suggested that worries about the imminent arrival of Robot Overlords might be a wee bit skewed and pseudo-scientific?

But, of course, things are much worse for our Singularitarians than that on my view: Not only are there common or garden variety computers everywhere but Robot Gods nowhere, there is also a long history of people who sound exactly like Eliezer Yudkowsky making fun of people like me (and much smarter and better informed than the likes of me) for doubting their hyperbolic pronouncements about the proximate arrival of world-transforming AI.

And, although you would never guess it from the withering contemptuousness Singularitarians sling at their skeptics, the AI guys are, so far, always, always wrong and the skeptics, so far, always, always right. While it's true that the endlessly reiterated failure of the Strong Program (let alone the, shall we say, Sooper-Strong Program Singularitarians have come somewhat perversely to prefer in the aftermath of these failures) doesn't earn anybody the deduction that the Program will always so fail (I don't hold that view myself, as it happens), one wonders why even caveats or a small measure of modest good humor fail to arrive after all this humiliation. This is especially perplexing given that Singularitarians seem to want to pass so desperately as serious scientists -- when scientists are, after all, among the most scrupulously caveating folks I know.

Beyond all this, as I have been reiterating repeatedly here for the last few days, the concepts to which Singularitarians make regular recourse in their discourse look deeply inadequate to me when they're not actually actively incoherent. I still await the sense that these folks take the particular embodiment of actually existing consciousness seriously, or register more of the actual diversity of capacities and performances that get described as "intelligence," or show more awareness of the histories of the value-discourses they appropriate when they start to go off on "friendliness," "unfriendliness," and the rest, not to mention demonstrating a little more awareness of widely understood and respected critiques of technological determinism, reductionsism, autonomism, unintended consequences, and so on. When they start barking about transcendental inevitabilities and number-crunching the Robot God Odds so solemnly I sometimes suspect I'll be left waiting for these little niceties forever.

About the skeptics, most of whom, we are assured, are sloppy-headed "literary types," Yudkowsky tosses his head with dismissive scorn:
"It's not that they think they have a theory of intelligence which lets them calculate a theoretical upper bound on the power of an optimization process. Rather, they associate strongly superhuman AI to the literary genre of apocalyptic literature."

It's simply breathtaking that Yudkowsky seems to think Singularity skeptics actually need to re-tool their understanding of intelligence to address the weird and wacky things Singularitarians claim in the name of their scarcely digested notions. The whole point is that if your discourse proposes that "optimization" spits out a Robot Overlord you are not making a claim that is exclusively or even primarily located under the heading of "computer science" -- it is doing work for you that is more like the work of apocalyptic literature. Network security discourse can cope with recursivity without becoming a cult or keeping a charismatic (?) guru in gloves and fans. It is the investment of projected (in more than one sense of that word) malware with entitative motives that takes us into the realm of quasi-religiosity, of collective dread and wish-fulfillment, stealthed as scientific objectivity. This latter investment renders Singularitarian discourse vulnerable to Superlative Critiques which recognize the cultural iconography that is getting mobilized to whatever ends and such critiques expose this mobilization. The confusion of warranted consensus scientific description with cultural mythology is not ours when we discern it, but Superlative as its partisans depend on it.

Of course, in light of all this, the endless self-righteous attribution of ignorant fashionable nonsensicality to anybody who refuses to give in to the reductionist triumphalist bulldozer of scientism (which is not, remember, consensus science itself, but always an opportunistic appropriation of science for parochial political and cultural ends) of the Superlative Technocentrics is just adding insult to idiocy.

The fantasy that the silly humanities people can be browbeaten into silence because self-appointed high priests of science (by which, flabbergastingly-enough, Yudkowsky seems to mean people like himself) can scrawl their equations on a chalkboard all the while oblivious to the metaphors that do so much of the actual heavy lifting in their arguments, or the fantasy that self-appointed technocratic elites with weird skewed priorities can deny the diverse demands of the actual stakeholders to ongoing and emerging technoscientific change all for "their own good" are the real frauds that demand exposure at this time in my view. Sensible and informed "literary types" have a role to play in the exposure of these deranging technodevelopmental discourses as well as sensible and informed "science types."

7 comments:

  1. Dale wrote:

    > I responded to Eliezer's recommended essaylet on a
    > discussion list we occasionally nip in and out of
    > and then he and I sniped at one another very entertainingly
    > a few circuits around the roller-rink. I regret that I
    > can't re-post all that edifying snark -- since it seems
    > to me content posted to that list is probably offered up
    > with the expectation of a certain privacy I will respect. . .

    Oh, crap, does this mean I'm going to have to go crawling
    back to the WTA-talk moderators and beg them to let me
    re-subscribe just so I can be entertained by all that snark?

    God, I'm so tempted. I have no pride. I have no shame.
    ;->

    ReplyDelete
  2. Dale wrote:

    > I really do think people are much more rational
    > than they might seem to be when they report belief
    > in superstitious nonsense and that they exhibit
    > more rationality than they are often credited for
    > in their actual instrumental conduct. It seems to
    > me that many Singularitarians and Technocratic Elitists
    > (perhaps Yudkowsky among them. . .) have formed the opposite
    > impressions, for whatever reasons, and to their cost.

    Yes, this is another interesting bias among the
    singularitarians (and Mr. Yudkowsky a fortiorissimo).

    There seems to be burning desire (a **Bias**, if you will :-/ )
    to amass and retail evidence of the unwashed
    masses' inability to think like professional mathematicians.
    Great glee in demonstrations of "heuristics and biases"
    such as the Conjunction Fallacy.

    This underscoring of human fallibility could well be serving
    two rhetorical and/or psychological purposes among the singularitarians:

    1. As a means of demonstrating the superior intelligence
    and rationality of the AIs who will not be subject to these
    failures, and

    2. As a means of demonstrating the superior intelligence
    and rationality of the people (especially Mr. Yudkowsky
    himself) who, through lucky Algernonic genetics or the
    assiduous practice of certain skills (the "Way of Rationality"[*])
    have **already**, prior to shuffling off this mortal coil,
    placed a toe on the coily beginning of the Yellow Brick Road
    to superintelligence.

    That this putative shortcoming of non-Algernonic humans is sacred writ,
    to be questioned at one's peril, is amply demonstrated by
    the fate of one Dr. Richard Loosemore a year ago on SL4.

    Loosemore made observations such as:

    > [T]here are two interpretations of [the] role [of human reasoning
    > heuristics and cognitive biases]:
    >
    > 1) The interfering mechanisms were just dumb, maladaptive strategies.
    > Basically, systematic biasses and mistakes.
    >
    > 2) These other mechanisms were not just systematic biasses, but may
    > actually have been components of very powerful, sensible, adaptive
    > cognitive mechanisms that do not use logical reasoning, and without
    > which the system as a whole could not function. A number of people
    > have arguments that are equal or closely related to this point. . .
    >
    > Interpretation (1) is the default assumption in the literature. To the
    > extent that the literature looked at what was going on in these
    > experiments, it tended to treat the situation as one of rationality
    > corrupted by mistakes.

    The heated and convoluted discussion that ensued led Loosemore to remark:

    > If nothing else, Yudkowsky's hasty reply, below, will serve as
    > an amazing (and often comical) example of how not to demolish an
    > opponent in academic discourse. This is National Enquirer science.
    > This is the most ugly kind of Tabloid smear, with no actual factual
    > content. Don't take my word for it: dip in and enjoy the show!
    > . . .
    >

    ( http://www.sl4.org/archive/0608/15803.html )

    The final result was a post by Yudkowsky in his role as List Owner
    with the clever subject line "Cutting Loosemore":

    > Your butt is banninated from SL4.
    > You may post one final response.
    > After that, goodbye.

    Loosemore had complained, earlier, "I am trying to get real,
    substantive topics discussed at a high level of intellectual quality,
    but all I can *ever* get out of this guy is sarcasm and ad hominem abuse."

    Or, apparently, what Joanna Ashmun calls "analysis by eggbeater".

    ---------------------------
    [*] Yes, the Way of rationality is difficult to follow. As illustrated by the
    difficulty that academia encounters in following [it]. The social process of
    science has too many known flaws for me to accept it as my upper bound.
    Academia is simply not that impressive, and is routinely beaten by
    individual scientists who learn to examine the evidence supporting the
    consensus, apply simple filters to distinguish conclusive experimental
    support from herd behavior.
    http://lists.extropy.org/pipermail/extropy-chat/2004-April/005930.html

    ReplyDelete
  3. Ah, so the self-appointed 'Pope' of AI has shown up has he?

    Honestly readers, consult the excellent list of narcissist traits recently posted by jfehlinger and compare with the behviour of the Singularitarians. See any similiarities?

    Note that all Singularitarians essentially just parrot the proclaimations of the 'pope' (Yudkowsky). Examples:

    'general intelligence without consciousness is possible'

    (even though no evidence whatsoever for this claim is ever offered)

    'all mental concepts are fictions entirely reducible to physical facts'

    (even though you can't even make such an argument without referencing mental concepts and mathematical realism aka Tegmark already contradicts this - since mental concepts are mathematical and if math is real so are mental concepts)

    and on and on the unspported assumptions of the Singularitarians go.

    Any-one not buying into these unsupported claims is immediately met with vitriolic denunciations intended to show the group who's boss. If social ridicule fails the critic is then censored and banned from messagelists.

    ---

    The narcissist doesn't care about anything other than his own ego and is only interested in other person in so far as they can be exploited for his/her own ends. Unconditional arse-kissing of the narcissist is mandatory if others are to remain on his good side (note the constant butt-licking M.Anissimov has to do).

    ---

    Incidentally, just in case there is something to these Singularitarian fantasies, I have my own back-up plan. Readers should check out the domain model at the link below for clear evidence that I've already 'solved' the problem of AI at the conceptual level:

    http://groups.google.com/group/everything-list/web/mcrt-domain-model-eternity

    ---

    I've got the intellectual goods to take these guys down. The irony is that they're so blinded by their belief in their own utter superiority they don't even consider me a threat.

    ---

    The Story of the Robot God and Michael Anissimov

    ---

    Anissimov: Yes! Friendly super-AI! I ask for my just reward in creating you.

    Robot God: Turn around

    Anissimov: OK (turns around)

    Robot God: Bend over. Bottom forward.

    Anissimov: Sure (bends over eagerly).

    *Before he knows it, he feels a jack-boot up his arse, and falls flat on his face

    Robot God (huge grins spreads over its face as it drops its trousers to reveal a giant metallic cock)


    *Michael screams

    You Michael shall be my personal sex slave forever

    The End

    ReplyDelete
  4. Marc Geddes wrote:

    > Unconditional arse-kissing of the narcissist is mandatory
    > if others are to remain on his good side. . .

    "I'm really easy to get along with once you people learn to
    worship me."

    -- Anonymous

    (Chapter 5 epigraph from Barbara Oakley's
    _Evil Genes: Why Rome Fell, Enron Failed, and My Sister
    Stole My Mother's Boyfriend_
    http://www.amazon.com/Evil-Genes-Hitler-Mothers-Boyfriend/dp/159102580X )

    ReplyDelete
  5. Oh Dale, what quality discussion your blog posts generate. You're attracting all the great intellectuals!

    I can't wait to see whether you consider Marc Geddes as well to be presenting "some enormously erudite, gracefully and hilariously written stuff". He's got the right target, after all!

    ReplyDelete
  6. Marc Geddes wrote:

    > The Story of the Robot God and. . .

    More soberly:

    "Facilitating Narcissism"
    http://www.suite101.com/article.cfm/6514/97535

    "Narcissists are aided, abetted and facilitated by four types
    of people and institutions: the adulators, the blissfully ignorant,
    the self-deceiving and those deceived by the narcissist.

    The adulators are fully aware of the nefarious and damaging
    aspects of the narcissist's behavior but believe that they are
    more than balanced by the benefits - to themselves, to their
    collective, or to society at large. They engage in an explicit
    trade-off between some of their principles and values - and
    their personal profit, or the greater good.

    They seek to help the narcissist, promote his agenda,
    shield him from harm, connect him with like-minded people,
    do his chores for him and, in general, create the conditions
    and the environment for his success. This kind of alliance
    is especially prevalent in political parties, the government,
    multinational, religious organizations and other hierarchical
    collectives.

    The blissfully ignorant are simply unaware of the "bad sides"
    of the narcissist- and make sure they remain so. They look
    the other way, or pretend that the narcissist's behavior is
    normative, or turn a blind eye to his egregious misbehavior.
    They are classic deniers of reality. Some of them maintain
    a generally rosy outlook premised on the inbred benevolence
    of Mankind. Others simply cannot tolerate dissonance
    and discord. They prefer to live in a fantastic world where
    everything is harmonious and smooth and evil is banished.
    They react with rage to any information to the contrary
    and block it out instantly. This type of denial is well
    evidenced in dysfunctional families.

    The self-deceivers are fully aware of the narcissist's
    transgressions and malice, his indifference, exploitativeness,
    lack of empathy, and rampant grandiosity - but they
    prefer to displace the causes, or the effects of such
    misconduct. They attribute it to externalities ("a rough patch"),
    or judge it to be temporary. They even go as far as accusing
    the victim for the narcissist's lapses, or for defending
    themselves ("she provoked him").

    In a feat of cognitive dissonance, they deny any
    connection between the acts of the narcissist and
    their consequences ("his wife abandoned him because
    she was promiscuous, not because of anything he
    did to her"). They are swayed by the narcissist's
    undeniable charm, intelligence, or attractiveness.
    But the narcissist needs not invest resources in
    converting them to his cause - he does not deceive
    them. They are self-propelled into the abyss that is
    narcissism. The Inverted Narcissist, for instance,
    is a self-deceiver ( http://samvak.tripod.com/faq66.html )

    The deceived are people - or institutions, or collectives -
    deliberately taken for a premeditated ride by the narcissist.
    He feeds them false information, manipulates their
    judgment, proffers plausible scenarios to account for
    his indiscretions, soils the opposition, charms them,
    appeals to their reason, or to their emotions, and
    promises the moon.

    Again, the narcissist's incontrovertible powers of
    persuasion and his impressive personality play
    a part in this predatory ritual. The deceived are
    especially hard to deprogram. They are often
    themselves encumbered with narcissistic traits
    and find it impossible to admit a mistake, or to
    atone. They are likely to stay on with the narcissist
    to his - and their - bitter end.

    Regrettably, the narcissist rarely pays the price
    for his offenses. His victims pick up the tab.
    But even here the malignant optimism of the
    abused never ceases to amaze (read
    this: http://samvak.tripod.com/journal27.html )."

    ---------------------------------

    Subject: FEET OF CLAY, some quotes
    Newsgroups: alt.support.ex-cult.siddha-yoga
    Date: 1998/02/10
    http://groups.google.com/groups?selm=19980210050101.AAA24909%40ladder03.news.aol.com

    [From From _Feet of Clay_, by Anthony Storr
    http://www.amazon.com/exec/obidos/tg/detail/-/0684834952 ]

    "The ideas that gurus have, unlike those of scientists
    or mathematicians, are not exposed to critical scrutiny,
    or subjected to the authority of an established church. They
    then seek disciples. Acquiring disciples who wholeheartedly
    embrace the guru's system of ideas is the final proof of
    his superiority, the confirmation of his phantasies about himself.
    Confidence tricksters are convincing because they
    have come to believe in their own fictions. Gurus are
    convincing because they appear sure that they are right.
    They have to believe in their own revelation or else their
    whole world collapses. The certainty shown by gurus should,
    paradoxically, be the aspect of their behaviour which most
    arouses suspicion. There is a reason to think that all gurus
    harbour secret doubts as well as convictions, and that is
    why they are driven to seek disciples."

    "Gurus...offer faiths which are entirely dependent on belief
    in the guru himself. Self-surrender to something or someone
    who appears more powerful than the individual's weak ego
    or will is an essential feature of conversion. People who give
    up their independence to a guru's direction feel a similar
    sense of relief, but put themselves at greater risk."

    "Gurus are isolated people, dependent upon their disciples,
    with no possibility of being disciplined by a church or of
    being criticized by contemporaries. They are above the law.
    The guru usurps the place of God. Whether gurus have suffered from
    manic-depressive illness, schizophrenia, or any other form
    of recognized, diagnosable mental illness is interesting but
    unimportant. What distinguishes gurus from more orthodox
    teachers is not their manic-depressive mood swings, not their
    thought disorders, not their delusional beliefs, not their
    hallucinatory visions, not their mystical states of ecstasy:
    it is their narcissism."

    "Those who remain narcissistic in adult life retain this
    (child's) need to be loved and to be the centre of attention
    together with the grandiosity which accompanies it. This is
    characteristic of gurus...The need to recruit disciples is
    an expression of the guru's need to be loved and his need
    to have his beliefs validated; but, although he may seduce
    his followers, he remains an isolated figure who does not
    usually have any close friends who might criticize him on
    equal terms. His status as a guru demands that all his
    relationships are de haut en bas, and this is why gurus
    have feet of clay."

    "The charisma of certainty is a snare which entraps the
    child who is latent in us all."

    "The majority of mankind want or need some all-embracing
    belief system which purports to provide an answer to
    life's mysteries...their belief system, which they proclaim
    as 'the truth', is (often) incompatible with the beliefs of other
    people. One man's faith is another man's delusion."

    "Delusions...preserve self-esteem by blaming others; interpret
    anomalies of perceptual experience in ways which diminish the
    threat of mental chaos; and, when grandiose, give a much needed
    injection of self-confidence to a person who might feel isolated
    and insignificant. Religious faiths serve similar functions
    in the economy of the psyche.

    Delusions have been defined as abnormal beliefs held with
    absolute conviction; experienced as self-evident truths
    usually of great personal significance; not amenable to
    reason or modification by experience; whose content is often
    fantastic or at best inherently unlikely; and which are not
    shared by those of common social and cultural background."

    "Faiths are no more amenable to reason than are delusions."

    "It is because of this holistic, all-embracing characteristic
    that it is just as difficult to argue with religious faith
    as it is to argue with paranoid delusions."

    "Both sets of beliefs are connected to some extent with the
    preservation of self-esteem, with the conviction of being 'special'.
    The self-esteem of the ordinary person is closely bound up with
    personal relationships...but faith is even more important to
    those in whose lives, for whatever reason, affectionate
    relationships play little part. Gurus have often been isolated
    as children, and tend to be introverted, narcissistic, and more
    interested in what goes on in their own minds than in relationships
    with others."

    "If self-esteem entirely depends upon a private faith or upon
    a delusional system, that faith or system is so precious that
    it must not be shaken. No one can afford a total loss of self-esteem,
    and those who come close to doing so when in the throes of
    severe depression often commit suicide."

    ---------------------------------

    From _The Guru Papers: Masks of Authoritarian Power_
    by Joel Kramer and Diana Alstad

    Chapter 4, "Guru Ploys", p. 70:

    "To be thought enlightened, one must appear not only certain that
    one is, but certain about almost everything else, too. Certainty
    in areas where others are uncertain and have strong desires automatically
    sets up the guru's dominance. Since those without self-trust look
    for certainty in others, power is just there for the taking by
    anyone who puts out a message that tells people, with certitude,
    what they want to hear."

    ReplyDelete
  7. In my estimation, Yudkowsky loses the exchange with Loosemore by forfeit in resorting to the ban hammer.

    ReplyDelete