Sunday, May 08, 2011

More Signs of the Singularity! (Damn You Auto Correct! Edition)

Damn You Auto Correct!

Among the transhumanists and singularitarians and techno-immortalists of the Robot Cult, there are many "serious futurologists" who seriously believe that a real person cartoonishly represented in crappy code that like-minded futurological sociopaths happen to be too inept to distinguish from the real person would through such an operation be techno-transubstantiated and techno-transcendentalized thereby into immortality in cyber-heaven (they have the pie charts). It is of course no easy matter to imaginatively inhabit the mind of a person so flabbergastingly deluded or deranged as that on the best of days, but dipping into a site like DYAC for a few minutes makes entertaining such facile fancies that much more hilariously impossible still for all but the Truest True Believers in the coming Robot God and the Singularity that Will End History and Deliver the Techno-Faithful unto Holodeck Heaven.

11 comments:

  1. > [T]here are many "serious futurologists" who seriously believe
    > that a real person cartoonishly represented in crappy code that
    > like-minded futurological sociopaths happen to be too inept to
    > distinguish from the real person would through such an operation
    > be techno-transubstantiated and techno-transcendentalized thereby
    > into immortality in cyber-heaven. . . It is of course no easy matter
    > to imaginatively inhabit the mind of a person so flabbergastingly
    > deluded or deranged. . .

    Although (if that kind of entertainment is to your taste) it is
    extremely easy, and pleasant, to suspend disbelief during the
    course of an Iain Banks or Greg Egan novel. I certainly don't
    bat an eye at

    "[Cass] unstrapped herself and drifted away from the bed.
    She didn't need to wash, or purge herself of wastes. From the
    moment she'd arrived, as a stream of ultraviolet pulses with a
    header requesting embodiment on almost any terms, the Mimosans had
    been polite and accommodating: Cass had been careful not to
    abuse their hospitality by pleading for frivolous luxuries.
    A self-contained body and a safe place to sleep were the only
    things she really needed. . . Demanding the right to eat and
    excrete, here, would have been as crass as insisting on slavish
    recreations of her favorite childhood meals, while a guest at some
    terrestrial facility."

    -- _Schild's Ladder_

    ReplyDelete
  2. On the other hand, Egan himself indulges in considerable snark
    towards contemporary >Hists in the more recent _Zendegi_:


    "'I'm Nate Caplan. . . My IQ is one hundred and sixty. I'm
    in perfect physical and mental health. And I can pay you half
    a million dollars right now. . . I know I look skinny. . .
    but I have no lipid deficiencies. . . And I'm willing to give
    up the caloric restriction. . .'

    'How did you get my address?' . . .

    'You'll get the money, and it will be untraceable. All I want
    from you in return is a guarantee that when the time comes,
    I'll be the one.'

    Nasim didn't know where to start. . . '**If** the Human
    Connectome Project goes ahead, the first maps will be utterly
    generic. We'll be tracing representative pathways within and
    between a few dozen brain regions, and then extrapolating
    from that. . . If you really want to kill yourself and donate
    your organs to science, go right ahead, but even if I took
    your bribe and somehow managed to get your brain included in
    the project . . . you'd have no more chance of waking up
    in cyberspace than if you'd donated a kidney.' . . .

    'But ten years down the track, when you've got the bugs ironed
    out, **I want to be the first**. When you start recording full
    synaptic details and scanning whole brains in high resolution --'

    '**Ten years**?' Nasim spluttered. 'Do you have any idea how
    unrealistic that is?'

    'Ten, twenty, thirty . . . whatever. You're getting in on the ground
    floor, so this is my chance to be there with you. . .
    Just give me your email address.'

    'Absolutely not.' . . .

    'You can always reach me through my blog!' he panted. 'Overpowering
    Falsehood dot com, the number one site for rational thinking
    about the future --'

    . . .

    ReplyDelete
  3. 'Hey, they're talking to Zachary Churchland!'. . .

    Churchland was an octogenarian oil billionaire who had raised the
    possibility of funding his own brain-mapping project, in competition
    with any government effort. The press had started calling him
    'the Craig Venter of the HCP'. . . The neuroscientists advocating
    the HCP treated him with kid gloves, as they would any potential
    sugar daddy. . .

    'Congressman, the ultimate goal of my project would be universal
    immortality,' Churchland declared. . . 'If there are public health
    benefits along the way, then that's well and good, but all of public
    health becomes a minor sub-problem when viewed in the light of the
    digital migration.' . . .

    'What timescale do you anticipate for that development? For what
    you call "personalisation"?

    'I am not an expert. . . [b]ut the people I have consulted on the
    matter suggest that it might be possible within twenty or thirty
    years.'

    'So this is not a development from which you would hope to benefit
    yourself, sir?'

    'On the contrary, Congressman,. . . I am unlikely to see out the
    year, but upon my death my body will be frozen. If I do set up a trust
    to support this research, the deeds of that trust will expressly state
    that its goals include my own digital resurrection. . .

    And I would not wish to mislead this committee into thinking that I
    have definitely resolved to fund a project of the kind we are discussing.
    In fact, over the last month or so I have received some very persuasive
    representations from a group who believe that it might be at best
    inefficient and at worst highly dangerous to proceed in this fashion.'

    'Can you elaborate, sir?'

    'I have been invited to fund an enterprise known as the Benign Superintelligence
    Bootstrap Project,' Churchland explained. 'Their aim is to build an
    artificial intelligence capable of such exquisite powers of self-analysis
    that it will design and construct its own successor, which will be armed
    with superior versions of all the skills the original possessed.
    The successor will then produce a still more proficient third version,
    and so on, leading to a cascade of exponentially increasing abilities.
    Once this process is set in motion, within weeks -- perhaps within
    hours -- a being of truly God-like powers will emerge.'

    ReplyDelete
  4. Nasim resisted the urge to bury her face in her hands. . . The uploading
    advocates who'd sold Churchland on an imminent digital resurrection
    hadn't lost their critical faculties entirely, but their penchant for
    finessing away any 'mere technical problems' that might stretch out the timetable
    was, nonetheless, intellectually corrosive, to the point where the next
    step probably didn't seem like such a great leap any more: hand-waving **all**
    practicalities out of existence, transforming the cyber-eschatologists'
    rickety scaffolding of untested assumptions into a cast-iron stairway to
    heaven. . .

    'Rather than trust humans to perfect the brain-mapping technology that we've
    been discussing, I am leaning towards putting my fate in the hands of
    an artificial God, for whom such problems will be trivial. The Benign Superintelligence
    will rule the planet with wisdom and compassion, eliminating war, disease,
    unhappiness, and of course, death. I am told that it will probably
    disassemble most of the material in our solar system in order to construct
    a vast computer that will exploit all the energy of the sun. Perhaps it
    will spare the Earth, or perhaps the Earth will be reconstructed, more perfectly,
    within that computerised domain.' . . .

    '"Rule the planet"? Am I to understand that you're contemplating funding a body
    that advocates overthrowing the lawful government of the United States?'

    Churchland required more oxygen before replying, 'Keep your shirt on, Congressman.
    There's no point fighting it, and the alternative would be far worse.
    Imagine if one of our country's enemies did this first. Imagine the kind of
    despotic superintelligence that Al Quaeda would create.'

    'Mr. Churchland,' Fitzwaller said evenly, 'does it not occur to you that most
    people on the planet would prefer not to have their affairs dictated by
    an artificial intelligence of any kind?'

    'That's too bad, Congressman.' Churchland retorted, 'because I am coming to
    the view that we probably have no choice.'

    Judith stormed into the conference room. . . For a moment Nasim assumed that
    she'd been watching the same feed, but then it became clear. . . that she
    was oblivious to the sight of half the HCP's potential funding sprouting wings
    and flying away. She was livid, but it had nothing to do with Churchland's
    deathbed embrace of Bullshit Squared.

    . . .

    ReplyDelete
  5. Zachary Churchland had died three weeks before and descended into the frosty
    limbo of an Alcor cryonics vault. He had left the bulk of his estate to the
    Benign Superintelligence Bootstrap Project, having finally concluded that he
    couldn't trust his immortal soul to human hands.

    'I heard someone's contesting his will,' Nasim recalled. . .

    'Third wife. Actually, I'm helping her fund the case,' Caplan explained
    smoothly. . . The ongoing litigation should help keep the bequest out
    of the Superintelligence Project's hands for quite a while.' . . .

    'Why should you care who gets Churchland's money? It's either Bullshit Squared,
    or the wives. It's lost to the HCP."

    'No doubt it is,' Caplan conceded, 'but I don't want the superintelligence to come
    into existence before I'm uploaded. It's very important to me that I'm the
    first transcendent being in this stellar system. I can't risk having to compete
    with another resource-hungry entity: I have personal plans that require at
    least one Jovian mass of computronium.'

    'Really? I have "personal plans" that require Naveen Andrews and a bottle of
    coconut oil, but I don't expect they're going to happen either.'

    Caplan was bemused. 'Why are you so hostile?'

    'I don't know,' Nasim confessed. 'Maybe it's because I've had enough experience
    of deluded fundamentalists to last a lifetime.'

    'Well, you're wasting your energy,' Caplan replied loftily. 'One way or
    another, everything I speak of will come to pass. You can either join us,
    or be left behind.'

    . . .

    ReplyDelete
  6. 'The boss sent me on a head-hunting trip a few years ago. . . I went to about
    fifty campuses and start-ups looking for researchers we could hire. . .'

    'Was that when you visited to Superintelligence Project?'. . .

    'Yeah. No AI there.' She had spent a day at their Houston complex, curious to
    find out what they'd done with Zachary Churchland's billions once his bequest
    had made it through the Texan version of _Bleak House_. But the sum total
    of their achievement had amounted to a nine-hundred-page wish-list dressed up
    as a taxonomy, a fantasy of convenient but implausible properties for a vast
    imaginary hierarchy of software daemons and deities. The whole angelic
    realm had been described with the kind of detail often lavished on a
    game-world's mythical bestiary, but Nasim had seen no evidence that these
    self-improving cyber-djinn had any more chance of being brought to life
    than the denizens of the Dungeons and Dragons _Monster Manual_.

    . . .

    ReplyDelete
  7. When Nasim went upstairs she had her knowledge-miner show her an updated
    news summary. . .

    In the dregs of the knowledge-miner's sweep was the news that the Benign
    Superintelligence Bootstrap Project had issued a video press release, with
    their public affairs officer, Michelle Bello, interviewing their director,
    Conrad Esch. The topical question addressed was whether the BSBSP had
    gleaned anything from the Human Connectome Project that might prove to
    be of more lasting significance than this brief spurt of interest in an
    online game.

    Apparently, the answer was yes. By carefully studying the HCP data over
    the last few months, the Superintelligence Project had acquired vital clues
    that would allow it to construct a Class Three Emergent Godlet within
    five years.

    'And when that happens, what can we expect?' Bello asked.

    'Within two or three hours, the planet will be entirely in the hands of
    the Benign Superintelligence. Human affairs will be reorganized, within
    seconds, into their optimal state: no more war, no more sorrow, no
    more death.'

    'But how can we be sure of that?' Bello probed fearlessly. 'Computers are
    capable of all kinds of errors and mistakes.'

    'Computers built and programmed by humans, yes,' Esch conceded. 'But remember,
    **by definition**, every element in the ascending chain of Godlets will be
    superior to its predecessor, in both intelligence and benignity.
    We've done the theoretical groundwork; now we're assembling the final
    pieces that will start the chain reaction. The endpoint is simply a matter
    of logic: God is coming into being. There is no disputing that, and
    there is no stopping it.'

    . . .

    ReplyDelete
  8. Caplan preferred to meet in augmented reality, but Nasim wasn't set up
    for that at home, so they settled for plain audio. . .

    [H]e interrupted her halfway through. 'Why doesn't your friend just
    freeze himself? All cancers will be treatable in a decade.'

    Cancers, maybe; Nasim doubted that being frozen to death would be cured
    so quickly. 'He has a son,' she said. 'That's the whole point. He
    wants to raise his son, not come back when Javeed's an adult.'

    'Well, he can't freeze the kid,' Caplan mused. 'That would be illegal.
    Unless Iran has much more progressive legislation on these things than
    we have.'

    Nasim struggled to reorganise her tactics. How did you get through to
    someone whose entire world view had been moulded by tenth-rate science
    fiction? Empathy for Javeed was out; Caplan probably believe that the
    only consequence of being orphaned at six was that you tried harder than
    anyone else to reach the top of your class in space academy. . .

    'Motivated volunteers aren't hard to find,' Caplan replied.

    'Maybe not,' Nasim conceded, 'But the people who'd take the bait if
    you sprinkled some buzzwords around the net are your fellow wannabe-immortals,
    who'll expect perfect copies of their minds to wake up in cyberspace.
    Martin has no such illusions; he understands that the Proxy will have
    massive limitations. He doesn't imagine that we can make him live
    forever; he just wants us to use his brain to craft some software that can
    do a certain job.'

    'So when exactly did I become the Make-a-Wish foundation?' Caplan protested
    irritably.

    'This isn't charity,' Nasim insisted. 'It would yield valuable information
    for both of us. . .'

    She waited, wondering if Caplan was going to make her swear not to turn Martin's
    Proxy into a transcendent being who would rob him of his rightful place as lord
    of the solar system.

    He said, 'If it doesn't work, are you prepared to clean up the mess? To put
    your botched creation out of its misery?' . . .

    Nasim steeled herself. She said, 'I'll spell out all the risks to Martin;
    in the end he's the one who'll have to decide the fate of anything derived
    from his mind. But yes, if it comes down to it, I'm prepared to clean
    up the mess.'

    . . .

    ReplyDelete
  9. 'So what do you think happened in Houston?' she said. . .

    Caplan said, 'I doubt it was the Cis-Humanist League. I'm thinking fundamentalist
    Christians.'

    '**Christians**?'

    'The Superintelligence Project stated their goals in explicitly religious language,'
    Caplan pointed out. '"God is coming into existence. We're building Him right
    here." What did they expect, trespassing on the territory of people with
    strongly held ideas about the meaning of the word?'

    'But they've been talking like that for years,' Nasim protested. 'Why should anyone
    start taking them seriously now?'

    'The HCP,' Caplan replied. . . Some of the credibility. . . would have rubbed off
    on them, in a layperson's eyes. You must have seen their me-too press release,
    saying they'd have God up and running within five years. For people who'd thought
    they were full of nothing but blasphemous hot air, it might have started looking
    too close for comfort. The Antichrist was coming to rule over the nations.' . . .

    'I thought the whole idea of religious prophecy was that it was . . . **prophecy**.
    If it starts to look as if the Beast will be born in a computer in Houston,
    isn't it the role of virtuous believers to live through his reign, stay true to
    their faith, and reap their reward in the end? You don't drive a truck full of
    fertiliser into the path of pre-ordained events that need to happen before the
    Second Coming, however unpleasant they might be.'

    Caplan said, 'Maybe they took their theology lessons from Schwarzenegger movies.
    Or maybe I'm wrong, maybe it was someone else who thought the side-loads tipped
    the balance and made the chance that the Superintelligence Project could succeed
    start to look like too great a risk. A government agency? A foreign power?'
    He shrugged.

    'Outside the project itself,' Nasim replied, 'apart from Zachary Churchland, the
    only person I know of who ever took them seriously was you.'

    Caplan laughed. . . 'Yeah, I was pretty naive back then.'

    'So what changed your mind?'

    'Watching them turn five billion dollars into nothing but padded salaries and
    empty verbiage.'


    Ouch!

    ReplyDelete
  10. > Damn You Auto Correct!

    In William Gibson's 1999 _All Tomorrow's Parties_ (the
    last book in the "Bridge trilogy"
    http://en.wikipedia.org/wiki/Bridge_trilogy ),
    the bored Cody Harwood (a vaguely Bill-Gatesian
    "richest man in the world" figure) has some fun blaming
    his real-time translation software for the vulgar insults he
    offers to a Japanese journalist during a telephone interview
    about his "Lucky Dragon Nanofax" chain-store replication service
    (a high-tech Kinko's using nanotech matter-assemblers):

    "'People are fascinated by the pointlessness of it. That's
    what they like about it. Yes, it's crazy, but it's **fun**.
    You want to send your nephew in Houston a toy, and you're in
    Paris, you buy it, take it to a Lucky Dragon, and have it
    re-created, from the molecules up, in a Lucky Dragon in
    Houston . . . What? What happens to the toy you bought in
    Paris? You keep it. Give it away. Eviscerate it with
    your teeth, you tedious, literal-minded bitch. What?
    No, I didn't. No, I'm sorry, Noriko, that must be an
    artifact of your translation program. How could you imagine
    I'd say that?' Harwood stares straight ahead, stunned
    with boredom. 'Of course I want to give the interview.
    This is an exclusive, after all. And you were my first
    choice.' Harwood smiles as he calms the journalist, but
    the smile vanishes the instant she begins to ask her next
    question.

    'People are frightened of nanotechnology, Noriko. We know
    that. Even in Tokyo, seventeen-point-eight of your markedly
    technofetishistic populace refuses to this day to set foot
    in a nanotech structure. Here on the coast, I'd point to
    the example of Malibu, where there's been a very serious
    biotech accident, but one which is entirely unrelated to
    nanotech. It's actually being cleaned up with a combination
    of three smart algae, but everyone's convinced that the
    beaches are alive ith invisible nanobots waiting to crawl
    up your disagreeable pussy. What? "Unfriendly cat"?
    No. There's something wrong with your software, Noriko.
    And I do hope you're only writing this down, because we
    negotiated the interview on a nonrecorded basis.
    If any of this ever turns up in any recorded form at all,
    you'll not be getting another. What? Good. I'm glad
    you do.' Harwood yawned, silently. 'One last
    question, then.' . . .

    ;->

    ReplyDelete
  11. I've had DYAC in my Google Reader since about December. Love that site (I sometimes comment as Dude Naw). And wouldn't you know, they mentioned Singularity in a recent post:

    http://damnyouautocorrect.com/8875/xbox-question/

    More signs of the Singularity!

    ReplyDelete