Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Thursday, March 14, 2013

Deep Thoughts on Democracy from Eliezer Yudkowsky

All-Wrong "Less Wrong" Robot Cult Guru Wannabe Eliezer Yudkowsky:
The transhuman technologies -- molecular nanotechnology, advanced biotech, genetech, Artificial Intelligence, et cetera -- pose tough policy questions. What kind of role, if any, should a government take in supervising a parent's choice of genes for their child? Could parents deliberately choose genes for schizophrenia? If enhancing a child's intelligence is expensive, should governments help ensure access, to prevent the emergence of a cognitive elite? You can propose various institutions to answer these policy questions -- for example, that private charities should provide financial aid for intelligence enhancement -- but the obvious next question is, "Will this institution be effective?" If we rely on product liability lawsuits to prevent corporations from building harmful nanotech, will that really work? I know someone whose answer to every one of these questions is "Liberal democracy!" That's it. That's his answer. If you ask the obvious question of "How well have liberal democracies performed, historically, on problems this tricky?" or "What if liberal democracy does something stupid?" then you're an autocrat, or libertopian, or otherwise a very very bad person. No one is allowed to question democracy.
By way of introduction, let us notice first of all that Yudkowsky's formulation assumes that everybody already knows what "the transhuman technologies" are (that "et cetera" at the end of the laundry list is a dead giveaway), but fails to note that what these "technologies" have in common is that none of them exist to BE anything at all, transhumanoid or otherwise. Of course, biochemistry already operates at the nanoscale, as do some actually-existing materials techniques. After all, making ceramics could be construed as nanotechnology if you squint, but literally nothing new, clarifying, or useful is accomplished by saying so. Whether "advanced" biotech exists or not will depend on the criteria one is using to denote "advanced" (and Yudkowsky doesn't specify his, because of course everybody already knows what criteria drive transhumanoids in these matters). Much the same is true of "genetech," some of which does exist but when it does it is called, not "transhumanism" but, you know, "medicine." And, of course, lots of inept computation, like infuriatingly incorrect autocorrect functions, idiotically inept video game background and companion characters and glorified tape-recorders in cars and hand-held devices get called "Artificial Intelligence," while none of them are the least bit intelligent at all.

Nobody has to join a Robot Cult to understand or deliberate about the developmental stakes associated with actually existing materials or medical techniques, software security or user friendliness issues. And, of course, nobody in a Robot Cult cares in fact about any of them at all except to the extent that these actually-existing artifacts, techniques, and issues are re-imagined as oracles, as portents, as signs, as "burning bushes" pointing the way to a Superlative Future, populated by Superlative Variations of these artifacts and techniques and issues, hyperbolized into a pseudo-scientific quasi-theological techno-transcendental domain in which the faithful can contemplate cyborgic demi-divinity in the form of a super-intelligence beyond error and humiliation, super-powers beyond dis-ease and vulnerability, superabundance beyond struggle and limits.

Given the scale of this error, it may seem mere quibbling to point out as well that, strictly speaking, the non-existing Superlative artifacts and techniques that populate the Superlative imaginary of transhumanoid, singularitarian, and techno-immortalist Robot Cultists do not in fact "pose tough policy questions" to anybody. They do not exist to pose anything, tough or not, anywhere on earth. To the extent that the wish-fulfillment fantasies of Robot Cultists are symptoms of underlying pathologies, irrational fears of aging, mortality, or bodily life, for example, or irrational worries over their lack of total control over the conditions of life, then I suppose one should concede that "the transhuman technologies" pose tough questions for their therapists. And, of course, one might slightly reformulate Yudkowsky's point to ask instead whether there might be worthwhile (if not exactly "tough") questions to ask about the deranging impact of too much public attention and concern devoted to non-existing, irrationally symptomatic "transhuman technologies" on sensible deliberation over budgetary and regulatory priorities, over the quality of public technoscience literacy, and over the reasonable assessment of the stakeholder diversity of real-world developmental costs, benefits, and risks.

Notice that just as Yudkowsky assumes everybody already knows what "the transhuman technologies" ARE (setting aside the issue that non-existing non-things and non-techniques AREn't anything at all), he also assumes that everybody already agrees what would constitute an "enhancement" of intelligence, even though it is quite obvious that enhancement is always enhancement in the service of specific ends, that optimally enabling for some ends inevitably disables for other ends, that there is widespread and passionately contentious disagreement over which ends are indispensable to human flourishing, and so on. He makes similar assumptions about what constitutes "advanced" biotechnology, about what constitutes "effective" institutions. One of the reasons that even those transhumanoid Robot Cultists who disdain association with eugenics (and it should be noted that many transhumanists insist on the association, proudly declaring themselves "liberal eugenicists," while many others espouse bioreductionist evo-devo formulations the relations to eugenic worldviews they fail to grasp or disingenuously deny) might still be assigned the eugenic designation is that in advocating for an "enhancement" treated as a neutral technical term they disavow the substance of disagreement over the terms of human flourishing the better to impose their own parochially preferred views on the matter. Transhumanist and other futurological formulations continually de-politicize the field of actually ongoing technodevelopmental social struggle the better to prevail in their personal political positions under the sign of the a-political, the factual, the hygenic, the optimally efficient, the already universal. That such transhumanists are simply indulging in a reactionary politics of naturalization while at once endlessly declaring themselves the enemy of all things "natural" isn't exactly paradoxical, since I daresay with transhumanists this is really more a matter of stupid people being too stupid to realize they're being stupid. But this customary futurological de-politicization of the politics of technodevelopment does provide a nice connection to the part of Yudkowsky's little number that initially attracted my attention, his also rather typically transhumanoid/singularitarian disdain of democracy.

I leave to the side the question whether we can really trust in the basic truthfulness of Yudkowsky's anecdotal interlocutor who in answer to every policy question presumably responds like a doll whose string has been pulled, "Liberal Democracy!" Given that such a nonresponsive response scarcely seems even grammatical let alone substantive I suspect that what Yudkowsky is accidentally confessing in this supercilious little fable is that when his liberal democratic conversational partner attempts to offer up his best most substantive responses to Yudkowsky's questions (however adequate or not these responses may actually be), Eliezer Yudkowsky -- Soopergenius! -- is incapable of hearing anything but "Liberal Democracy!" over and over again.

If Yudkowsky were to declare to me that "liberal democracies" have done "something stupid" or asked me how "liberal democracies" have "performed historically" in the face of some intractable problem or other, I suspect that I would answer that "liberal democracies" don't DO anything at all, don't PERFORM at all, but that through the forms of liberal democracy (which are, after all, themselves always changing as citizens struggle through them to make them better) citizens, the people themselves, educate, agitate, organize, legislate, reform, deliberate, compromise, content together to solve shared problems, to provide nonviolence alternatives for the adjudication of disputes, and to support and strengthen the scene of consent of everyday people to everyday life.

While nobody denies that citizens can be stupid or ineffective (and much worse), the question is through what collective forms do citizens best recognize and redress such problems? If it is true that majorities of the people can be terribly even dangerously wrong this is because all people can be terribly and even dangerously wrong -- and of course this also includes any minority of people one might care to designate (or who, more likely, would care to designate themselves) a superior elite of aristocrats, plutocrats, technocrats or others who in disdaining liberal democracy express their fancied preference to rule over others.

Needless to say -- because it has been said after all as often as its lesson has been ignored -- there is no argument against government of, by, and for the people that is not a stronger argument against the rule of some few people over the rest of us. What I suspect is that in saying all this, all Eliezer Yudkowsky would hear from me, as he heard in his anecdotal interlocutor, is somebody squawking "Liberal Democracy!" over and over and over again. This insensitivity (I intend the word as a synonym for palpable unintelligence) indicts Yudkowsky even as he imagines himself transfigured into triumph by it.

Perhaps it was Yudkowsky's enjoyment of this false fantastic triumph which distracted his attention from noticing that in whining about not being able to "question democracy" he is ignoring the fact that questioning democracy is the substance of democracy itself, while at once stridently contemplating removing from the majority of people the very right to have a say in the decisions that affect them that he self-righteously demands for himself and pretends to be restrained from even as he effortlessly exercises it in fact. It's almost as if he's "an autocrat, or libertopian, or otherwise a very very bad person" or something.


erickingsley said...


In addition to the usual wanking about non-existent sci-fi magitech, Yudkowsky also appears to be a megalomaniac wannabe dictator who has no time for pesky things like "democracy" and "what other people want". He's so super-smart, he knows what is best for everyone else better than anyone else does and he'd very much like to impose his will on everyone else...for their own good of course, if only they could see it!

Sort of like a less cool, less smart, less sexy Borg Queen.

Eudoxia said...

Maybe he's just preparing the stage for his come out of the neo-monarchist closet one of these days? :)

On that note, I always suspected Anissimov would turn into a libertarian - like all transhumanists - but not, well, all this:

>One element in the trajectory of thought I wouldn't have predicted in 1999 is that evolutionary psychology is still controversial today.
>Secular humanism? More like secular who-cares.
>now calls himself Oracle

And, ironically,

>Whatever your ideology is, ask yourself: "Does it make sense in a world of VR, self-replicating machines and nanotech-built everything?"

He has become some kind of old-timey imperialist who is still upset about how the Frumentarii are the only ones who can bear a fasces (I made my first gun control joke~). This recent change also correlates with his relationship with Rachel Haywire, who is herself a proud fascist, trying to disguise the gas chambers under the banner of anarchism.
>The contributors are [...] and Rachel Haywire.

Let us take a look at their manifesto:

Oh, this gon be good.

>In addition, the German National-Socialists and Italian Fascists of the twentieth century allied themselves with large banking interests and betrayed the more 'socialistic' aspects of their original programmes.

The Third Reich failed because Hitler wasn't enough of a fascist!

But can I honestly be surprised? I mean, it should've been obvious considering SIAI/MIRI/whatever's 'endless authoritarian happiness' vision of the future.

Esebian said...

See also the Robot Cultists' widespread infatuation concerning Market Maoism. One wonders what they will have to say once the Chinese bubble bursts?

jimf said...

> I leave to the side the question whether we can really
> trust in the basic truthfulness of Yudkowsky's anecdotal
> interlocutor who in answer to every policy question
> presumably responds like a doll whose string has been pulled,
> "Liberal Democracy!"

To soopergeniuses who think outside the box -- er, this
blinkered alley of the multiverse -- **all** the rest of
us respond "like a doll whose string has been pulled".

It's not easy being a soopergenius. But, ya know, somebody
has to do it.

(Someday, poems will be written. . .)

jimf said...

> I always suspected Anissimov would turn into a libertarian. . .
> but not, well, all this:
> . . .now calls himself Oracle

I guess that means we'd better wachowski for **him**!

Does he bake cookies now, too?