Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All
Sunday, October 07, 2007
More Delicious Agony
Upgraded from Comments:
Reader Aleksei, I fear, is growing annoyed with me:
I'm not the one making confident predictions here, *you* are. You are confident enough of Brain Emulation being unable to produce a human-equivalent cognitive system, that you label as a Robot Cult those of us who think that such a development is a possibility that can't be comfortably ruled out.
Nonsense. My point is that you people have jumped the gun. You have just made a handful of facile leaps that lead you to think what you call Emulation will spit out a Robot God Brain and then, once those leaps are made, you think that all that is left, extraordinarily enough, is to calculate the Robot God Odds as to how many years it will take to get to the Tootsie-Roll Center of the Tootsie-Pop. I'm neither confident nor unconfident about timescales, particularly -- I'm just confident that your confidence is flabbergastingly unwarranted.
And I'm afraid I simply must call bullshit on your oh-so-reasonable characterization of Singularitarians as "those of us who think that such a development is a possibility that can't be comfortably ruled out," because that characterization would make me a Singularitarian of a sort. What actually makes one Singularitarian is clear upon even a cursory survey of your actual published readily available (too bad for you, cultists) discussions which suggest rather forcefully that you take these "possibilities" as near certainties, and certainly as urgencies, while your topical emphases, your policy priorities, your assessments of the concerns of your contemporary peers immediately, obviously, and damningly reveal the truth of the matter.
I haven't even talked about "consciousness" at all. For all I know, a brain emulation might perform cognitive processing as a non-conscious entity.
Now this is interesting. I'm glad to hear it.
Take out the entitative dimension of AI, however, and all the risks and powers you're talking about become far too conventional to justify the way Singularitarians keep casting about for monster movie metaphors about a space race between the evil or clueless teams who might create Unfriendly AGI and the heroic Singularitarians who will beat them by creating Friendly AGI first (and I shudder to think what a sociopath will regard as Friendly on this score). Take the entity out, and you've just got recursive malware runaway, something like a big bulldozer on autopilot that you have to stop before it slams into the helpless village or whatnot. None of the Singularitarian handwaving or secret handshakes or SL4, dude! self-congratulation of the sub(cult)ure is much in point anymore.
The cult vanishes and suddenly you're just talking about software security issues like everybody else. Just like puncturing the Superlativity of the Technological Immortalists leaves you talking about healthcare like everybody else. Just like puncturing the Superlativity of the Nanosantalogists leaves you talking about free software, regulating toxicity at the nanoscale, and widening welfare entitlements just like everybody else.
Drop the transcendentalizing, hyperbolzing discourse and suddenly you're in the world here and now with your peers, facing the demands of democratizing ongoing and proximately upcoming technodevelopmental social struggle.
Just like I've been saying over and over and over again. You can't be technoprogressive and Superlative at the same time. Technoprogressive discourses, properly so-called, won't feed your ego, won't give you a special identity, won't promise you transcendence, won't bolster your elitism or narcissism, and won't readily facilitate a retro-futural rationalization for the eternal articulation of technodevelopment in the interests of incumbents. That's what I'm talking about here. If that sort of thing doesn't interest you, you are quite simply in the wrong place.
You demanded an explanation of why I think you are wrongheaded, but in the technical terms of your own idiosyncratic discourse rather than the perfectly legitimate terms that actually interest me by temperament and training. I replied by pointing out that in my view, "It's far better for you people to explain calmly how exactly you became the sorts of folks who stay up at night worrying about the proximate arrival of Unfriendly Omnipotent Robot Gods given the sorry state of the world (and computer science) at the moment."
You replied:
So your answer is no. You refuse to answer the one question I presented to you.
Big talk, guy, but you mustn't forget that I'm not a member of your Robot Cult. There aren't enough of you for you to think that you have earned the right to demand that those who disagree with you accept your terms when we want to express our skepticism of your extraordinary claims and curious aspirations. You should consider this a reality check. You need to stop engaging in self-congratulatory circle-jerks with your True Believer friends and struggle to communicate your program in terms the world will understand as they themselves present these terms to you. I cheerfully recommend this because I think the brightest folks among you will likely re-assess their positions once they try to engage in this sort of translation exercise. Those who don't will be that much easier for the likes of me to skewer. Oh, and if I'm wrong about you, then of course the Singularitarians Will Prevail or whatever -- but that isn't actually something I worry about overmuch.
I'm not changing the subject. I *started* this conversation with a direct question to you, a question you refuse to answer.
Dag, who's blog is this anyway? Look, it honestly isn't clear to me that anything you would count as an adequate answer wouldn't already embed me within the very discourse I'm deriding. What on earth is in it for me? I don't want to join in your Robot Cult Reindeer Games. The prospect holds no allure.
You are the one shunting aside difficulties, preferring to focus on assorted accusations of cultishness.
Have you ever argued with a longstanding Scientologist? I'm just asking.
If you are ever able to move past your apparent need to ridicule as "Robot Cultishness" the questioning of some of your assumptions, let me know.
I enjoy ridiculing the ridiculous, it's exactly what they deserve. It's not an "apparent need" of mine so much as certainly it is a profound pleasure.
Feel free to continue to read and comment on my writing whenever you like, as you have been. I enjoy these little talks of ours.
As for my sad inability to question my orthodox assumptions in matters of Robot Cultism, it is, no doubt, as you suggest, a sorry and sordid state of affairs for me. It is a hard thing to be so limited as I am.
Persevere, earnest Singularitarian footsoldier, and perhaps one day I might see the Light as you have, someday I might hear as keenly as do you the proximate tonalities of the Robot God.
Reader Aleksei, I fear, is growing annoyed with me:
I'm not the one making confident predictions here, *you* are. You are confident enough of Brain Emulation being unable to produce a human-equivalent cognitive system, that you label as a Robot Cult those of us who think that such a development is a possibility that can't be comfortably ruled out.
Nonsense. My point is that you people have jumped the gun. You have just made a handful of facile leaps that lead you to think what you call Emulation will spit out a Robot God Brain and then, once those leaps are made, you think that all that is left, extraordinarily enough, is to calculate the Robot God Odds as to how many years it will take to get to the Tootsie-Roll Center of the Tootsie-Pop. I'm neither confident nor unconfident about timescales, particularly -- I'm just confident that your confidence is flabbergastingly unwarranted.
And I'm afraid I simply must call bullshit on your oh-so-reasonable characterization of Singularitarians as "those of us who think that such a development is a possibility that can't be comfortably ruled out," because that characterization would make me a Singularitarian of a sort. What actually makes one Singularitarian is clear upon even a cursory survey of your actual published readily available (too bad for you, cultists) discussions which suggest rather forcefully that you take these "possibilities" as near certainties, and certainly as urgencies, while your topical emphases, your policy priorities, your assessments of the concerns of your contemporary peers immediately, obviously, and damningly reveal the truth of the matter.
I haven't even talked about "consciousness" at all. For all I know, a brain emulation might perform cognitive processing as a non-conscious entity.
Now this is interesting. I'm glad to hear it.
Take out the entitative dimension of AI, however, and all the risks and powers you're talking about become far too conventional to justify the way Singularitarians keep casting about for monster movie metaphors about a space race between the evil or clueless teams who might create Unfriendly AGI and the heroic Singularitarians who will beat them by creating Friendly AGI first (and I shudder to think what a sociopath will regard as Friendly on this score). Take the entity out, and you've just got recursive malware runaway, something like a big bulldozer on autopilot that you have to stop before it slams into the helpless village or whatnot. None of the Singularitarian handwaving or secret handshakes or SL4, dude! self-congratulation of the sub(cult)ure is much in point anymore.
The cult vanishes and suddenly you're just talking about software security issues like everybody else. Just like puncturing the Superlativity of the Technological Immortalists leaves you talking about healthcare like everybody else. Just like puncturing the Superlativity of the Nanosantalogists leaves you talking about free software, regulating toxicity at the nanoscale, and widening welfare entitlements just like everybody else.
Drop the transcendentalizing, hyperbolzing discourse and suddenly you're in the world here and now with your peers, facing the demands of democratizing ongoing and proximately upcoming technodevelopmental social struggle.
Just like I've been saying over and over and over again. You can't be technoprogressive and Superlative at the same time. Technoprogressive discourses, properly so-called, won't feed your ego, won't give you a special identity, won't promise you transcendence, won't bolster your elitism or narcissism, and won't readily facilitate a retro-futural rationalization for the eternal articulation of technodevelopment in the interests of incumbents. That's what I'm talking about here. If that sort of thing doesn't interest you, you are quite simply in the wrong place.
You demanded an explanation of why I think you are wrongheaded, but in the technical terms of your own idiosyncratic discourse rather than the perfectly legitimate terms that actually interest me by temperament and training. I replied by pointing out that in my view, "It's far better for you people to explain calmly how exactly you became the sorts of folks who stay up at night worrying about the proximate arrival of Unfriendly Omnipotent Robot Gods given the sorry state of the world (and computer science) at the moment."
You replied:
So your answer is no. You refuse to answer the one question I presented to you.
Big talk, guy, but you mustn't forget that I'm not a member of your Robot Cult. There aren't enough of you for you to think that you have earned the right to demand that those who disagree with you accept your terms when we want to express our skepticism of your extraordinary claims and curious aspirations. You should consider this a reality check. You need to stop engaging in self-congratulatory circle-jerks with your True Believer friends and struggle to communicate your program in terms the world will understand as they themselves present these terms to you. I cheerfully recommend this because I think the brightest folks among you will likely re-assess their positions once they try to engage in this sort of translation exercise. Those who don't will be that much easier for the likes of me to skewer. Oh, and if I'm wrong about you, then of course the Singularitarians Will Prevail or whatever -- but that isn't actually something I worry about overmuch.
I'm not changing the subject. I *started* this conversation with a direct question to you, a question you refuse to answer.
Dag, who's blog is this anyway? Look, it honestly isn't clear to me that anything you would count as an adequate answer wouldn't already embed me within the very discourse I'm deriding. What on earth is in it for me? I don't want to join in your Robot Cult Reindeer Games. The prospect holds no allure.
You are the one shunting aside difficulties, preferring to focus on assorted accusations of cultishness.
Have you ever argued with a longstanding Scientologist? I'm just asking.
If you are ever able to move past your apparent need to ridicule as "Robot Cultishness" the questioning of some of your assumptions, let me know.
I enjoy ridiculing the ridiculous, it's exactly what they deserve. It's not an "apparent need" of mine so much as certainly it is a profound pleasure.
Feel free to continue to read and comment on my writing whenever you like, as you have been. I enjoy these little talks of ours.
As for my sad inability to question my orthodox assumptions in matters of Robot Cultism, it is, no doubt, as you suggest, a sorry and sordid state of affairs for me. It is a hard thing to be so limited as I am.
Persevere, earnest Singularitarian footsoldier, and perhaps one day I might see the Light as you have, someday I might hear as keenly as do you the proximate tonalities of the Robot God.
Subscribe to:
Post Comments (Atom)
4 comments:
Oh my, what weird mistaken assumptions you continue to like to make about me. You ascribe to me the weirdest condifences that I have never exhibited.
There doesn't seem to be much point in telling you otherwise, since you've obviously decided you know far better than me what I think and what I don't. I would certainly find it cultish if I believed the sorts of things you mostly talk about, but never have I held such beliefs.
Anyway, please send your reply to me in email, if you want this to be a discussion (I doubt very much you're honestly interested in that). That's how this started, by me asking you a question in email, not in your blog, even though previous messages got copypasted here (which I don't object to, nor to the copypasting of possible future messages).
I generally find having discussions in blogs to be quite inconvenient compared to email.
Oops, "condifences" = confidences :P
Dale, I think you should really consider _saying_ something instead of ridiculing what _you wish to think is _ ridiculous without giving arguments.
Because, you know, you begin to sound like that Iraq Information Minister who ridiculed the claim that American tanks were close. Not that I approve of the invasion of Iraq, but wishing won't make it so and "I don't like it" is not an argument.
I know that at times you are willing to put the histrionics aside and try to give some arguments in support of your views. And when you do so, your criticism is taken very seriously by at least some robot cultists and nanosantalogical believers, yours truly included.
So why don't we start again: besides the fact that you don't like robot cults, what the flickr is wrong with AI, uploading etc. from an engineering point of view?
I think you should really consider saying something instead of ridiculing what you wish to think is ridiculous without giving arguments.
Are you shitting me? Scroll down to the Superlative Summary and you'll find a collection of links to texts offered up over half a decade delineating my Critique of Superlative and Sub(cult)ural Technocentrisms. Hours of pleasure for the whole family -- arguments, definitions, close readings, conceptual maps up to your ears. It's true that there is also quarrelsome banter as people read the texts and pick at parts of it or simply express their dissatisfactions with it and so on. But it is hard to believe you have read much of this at all if you think you can just dismiss it as "name-calling." I'll leave to the side your enormously smart and sophisticated comparison of me to the Iraqi Information Minister. That's good stuff.
So why don't we start again: besides the fact that you don't like robot cults, what the flickr is wrong with AI, uploading etc. from an engineering point of view?
It's like being in the torturer's chair. So, zen, vy don't ve start again? (Irony impaired readership insert smiley here.) I'd rather poke my eye out with a knitting needle. I've already explained what's wrong with this gambit of yours half a dozen times the last few weeks. It's actually literally shocking to me that anybody would try that again given my explanations that (a) "technical" and "engineering" discussions stealth sociocultural ones in the discourses I'm critiquing, (b) that my training and temperament better suit me to propose social, political, cultural, psychological critiques, (c) such critiques are perfectly valid on their own terms even if they are not the sort people are accustomed to, and (d) that since these critiques highlight issues and problems that are often neglected in the discourses under discussion this emphasis of mine is surely as much a strength as a weakness.
When you talk about AI and uploading you are speaking less from an "engineering point of view" than you fancy you are in any case. Engineers aren't properly in the business of engaging in thought experiments about angels or disembodied souls. They really aren't.
Post a Comment