Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Sunday, July 03, 2011

Futurological Brickbats

Separating people from their money promising they can get rich investing in the Next Big Thing or reassuring tragic gizmo-fetishes they are truly on the Bleeding Edge doesn't make you any kind of intellectual. It almost certainly does make you a scam artist, whether you are actually in on the scam yourself or not.

More Futurological Brickbats here.

1 comment:

jimf said...

> Separating people from their money. . .
> [while] reassuring tragic gizmo-fetishes
> they are truly on the Bleeding Edge. . .

Apropos of which, I found the following in my inbox a couple of days ago:

-----------
Hi James,

I'm Michael Anissimov, as you probably know I work for SIAI.

I am contacting you because we at SIAI are trying to pull together funds for a summer matching challenge. We currently only have $30,000, we're looking for contributions to bring us closer to $50,000. Would you be able to help with that? If you're interested in donating but have questions about the state or direction of SIAI, feel free to ask away.

Best,
Michael Anissimov
-----------

I'm falling further and further behind the whole Singularity scene -- I don't bother to read Accelerating Future, the Extropians' list archives, or SL4 (if it still exists) anymore, and I certainly don't read Overcoming Bias or Less Wrong.

However, I recently stumbled across some remarks on another forum
("Pull up a chair everyone! I found the good stuff!"
SolarTungsten 12/14/10, 09:04
http://www.poe-news.com/forums/sp.php?pi=1002433160 )
describing a thread on Less Wrong in which our old pal Roko Mijic was taken to task by Eliezer Yudkowsky (the blog owner) for revealing that "one person at SIAI" was "severely worried" about the coming AI God punishing people for all eternity in virtual Hell (I guess somebody must have just read Iain Banks' _Surface Detail_) "to the point of having terrible nightmares, though ve wishes to remain anonymous."

Or, as the POE News poster describes it, "The general idea is that when the God-brain or whatever finally takes over, it might decide to punish singularitians who believed in the coming new age, but didn't devote 100% of their resources to making it happen. Now that you know about this idea, you have to devote your life to the singularity or suffer eternal torture at the hands of the AI (that the AI can extend your life indefinitely is considered to be so obvious and non-controversial that no one even bothers mentioning it)."

Eliezer was angry with Roko for revealing this, apparently because he thinks it might actually give the AI God (or CEV for "Collective Extrapolated Volition" as it's now called) reason **for actually doing it**! ;->

Or, as EY says himself:

"You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends. . .

(For those who have no idea why I'm using capital letters for something that just sounds like a random crazy idea, and worry that it means I'm as crazy as Roko, the gist of it was that he just did something that potentially gives superintelligences an increased motive to do extremely evil things in an attempt to blackmail us. It is the sort of thing you want to be EXTREMELY CONSERVATIVE about NOT DOING.)"

In any case, according to the POE News poster, the thread was eventually expunged from Less Wrong (though, as we all know, it's difficult these days to completely expunge something from the Web).

So pony up, people, if you don't want to end up in Cyber Hell!