Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Wednesday, January 30, 2013

So Not A Cult

Eliezer_Yudkowsky 20 April 2011 09:13PM:
Having some kind of global rationalist community come into existence seems like a quite extremely good idea. The NYLW [New York Less Wrong] group is the forerunner of that, the first group of LW-style rationalists to form a real community, and to confront the challenges involved in staying on track while growing as a community. "Stay on track toward what?" you ask, and my best shot at describing the vision is as follows: "Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and the more fun we have the more people will want to join us."
The piece, entitled (I kid you not), "Epistle to the New York Less Wrongians" continues on from there. I for one wish the "Less Wrongians" the best of luck "staying on track" toward "making people awesome, and plot[ting] to optimize everything in sight" in their separatist enclave, wherever that ends up being, an abandoned oil platform fiefdom with libertopian anti-tolerance activist Peter Thiel (conspicuous donor to Singularitarian causes that he is), perhaps, or under a bubble-dome in a high-tech sooper city beneath the sea or sooper-genius lab in the asteroid belt. I encourage readers to post their personal favorite bits from the rest of the Epistle in comments. I cannot decide whether the rampaging lack of self-awareness in the become more self-aware section or the whole Singularity Needs Women section is the more laugh-out-funny, so I will leave those delightful determinations to others.

17 comments:

jollyspaniard said...

Identical to the standard Scientology elevator pitch the only difference is in the venacular.

He doesn't hold the sway over his followers that L Ron did though.

jimf said...

> I encourage readers to post their personal favorite bits from
> the rest of the Epistle in comments.

I think we should all take a moment to bow our heads in sympathy
with Brother Roko Mijic, who was cast out of the fold for
1) coming up with a reductio ad absurdum ("Roko's Basilisk")
of the current framework for Friendly AI, utilitarianism, and acausal
deal-making with future superintelligences and then 2) blabbing
that reductio in such a way as to 2a) threaten to make
a fool out of the Source of All Rationality and 2b) give nightmares
to a donor (of $$$) to the Singularity Institute.

The poor guy was publicly berated (at least he wasn't
punched in the face, spit on, or sentenced to the
Singularity Institute's equivalent of Rehabilitation
Project Force -- or was he? :-0 ). But he was led to
say:

"I wish I had never come across the
initial link on the internet that caused me to think about
transhumanism and thereby about the singularity. . .
I feel quite strongly that this knowledge is not a worthy
thing to have sunk 5 years of my life into getting."

("Sam sat on the ground and put his head in his hands.
'I wish I had never come here, and I don’t want to see
no more magic. . .'"
-- "The Mirror of Galadriel")

I actually feel a bit sorry for the guy. I'm sure it must
have hurt to be publicly berated by his guru
(and he apparently has taken his marbles and gone home);
the compensation is that he's achieved a permanent measure
of Internet fame as the originator of "Roko's Basilisk".

I fear we will never again see him descending on
Amor Mundi in his righteous wrath, alas.

jollyspaniard said...

He sounds like an ex Scientologist. Give him another year or two and he may be involved in some similar daffiness. A lot of these Englightenment cult like acolytes cycle gurus every few years. So don't rule out future entertainment value from him just yet.

Mitchell said...

"He doesn't hold the sway over his followers that L Ron did though"

I once said that LW is a fan club, not a cult.

Dale Carrico said...

Fandoms are united in their enthusiasm, not their faith.

jimf said...

> Mitchell said...
>
> I once said that LW is a fan club, not a cult.

Ah, Mitchell [Porter].

I just noticed, via the discussion at
http://rationalwiki.org/wiki/Talk:LessWrong/Archive13
"Sweet Jebus, the basilisk has a blog. . ."
that you transplanted all your own (censored)
comments on L'Affaire Basilisque
(Cliff's Notes version at
http://bo-ne.ws/forum/read.php?7,26558,27592#msg-27592 )
from LessWrong to
https://basiliskblog.wordpress.com/2012/10/18/a-compilation-of-comments/

I wonder if the Unfriendly AI from the Future that's
going to send us all to Hell isn't actually
Mecha-Streisand. If so, all this uncensorable
Basilisk stuff must be due to the Mecha-Streisand Effect.

I woke up this mornin', I was feelin' mighty bad
Baby said good mornin', hell, it made me mad
Cause I'm evil, evil hearted me
So downright evil as a man can be. . .

;->

Black guy from the future past said...

Does anybody at less wrong even read anything by Dale?

jollyspaniard said...

I can't imagine he'd be their favourite blogger but you never know. A lot of blogging addicts will read blogs from opposing viewpoints quite often. I do fairly regularly.

Mitchell said...

Dale doesn't engage with the specifics of LW very much. The best single LW contrarian is Alexander Kruel (kruel.co), aka XiXiDu, and he is certainly well-known there.

The account of the basilisk linked by Jim gets the details a little wrong at stage 4, but this always happens. The whole thing is a misapplication of an arcane idea that is itself dubious, "acausal trade". One of the unfortunate side effects of the basilisk hysteria is that all discussion of the nuances of what ought to be some rather interesting thought-experiments has been shut down; that's really why I started my "basilisk blog".

Dale Carrico said...

Dale doesn't engage with the specifics of LW very much.

People caught in the gravity well of this or that sect Robot Cult endlessly make this point. Once one determines that a futurological discourse or program is ridiculous, reactionary, dangerous, or otherwise off the rails its adherents are actually no longer in a position to dictate the terms on which the "specifics" are debated.

To go down the Rabbit Hole with Singularitarian Robot Cultists and debate the Robot God Odds on their preferred terms consolidates their discourse even if one assumes a contrarian position in the debate. It confers on them a legitimacy that they otherwise could not produce from their actual position of marginality from actual technoscientific and technodevelopmental consensus.

Within their pocket universe the Singularitarians may endlessly reassure themselves that theirs is a vital and legitimate movement/ program but that is of course a delusion. In fact, they are a serially failed, and always only failing, fundamentally conceptually flawed, palpably symptomatic, minute marginal subculture saying the most patent nonsense on regular basis. This remains true even if sometimes some of them say things that also aren't nonsense, as even stopped clocks are right twice a day, even complete fools know where the coffee is kept.

What Robot Cultists think are the important "specifics" are often not the "specifics" that happen to be the most important from the vantage of one who is observing and analyzing or criticizing their discourse as an object rather than as a Believer whose needs it is meeting (at whatever cost).

Even where Robot Cultists occasionally talk about real scientific results or developmental issues here and there, who in their right mind would choose a Robot Cultist as a serious interlocutor in such discussions? Since superlative futurological faith-based initiatives for techno-transcendence essentially extrapolate, project, hyperbolize, or emphasize logical compatibility with some real scientific results none of which get them remotely where they think they want to go, and also from the accident that endlessly many of the results that must be discovered between where we are and where they think they want to be have simply not yet disappointed their hopes as surely many will, I personally disagree that the "technical questions" and "central debates" within their orthodoxies are really the specifics they are pretending to be in any case.

What matters for the Robot Cultists are less the actual results than the rhetorical significances that idiosyncratically attach to these results in their discourse. It is not surprising that True Believers or even futurological fans do not share my perspective on the rhetorical operations in play within their arguments and aspirations or the salience of their connection to mainstream neoliberal assumptions or theological frames they may consciously disapprove. Contrarians who snipe at the Robot Cultists primarily in terms only the Cultists understand are functioning as a loyal opposition. I have no interest in performing such a function.

I engage with Robot Cultism in terms of the specifics that actually matter by my own determination.

Given the citational richness of my critiques no one can pretend I am not highly knowledgeable of and materially engaged with the actual textual specificities of the actual texts of the Robot Cultists I critique. It would be one of the characteristics of their cultishness that their adherents and camp followers would disdain as non-engagement an insistent engagement that simply does not attach the same significance they do to the specifics under scrutiny.

jimf said...

> The best single LW contrarian is Alexander Kruel (kruel.co),
> aka XiXiDu, and he is certainly well-known there.

Actually, my favorite LW contrarian is Dmytry Lavrov, currently
posting under the name private_messaging. I gather some of his
earlier user accounts there have been locked out, though, and
many of his comments have been deleted.

Both "XiXiDu" and "Dmytry" also participate in the LessWrong
"Talk" threads at the RationalWiki Web site.
http://rationalwiki.org/wiki/Talk:LessWrong#LW_going_nuts_with_burning_the_evidence

jimf said...

> Does anybody at less wrong even read anything by Dale?

Probably not, but guess what?

Dale got an epigraph there!
In an EY "sequence", no less!

http://lesswrong.com/lw/eqn/the_useful_idea_of_truth/

Dale Carrico said...

Right next to Nietzsche, yet. Think how much more impressive had the sooper-geniuses spelled "Nietzsche" correctly.

jimf said...

BTW, you know who else mixed it up with LessWrong?

The late Aaron Swartz.

I guess that's not surprising. I imagine everybody who's anybody
in the hacker/nerd community has heard of Eliezer Yudkowsky/
LessWrong/SIAI-SI-MIRI by now.

To his credit, he was a skeptic about the whole enterprise:
http://lesswrong.com/lw/atm/cult_impressions_of_less_wrongsingularity/75l5

But he was attracted to the "self improvement" angle of the
thing:

http://lesswrong.com/user/aaronsw/overview/?count=51&after=t3_dy8
----------------------
In response to Welcome to Less Wrong! (July 2012)
Comment author: aaronsw 04 August 2012 27 points

I'm Aaron Swartz. I used to work in software (including as a
cofounder of Reddit, whose software that powers this site) and now
I work in politics. I'm interested in maximizing positive impact,
so I follow GiveWell carefully. I've always enjoyed the rationality
improvement stuff here, but I tend to find the lukeprog-style
self-improvement stuff much more valuable. I've been following Eliezer's
writing since before even the OvercomingBias days, I believe,
but have recently started following LW much more carefully after
a couple friends mentioned it to me in close succession.

I found myself wanting to post but don't have any karma, so I
thought I'd start by introducing myself.

I've been thinking on-and-off about starting a LessWrong spinoff
around the self-improvement stuff (current name proposal: LessWeak).
Is anyone else interested in that sort of thing? It'd be a bit
like the Akrasia Tactics Review, but applied to more topics.
----------------------

They teach Pick-Up Artistry as well as Rationality
(at "Rationality Boot Camps" and "Rationality Minicamps")
http://singularity.org/blog/2011/06/21/rationality-minicamp-a-success/

From the comment thread at
http://lesswrong.com/lw/78s/help_fund_lukeprog_at_siai/4otj
--------------------
I was a mini-camp participant, and I actually became
more awesome as a result. Since mini-camp, I've:

- used Fermi calculations (something we practiced) to
decide to graduate from school early.

- started making more money than I had before.

- started negotiating for things, which saved me over
$1000 this summer.

- begun the incredibly fucking useful practice of rejection therapy,
which multiplied my confidence and caused the above two points.

- rapidly improved my social abilities, including the easily
measurable 'success with women' factor. This was mostly caused by
a session about physical contact by Will Ryan, and from two major
improvements in wardrobe caused by the great and eminent lukeprog
(in whose name I just donated). I wasn't bad at social stuff
before - this was a step from good to great.

- resolved my feelings about a bad relationship, mostly as a result
of boosted confidence from increased social success.

I stuck around in California for the summer, and gained a lot from
long conversations with other SIAI-related people. The vigor and
insight of the community was a major factor in showing me how much
more was possible and helping me stick to plans I initiated.

But, that said - the points listed above appear to be a direct result
of the specific things I learned at mini-camp.
--------------------

I must admit, "the incredibly fucking useful practice of
rejection therapy" sounds kind of interesting. ;->

Dale Carrico said...

If I've said it once I've said it a thousand times: even at the turning of tide from a prevalent conservative to liberal framing of the terms of public debate on most issues in this historical moment, there remains a strange and dangerous (often completely inadvertent) susceptibility of progressive discourse to re-introduce reactionary assumptions, conceits, norms, frames, aspirations through tech-talk.

jimf said...

> even at the turning of tide from a prevalent conservative to liberal
> framing of the terms of public debate on most issues in this
> historical moment. . .

Sez **you**! ;->

http://lesswrong.com/user/advancedatheist/overview/
-------------------
In response to The school of science fiction
advancedatheist [Mark "Plus" (Potts)]
06 January 2013 0 points

Vox Day suggests that we can find another Hari Seldon in Oswald Spengler:

http://voxday.blogspot.com/2013/01/spenglerian-decline.html

Which discusses:

http://nationalinterest.org/article/spenglers-ominous-prophecy-7878?page=show

I've suspected for awhile now that the democratic, egalitarian and
feminist (DEF) era we've lived in represents a kind of unsustainable
drunkard's walk from long-term and more stable social norms which
has fooled our pattern-recognition heuristics into imposing a vector
on this deviation and calling it "progress." It wouldn't surprise
me if future societies descended from ours, for example, the ones
which in which we might reanimate from cryostasis (assuming that
could even happen) will look noticeably more aristocratic, hierarchical
and patriarchal than our departure society. That might suck for the
feminist women who have signed up for cryosuspension and survive
the ambulance ride across time, but I think I could handle it. ; )

Interestingly enough, much American science fiction written during
the mid 20th Century presents a similarly skeptical view of the
current DEF ideology. How many science fiction stories postulate
noble houses, monarchies and feudal-looking societies with
advanced sciences and technologies, but set in "the future"?
These writers might have followed the lead of their predecessor
H.G. Wells, who advocates in his works that an aristocracy of
the mind should run things.

Dale Carrico said...

It's reassuring to know that behind that grumpy old pot-bellied white-racist greedhead gun-nut patriarchal prick exterior there stands poised deep within Mr. Plus the concentrated essence of the superintelligent omnicompetent aristocracy of mind waiting in the wings to save all the uppity negro jezebel faggot hippy scum from ourselves in the fullness of time.