Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Friday, January 01, 2016

Robot Cultist Eliezer Yudkowsky's Ugly Celebration of Plutocracy

[Let us begin the new year, you and I, with a long, but not longing, backward glance. In the Moot to a post yesterday my friend Jim Fehlinger drew our attention to an essay from 2008 from robocultic Singularitarian, would-be sooper-genius, wannabe-guru Eliezer Yudkowsky. I vaguely recall having glanced at this piece once, but I can't say that I really gave the piece the real attention it deserved before now. Eliezer Yudkowsky is hardly what you would call a celebrated figure, but he is taken seriously by people who are taken seriously, for whatever reasons, to the extent that one admits the "Thought Leaders" of that amorphous aspirational blob called "tech" are taken seriously, folks like "philosophers" Nick Bostrom and Robin Hanson, and reactionary "techno-progressive" skim-and-scam gazillionaires like Peter Thiel and Elon Musk, and the like. It is well known that I find Yudkowsky a ridiculous person, but I must say that upon reading the linked piece I found the apology for plutocratic elitism that followed frankly flabbergasting in its infantile foolishness, lack of standards, and truly gross indecency. Even I did not think so little of Yudkowsky to think he would write something like this. Perhaps he no longer would. I would like to think so. It is true that over the years venture capitalist techbros of the Valley of the Silly Cons have regularly been exposed by journalists and then mocked by majorities for making clueless bigoted asshole exhibitions of themselves and it is also true that the object of Yudkowsky's piece is, in a nutshell, to declare this very cohort of VCs as ethically and organismically and, hell, apparently cosmically superior sorts of beings, and so the awfulness of the following has a certain dismal inevitability about it. For the whole piece you should follow the link, my own comments are bracketed and interspersed and, I do assure you, were quite involuntary reflex actions of a shocked conscience. Yes, shock is the way we should begin...--d]
I was shocked, meeting Steve Jurvetson, because from everything I'd read about venture capitalists before then, VCs were supposed to be fools in business suits, who couldn't understand technology or engineers or the needs of a fragile young startup, but who'd gotten ahold of large amounts of money by dint of seeming reliable to other business suits.
[My own (possibly over-general) generalized impression of VCs is that they are mostly privileged upward failing opportunists unscrupulously hyping vaporware for short-term cash from credulous marks or exploiting collective labor and intelligence via the shar(ecropp)ing economy with little awareness or interest in the costs or risks or suffering of others involved in their efforts. "[S]eeming reliable to other [VCs in] business suits" might describe this sociopathic state of affairs, but I do think better descriptions are available. I must add that many of these people seem to me to have "gotten a-hold of large amounts of money" by being born with it or with enough of it to schmooze others born with it, which is to say that they are "self-made men" in the usual way.--d]
One of the major surprises I received when I moved out of childhood into the real world, was the degree to which the world is stratified by genuine competence.
[Since Yudkowsky has interposed this curious framing at this point in his narrative himself, I think it only fair to offer it up as a question for the reader rather than a premise we will all simply uncritically accept: Do we agree with Yudkowsky, admittedly a man veering into middle age at this point, that he has indeed "moved out of childhood" at all, let alone "into the real world"? Given the embarrassing narcissism, the simplistic conceits, the facile hero worship, the infantile wish-fulfillment on display, are we all quite ready to admit Yudkowsky into the ambit of adulthood? Or is his superlative futurology yet another, more than usually palpable, symptom of superannuated infancy?--d]
Now, yes, Steve Jurvetson is not just a randomly selected big-name venture capitalist. He is a big-name VC who often shows up at transhumanist conferences. But I am not drawing a line through just one data point.
[Quite a lot of the material I snipped from the beginning of Yudkowsky's piece involved his praise of Steve Jurvetson in particular who may, for all I know, actually be a bright and worthy person (although, contra Yudkowsky, I cannot say his attendance at robocultic transhumanist conferences, if that is true, inspires confidence in his judgment) or may, again for all I know, simply be someone Yudkowsky is buttering up in hopes of some collection plate action for his robocultic causes.--d]
I was invited once to a gathering of the mid-level power elite, where around half the attendees were "CEO of something" -- mostly technology companies, but occasionally "something" was a public company or a sizable hedge fund. I was expecting to be the youngest person there, but it turned out that my age wasn't unusual -- there were several accomplished individuals who were younger. This was the point at which I realized that my child prodigy license had officially completely expired.
[Can there really be people who refer non-derisively and non-satirically to groups of the rich as "the power elite"? Can there really be people who refer to themselves -- setting aside the question of people in their thirties who refer to themselves -- affirmatively as "child prodigies"? With much discomfort and sadness, let us soldier on.--d]
Now, admittedly, this was a closed conference run by people clueful enough to think "Let's invite Eliezer Yudkowsky" even though I'm not a CEO. So this was an incredibly cherry-picked sample. Even so...
[Even if this hyperbole is meant to signal irony, the boasting in it is so transparently a compensation for insecurity it is actually painful to observe.--d]
Even so, these people of the Power Elite were visibly much smarter than average mortals. In conversation they spoke quickly, sensibly, and by and large intelligently. When talk turned to deep and difficult topics, they understood faster, made fewer mistakes, were readier to adopt others' suggestions. 
[Again, with the "Power Elite" business. The capital letters and, if I may say so, simple commonsense make we want to assume the phrasing is parodic -- but nothing anywhere else suggests this. Indeed, one has the horrified suspicion that the letters are also capitalized in Yudkowsky's head. We will set aside as too horrific to contemplate the suggestion that it was simply their likely whiteness and maleness that made the Power! Elite! gathered in that room seem "visibly much smarter than average mortals." Notice that we must take Yudkowsky's word that the topics under discussion were "deep" and "difficult" and that they spoke of them "sensibly" and "intelligently" and "made fewer mistakes" (he would have caught them if they had). Were they "speaking quickly" because they had so much to say and were excited by their topics -- or just because they are used to fast-talking salesmanship and bullshit artistry? Were they "adopting each others suggestions" because they were open to intelligent criticisms or because they are yes-men flattering and cajoling each other for networking's sake or because groups of people like this are already largely in agreement about what matters and why it matters especially when it comes to "tech" talk?--d]
No, even worse than that, much worse than that: these CEOs and CTOs and hedge-fund traders, these folk of the mid-level power elite, seemed happier and more alive.
[There you go. Read it again. Hedge fund managers and tech VCs are happier and more alive than other people. MORE ALIVE. The rich are not like you and me. They are tapped into exquisite joys and alivenesses unavailable to majorities, they they are more real. This bald endorsement of reactionary plutocratic superiority is so ignorant of the richness of the lives and intelligence of the majorities it dismisses and is so flatly pernicious in its eventually genocidal political entailments, I must say it is a rare thing to see in a public statement... Although, again, I have already noted that such public statements are indeed comparatively more commonplace, and notoriously so, among these very same sort of rich "tech" VCs and banksters. But there it is. Of course, Yudkowsky doesn't really mean "worse" or "much worse" in anything like the conventional sense, when he declares these (are we meant to think reluctant?) truths. No, Yudkowsky is relishing the awfulness of what he is saying, he is savoring the ugliness in his mouth, tonguing his anti-democratic morsel from tooth to tooth, smacking his lips in an unseemly dance of contrarian "political incorrectness," drinking in the imagined opprobrium of the unwashed useless eating masses he cheerfully consigns to computronium feedstock here. One is all too used by now to these online spectacles of man-child id celebrating racist police violence or rape culture or what have you in the faces of the vulnerable, smearing their feces on the walls of the world. What it is useful to recall at this juncture, again, is that mild-mannered "tech philosopher" Nick Bostrom at Oxford and widely worshiped celebrity "tech" CEO Elon Musk are discursively, sub(cult)urally and institutionally connected to this person, are conversant with his "ideas" and "enterprises," are his colleagues.--d]
This, I suspect, is one of those truths so horrible that you can't talk about it in public. This is something that reporters must not write about, when they visit gatherings of the power elite.
[Again, nothing could be clearer than that Yudkowsky does not find this "truth" to be in the least horrible. He is palpably relishing it -- his enjoyment is so rich he does not even care about the perverse contradiction of describing as absolutely prohibited the speaking of the very truths he is in the act of megaphoning about at top volume -- and to the extent that he has already figured himself as adjudicating this gathering of rich happy genius elite superbeings, he is also making a spectacle of confirming his own status as such a being himself. Again, one doesn't have to scratch too deep beneath this ungainly superficial boasting to detect what look to be the rampaging insecurities desperately compensated by this embarrassing self-serving spectacle, but I do not so much discern in this so much a cry for help as an announcement of a punitive rage for order putting us all on notice. Such dangerous and costly performances of insecure personhood do not elicit my sympathy but ready my defenses. It is amusing to note that in the comments to his post, an early one responds to Yudkowsky's observation that "This [elitism], I suspect, is one of those truths so horrible that you can't talk about it in public" by assuring us "Charles Murray talked about [it] in The Bell Curve." Quite so! A later comment adds, "And Ayn Rand wrote about it repeatedly." All too, too true. I will add myself that while it is true that few reporters write about tech billionaires that they are literal gods the rest of us should be so lucky get pooped on by, the endless fluffing these people get in informercial puff-pieces and think-tank PowerPoints and TED ceremonial ecstasies are all premised on only slightly more modest variations of the attribution of superiority Yudkowsky is indulging here. That he takes this praise to such bonkers extremities doesn't actually make his argument original, in particular, it just makes it even more than usually stupid.--d]
Because the last news your readers want to hear, is that this person who is wealthier than you, is also smarter, happier, and not a bad person morally. Your reader would much rather read about how these folks are overworked to the bone or suffering from existential ennui. Failing that, your readers want to hear how the upper echelons got there by cheating, or at least smarming their way to the top. If you said anything as hideous as, "They seem more alive," you'd get lynched.
[Yudkowsky was not lynched for saying these very things, and of course he is lying when he pretends to expect anything remotely otherwise. Dumb emotionally stunted smug straight white assholes aren't the people who have historically been lynched in this country, as it happens. Charles Murray didn't write about that in The Bell Curve nor did Ayn Rand devote a chapter to it in one of her execrable bodice-rippers. You know, I would be surprised if many, indeed if anybody has even been meaner to Eliezer Yudkowsky about his horrible screed than I am being right now in the near-decade since he wrote all these awful ugly things he has never since recanted nor qualified. Of course, one expects straight white techbros to lose themselves in grandiose fantasies of imagined victimhood for just innocently being themselves in the world of politically correct oversensitive naturally inferior social justice warriors blah blah blah blah blah. It is indeed evocatively Ayn Randian of Yudkowsky to presume that we sour-smelling masses contemplate our rich productive techbro betters with envious projections onto them of misery and ennui -- but of course the truth is that such protestations about the lives of stress and stigma and suffering and risk suffered by our indispensable beneficient entrepreneurial Maker elites are usually self-serving rationalizations for bail-outs and tax-cuts and ego-stroking offered up by themselves rather than those of us, Takers all, they so relentlessly exploit and disdain. In any case, nothing could be clearer than that Yudkowsky and his readership do not identify in the main with such envious errant mehum masses, but largely consist instead of useful idiots who fancy themselves Tomorrow's Power Elite awaiting their own elevation via the coding or crafting of the Next! Big! Thing! That! Changes! Everything! and hence they actually identify with the pretensions of the plutocrats Yudkowsky is describing and disdain in advance those who in disdaining them disdain their future selves -- the poor pathetic suckers! I leave to the side the fact that many do not expect merely to Get Rich Quick soon enough, but in the fullness of time expect, given their robocultishness, to live in shiny robot bodies in nanobotic treasure caves filled with sexy sexbots when they are not rollicking in Holodeck Heaven as cyber-angelic info-souls under the ministrations of a history-ending super-parental Friendly Robot God.--d]

[There is much more evil crapola to be found in this vein in Yudkowsky's e-pistle. One particularly crazy utterance several more pages into the screed asserts that "Hedge-fund people sparkle with extra life force. At least the ones I've talked to. Large amounts of money seem to attract smart people. No, really." Oh, how our rich elites sparkle! As I said, it is really just more of the same -- including more of these faux "No, really" protestation against objections to all this objectionable idiocy that never really arrive nor are really, no, really, expected to from his readership.--d]

[By way of conclusion, it is interesting to note that like many who lack training in structural critique Yudkowsky finds himself indulging in a rather romantic misconception of the complexities of historical, social, and cultural dynamisms -- investing heroized protagonists with magickal force and indulging in frankly conspiracist mappings of power.--d]

[For what I mean by magick--d:]
Visiting that gathering of the mid-level power elite, it was suddenly obvious why the people who attended that conference might want to only hang out with other people who attended that conference. So long as they can talk to each other, there's no point in taking a chance on outsiders who are statistically unlikely to sparkle with the same level of life force. When you make it to the power elite, there are all sorts of people who want to talk to you. But until they make it into the power elite, it's not in your interest to take a chance on talking to them. Frustrating as that seems when you're on the outside trying to get in! On the inside, it's just more expected fun to hang around people who've already proven themselves competent. I think that's how it must be, for them. (I'm not part of that world, though I can walk through it and be recognized as something strange but sparkly.)
[For what I mean by conspiracy--d:]
There's another world out there, richer in more than money. Journalists don't report on that part, and instead just talk about the big houses and the yachts. Maybe the journalists can't perceive it, because you can't discriminate more than one level above your own. Or maybe it's such an awful truth that no one wants to hear about it, on either side of the fence. It's easier for me to talk about such things, because, rightly or wrongly, I imagine that I can imagine technologies of an order that could bridge even that gap. I've never been to a gathering of the top-level elite (World Economic Forum level), so I have no idea if people are even more alive up there, or if the curve turns and starts heading downward.
[As I said, one is left questioning more than Yudkowsky's intelligence after reading such stuff, but wondering -- to the extent that we take this stuff straight, and not as a bit of pathetic but probably lucrative self-promotional myth-making -- if his many accomplishments (writing Harry Potter fan-fiction, writing advertizing copy about code that doesn't exist, extolling rationality while indulging in megalomaniacal crazytalk) will one day include an arrival at either basic competent adulthood or basic moral sanity. Yudkowsky ends his missive in what seems an ambivalent bit of loose-talking guruwannabe-provocation or possibly ass-saving: "I'm pretty sure that, statistically speaking, there's a lot more cream at the top than most people seem willing to admit in writing. Such is the hideously unfair world we live in, which I do hope to fix." We are left to wonder if the reference to "hideous unfair[ness]" is ironic or earnest. It is hard to square his conventional meritocratic rationalization for inequity with the belief that this state of affairs is really so very unfair after all, so far as it goes, though who of us can say just where the balance finally falls once one ascends to the Olympian heights from which we are assured that Yudkowsky, elite above the elites, hopes finally to "fix" things? The ways of self-appointed godlings are mysterious.--d]

30 comments:

jimf said...

> The rich are not like you and me. . .

Well, they're not, of course. ;->

Some tangentially-related YouTube viewing:

Chris Hedges - The Pathology of The Super Rich
Originally uploaded December 5, 2013
https://www.youtube.com/watch?v=EfmiCLYweTQ

Jonathan Haidt: Three Stories About Capitalism -- at ZURICH.MINDS
Published on Dec 3, 2014
https://www.youtube.com/watch?v=iOu_8yoqZoQ

Mormon Stories #606: Mormonism and the Culture of Fraud
with Attorney Mark Pugsley
In this episode, Mark discusses the culture of financial fraud
(e.g., ponzi schemes) within Utah Mormonism.
https://www.youtube.com/watch?v=jayJ3R4pj3k

Dale Carrico said...

Small world, unless you have to clean it.

Unknown said...

Yudkowsky and his band of loyal dogs love for the coporate strata is not that suprising is it. It's like when "new" atheists found out the horrible views of Sam Harris or Christopher Hitchens. Some are still denying that they hold these views. Or pride themselves for their "skepticism".

jimf said...

> Small world, unless you have to clean it.

Just like my apartment. :-0

jimf said...

> [H]e is taken seriously by people who are taken seriously,
> for whatever reasons, . . . folks like Robin Hanson, . . .
> Peter Thiel and Elon Musk, and the like.

You know, I've known about the Thiel connection with the
Singularitarians for many years, but I hadn't known there
was a Koch connection (with Robin Hanson)
until I stumbled across this comment thread:

http://www.overcomingbias.com/2012/10/female-overconfidence.html
---------------------
VV

I've just found this:

http://world.std.com/~mhuben/mason.html
~~~
Criticisms of George Mason U. Economics (and Mercatus)

The Economics department of George Mason University has
been strongly shaped by tens of millions of dollars of
donations by the libertarian Koch Foundations of the
billionaire Koch brothers. Most, if not all, of the staff
(23 GMU professors on the Mercatus "Scholars" list 5/16/09)
is affiliated with the Koch-financed Mercatus Center,
a libertarian pro-corporatist think-tank. The result
is a propaganda mill with academic credentials.

Notable libertarian ideologues at both include:

Peter Boettke
Bryan Caplan
Tyler Cowan
Alex Tabarrok
Robin Hanson
Peter Leeson

But here's the fun part. All these macho, independent
libertarians are sucking at the government tit.
George Mason University is a public university, supported
by state funds. At least 4 are also enjoying tenure. And
they get to double dip at the conservative welfare tit
of the Mercatus Center without even having to leave campus.
What a splendid position from which to decry unions,
living wages, job security, restrictions on competition,
and all those other things that keep them from having
cheaper servants.
~~~

Is it actually true?

---

Stephen Diamond

It's been posted to these Comments before, I believe by
this blogger --
http://omniorthogonal.blogspot.com/2011/11/libertardian.html

Never denied.

What would make an interesting story if someone had the
resources to research it (I might write one despite not having
those resources) is the partnership and semi-split between
Overcoming Bias and the "sister site" Less Wrong: Hanson
funded by rightist billionaires Koch and Yudkowsky by rightist
billionaire Peter Thiel.The three billionaires are all
self-proclaimed libertarians, but they have strategic
differences on foreign policy, the Koch brothers being
militarist hawks and Thiel having funded the campaign of
isolationist Ron Paul.

Hanson and Yudkowsky, as far as I can make out, "split" over
whether they saw the future as consisting of a trillion
barely subsisting human emulations (Hanson) or a God-like
singleton artificial intelligence (which, incidentally,
might take its vengeance on any of today's humans who dared
oppose the development). Whether this esoteric difference
served as a cover for the more quotidian differences
between their billionaire patrons remains for future discovery.
====

There's some other entertaining gossip in that same comment
thread (i.e., of
http://www.overcomingbias.com/2012/10/female-overconfidence.html )

jimf said...

> Ugly Celebration of Plutocracy. . .
> . . .infantile foolishness, lack of standards,
> and truly gross indecency. . .
> [M]y own comments. . . were quite involuntary reflex
> actions of a shocked conscience. Yes, shock is the
> way we should begin. . .

Or not. Apparently, YMMV.

> Yudkowsky was not lynched for saying these. . .
> things. . .

Indeed, he was quoted with approval in some precincts:

https://musefree.wordpress.com/2008/09/27/money-power-elites-and-morality/
------
Eliezer Yudkowsky writes:

> One of the major surprises I received when I moved out of
> childhood into the real world, was the degree to which the
> world is stratified by **genuine** competence. . .

Worth quoting, I think, especially in an era where much redistributionist
logic stems from an assumption that money and ability have little relation.

We all have different goals in life, and some, like I, choose to do
something out of love or reverence and perhaps a shot at greatness.
In doing so, we often renounce the opportunity of doing something
else that might have led to more money. However, it is important that
we do not confuse this voluntary decision with some sort of moral
superiority. There is nothing wrong with the fact that people
with more money have better healthcare, better food, better
recreation and better opportunities in life. Money may not be
a perfect denomination, but it is the best that exists. And
there is nothing more important, in these troubled days, to
reaffirm the morality of a world that deals in it and rewards
some more than others.

Let me end this post with an excerpt from a glorious passage
by Ayn Rand, who expresses this idea more eloquently than
I ever can. . .

> [Eloquence from La Rand]
====

;->

jimf said...

http://amormundi.blogspot.com/2015/12/i-predict.html?showComment=1451634071622
------------
> Has he recanted or qualified ANY of this truly ugly, evil,
> crazy crapola in the years since he said this?

Nope. And, as an earthy Brit might say, "not bloody likely". ;->
====

> Even I did not think so little of Yudkowsky to think he
> would write something like this. Perhaps he no longer would.
> I would like to think so.

You hope in vain. ;->

I gather the post under scrutiny is considered part of "The Sequences" (TM).

It was "rerun" on LessWrong four years later (three years ago):

http://lesswrong.com/lw/efr/seq_rerun_competent_elites/
------------
[SEQ RERUN] Competent Elites
Post author: MinibearRex
09 September 2012 05:42AM

Today's post, Competent Elites was originally published on 27 September 2008.
A summary (taken from the LW wiki)
[ https://wiki.lesswrong.com/wiki/Less_Wrong/2008_Articles/Summaries#Competent_Elites ]:

> People in higher levels of business, science, etc, often really
> are there because they're significantly more competent than
> everyone else.

This post is part of the Rerunning the Sequences series, where
we'll be going through Eliezer Yudkowsky's old posts in order
so that people who are interested can (re-)read and discuss them.
The previous post was The Level Above Mine, and you can use
the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate
by re-reading the sequence post, discussing it here, posting
the next day's sequence reruns post, or summarizing forthcoming
articles on the wiki. Go here for more details, or to
have meta discussions about the Rerunning the Sequences series.
====

jimf said...

"Halo" is not just a video game. ;->

http://wallowinmaya.com/2012/03/26/493-competent-elites/
---------
. . .

Admittedly, Yudkowsky had experiences with a biased sample
(Silicon Valley, CEOs of tech-companies, etc.) but the general
point is unfortunately true.

When I was young I thought everyone in power (especially folks
in the market economy and business world) was evil,
of mediocre intelligence, stressed out and corrupt.

Now, of course, most business administration students
are in fact stupid and selfish. But the CEOs of middle-to-big
companies have presumable a mean IQ of 120. And the
CEOs of really big companies are probably much smarter
than average.

Obviously, most of them are ambitious and egoistic, yeah,
even slightly sociopathic. But, hey, so is everyone else.
If my former socialist-friends were in charge I would be
dead and our civilisation collapsing. . .

Yeah, the “Halo-effect” isn’t even much of a fallacy.
Intelligence, happiness and beauty **do** correlate with
each other. That shouldn’t be surprising. . .

[G]ood comment by [Michael] Vassar:

> Smart, happy, and alive? That fits my observations.
> Not bad morally? Only in the Bay area. Also, I think that
> more successful people seem smarter etc due to halo effect,
> and the ability to seem smart and alive and generally
> appealing, even moral, is called social skill or charisma
> and contributes a lot to a person’s rise in power.
> You may have noticed that these people were also much
> better looking than average. . .

Actually, forget what I wrote and read this post by Steven Hsu:

http://infoproc.blogspot.com/2009/08/creators-and-rulers.html
====

OTOH:

http://log.scifihifi.com/post/24869568279/while-smartness-is-necessary-for-competent-elites
---------

"While smartness is necessary for competent elites, it is far
from sufficient: wisdom, judgment, empathy and ethical rigor
are all as important, even if those traits are far less valued.
Indeed, extreme intelligence without these qualities can be
extremely destructive. But empathy does not impress the same
way smartness does. Smartness dazzles and mesmerizes. More important,
it intimidates. When a group of powerful people get together
to make a group decision, conflict and argumentation ensue, and
more often than not the decision that emerges is that which
is articulated most forcefully by those parties perceived to
be the “smartest.”"

- Christopher Hayes - Why Elites Fail
The Nation
http://www.thenation.com/article/why-elites-fail/
====

jimf said...

> Do we agree with Yudkowsky, admittedly a man veering into
> middle age at this point, that he has indeed "moved out of
> childhood" at all, let alone "into the real world"? Given
> the embarrassing narcissism, the simplistic conceits, the
> facile hero worship, the infantile wish-fulfillment on
> display, are we all quite ready to admit Yudkowsky into
> the ambit of adulthood?

https://www.reddit.com/r/badphilosophy/comments/3fuy91/oldie_but_goodie_from_lesswrongs_resident_crank/
------------
I'd never read Yudkowsky before this. His lack of self-awareness
is positively Randian.
====

Why Emm Emm Vee?

;->

Dale Carrico said...

not that suprising

I rarely blog about being surprised. I tend to blog about being amused or being disgusted.

Giordano Mirandolla said...

Note: Someone at MIRI was apparently good enough at PR that they left this essay and "Why you should be a scientific racist"(http://lesswrong.com/lw/kk/why_are_individual_iq_differences_ok/) out of the book version of the sequences.

jimf said...

> Someone at MIRI was apparently good enough at PR that they
> left this essay and "Why you should be a scientific racist"
> (http://lesswrong.com/lw/kk/why_are_individual_iq_differences_ok/)
> out of the book version of the sequences.

Speaking of "scientific racism" (along with "technolibertarianism"
and "geek misogyny"), there's an interesting thread on Tumblr
that popped up yesterday:

http://reddragdiva.tumblr.com/post/136699954978/whats-your-history-with-lw-and-rationalist
----------------
David Gerard
Jan 5th, 2016

Anonymous asked:

> What's your history with L[ess]W[wrong] and rationalist tumblr? How did
> you come to be involved in those hellholes?

this post [ http://reddragdiva.tumblr.com/post/128546600873/when-was-the-last-time-you-changed-your-mind-about ]
summarises my lw progression if anyone cares. here’s why the roko’s basilisk article
[ http://reddragdiva.tumblr.com/post/127944291153/reasonableapproximation-uncrediblehallq ].

as internet-as-television goes, amateur philosophical streetfighting is great fun!
i looked back a while ago at my comments from when i started posting in 2010 and marvel
at how optimistic i was that the weird, bad and cultish ideas could easily be
alleviated through sweet reason.

the tag for every time i bitch about this cluster of terrible ideas is #the crackpot offer indeed
[ http://reddragdiva.tumblr.com/tagged/the-crackpot-offer-indeed ].

(i need to tag more stuff more reliably because i hate answering questions twice and
tumblr’s search system is as coded by dildos as everything else about this blue hell.)

as for rationalist tumblr, i really did come here to keep up with my spouse’s essays
in pounding elf cock therapy. but i knew a pile of the people from lw, you see.

they are mostly very nice really. some are not. some are sort of nice if you don’t
mind technolibertarianism, geek misogyny and scientific racism crapping up your dash,
though i do mind those things actually. (and not to mention the ones who try to use
“neurotypical” as a snarl word when called on their fucking reprehensible ideas.)
[ http://reddragdiva.tumblr.com/post/135929409138/tenaciousvoidcycle-nostalgebraist ]
i’m trying to blow up less at arrant stupidity from people who mean well but who
should know better by now.

[see, this is what i mean about my shitposting ability being utterly unaffected by
labyrinthitis and my guts doing loop the loops when i look at a screen]
====

jimf said...

> http://reddragdiva.tumblr.com/post/136699954978/whats-your-history-with-lw-and-rationalist
> ----------------
> David Gerard
> Jan 5th, 2016
>
> Anonymous asked:
>
> > What's your history with L[ess]W[wrong] and rationalist tumblr? How did
> > you come to be involved in those hellholes?
>
> this post [ http://reddragdiva.tumblr.com/post/128546600873/when-was-the-last-time-you-changed-your-mind-about ]
> summarises my lw progression if anyone cares. . .
> ====

http://reddragdiva.tumblr.com/post/128546600873/when-was-the-last-time-you-changed-your-mind-about
------------
David Gerard
Sep 7th, 2015

> su3su2u1
> Sep 7th, 2015

> > Anonymous asked:
> > When was the last time you changed your mind about
> > something MIRI-related? Not just learned something new, but
> > went from believing X to believing the opposite of X. Was
> > your new opinion more favorable to MIRI, or less favorable,
> > than your old opinion? What evidence did you see that
> > convinced you?
>
> I’ve grown steadily less favorable with increasing exposure. My initial
> impression, years ago, was something like “well, these are wild ideas but
> it takes all kinds and whatever it takes to motivate your research.”
> Basically, how I think of Kurzweil- guy who does great, interesting work
> with a side order of ideas I find a bit wacky.
>
> Then later “well, this is mostly just typical skeptic blogging stuff.”
>
> Then as I became more interested in machine learning stuff (i.e. when
> companies started paying me to do it), I decided to visit their technical
> work now that I had some idea of the field and that is when the wheels
> came off the bus. After that, I revisited the Less Wrong blog posts and
> actually systematically read some of them, which further pushed me away
> (in particular, the focus on idea Bayesian agents (everything centered
> around “science vs. bayes” really, the references to Drexler style
> nanotech, etc).
>
> So I’ve had series of shifts, starting from
>
> 1. AI research organization with a side order of wacky ideas (based on
> the general impression given to me by the internet). This isn’t a negative
> judgement - I know lots of brilliant researchers who come with a side
> order of wacky ideas.
>
> 2. A skeptical blogger who works in AI research (based on the general
> impression given to me by friends interested in LW as the sequences were
> happening, and glancing at the occasional LW post).
>
> 3. A horribly unproductive research organization that doesn’t seem to be
> producing any interesting technical work- based on reading their papers
> a few years ago, revisited more recently.
>
> 4. A horribly unproductive research organization hosting a side blog
> with lots of mistaken, anti-science stuff (based on reading the sequences,
> some of the problems I’ve discussed here).
>
> I guess also MIRI related, I had the impression that Bostrom’s _Superintelligence_
> was going to be a convincing portrayal of AI-risk (a lot of people told me it’s
> the best possible version of the MIRI style arguments). When I read it,
> I thought there wasn’t anything really different than the typical MIRI arguments.
> (based on reading Bostroms book).
>
> In a more positive shift, years ago I posted a bit on the LW web board and
> through that got the impression that LW was pretty hostile to criticism. The
> people on tumblr I’ve interacted with have, for the most part, been very pleasant
> people even though I disagree with them on pretty much everything (of course,
> the anons are a different story, but trolls gonna troll). So I’d say the
> community at large seems much healthier than I would have thought.

. . .

====

jimf said...

http://reddragdiva.tumblr.com/post/128546600873/when-was-the-last-time-you-changed-your-mind-about
------------
David Gerard
Sep 7th, 2015

> su3su2u1
> Sep 7th, 2015
>
> . . .
>
> I’ve grown steadily less favorable [to LessWrong/MIRI] with increasing exposure. . .

my progression strongly resembles this.

i started reading because a friend was signing up for cryonics and was
an active participant. (since i am for my dilettantism an actual expert on
scientology [ http://suburbia.net/~fun/scn/ ],
mutual friends literally deputised me to talk to him and see if he’d joined a
weird cult. my verdict was “not really,” which remains my verdict.) my previous
opinion of cryonics was neutral-to-positive, i looked into it and went
“wtf is this shit.” this [ http://rationalwiki.org/wiki/Cryonics ] was the
main result. i joined lesswrong in late 2010 ‘cos it looked fun. went to
a few of the meets. was put off attending one ever again by the
vociferous support for scientific racism. apparently scientific racism
is essential to being considered a true rationalist.

it took me years to realise there was no “there” there, that all the
dangling references to references that refer to other references never
resolve: that yudkowsky has **literally never accomplished anything**. he has
**literally no results to his credit, in his claimed field or out of it**.
he’s a good pop-science writer, and I highly respect that. i’ve read the
sequences through a coupla times and there’s good stuff in there. and he’s
written literally the most popular harry potter fan-fiction, for what that’s
worth. but in his putative field, his actual achievements **literally don’t exist**.

and the papers! holy shit, these are terrible! TDT [Timeless Decision Theory,
https://intelligence.org/files/TDT.pdf ], CEV [Coherent Extrapolated Volition,
https://intelligence.org/files/CEV.pdf ] - i am not a scientist, but i
have enough experience as a skeptic to tell when someone’s spent 150 pages
handwaving away the obvious hard bit and playacting at science. the best
thing to say about them is that none of them start “Hello, my name is Kent Hovind.”
[ http://rationalwiki.org/wiki/Kent_Hovind's_doctoral_dissertations ].

i recently looked up my early comments and i’m amazed how optimistic i was that
the weirdy bits could be dealt with by sweet reason rather than being the point
of the exercise. “taking ideas seriously”, by which they don’t mean
“consider this philosophical notion in detail in the abstract”, but
“believe our utilitarian calculations and act on them because of our logic.”
even scott alexander called this one out.
[ http://squid314.livejournal.com/350090.html ].

i went back and read every post in main from 2007-2011. you can see it
getting stranger and stranger from 2009 on, as people took in and extrapolated
the sequence memes. i would say that peak weird was the basilisk post in
july 2010. [ http://rationalwiki.org/wiki/Roko's_basilisk ]
this i think scared people and the weird seemed to noticeably scale back.
they also started the regular meets, for slight interaction with reality.

i mean, i don’t claim a stack of achievements either, i don’t even have
a degree, but at least i haven’t started a million-dollar-a-year charity
to fund my fanfic and blogging.

i am entirely unsurprised bostrom’s book is more of the same. i and others
have been going through the transhumanism related articles on wikipedia lately.
it’s a classic wikipedia walled garden. [ https://en.wikipedia.org/wiki/Wikipedia:Walled_garden ]
people who only ever refer to each other and cobble together an illusion
of actual science or technology going on, with a massive helping of hype for
the cause.
====

jimf said...

So there's this ex-Scientologist named Chris Shelton who just published
a book about his experiences in the "church":

http://www.patheos.com/blogs/friendlyatheist/2016/01/04/this-is-why-i-joined-the-church-of-scientology-as-a-teenager/
------------------
This is Why I Joined the Church of Scientology as a Teenager
January 4, 2016
by Hemant Mehta

. . .

His new book detailing his experience within the organization
is called _Scientology: A to Xenu: An Insider’s Guide to What
Scientology is All About_ . . .
====

Shelton also has a YouTube channel called
"The Critical Thinker at Large
Offering Reason in an Unreasonable World"
https://www.youtube.com/channel/UCF326xyA0QHI7Z5xAwKQDJg/videos

So I was watching some of Shelton's videos, and in one
entitled "Critical Q&A #36"
( https://www.youtube.com/watch?v=Asq6Je9SS0g )
one of his viewers asked (at 18:58/34:41):

"I'd be interested in hearing more specifics about how you got
into critical thinking. You've said that you started reading
Carl Sagan, but I'd be interested in the moment (if you remember
it) that you began to know that you lacked critical thinking
skills (or weren't using them). . ."

jimf said...

And Shelton answered:

"I'll tell you exactly the moment it happened. It was August, 2013,
and I was sitting at my computer. Having read all this information
about Scientology, and knowing that it was something I could no
longer believe in, because I'd learned that Hubbard was a pathological
liar and a serial plagiarist, and that Scientology was not at all
what it said it was, I knew that I was no longer part of this
destructive cult mentality. And I told myself, "I **do not** want
to go from one cult into another." But I have to decide, what am
I going to believe? What am I going to think? How am I going to
approach life now? . . . And I get on the internet, and exactly
what happened was I looked under "skepticism" -- maybe when I was
reading about Scientology, somebody said something about skepticism,
or had used this phrase "critical thinking", and I didn't really
understand what that was, right? So I Googled "skepticism",
and what came up was skeptic.com and a video from Penn & Teller.
Now I didn't know anything about Penn & Teller or James Randi
or Carl Sagan or the skeptical movement or the atheists or
Richard Dawkins or any of this stuff, I didn't know about any of it.
But I knew who Penn & Teller were because they were magicians and
they were funny and I'd laughed at them in the past because I'd
seen their performances and I'd really liked them. . . But I had
never connected those two magicians with skepticism, or logic or
reason. So I looked at the video and it was Penn Jillette talking
about James Randi. Never heard of James Randi. So I looked up
James Randi. . . this very famous skeptic who had been a magician
who was talking about critical thinking, and he bad-mouthed
Hubbard and he bad-mouthed psychics and ESP, and I read about the
James Randi million-dollar challenge, and I was fascinated by
this. . . And there was a quote from Carl Sagan about belief and
about science. And I started reading what Carl Sagan had to say
about how science is not a system of belief, it's a way of thinking,
a way of looking at information and making determinations and
judgments on that information based on the scientific method and
what I came across right away with Carl Sagan was something
called the "Baloney Detection Kit" . . . in a chapter of the book
_The Demon-Haunted World_. Science as a light or a candle in the
darkness of ignorance. Never heard of it before. I knew about
Carl Sagan from "Cosmos" -- you know, "beelions and beelions of
stars. . ." -- that was all I knew about Carl Sagan. I read the
steps of this Baloney Detection Kit and I thought this is exactly
what I'm looking for, because it's not another cult system, it's
not somebody telling me how the world is or what to think, it's
giving me tools I can use to evaluate for myself what to think.
And I love that. . ."

jimf said...

So I was moved by all this to add the following comment (to
that same "Critical Q&A #36" video
https://www.youtube.com/watch?v=Asq6Je9SS0g )

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
jfehlinger 23 hours ago

Chris -- it's great that you stumbled on the skeptical
and atheist community via the Web -- Penn & Teller, James Randi,
Carl Sagan et al. However, let me offer you a heads-up about another
quasi-cultic belief system that, in the last decade, has
begun to infiltrate the skeptical community at large. For example,
I was surprised not long ago to find out from a 30-year-long friend of mine,
who was until recently a producer of the New York City Skeptics' podcast
"Rationally Speaking" [ http://rationallyspeakingpodcast.org/about/ ]
that the podcast's current *host* is somebody whose
name I recognized as being associated (as president and cofounder)
of something called the "Center for Applied Rationality"
[ http://rationality.org/about/ ], a group with
some pretty unsavory associations -- to wit, the quasi-cultic belief
system I mentioned above. This is the sort of thing I mean:

http://www.vice.com/en_ca/read/theres-something-weird-happening-in-the-world-of-harry-potter-168
--------------
The Harry Potter Fan Fiction Author Who Wants to Make
Everyone a Little More Rational
By David Whelan
March 2, 2015
====

https://web.archive.org/web/20120728023014/http://betabeat.com/2012/07/singularity-institute-less-wrong-peter-thiel-eliezer-yudkowsky-ray-kurzweil-harry-potter-methods-of-rationality
--------------
Faith, Hope, and Singularity: Entering the Matrix with New York’s Futurist Set
It's the end of the world as we know it, and they feel fine.
By Nitasha Tiku
7/25/2012
====

http://harpers.org/archive/2015/01/come-with-us-if-you-want-to-live/
--------------
Come With Us If You Want to Live
Among the apocalyptic libertarians of Silicon Valley
By Sam Frank
January 2015
====

http://rationalwiki.org/wiki/Lesswrong
https://web.archive.org/web/20141124184420/http://kruel.co/2014/07/03/eliezer-yudkowskys-personality/#sthash.MLyiGHKN.nf9cLslm.dpbs

The cultic/religious guru/true-believer dynamic turns up in what you might think of
as the unlikeliest places. Beware! ;->
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

jimf said...

More from a LW/MIRI/HPMOR critic:

http://reddragdiva.tumblr.com/post/130943692318/argumate-thirqual-argumate
-------------
David Gerard
Oct 11th, 2015

> argumate:
>
> thirqual:
>
> argumate:
>
> . . .
>
> I must confess a lot of LW ideas seem a lot more alarming
> when I imagine them being encountered by teenagers.
>
> Now imagine if a large contingent of people came across LW through a very
> popular Harry Potter fanfiction. . .
>
> aaaaaahhhhhh

this is a lot of the hazard.

1. narcissistic autodidact pontificates. a lot of it’s actually
good science popularisation (the redigested kahneman), with a small
percentage of crack (the ai, quantum, anti-science). the crack turns out to
be the actual point
[ https://topherhallquist.wordpress.com/2015/07/30/lesswrong-against-scientific-rationality/ ].

2. the good stuff uses neologisms by the ton. this cuts off the reader
from 2000 years’ discourse on the philosophy in question (much of
which does in fact go back a couple thousand years) and gives the
naïve reader (that’s the teenagers!) the impression that
**any of the good stuff is original**.

3. (also, the ideas are like catnip for a noticeable type of reader - young,
very smart, somewhat aspergic, ocd tendencies - for whom they are actually
memetic hazards.)
[ http://reddragdiva.tumblr.com/post/127944291153/reasonableapproximation-uncrediblehallq ]

4. the narcissist is thought of by the incredibly impressed young readers as a
original genius.

5. miri gets donations. this is after all the most important cause in the world.
[ http://lesswrong.com/lw/8fd/transcription_of_eliezers_january_2010_video_qa/ ]
just ask them!
[ http://lesswrong.com/lw/24c/the_fundamental_question/1wmb?c=1 ]

the neologism thing is particularly annoying. someone suggested on the rw to do list
[ http://rationalwiki.org/wiki/RationalWiki:To_do_list ]
a lw jargon translator (6 upvotes right now). i can’t be bothered myself,
but y’know if anyone can.

i was surprised and pleased to see that “typical mind fallacy”
is actually the proper term for it. **unfortunately** i’ve had to stop
casually linking people to slatestarscratchpad’s otherwise-excellent
popularisation on lw [ http://lesswrong.com/lw/dr/generalizing_from_one_example/ ],
because he just had to use pick-up artists as his obvious go-to example.
yes, social defectors and parasites can get advantage for themselves even
if others find their ideas and behaviour reprehensible, well done.

i mean, noticing your mind and that you can do things to improve it is
**great stuff**. getting people to notice their minds and that they can do things
to improve it is **quite possibly** great stuff. doing so with deeply
flawed philosophy that turns out not to have good foundations, less so.

and i’m being disconcerted to discover how **young** so much of post-lw tumblr is.
====

Scientology for the internet age -- via Harry Potter.

Jesus Fucking Christ.

jimf said...

http://lesswrong.com/lw/8il/how_did_you_come_to_find_lesswrong/
-------------
quanticle
21 November 2011

I was reflecting the other day about how I learned about LessWrong.
As best as I can recall/retrace, I learned about LessWrong from. . .
an essentially chance meeting. I'm wondering how typical my experience is.
How did **you** come to LessWrong?

Do you think that we (the community) are doing enough to bring in
new users to LessWrong? If not, what do you think could be done
to increase awareness of LessWrong amongst potential rationalists?

---

KatieHartman
25 November 2011

I met Jesse Galef last year, and we became fast friends -
at which point he practically begged me to read
Methods of Rationality and LW. Good on you, Jesse!

This year I was the organizer for Skepticon, a conference
that has traditionally concerned itself with the atheist movement.
Eliezer, Julia Galef, Richard Carrier, and Spencer Greenberg
were kind enough to come speak on topics more pertinent to
the rationalist community. . .

Attendance was just over 1,100. . . I overheard one of them
tell Eliezer that she felt like his talk had revealed a
"next step" in her personal growth as a freethinker. . .

Lately, there's been a huge upsurge in atheist/secular
activism at the college level, and as far as I know, these
groups aren't being particularly targeted. . .

---

Mercurial
21 November 2011

Through cryonics, oddly enough. I went to a "Teens & Twenties"
cryonics meetup in January 2009 and met Eliezer there.
He kept bringing up the rationality stuff and kept trying to
encourage everyone to look at Less Wrong. . .

---

RomeoStevens
22 November 2011

Was going crazy studying econ and feeling like I must
be insane to reach the conclusions about the world I was
reaching. Then I found Robin Hanon's blog. From there I
read lots of Eliezer, Mencius Moldbug, and Nick Szabo
and realized everyone else was insane. . .

---

Ezekiel
26 November 2011

HPMOR was recommended to me by a friend, and from there to here. . .

---

fortyeridania
21 November 2011


I think I was reading up on cognitive biases, which led me
to Overcoming Bias, which led me to Less Wrong. . .

---

Vladimir_Nesov
21 November 2011

"What's the state of AI these days?" (on a whim, Feb 2007 IIRC) ->
Goertzel's agi list -> SL4 -> OB -> LW

---

peter_hurford
21 November 2011

Via LukeProg's CommonsenseAtheism blog. . .

---

Zetetic
21 November 2011


First through Scott Aaronson's blog. . .
Then a friend encouraged me to check out the sequences. . .

---

Grognor
21 November 2011

Discovered Graham's Number. Realized how fucking huge it was.
Googled around for a coherent explanation. Found this.
[ http://www.yudkowsky.net/obsolete/singularity.html ]
Thought, "This guy is a motherfucking genius. . ."

I found Eliezer Yudkowsky's homepage and it was like
"WOAH, THIS GUY IS A MOTHERFUCKING GENIUS! FUN THEORY?
WHAT'S THAT?" . . .

I've personally tried to get people interested in Less Wrong
to no avail. It's not easy, so there's that. . .

---

Kaj_Sotala
22 November 2011

Wikipedia article for technological singularity ->
Finnish IRC channel about transhumanism -> Finnish Transhumanist Assocation ->
Staring into the Singularity -> SL4 -> OB -> LW.

---

. . .

====

Observe, little girls.

jimf said...

> http://reddragdiva.tumblr.com/post/128546600873/when-was-the-last-time-you-changed-your-mind-about
> ------------
> David Gerard
> Sep 7th, 2015
>
> . . .
>
> . . .i am for [all] my dilettantism an actual expert on
> scientology [ http://suburbia.net/~fun/scn/ ]. . .

Hm. Apparently, David Gerard, Wikipedian, co-founder
of RationalWiki (a somewhat unflattering, if juicy, bio:
https://encyclopediadramatica.se/David_Gerard ),
and disillusioned LessWrongian is (or was, at any rate,
when he tangled with the Scientologists
in the 90s) a personal acquaintance of none other than
Julian Assange.

From:

http://www.amazon.com/Most-Dangerous-Man-World-Conspiracies/dp/1616084898/
--------------------------
_The Most Dangerous Man in the World:
The Explosive True Story of Julian Assange
and the Lies, Cover-Ups, and Conspiracies
He Exposed_
by Andrew Fowler
====

https://books.google.com/books?id=Z00tAgAAQBAJ&pg=PT32&lpg=PT32&dq=Scientology+%22David+Gerard%22&source=bl&ots=fCL5S7MUiA&sig=4Q2F5qW7Mz3d9llGgQE5fRWpIFY&hl=en&sa=X&ved=0ahUKEwiTg52Tp5jKAhVGPT4KHcmKAIcQ6AEIPzAI#v=onepage&q=Scientology%20%22David%20Gerard%22&f=false
--------------------------
Assange operated one of the first Usenet chat rooms, "Suburbia",
which became famous as an open forum for new ideas, a place
where anyone from anywhere in the world could speak their
mind. The recurring topics focused on attempts by the
government to control free speech, the possibility of
nuclear war, the role of the US military bases overseas,
and how to crack encryption codes.

But of greatest interest to Assange at this time were plans
by the underground computer community to target the controversial
Church of Scientology. . .

For Assange and the other Melbourne hackers, Scientology
embodied everything they stood against: a rigidly conformist
and opaque organisation that exercised great power over
what its members said, thought and did. . . "What you have then
is a Church based on brainwashing yuppies and other people
with more money than sense," Assange wrote. . .

He soon received a call. When the phone rang on Assange's
silent number he might have expected an intelligence agency
to be on the other end. Instead the voice he heard
belonged to a private investigator working on behalf of
the Church of Scientology. The Church was particularly
interested in David Gerard, one of Assange's friends.
Gerard ran his own chat room, encouraging open debate
about alternate religions. When an ex-Scientologist
confirmed during one discussion that it was part of
"secret Scientology scripture" to try to communicate with
plants, the Church took him [not Gerard, the ex-Scientologist:
http://www.suburbia.net/~fun/scn/pers/fun/vut/haddon-15k.html]
to court bizarrely for "copyright violation". His house was
raided and his computer taken away.

The private investigator asked Assange how long Gerard
had had an account with "Suburbia". Assange refused to
hand over any information. Instead he tipped off Gerard --
and posted a call to arms. "The fight against the
Church is far more than the Net vs a bunch of wackos
with too much money. It is about corporate suppression
of the Internet and free speech," Assange posted on
his site. The Internet, he argued, posed the greatest
threat to Scientology in that it was, by its very nature,
"a censorship-free zone". . .

He warned that "precedents the Church sets today"
would become the "weapons of corporate tyranny tomorrow".
If ever there was a cause célèbre for underground hackers,
the Church of Scientology provided to perfect target. . .
====

jimf said...

https://www.reddit.com/r/HPMOR/comments/28yjbx/some_strangely_vehement_criticism_of_hpmor_on_a/cifxmzb
(via
http://rationalwiki.org/wiki/Talk:Eliezer_Yudkowsky#Yudkowsky_hates_hates_hates_RW.27s_guts )
----------------
"RationalWiki hates hates hates LessWrong. . . so they lie about us
and have indeed managed to trash our reputation on large parts
of the Internet. . ."

"I submit to you all that by far the best reason why folks at RationalWiki
would act like this toward some of the clearest-cut moral exemplars of
the modern world, often-young people who are donating large percentages
of their incomes totaling millions of dollars to fight global poverty. . .
when RWers themselves have done nothing remotely comparable, is precisely
that RWers themselves have done nothing remotely comparable, and RW
hates hates hates anyone who, to RW's tiny hate-filled minds, seems
to act like they might think they're better than RW.

What RW has to say about effective altruism stands as an absolute
testimonial to the sickness and, yes, outright evil, of RationalWiki,
and the fact that RW's Skeptrolls will go after you no matter how much
painstaking care you spend on science or how much good you do for other
people, which is clear-cut to a far better extent than any case I could
easily make with respect to their systematic campaign of lies and
slander about LessWrong."
====


Some of the clearest-cut moral exemplars of the world?

Well, I'd certainly be willing to grant: **one** of the
clearest-cut exemplars of utter tone deafness.

Me skeptic, you skep**troll**!

It sounds exactly like (from
https://www.youtube.com/watch?v=BtpvcIsaB88
Scientology is Coming Apart at the Seams
Chris Shelton
Published on Dec 19, 2014
(17:09/37:16) -- "Ruthie Heyerdahl video posted
on YouTube, February 4, 2008"
----------------
I have a ton of friends who are Scientologists. And I think
Scientologists are possibly some of the most involved people
I have ever met. Like, they are some of the most ethical
people. They're so friendly, and so happy, and they have
such a purpose in life. It's very rare for me to come across
people who actually try to make a difference -- in their
local communities, in their cities, in their government.
To be honest, I even have a hard time finding people who
actually vote. And Scientologists are some of the most
intelligent and the most involved people that I have ever
met. They're people who **care**. They're people who
make a difference. I know so many Scientologists, and they're
**great** people. And so, the fact that their right to
practice their religion is being protested is very, very
strange. . .
====

jimf said...

> "RationalWiki hates hates hates LessWrong. . ."

https://www.reddit.com/r/HPMOR/comments/2n3yh5/what_is_it_with_dark_lord_potter_and_hpmor/cmad6x3
--------------
Eliezer Yudkowsky
1 year ago

There's a standard Internet phenomenon (I generalize) of a
Sneer Club of people who enjoy getting together and picking
on designated targets. Sneer Clubs (I expect) attract people
with high Dark Triad characteristics, which is (I suspect)
where Asshole Internet Atheists come from - if you get a
club together for the purpose of sneering at religious people,
it doesn't matter that God doesn't actually exist, the club
attracts psychologically f'd-up people. Bullies, in a word,
people who are powerfully reinforced by getting in what feels
like good hits on Designated Targets, in the company of
others doing the same and congratulating each other on it.
E.g. my best guess is that RationalWiki started out as a
Sneer Club targeted on homeopathy, and then they decided
that since they were such funny and incisive skeptics they
ought to branch out into writing about everything else,
like the many-worlds interpretation of quantum mechanics. . .

There is also a common attitude that nerds are designated
bullying-targets; or to write it out at slightly greater
length, people who talk about science are acting like they
think they're snootier than you, which is especially
deserving of a slapdown since the person is probably
just some nerd in their mother's basement. . .

To **hate hate hate** a vegetable stew and everyone who
ever said they liked it, you need to have something else
going on inside your head besides having tasted the
stew and having said "bleah" that one time.
====

https://en.wikipedia.org/wiki/Dark_triad
--------------
The dark triad is a group of three personality traits:
narcissism, Machiavellianism and psychopathy.
Use of the term "dark" implies that people scoring high
on these traits have malevolent qualities:

-- Narcissism is characterized by grandiosity, pride,
egotism, and a lack of empathy.

-- Machiavellianism is characterized by manipulation and
exploitation of others, a cynical disregard for morality,
and a focus on self-interest and deception.

-- Psychopathy is characterized by enduring antisocial behavior,
impulsivity, selfishness, callousness, and remorselessness.
====

jimf said...

(via
https://www.reddit.com/r/SneerClub
[SneerClub: producing hatred of LessWrong and Yudkowsky since 2015]):

An upgrade is coming. . .

http://lesswrong.com/lw/n0l/lesswrong_20/
-------------
LessWrong 2.0
Vaniver
09 December 2015

Alternate titles: What Comes Next?, LessWrong is Dead, Long Live LessWrong!

. . .

What went wrong (or horribly right):

So why did LessWrong fade? One short version is that
LW was a booster rocket, designed to get its payload to a
higher altitude then discarded. This is what I mean by
what went horribly right--MIRI now has a strong funding
base and as much publicity as it wants. Instead of writing
material to build support and get more funding, Eliezer
(and a research team!) can do actual work. . .
====

Hold on to your hat, Maudie, here comes the good stuff!

Loc. cit.
-------------
helldalgo
03 December 2015

Less Wrong has a high barrier of entry if you're at all
intimidated by math, idiosyncratic language, and the
idea that ONE GUY has written most of its core content. . .

If other community members branch out in the way that
CFAR and MIRI have, integrating the education-without-academia
principles should be a priority in their organizations.
It's not a stretch: Eliezer Yudkowsky does not have a degree,
and he has done excellent work from a teaching point of view.
He also seems to be respectable among academics for his
theory work (I'm not knowledgeable enough to vet that
personally).

---
PhilGoetz
27 December 2015

One thing LessWrong should have been, and never was, is a
place to expand on and critique the ideas behind Friendly AI
and Coherent Extrapolated Volition. It's nearly 2016, and
AFAIK we still have no more details on what CEV is or
how it might work than when the site was created. I find
it strange to talk about shutting the site down when
it's never gotten around to what should have been its
primary purpose.

It wasn't such a place partly because Eliezer discouraged
attempts to fill in the gaps, figure out what he meant,
or critique the assumptions of his program. LessWrong
was a fundraiser for a cause that it wanted not to discuss.

---
CellBioGuy
19 December 2015

Near as I can tell LW 'faded' because it was initially
created to sneak obsession with a particular flavor of AI
apocalypse into the minds of as wide an audience as possible
under the guise of a rationality movement so as to make
money for the singularity institute, and now there are
actual institutions which are more lucrative potential
funding sources. That, and the ai safety subtext that
underlaid everything has gone nowhere at the institute
itself to all outward appearances.
====

jimf said...

Apropos the original subject of the thread:

http://www.nytimes.com/2016/01/01/opinion/privilege-pathology-and-power.html
--------------------
Privilege, Pathology and Power
Paul Krugman
JAN. 1, 2016

Wealth can be bad for your soul. That’s not just a
hoary piece of folk wisdom; it’s a conclusion from
serious social science, confirmed by statistical analysis
and experiment.
[ http://healthland.time.com/2013/08/20/wealthy-selfies-how-being-rich-increases-narcissism/ ]
The affluent are, on average, less likely to exhibit empathy,
less likely to respect norms and even laws, more likely
to cheat, than those occupying lower rungs on the economic ladder.

And it’s obvious, even if we don’t have statistical
confirmation, that extreme wealth can do extreme spiritual
damage. Take someone whose personality might have been
merely disagreeable under normal circumstances, and give
him the kind of wealth that lets him surround himself
with sycophants and usually get whatever he wants. It’s
not hard to see how he could become almost pathologically
self-regarding and unconcerned with others.

So what happens to a nation that gives ever-growing
political power to the superrich? . . .

[T]hink of the various billionaires who, a few years ago,
were declaring with straight faces, and no sign of
self-awareness, that President Obama was holding back
the economy by suggesting that some businesspeople
had misbehaved. You see, he was hurting their feelings.

Just to be clear, the biggest reason to oppose the power
of money in politics is the way it lets the wealthy rig
the system and distort policy priorities. And the biggest
reason billionaires hate Mr. Obama is what he did to
their taxes, not their feelings. The fact that some of
those buying influence are also horrible people is secondary.

But it’s not trivial. Oligarchy, rule by the few, also tends
to become rule by the monstrously self-centered.
Narcisstocracy? Jerkigarchy? Anyway, it’s an ugly spectacle,
and it’s probably going to get even uglier over the course
of the year ahead.
====

Ellie K said...

Argh, where to begin?! jimf has posted a ton of stuff which is causing me to have the usual sense of conspiracy theory that I get from spending too much time on the Internet. RationalWiki is very irrational, although I don't know if that is by original intent. It doesn't surprise me that RationalWiki doesn't like Wikipedia.
David Gerard is a long-time contributor to Wikipedia, under his own name. My impression is that he is patient and well-behaved. I didn't know he had a tumblr blog. I have a tumblr blog and follow nostalgebraist, and am surprised to find anything other than ubiquitous political correctness on tumblr.
Mercatus and GMU are indeed funded by Kochs and "The State". I don't like Tyler Cowen, and even his own Marginal Revolution blog commentors often dislike him. Yes, Robin Hansen and Overcoming Bias are linked to Less Wrong and GMU, and it is all public knowledge. Hansen has a very impressive academic background in physics, and I suspect was lured away from that.

First things first: Eliezer Yudkowsky is an annoying little snit that should have gone to college instead of getting ridiculous mileage as a child prodigy. Now, he is ungracefully segueing into middle age.

Next, I liked and agreed with much of the author, Dale C's original post, although I disagree with the brief parenthetical that police are racist. It is important not to go full Corey Pein on everything. Also, there is nothing inherently bad about certain aspects of reactionary thought. For example, modesty, understatement, avoidance of tasteless displays of obscene wealth, never self-aggrandizing, willingness to help the poor and needy without the incentive of tax credits and public praise, are all aspects of reactionary thought that would make the world a better place than it is now.

Ellie K said...

This strikes a sympathetic pang of acknowledgement:

...exploiting collective labor and intelligence via the shar(ecropp)ing economy with little awareness or interest in the costs or risks or suffering of others involved in their efforts.

"Knowledge workers", "gig economy", "sharing economy" and freelancing are all misnomers, and thinly veiled, technologically-enabled forms of worker exploitation. I don't know why it should be necessary for me to explain this so often, to so many people. Very few seem to realize it. I'm happy that the author, Dale C. does. Tom Slee, also known as Whimsley on Twitter, is a very bright tech person who realizes it too. Some of the altright and reactionary crowd realize as well.

Wikipedia, StackOverflow and some open source software projects are among the very few successful experiments in collaborative working for free. I think they were flukes, and unlikely to be repeated. I predict that techno-utopian do-gooders (effective altruism a la Less Wrong) will have a long time to wait before they see anything similar again. Bitcoin is a stellar example of how effective altruism/ rational self-interest does not work without central governance, but instead, turns into an insider cabal controlled by less than a dozen powerful players who exploit everyone else until the entire Ponzi scheme (cult?) totters on the brink of collapse.

jimf said...

> First things first: Eliezer Yudkowsky is an annoying little snit. . .

It's more than that. Cf.
http://amormundi.blogspot.com/2008/03/fallen-world-and-world-to-come-or.html
and search down the comment thread to "Grandiosity".

There are lots of "annoying little snits" in the world,
and most of them are relatively harmlessly annoying.

But when they have the gift of the "guru whammy",
watch out. (L. Ron Hubbard. Ayn Rand. Andrew Cohen.
etc., etc.)

Did I post a link to that NY Times article from a few
days ago?

http://www.nytimes.com/2016/01/17/magazine/the-happiness-code.html
----------------
A new approach to self-improvement is taking
off in Silicon Valley: cold, hard rationality.
By JENNIFER KAHN
JAN. 14, 2016

. . .

The [Center for Applied Rationality's] three founders —
Julia Galef, Anna Salamon and [Michael] Smith — all have backgrounds
in science or math or both, and their curriculum draws heavily
from behavioral economics. . .

CFAR has been offering workshops since 2012. . .
People tend to hear about the group from co-workers. . .
or through a blog called LessWrong, associated with the artificial-intelligence
researcher Eliezer Yudkowsky, who is also the author of the
popular fan-fiction novel "Harry Potter and the Methods of Rationality."
(Yudkowsky founded the Machine Intelligence Research Institute (MIRI),
which provided the original funding for CFAR. . .)
Yudkowsky is a controversial figure. Mostly self-taught — he
left school after eighth grade — he has. . .
blogged at length about the threat of a civilization-ending A.I.
Despite this, CFAR’s sessions have become popular. . .

Salamon acknowledged that the
center’s aims are ultimately proselytic. CFAR began as a spinoff
of MIRI, which Yudkowsky created in 2000, in part to study the
impending threat posed by artificially intelligent machines, which,
he argued, could eventually destroy humanity. . .
Over the years, Yudkowsky found that people struggled to think
clearly about A.I. risk and were often dismissive of it. In
2011, Salamon, who had been working at MIRI since 2008,
volunteered to figure out how to overcome that problem.

When I spoke with Salamon, she said that "global catastrophic risks"
like sentient A.I. were often difficult to assess. . .

It was a point of view that nearly everyone at the workshop fervently
shared. As one participant told me: "Self-help is just the gateway.
The real goal is: Save the world.". . .

And while some exercises seemed useful, other parts of the workshop —
. . . the groupthink, the subtle insistence
that behaving otherwise was both irrational and an affront to "science"
— felt creepy, even cultish. . .

Another woman. . . said her commitment to rationality had
already led to difficulties with her family and friends.
(When she mentioned this, Smith proposed that she make new
friends — ones from the rationalist community.). . .

[T]he vibe was just a little strange,
what with the underlying interest in polyamory and cryonics,
along with the widespread concern that the apocalypse, in
the form of a civilization-destroying artificial intelligence,
was imminent. When I asked why a group of rationalists would
disproportionately share such views, people tended to cite the
mind-expanding powers of rational thought. . .

But the real reason, many acknowledged, was CFAR’s connection to
Yudkowsky. Compulsive and rather grandiose, Yudkowsky is known
for proclaiming the imminence of the A.I. apocalypse ("I wouldn’t
be surprised if tomorrow was the Final Dawn, the last sunrise
before the earth and sun are reshaped into computing elements")
and his own role as savior ("I think my efforts could spell the
difference between life and death for most of humanity").

When I asked Galef and Smith whether they worried that the group’s
association with Yudkowsky might be off-putting, they seemed
genuinely mystified. . .
====

Dale Carrico said...

Thank you for your generous comments and welcome to Amor Mundi. A few super-quick responses off the cuff. The "neo-reaction" is a facile ill-digested hodgepodge of technofixation sexism and libertopianism, beneath contempt and already past its expiration date as an online fandom strange-attractor... but the values of modesty, charity, horror at vulgarity and the refusal to reduce value to a pecuniary matter you mention are all indeed hallmarks of a valid conservative temperament that has been long displaced -- exemplified by writers from Burke to Oakeshott, and there are vestigial echoes of it from the bioconservatives at, say, The New Atlantis. Even though I am a person of the left, I find hold-outs of this conservative disposition congenial and companionable, rather like a glossy BBC adaptation of a Jane Austen novel or like the usually sensible advice Judith Martin provides as Miss Manners, so long as we don't let the talk get around to religion or feminism.

I do want to be clear that I did not declare all policing racist, but simply deplored racist police violence where it undoubtedly occurs, as I assume anyone with sense and standards would do. Communities of color urgently need the security and support of good, responsive, accountable police. Militarization, unaccountability, structural biases, inadequate training in violence de-escalation, alienation from the communities being policed, sentencing disparities, bureaucratized school-to-prison pipelines, the war on nonviolent drug use, all in the context of the over-saturation of public spaces with military weapons are all unquestionable problems highlighted by the current BlackLivesMatter movement, and none of which is properly summarized by the glib overgeneralization "all police are racist." I daresay you know and agree with most of that, but it is important to be clear on these things.

As you can tell from the piece I do think Yudkowsky is a ridiculous person. I've said so since he actually was a child prodigy, poor mite. I agree that Tom Slee is a fine and funny person with his head on straight, we follow one another on twitter and trade occasional wisecracks. I know, have often conversed, and have even enjoyably lunched with Robin Hansen, and though is an inoffensive person as persons go I cannot say I am impressed with his scholarship. He is scarcely distinguishable in his actual views from fulminating market libertarian ideologues and robocultic futurologists of the singularitarian bent but he pulls his punches to maintain a measure of public respectability. I am happy you have found your way here, and I recommend my Futurological Discourses and Posthuman Terrains piece as the best, concise formulation of my perspective if you want a good idea of where I am coming from. Again, welcome to you!

jimf said...

So a couple of years ago we had the Sad Puppies
https://en.wikipedia.org/wiki/Sad_Puppies
and now we have the "Rationalist Puppies"
http://slatestarscratchpad.tumblr.com/post/125635987521/the-darkest-timeline

http://asocratesgonemad.tumblr.com/post/138311551872/just-recieved-an-email-to-hpmor-subscribers-from
--------------
Just received an email (to HPMOR subscribers) from Eliezer Yudkowsky
about a new light novel he’s self-publishing. . .

Also, he encouraged the recipient to go to WorldCon and
nominate H[arry]P[otter and the]M[ethods]O[f]R[ationality]
for best novel. I don’t know why he’s
refusing to settle for best fanwork (isn’t that a category?),
but then again HPMOR *has* become really ubiquitous in nerd
circles. I’m just worried that he’s tossing out a more achievable
Schelling point through needless direct coordination.
Yudkowsky is nothing if not an Icarus. . .
====

And he's not even looking over his shoulder wondering when
J. K. Rowling and her lawyers are finally going to get fed up
with this nonsense?

Ellie K said...

Dale, you write beautifully. Thank you so much for taking the time to read and reply to my comment. I hope you enjoyed the satirical video of Big Yud. I think you did :o) There needs to be a lot more trolling and cutting down to size of Big Yud. He and people like Bryan Caplan (and don't forget Justin Wolfers) have weird fan boy-like followings of impressionable young minds. It is really sad and troubling to me. I don't know why EY and his libertarian sponsors are so alluring to earnest bright and very young people.

Hello jimf! I remember Sad Puppies. I perused some of the links you kindly provided and was unnerved and saddened by the level of obsession that some of tumblr has with Big Yud, see this in particular:
http://nostalgebraist.tumblr.com/tagged/big-yud ...pages and pages of HPMOR obsession. Of course EY self-published his newest light novel! It is a cult, the Less Wrong and Slate Star Codex crowd. I see them as wannabe lackeys of the plutocracy, but plenty of them consider themselves liberal progressives. They like green energy, open marriages and open borders, and tell me how wonderful Uber, bitcoin and P2P are. I don't know what I am exactly, but I am not a devotee of Big Yud!

I need to go to sleep as I need to be at work early in the morning. One more quick thought for Dale. There are some other people on Twitter that I follow, that I think you might like: Frank Pasquale, David Golumbia, Dan Kervick and Adam H. Johnson. All are kind of lefty, yet have aspects of that conservative temperament that you described so fluently.