Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Tuesday, February 05, 2013

Engaging Robot Cultists on the Specifics

Upgraded and adapted from the Moot, Mitchell asserts "Dale doesn't engage with the specifics of L[ess] W[rong-ism] very much."

People caught in the gravity well of this or that sect of the Robot Cult endlessly make this point. Once one determines that a futurological discourse or program is ridiculous, reactionary, dangerous, or otherwise off the rails its adherents are actually no longer in a position to dictate the terms on which the "specifics" are debated.

To go down the Rabbit Hole with Singularitarian Robot Cultists and debate the Robot God Odds on their preferred terms consolidates their discourse even if one assumes a contrarian position in the debate. It confers on them a legitimacy that they otherwise could not produce from their actual position of marginality from actual technoscientific and technodevelopmental consensus.

Within their pocket universe the Singularitarians may endlessly reassure themselves that theirs is a vital and legitimate movement/program but that is of course a delusion. In fact, they are a serially failed, and always only failing, fundamentally conceptually flawed, palpably symptomatic, minute marginal subculture saying the most patent nonsense on a regular basis. This remains true even if sometimes some of them say things that also aren't nonsense, as even stopped clocks are right twice a day, and even complete fools can tell you where the coffee is kept.

What Robot Cultists think are the important "specifics" are often not the "specifics" that happen to be the most important from the vantage of one who is observing and analyzing or criticizing their discourse as an object rather than as a Believer whose needs it is meeting (at whatever cost).

Even where Robot Cultists occasionally talk about real scientific results or developmental issues here and there, who in their right mind would choose a Robot Cultist as a serious interlocutor in such discussions? Since superlative futurological faith-based initiatives for techno-transcendence essentially extrapolate, project, hyperbolize, or emphasize logical compatibility with some real, qualified, circumscribed scientific results none of which get them remotely where they think they want to go, and also from the accident that endlessly many of the results that must be discovered between where we are and where they think they want to be have simply not yet disappointed their hopes as surely many will -- and setting aside considerations of political priorities and the vicissitudes of stakeholder struggles, funding, regulation, education, distribution, infrastructural affordances, the play of accidents and passions in actual technodevelopment -- I personally disagree that the "technical questions" and "central debates" within their orthodoxies are really, substantially the specifics they are pretending to be in any case.

What matters for the Robot Cultists are less the actual results than the rhetorical significances that idiosyncratically attach to these results in their discourse. It is not surprising that True Believers and futurological fans do not share my perspective on the rhetorical operations in play within their arguments and aspirations or the salience in my estimation of their connection to mainstream neoliberal, competitive eugenic, consumer fetishistic assumptions or theological frames they may or may not individually disapprove. Contrarians who snipe at the Robot Cultists primarily in terms only the Cultists themselves understand seem to me to be functioning more or less as a loyal opposition in service to Robot Cultism. That may be a perfectly respectable way to get your kicks, especially if you don't make their mistake of confusing science fiction with science practice or wish-fulfillment fantasizing with though-experiments, but I personally have no interest in performing such a function.

I engage with Robot Cultism in terms of the specifics that actually matter by my own determination.

Given the citational richness of my critiques no one can honestly pretend I am not highly knowledgeable of and materially engaged with the actual textual specificities of the actual texts of the Robot Cultists I critique. It would be one of the characteristics of their cultishness that their adherents and camp followers would disdain as non-engagement an insistent engagement that simply does not attach the same significance they do to the specifics under scrutiny.

19 comments:

jimf said...

> Once one determines that a futurological discourse or program
> is ridiculous, reactionary, dangerous, or otherwise off the
> rails its adherents are actually no longer in a position to
> dictate the terms on which the "specifics" are debated.

And, of course, they return the favor, in spades.

Your field of expertise, rhetoric, is dismissed by the
LessWrongians as one of the "Dark Arts" -- a term of art,
among them.
http://wiki.lesswrong.com/wiki/Dark_arts
(Apparently there's nothing ironic intended in the sentence
in the above article "Such effects can be caused by something as
benign as the use of a specialist vocabulary which the target is
unfamiliar with. . .")

See also
http://lesswrong.com/lw/9iw/the_dark_arts_a_beginners_guide/

http://lesswrong.com/lw/3k/how_to_not_lose_an_argument/
--------------------
The science of winning arguments is called Rhetoric, and it
is one of the Dark Arts. Its study is forbidden to rationalists,
and its tomes and treatises are kept under lock and key in
a particularly dark corner of the Miskatonic University library.
More than this it is not lawful to speak.

What Dale refers to as "mainstream science" is **itself**
dismissed by the uber-rationalists at LessWrong. For them,
there is only Bayesian inference. It involves writing
down explicit probabilities, and then performing explicit
calculations to update them. People who are adept at this
can apparently dispense with the scientific community,
perform "low N" experiments, and decide, for example,
whether nootropic supplements work for them:
http://www.gwern.net/Nootropics
(looks like a lot of work).

Also, seeing (or worse, **looking for** -- Confirmation Bias!
Confirmation Bias!) similarities between LWianism
and Scientology, or Objectivism (or other SFnal "cults
of rationality" you might think of -- NXIVM, anyone?)
is dismissed as mere "pattern recognition" -- a lazy man's
substitute for reason.

jimf said...

Speaking of pattern recognition, try
reading Lawrence Wright's _Going Clear_, and then see
if this comment sequence doesn't "pattern match" to
something pretty unsavory:

http://lesswrong.com/lw/78s/help_fund_lukeprog_at_siai/4otj
---------------
SilasBarta
24 August 2011 3 points

Is there a post requesting volunteer help with this administrative task?
===

Eliezer_Yudkowsky
24 August 2011 7 points

. . .Why, are you volunteering to administrate them?
===

SilasBarta
24 August 2011 0 points

Sure, I'd love to! (I thought I didn't qualify to
volunteer for SIAI?) . . .
===

AuthorityFigure
25 August 2011 12 points

> Sure, I'd love to! (I thought I didn't qualify to
> volunteer for SIAI?)

LOL. Way to play up the role of the passive-aggressive outsider.
===

OptimalFAI
25 August 2011 6 points

Very true.

In the interest of optimizing our rationality I think that
we need to continue to call out instances "community distancing"
such as the one exhibited by Silas above.

The reason for doing so? It lets the dissenters know that a
community can tolerate and appreciate criticism but not the creation
of a lone wolf character. Lone wolves do not contribute to a
community and instead impede our advances in rationality by drawing
conversations back to their status. As such, their status seeking
should be pointed out and skepticism should be attached to their
future postings.

Passive-aggressive comments in particular are troublesome because
these types eventually find ways to disrupt substantive threads
by reminding others of their loner status and their unacknowledged
genius. Their resentment then leads to them mocking key figures
in a community (note Silas' comments to both Eliezer and Luke).

Perhaps LW needs a mini-sequence on acceptable and non-acceptable
signaling within a rational community.
===

To the Ethics Office! Go!

(They need to come up with a good shtick that corresponds
to an E-meter. Maybe Dave Asprey could help them
cook up some kind of biofeedback device -- with
a fancy software interface, of course.)

jimf said...

> What Dale refers to as "mainstream science" is **itself**
> dismissed by the uber-rationalists at LessWrong. . .

A commenter on another forum characterized the LW orthodoxy
on this topic as follows (I'll leave it to you guys to
guess who whom the pronoun refers):

http://www.poe-news.com/forums/spshort.php?pi=1002433082&ti=1002430709
---------------
His 'outsider's wisdom' is really grating

He keeps talking in dismissive tones about the 'average scientist,'
and how in order to make progress, we need to be many, many times
smarter than this 'average scientist.' And he isn't just talking
about AI researchers either, he's also talking about the entire
discipline of physics. Although he never quite says it, he keeps
hinting that he is just such a person, and is in fact better at
computer science and physics than most actual professionals.

There is an interesting disconnect here, because it's very plain
that he hasn't written a line of actual AI code, and he could
never claim to be able to reproduce or surpass the work of pretty
much any theory group (I will eat my hat if [he] can even
calculate the spectra of H2). The vibe that I get from him is that
he thinks he could beat these guys at their own game if he ever
put forth the effort.

Of course, actually enrolling in a program and doing substantial
work runs the risk of him discovering that he isn't really some
kind of wunderflake snowkind. So, instead of applying himself
and discovering that he can't actually single-handedly change
the course of science, he has constructed a giant tower of
unrealized potential from which he sneers down at people doing
real work.
---------------

This was in a long thread at "PoE News".
Another post in the same thread, by a character with the
handle "Xiphias", is actually linked to on the LessWrong Talk
page at RationalWiki ( http://rationalwiki.org/wiki/LessWrong ).
"Pretend there is a website of trans-accountants who have never
had an accounting job nor had any education in accounting."
It is a personal evaluation both of LessWrong and of
the person of whom LW has often been accused of being a vehicle for
the personality cult. It is not a restrained commentary. ;->
http://www.poe-news.com/forums/spshort.php?pi=1002432591&ti=1002430709

http://samvak.tripod.com/journal63.html
---------------
These – the lack of empathy, the aloofness, the disdain,
the sense of entitlement, the restricted application of humour,
the unequal treatment and the paranoia – make the narcissist a
social misfit. The narcissist is able to provoke in his milieu,
in his casual acquaintances, even in his psychotherapist,
the strongest, most avid and furious hatred and revulsion.
To his shock, indignation and consternation, he invariably
induces in others unbridled aggression. . .

One feels ill at ease in the presence of a narcissist for
no apparent reason. No matter how charming, intelligent,
thought provoking, outgoing, easy going and social the
narcissist is – he fails to secure the sympathy of his
fellow humans, a sympathy he is never ready, willing,
or able to grant them in the first place.
---------------

It has been said of both Ayn Rand and L. Ron Hubbard that
they created their respective "philosophies" as very public
attempts at self-psychotherapy, in reaction to the marked psychopathologies
from which they suffered.

I think LW is treating the world to the same sort of spectacle.

jollyspaniard said...

They already have their equivalent of an E Meter, they're misusing mathematics to make decisions. Their misuse of mathematics to make life decisions is their magic black box.

They're still getting off the ground. Real Scientologists have a lot more jargon and official policy references they toss around. They've got 50 years of accumulated BS canon to work with. The LessWrong guys are positively parched in comparison but they're working on it.

jollyspaniard said...

One interesting note, Scientology has changed the auditing thing nowadays. A lot of people can buy the machine nowadays and self audit. In that respect Scientology is imitating it's decentralized imitators.

jimf said...

> One interesting note, Scientology has changed the auditing thing
> nowadays. A lot of people can buy the machine nowadays and self audit.

Not **legally**, surely! (I know people have put E-meters up
on eBay, but I thought the Scientologists tried to keep after those
offers and have them taken down.)

There certainly are "do-it-yourself" Scientologists in the world --
in fact, "real" Scientologists call them "squirrels".

It's an interesting observation in _Going Clear_ that in the beginning,
when Hubbard published _Dianetics_, it **was** offered as an
essentially do-it-yourself technique that anybody who read
the book could apply at home. This was "corrected" when Dianetics
became Scientology -- after that, the **only** approved "bridge to total
freedom" became Hubbard-approved "tech" that could **only** be
had via the official sequence of paid-for Scientology courses.

I can't imagine that's changed. That's where all the $$$ comes
from!

jollyspaniard said...

There was a court case in France an ex Scientologist was suing them for a refund for her home auditing kit which cost her a bundle. They still sell expensive auditing sessions though which I imagine are marketed as being more effective.

jimf said...

Speaking of Scientology, though, it seems that
Luke Muehlhauser, current executive director
of the Singularity Institute[*], who was an evangelical
Christian until he lost his faith at the beginning of
2008, poked his nose in the Scientologists' tent and
then withdrew it.
( http://lesswrong.com/lw/58m/build_small_skills_in_the_right_order/ )

The Appeal of Scientology and an Atheist Church
by Luke Muehlhauser
April 12, 2009
http://commonsenseatheism.com/?p=1357

My Opinion of Scientology
by Luke Muehlhauser
February 5, 2010
http://commonsenseatheism.com/?p=6044

"Around age 22 I wanted nothing more than to be like Jesus to
a lost and hurting world."
-- "Facing the Singularity 1: Personal Motivations"
by Luke Muehlhauser
November 24, 2011
http://commonsenseatheism.com/?p=16256

"About Me. . .
My name is Luke Muehlhauser.
I grew up an evangelical Christian in Minnesota, USA.
At age 21, I began to study the Historical Jesus and
the philosophy of religion. I lost my faith in January 2008.
Now, I’m an outspoken advocate of rational thinking and naturalism.

I try not to have 'heroes,' but here are some people I admire:

- Noam Chomsky
- Mohandas Gandhi
- Nick Bostrom
- George Orwell
- Eliezer Yudkowsky"
http://commonsenseatheism.com/?page_id=3

[*] They've lost their brand, though. SI is now
MIRI, the "Machine Intelligence Research Institute".

Tell 'em Jim, tell 'em Jim, tell 'em Jim, . . .

;->

jimf said...

> -- "Facing the Singularity 1: Personal Motivations"
> by Luke Muehlhauser
> November 24, 2011
> http://commonsenseatheism.com/?p=16256

Cf.
http://rationalwiki.org/wiki/Talk:LessWrong/Archive2
------------------
. . .
[I]t is rather depressing to see how awareness of own reasoning
problem ( http://commonsenseatheism.com/?p=16256 ) doesn't
translate into practical solution (staying the hell away from
people who promise you grand purpose). . .
Dmytry, 16 May 2012
------------------

BTW, one of the commentators about LW at RationalWiki
has remonstrated with the other LW critics there about
the comparison to Scientology (invoking a kind of "Godwin's Law"):

http://rationalwiki.org/wiki/Talk:LessWrong/Archive4
-------------------
LessWrong is nothing like Scientology, and that's a completely silly
comparison. And trivialises Scientology the way casually comparing people
to Hitler trivialises the Nazis and what they did.

I'll note here that I happen to know really lots and lots about Scientology
so I can speak knowledgeably about the comparison. . .

Scientology is basically the Godwin example of cults; comparing any
damn thing to Scientology makes actually abusive cults that are not
as bad as Scientology seem benign. LessWrong is not Scientology.
Not even slightly. . .

Their stupidities are stupid, their fans are fanboys, their good bits
are fine, that's quite sufficient. The stupidities fully warrant
horselaughs. . . [but y]ou're mistaking stupidity
for malice, and this is leading you to make silly comparisons

David Gerard
21 June 2012
===


@David Gerard: I don't know much about Scientology, but I do know a
little about Objectivism, and I don't think the comparison between
LW and Objectivism is misplaced. Is Objectivism a cult? Objectivists
are not going to commit mass-suicide anytime soon, and I'd guess most
people find Objectivism through the Internet these days (in academia,
their views are fringe, just like many of those popular on LW).
I think the word "cult" may give the wrong connotations. Take a look
at Shermer's "The Unlikeliest Cult In History" and see how well it
matches LW. Still, my main motivation to oppose LW is that it promotes
crankery and pseudoscience, and unfortunately is attracting otherwise
smart people. . .

--Baloney Detection
27 June 2012
===

YMMV.

jimf said...

From the comment thread at
http://amormundi.blogspot.com/2009/04/its-more-than-fun-to-ridicule.html

> [T]here has been a giant transfer of time, attention, and resources
> from reality to fantasy. Rather than pursuing the American dream,
> people are simply dreaming. . .

"Four Years Later" [a story of the near future]
Date: Fri Apr 19 2002
http://acceleratingfuture.com/sl4/archive/0204/3384.html

The date is April 19, 2006 and the world is on the verge of something
wonderful. The big news of the last twelve months is the phenomenal success
of Ben Goertzel's Novamente program. It has become a super tool for solving
complex problems. . . "[M]iracle" cures for one major disease after
another are being produced on almost a daily basis. . .
[T]he success of the Novamente system has made
Ben Goertzel rich and famous making frequent appearances on the talk show
circuit as well as visits to the White House. One surprise is the fact that
the System was unable to offer any useful advise to the legal team that
narrowly fended off the recent hostile take over attempt by IBM. The
Novamente phenomen[on] has triggered an explosion of public interest and
research in AI. Consequently, the non-profit organization The Singularity
Institute for Artificial Intelligence has been buried under an avalanche of
donations. In their posh new building in Atlanta we find Eliezer working
with the seedai system of his own design. . .
===

Hm. So what was happening here around then? Ah,
Friday, April 21, 2006
Does Technophilia Incline Right?
http://amormundi.blogspot.com/2006/04/does-technophilia-incline-right.html

;->

Mitchell said...

Dale appears to be responding to a point that I wasn't making. Someone had asked if Dale is known at LW, and I said Alexander Kruel is much better known, because he's more engaged with the details.

jimf said...

> Dale appears to be responding to a point that I wasn't making.
> Someone had asked if Dale is known at LW, and I said
> Alexander Kruel is much better known, because he's more
> engaged with the details.

And Mr. Kruel has (over the course of time, I gather)
become **critical** of the details
( http://kruel.co/2012/07/17/siailesswrong-critiques-index/ )
which suggests that his contributions on LW (**whatever**
they are) will eventually be downvoted into invisibility,
just because he's now seen as a "troll".

It would hardly be profitable for Dale to attempt to discuss
the technicalities of Solomonoff Induction, Kolmogorov
Complexity, Marcus Hutter's AIXI, the Multiple Worlds
Interpretation of quantum mechanics, or any of the other
arcana about which the amateurs on LW pretend expertise,
either on Amor Mundi or on LessWrong. He **doesn't have to**
in order to make a cogent critique of them.

However, even people who **are** (or at least make a good show
of seeming to be) informed enough to **differ** with LW orthodoxy
on their favorite subjects, meet a predictable fate:

http://www.reddit.com/r/LessWrong/comments/17y819/lw_uncensored_thread/
-----------------
EliezerYudkowsky 1 point 12 hours ago*

. . .

Also, why are you responding to a known troll? Why are you reading
a known troll? You should be able to predict that they will horribly
misrepresent the position they are allegedly arguing against,
and that unless you know the exact true position you will be
unable to compensate for it cognitively. This (combined with actual
confessions of trolling, remember) is why I go around deleting
private-messaging's [Dmytry Lavrov's] comments on the main LW.
===

The introduction to the above page, BTW, is (no doubt unintentionally)
entertaining in itself:

-----------------
This is meant to be an uncensored thread for LessWrong, someplace
where regular LW inhabitants will not have to run across any
comments or replies by accident. Discussion may include
information hazards,
[ http://wiki.lesswrong.com/wiki/Deletion_policy#Information_hazards. ]
egregious trolling, etcetera, and since I expect this
to actually happen, I would frankly advise all LW regulars not
to read this. That said, local moderators are requested not to
interfere with what goes on in here (for that matter, I wouldn't
suggest looking at it, period).

If any local moderators or anyone else can think of a reason why
this is a terrible idea, please comment here or on the corresponding
LessWrong discussion post.

My understanding is that this should not be showing up in anyone's
comment feed unless they specifically choose to look at this post,
which is why I'm putting it here instead of LW. If I'm wrong about
this, please let me know quickly.
-----------------

I suppose eventually the uber-rationalists at LW will be
able to install net nanny software (like the Scientologists
allegedly use) to filter out irrationality (or "information
hazards") on the Web at large. Including -- **of course** --
this very blog. ;->

jimf said...

> One feels ill at ease in the presence of a narcissist. . .

Cluelessness:
http://lesswrong.com/lw/9gy/the_singularity_institutes_arrogance_problem/

Dale Carrico said...

Dale appears to be responding to a point that I wasn't making.

I was responding to a point incidental to a point you were making that didn't interest me as much. But I meant and mean no disrespect by that.

jimf said...

What Mitchell Porter **really** thinks (and wonders)
about LessWrong.

http://lesswrong.com/lw/4vb/can_we_stop_using_the_word_rationalism/
----------------
Mitchell_Porter 19 March 2011 54 points

...

I occasionally ponder what LW's objective place in the scheme
of things might be. Will it ever matter as much as, say, the
Vienna Circle? Or even just as much as the Futurians? - who
didn't matter very much, but whose story should interest
the NYC group. The Futurians were communists, but that was
actually a common outlook for "rationalists" at the time,
and the Futurians were definitely future-oriented.

Will LW just become a tiresome and insignificant rationalist
cult? The more that people want to conduct missionary activity,
"raising the sanity waterline" and so forth, the more that
this threatens to occur. Rationalist evangelism from LW might
take two forms, boring and familiar, or eccentric and cultish.
The boring and familiar form of rationalist evangelism could
encompass opposition to religion, psych 101 lectures about
cognitive bias, and tips on how optimism and clear thinking can
lead to success in mating and moneymaking. An eccentric and cultish
form of rationalist evangelism could be achieved by combining
cryonics boosterism, Bayes-worship, insistence that the many-worlds
interpretation is the only rational interpretation of
quantum mechanics, and the supreme importance of finding the
one true AI utility function.

It could be that the dominant intellectual and personality tendencies
here - critical and analytical - will prevent serious evangelism
of either type from ever getting underway. So let's return for a
moment to the example of the Vienna Circle, which was not much of
a missionary outfit. It produced a philosophy, logical positivism,
which was influential for a while, and it was a forum in which minds
like Godel and Wittgenstein (and others who are much lesser known
now, like Otto Neurath) got to trade views with other people who
were smart and on their wavelength, though of course they
had their differences.

Frankly I think it is unlikely that LW will reach that level.
The Vienna Circle was a talking shop, an intellectual salon,
but it was perhaps one in ten thousand in terms of its lucidity
and significance. Recorded and unrecorded history, and the
Internet today, is full of occasions where people met, were
intellectually sympatico, and managed to elaborate their
worldview in a way they found satisfactory; and quite often,
the participants in this process felt they were doing something
more than just personally exciting - they thought they were
finding the truth, getting it right where almost everyone
else got it wrong.

I appreciate that quite a few LW contributors will be thinking,
I'm not in this out of a belief that we're making history;
it's paying dividends for me and my peers, and that's good enough.
But you can't deny that there is a current here, a persistent
thread of opinion, which believes that LW is extremely important
or potentially so, that it is a unique source of insights, a workshop
for genuine discovery, an oasis of truth in a blind or ignorant
world, etc.

jimf said...

Some of that perception I believe is definitely illusory, and
comes from autodidacts thinking they are polymaths. That is,
people who have developed a simple working framework for many
fields or many questions of interest, and who then mistake
that for genuine knowledge or expertise. When this illusion
becomes a collective one, that is when you get true intellectual
cultism, e.g. the followers of Lyndon Larouche. Larouche
has an opinion on everything, and so to those who believe him
on everything, he is the greatest genius of the age.

Then, there are some intellectual tendencies here which, if not
entirely unique to LW, seem to be expressed with greater strength,
diversity, and elaboration than elsewhere. I'm especially thinking
of all the strange new views, expressed almost daily, about
identity, morality, reality, arising from extreme multiverse
thinking, computational platonism, the expectation of uploads...
That is an area where I think LW would unquestionably be of
interest to a historian of technological subcultural belief.
And I think it's very possible that some form of these ideas
will give rise to mass belief systems later in this century -
people who don't worry about death because they believe in
quantum immortality, popular ethical movements based on some
of the more extreme or bizarre conclusions being deduced from
radical utilitarianism, Singularity debates becoming an element
of political life. I'm not saying LW would be the source of
all this, just that it might be a bellwether of an emerging
zeitgeist in which the ambient technical and cultural environment
naturally gives rise to such thinking.

But is there anything happening here which will contribute to
intellectual progress? - that's my main question right now.
I see two ways that the answer might be yes. First, the ideas
produced here might actually be intellectual progress; second,
this might be a formative early experience for someone who
went on to make genuine contributions. I think it's likely
that the second option will be true of someone - that at least
one, and maybe several people, who are contributing to this
site or just reading it, will, years from now, be making
discoveries, in psychology or in some field that doesn't yet
exist, and it will be because this site warped their sensibility
(or straightened it). But for now, my question is the first
one: is there any intellectual progress directly occurring here,
of a sort that would show up in a later history of ideas?
Or is this all fundamentally, at best, just a learning experience
for the participants, of purely private and local significance?

jimf said...

And Alexander Kruel, then and later.

http://lesswrong.com/lw/4vb/can_we_stop_using_the_word_rationalism/
----------------
XiXiDu 19 March 2011 18 points

LW is nearly perfect but does lack self-criticism. I love
self-criticism and I perceive too much agreement to be boring.
One of the reasons why there is so much agreement here is
not that there is nothing wrong but that people who strongly
disagree either don't bother or are deterred by the reputation
system. How do I know that? The more I read the more I learn
that a lot of the basic principles here are not as well-grounded
as the commitment of the community would suggest. Recently
I wrote various experts in an effort to approach some kind
of 'peer-review' of LW. I got replies from people as diverse
as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce,
various economists, experts and influencer's. The overall
opinion so far is not so much in favor of this community.
Regarding the reputation system? People told me that it is
one of the reasons why they don't bother to voice their
opinion and lurk, but you could just read the infamous RationalWiki entry
[ http://rationalwiki.org/wiki/LessWrong#The_ugly ]
to get an idea of the general perception (although it improved
since my comment here,
[ http://lesswrong.com/lw/4g/eliezer_yudkowsky_facts/31yg ]
which they pasted into the talk page). I tried a few times to
question the reputation system here myself or ask if there
are some posts or studies showing that such systems do
subdue trolling but not at the price of truth and honesty,
that reputation systems do not cause unjustified conformity.
Sadly the response is often downvotes mixed with angry replies.
Another problem is the obvious arrogance here which is getting
more distinct all the time. There is an LW versus rest of
the world attitude. There is LW and then there are the irrational,
ordinary people. That's just sad and I'm personally appalled by it.

Here is how some people described LW when I asked them about it:

> ...a lot of impressive-sounding jargon and slogans, and
> not everything they say is false and foolish, but in my view
> they've just sprinkled enough mathematics and logic over
> their fantasies to give them a veneer of respectability.

or

> ...they are naïve as far as the nature of human intelligence
> goes. I think they are mostly very bright and starry-eyed
> adults who never quite grew out of their science-fiction
> addiction as adolescents. None of them seems to have a realistic
> picture about the nature of thinking...

Even though I am basically the only person here who is often openly
derogatory about this community, people seem to perceive it as too
much already. I am apparently just talking about the same old problems
over and over. Yet I've only been posting here since August 2010.
The problems have not been fixed. There are problems like the
increasing and unjustified arrogance, lack or criticism (let
alone peer-review) and an general public relations problem
(Scientology also gets donations ;-). But those problems don't matter.
What is wrong and what will probably never change is that mere
ideas are sold as 'laws' which are taken seriously to a dangerous
degree by some individuals here. This place is basically breeding
the first group of rationalists committed to do everything in
the name of expected utility. I think that is not only incredible
scary but also causes distress in people who are susceptible to
such thinking.

> ... this might be a formative early experience for someone
> who went on to make genuine contributions.

LW is certainly of great value and importance and I loved reading a
lot of what has been written so far. I would never suggest that LW is
junk but as long as it has the slightest problem with someone coming
here and proclaiming that you are all wrong then something is
indeed wrong.

jimf said...

As time goes by:

http://lesswrong.com/lw/aw7/muehlhausergoertzel_dialogue_part_1/
------------------
XiXiDu [Alexander Kruel]
17 March 2012 0 points

I notice the same in this dialogue that I notice when Eliezer Yudkowsky
talks to other people like Robin Hanson or Massimo Pigliucci.
Or when people reply to me on Less Wrong. There seems to be a fundamental
lack of understanding of what the other side is talking about.

. . .
===

In response to Muehlhauser-Goertzel Dialogue, Part 1
XiXiDu [Alexander Kruel]
17 March 2012 -15 points

If you want to talk to Less Wrong you have to be a fucking robot...
a psychopathic robot. Colloquial language and an informal style
are a red rag to them.
===

Will_Newsome 19 March 2012 4 points

XiXiDu, hear me: **LessWrong has contemptibly bad epistemic habits.**
They're kind of retarded and they don't realize it. Continued
disagreement with them is perfectly okay. If they can't satisfactorily
back up their claims that probably means they're way overconfident
in them. You shouldn't feel super stressed all the time just because
a group of self-aggrandizing nutjobs on the internet disagrees with
you about semi-metaphysical Far mode bullshit.
===

XiXiDu 17 March 2012 -10 points

. . .

> my own blog posts about AGI were being trolled by SIAI zealots
> (not the principals, I hasten to note) leaving nasty comments
> to the effect of “SIAI has proved that if OpenCog achieves human
> level AGI, it will kill all humans.“

Well, yeah. If you say anything critical the gang comes and calls
you a troll and what not. But never do they actually argue for their case.
===

ArisKatsaris 17 March 2012 10 points

> If you say anything critical the gang comes and calls you a troll and what not.

. . .

I for one, can no longer trust that anything you say is sincere or that
you even care about it one way or another. I start with assumptions of
honesty from other participants, but when such trust is violated,
you can't easily regain it.
===

And so it goes. . .

jimf said...

On the other hand:

http://rationalwiki.org/wiki/Talk:LessWrong
------------
Aris Katsaris
21 January 2013

Interesting irony: A mere two years ago (December 2010)
Kruel was calling rationalwiki "awful" and proclaimed LW
the most intelligent and rational community he knew of.
"I'm curious if you know of a more intelligent and rational
community than Less Wrong? I don't." And back then it was
me who criticized him on his unjust judgment on rationalwiki.
===

XiXiDu
21 January 2013

1.) LW is still the most rational and intelligent community
I know of. 2.) I still agree with almost everything of what
Yudkowsky wrote and which I have read and extrapolate to
be in agreement with him about mostly everything
3.) The RationalWiki entry has improved since I wrote that
comment. 4.) I now disagree with the part of my comment
referring to the Roko basilisk. It should be mentioned explicitly.
5.) Points #1,2 do not contradict with any of my criticism of
SI/LW/Yudkowsky.
===

I'm in love!
I'm a believer, I couldn't leave her if I tried. . .

YMMV. ;->