I think the Sequences got everything right and I agree with them completely... Even the controversial things, like: I think the many-worlds interpretation of quantum mechanics is the closest to correct and you're dreaming if you think the true answer will have no splitting (or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable). I think cryonics is a swell idea and an obvious thing to sign up for if you value staying alive and have enough money and can tolerate the social costs. I think mainstream science is too slow and we mere mortals can do better with Bayes. I am a utilitarian consequentialist and think that if allow someone to die through inaction, you're just as culpable as a murderer. I completely accept the conclusion that it is worse to put dust specks in 3^^^3 people's eyes than to torture one person for fifty years. I came up with it independently, so maybe it doesn't count; whatever. I tentatively accept Eliezer's metaethics, considering how unlikely it is that there will be a better one (maybe morality is in the gluons?) "People are crazy, the world is mad," is sufficient for explaining most human failure, even to curious people, so long as they know the heuristics and biases literature.Yes, of course it is ridiculous to pretend that the many worlds interpretation is so non-problematic and non-controversial that one would have to be "dreaming" to entertain the possibility that it may one day be supplanted by a better theory that looks more like alternatives already on offer -- and, yes, it is especially ridiculous to pretend so on the basis of not knowing more about physics than a non-physicist high school drop-out guru-wannabe who thinks he is leading a movement to code a history-shattering Robot God who will solve all our problems for us any time soon.
Yes, of course it is ridiculous to believe that your frozen, glassified, hamburgerized brain will be revived and sooper-enhanced and possibly immortalized by swarms of billions of robust reliably controllable and programmable self-replicating nanobots, and/or your info-soul "migrated" via scanning into a cyberspatial Holodeck Heaven where it will cavort bug-and-crash-and-spam free for all eternity among the sexy sexbots.
Yes, of course it is ridiculous to imagine non-scientists in an online Bayes-Theorem fandom can help accomplish warranted scientific results faster than common or garden variety real scientists can themselves by running probability simulations in your club chairs or on computer programs in addition to or even instead of anybody engaging in actually documentable, repeatable, testable experiments, publishing the results, and discussing them with people actually qualified to re-run and adapt and comment on them as peers.
Yes, of course it is ridiculous to think of oneself as the literal murderer of every one of the countless distant but conceivably reachable people who share the world with you but are menaced by violence, starvation, or neglected but treatable health conditions even if it is true that not caring at all about such people would make you a terrible asshole -- and, yes, it is ridiculous to fall for the undergraduate fantasy that probabilistic formulae might enable us to transform questions of what we should do into questions of fact in the first place.
Yes, of course it is ridiculous to say so many nonsensical things and then declare the rest of the world mad.
Yes, it is ridiculous that the very same Eliezer Yudkowsky treated as the paragon against whose views all competing theories of physics are measured is the very same person endorsed a few sentences later as the meta-ethical paragon compared to whose views all competing moral philosophies are judged wanting. Sure, sure, your online autodidact high priest deserves the Nobel Prize for Physics and the Nobel Peace Prize on top of it in addition to all that cash libertopian anti-multiculturalist reactionary and pop-tech CEO-celebrity Peter Thiel keeps giving him for being an even better Singularipope than Kurzweil. Who could doubt it?
Perhaps grasping the kind of spectacle he is making of himself, our True Believer offers up this defensive little bit of pre-emptive PR-management in his post (not that it yields any actual qualification of the views he espouses or anything): "This of course makes me a deranged, non-thinking, Eliezer-worshiping fanatic for whom the singularity is a substitute religion." Hey, pal, if the shoe hurts, you're probably wearing it.
By the way, if anybody is wondering just what The Sequences are, you know, the ones that presumably "get everything right" -- no, nothing culty there -- they are topical anthologies of posts that have appeared on Less Wrong (major contributions written by, you guessed it, Eliezer Yudkowsky, naturellement) and function more or less as site FAQs with delusions of grandeur. While not everything in The Sequences is wrong, little that isn't wrong in them isn't also widely grasped and often endorsed by all sorts of folks who aren't also members of Robot Cults who think they are the only ones who aren't wrong, er, are "less wrong" -- which is the usual futurological soft shoe routine, after all.
Inspired by the aggressive-defensive post I have been dissecting so far, another True Believer offered up -- again, all in good funny fun, right, right? -- the following intriguing, revealing Robot God Apostle's Creed for the Less Wrong Throng, which I reproduce here for your delight and edification:
I believe in Probability Theory, the Foundation, the wellspring of knowledge,Nothing to see here, folks. For more on how totally not a cult the Robot Cult is, see this and this; and for more on the damage even so silly a cult as the Robot Cult can do, see this and this.
I believe in Bayes, Its only Interpretation, our Method.
It was discovered by the power of Induction and given form by the Elder Jaynes.
It suffered from the lack of priors, was complicated, obscure, and forgotten.
It descended into AI winter. In the third millennium it rose again.
It ascended into relevance and is seated at the core of our FAI.
It will be implemented to judge the true and the false.
I believe in the Sequences,
Many Worlds, too slow science,
the solution of metaethics,
the cryopreservation of the brain,
and sanity everlasting.