tag:blogger.com,1999:blog-5956838.post5161835410615696032..comments2023-11-22T01:14:54.298-08:00Comments on amor mundi: Fluffing YudDale Carricohttp://www.blogger.com/profile/02811055279887722298noreply@blogger.comBlogger12125tag:blogger.com,1999:blog-5956838.post-72193166375474812092014-10-22T11:43:11.741-07:002014-10-22T11:43:11.741-07:00> "It is easier for a camel to pass throug...> "It is easier for a camel to pass through the eye of a needle if it is<br />> lightly greased."<br />><br />> -- Kehlog Albran, "The Profit"<br /><br />Lightly greasy:<br /><br />http://elfs.livejournal.com/1197817.html<br />------------------<br />Elf M. Sternberg<br />January 22nd, 2010 <br /><br />[W]e, human beings, have **purpose** of some kind. We fight like<br />hell to fulfill it, whatever it is, and we're good at the<br />consequential purpose of reproducing to cover the planet<br />like mad. But that purpose is arbitrary, emergent because<br />that's the way evolution works. **All** our purposes are arbitrary<br />and emergent: barring a theological excuse, we're making<br />it up as we go along, picking and choosing the ones that<br />appeal to us. . .<br /><br />In order for there to be a universe in which we limited,<br />organic human beings have a place beside our superhuman progeny,<br />the superhuman progeny must actively want (that's the emotion<br />they need, to decide for our survival) us to be around. We<br />must be, quite literally, a subject of their, for lack of<br />a better term, a posthuman term, religion. . .<br /><br />AIs won't emerge through the standard evolutionary model and<br />will not have the exaptive outcomes of evolutionary ecology.<br />They will emerge due to our desires. We will oversee the process.<br />We have a chance to get it right. <br /><br />There is, however, only exactly one chance. Between rampancy and<br />failure, we must pass through the eye of the needle and create AIs<br />that **like** and **want** us, no rationale needed, and if questions<br />are asked, the AIs must be satisfied, as we are satisfied, that<br />in an arbitrary and uncaring universe, they want to keep surviving. . .<br />and they want us to keep surviving right along with them.<br />[Eliezer] Yudkowsky is working harder and smarter on giving humanity<br />that chance than any other thinker on the issue of AI sentience.<br />He should be given his due.<br />====<br /><br />;->jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-30740100247237763302014-10-16T15:45:18.254-07:002014-10-16T15:45:18.254-07:00And I say again unto you. . .
> https://plus.g...And I say again unto you. . .<br /><br />> https://plus.google.com/+TimCaswell/posts/QJcpJei9Muy#+TimCaswell/posts/QJcpJei9Muy<br />> ------------------------<br />> Isaac Schlueter<br />><br />> . . .<br />><br />> It is easier to climb a mountain, or get a black belt<br />> in a martial art, or learn to fly a plane, or write an<br />> operating system, than it is to change even a single idea<br />> in your head. . .<br /><br />Similarly, "It is easier for a camel to go through the eye<br />of a needle, than for a rich man to enter into the kingdom of God. . ."<br /><br />-- Mark 10:25<br /><br />and also<br /><br />"It is easier for a camel to pass through the eye of a needle if it is<br />lightly greased."<br /><br />-- Kehlog Albran, "The Profit"<br /><br />;-><br />jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-10300558819269797342014-10-16T15:44:17.927-07:002014-10-16T15:44:17.927-07:00> [He] has spent the past 14 years on the inter...> [He] has spent the past 14 years on the internet bombarding gullible<br />> nerds with techno-babble. . .<br /><br />Like guy I used to read on an Enneagram (of Personality)<br />forum more than a decade ago. (He was fun to read back then, but he's also<br />an Ayn Rand fan, and there was that unfortunate episode with his<br />roommate's cat.)<br /><br />https://plus.google.com/+TimCaswell/posts/QJcpJei9Muy#+TimCaswell/posts/QJcpJei9Muy<br />------------------------<br />Isaac Schlueter<br /><br />Aug 9, 2012<br /><br />There are many great materials on decision theory, cognitive distortions,<br />irrationality, meditation, motivation, and influence. I think anyone<br />with a brain ought to study the literature on the subject, for the<br />same reason that I think drivers ought to get a license first. This<br />series from Eliezer Yudkowsky is about the best introduction/overview<br />to the art and science of self-updating that I've ever seen, and<br />the bibliographies on each article often have very useful links:<br />http://wiki.lesswrong.com/wiki/How_To_Actually_Change_Your_Mind<br /><br />It is no small task. It is easier to climb a mountain, or get a<br />black belt in a martial art, or learn to fly a plane, or write an<br />operating system, than it is to change even a single idea in your head.<br />It takes practice, work, dedication, constant vigilance, and unshakable<br />persistence in the face of repeated failure.<br /><br />Honest rationalism requires a tremendous faith, though certainly not<br />in any sort of "higher power". It requires a faith in yourself and your<br />ability to keep going. This belief in oneself certainly must come in<br />advance of evidence, because the evidence is clear: we are all weak,<br />mostly wrong most of the time, and sudden enlightenment is a fairy tale.<br />But do we stop writing software just because there will be bugs?<br />Of course not. It is only through incremental self-updating that we<br />can improve, and that can only be done if you believe you can do it.<br />It's much easier to just give up, and unlike climbing a mountain,<br />you won't even notice that you have given up.<br /><br />You can never stop exercising, or the muscles turn to flab, and you become<br />just another insane person driving their brain without a license.<br />I don't think it's possible to keep to The Way unless you have a goal<br />other than the practice itself; the "something to protect" that<br />gives the superhero a reason to focus their power. It requires a<br />connection to the world and the people in it; the courage to face<br />overwhelming odds; the humility to admit failure and swallow pride<br />and always rise above our petty grievances and insecurities; the<br />audacity to say, "I am fit to live in this world. I will stare reality<br />in the face, and never flinch away from it." The rewards are significant.<br /><br />This is why I laugh when a religious person tells me that I lack faith<br />or spirituality. They usually don't know the meaning of those words.<br /><br />Tsuyoku naritai!<br />====<br /><br />Enchanté to you too, bub.jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-76147480417215182222014-10-16T11:53:39.724-07:002014-10-16T11:53:39.724-07:00> [H]e claims to be a 'rationalist', bu...> [H]e claims to be a 'rationalist', but in fact he's a religious fundamentalist. . .<br />><br />> [He] has spent the past 14 years on the internet bombarding gullible<br />> nerds with techno-babble in order to get them to send him money. . .<br />><br />> Black-and-white cartoon moral theories (Utilitarianism),<br />> black-and-white cartoon economic ideas (Libertarianism),<br />> black-and-white cartoon philosophies of science (Bayesianism). . .<br /><br />http://www.motherjones.com/politics/2014/08/inquiring-minds-arie-kruglanski-psychology-extremism-isis<br />----------------------<br />According to University of Maryland psychologist. . . Arie Kruglanski,<br />who has studied scores of militant extremists. . . [the<br />attraction of an extreme ideology is n]ot just its content,<br />but the mindset that it indicates -- one that sees the<br />world in sharp definition, no shades of gray.<br />"These extreme ideologies have a twofold type of appeal," explains<br />Kruglanski on the latest Inquiring Minds podcast. "First of all,<br />they are very coherent, black and white, right or wrong.<br />Secondly, they afford the possibility of becoming very unique,<br />and part of a larger whole." . . .<br /><br />That kind of belief system, explains Kruglanski, is highly attractive<br />to young people who lack a clear sense of self-identity, and are<br />craving a sense of larger significance. . .<br /><br />These young people seem to have what psychologists call a very<br />strong "need for cognitive closure," a disposition that leads to<br />an overwhelming desire for certainty, order, and structure in one's<br />life to relieve the sensation of gnawing -- often existential --<br />doubt and uncertainty. According to Kruglanski, this need is<br />something everyone can experience from time to time. We all<br />sometimes get stressed out by uncertainty, and want answers.<br />We all feel that way in moments, in particular situations, but<br />what Kruglanski shows is that some of us feel that way more<br />strongly, or maybe even **all the time**. And if you go through<br />the world needing closure, it predisposes you to seek out the<br />ideologies and belief systems that most provide it.<br /><br />Fundamentalist religions are among the leading candidates. Followers. . .<br />"know exactly what is right and what is wrong, how to behave in<br />every situation," explains Kruglanski. "It's very normative and constraining,<br />and a person who is a bit uncertain, has the need for closure, would be<br />very attracted to an ideology of that kind." And for an outsider. . .<br />drawn to that sense of certainty that it imparts, Kruglanski adds,<br />you then want to prove yourself. To show your total devotion and<br />commitment to the cause. . .<br />====jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-7631326811177456592014-10-14T13:17:25.019-07:002014-10-14T13:17:25.019-07:00The thing about Yudkowsky, he claims to be a '...The thing about Yudkowsky, he claims to be a 'rationalist', but in fact he's a religious fundamentalist, no different at all from any other run-of-the-mill religious fanatics.<br /><br />The fundamentalist sees the world in absolute black-and-white terms, based on elaborate intellectual castles-in-the-air (usually having little or not relation to practical reality), and no other perspectives can be tolerated. To the fundamentalist, anyone disagreeing with them is 'mad' or 'stupid' and must be converted or destroyed at all costs.<br /><br />Yudkowsky has spent the past 14 years on the internet bombarding gullible nerds with techno-babble in order to get them to send him money. <br /><br />Consider the really weird, extremist ideologies he's pushed over the years (Libertarianism, Utilitarianism, Bayesianism).<br /><br />Black-and-white cartoon moral theories (Utilitarianism), black-and-white cartoon economic ideas (Libertarianism), black-and-white cartoon philosophies of science (Bayesianism), see the pattern here? - it's clear Yudkowsky is a closet religious fundamentalist.<br /><br />As to the idea that 'super-intelligence' is some sort of magic god , well this is just silliness. There's lots of things super-intelligence couldn't do - it can't break the laws of physics for example, and it can't crack unbreakable encryption codes - it can't over-turn logic and mathematics. Quite apart from these clear abstract limits, it would also be limited by many real-world practical constraints (resources, time, human nature, politics etc.) just like you or I. <br /><br />And of course Dale, you're right that the very term 'super-intelligence' itself has not even been defined properly, let alone have we seen any actual real-world advances towards it on the near-horizon. <br /><br />What would a real super-intelligence be like? I'm pretty sure that blindfolded monkeys throwing darts at a board with random wild speculations written on it would be just as accurate as the techno-babble that's spewed endlessly from Yudkowsky's mouth over the last 14 years.ZARZUELAZENhttps://www.blogger.com/profile/07742429508206464486noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-38304086628554465822014-10-14T01:08:13.791-07:002014-10-14T01:08:13.791-07:00Thanks, Jim -- I can always count on you!Thanks, Jim -- I can always count on you!Dale Carricohttps://www.blogger.com/profile/02811055279887722298noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-18151748214324334362014-10-14T00:05:28.277-07:002014-10-14T00:05:28.277-07:00Something like that rationale (not the imminence o...Something like that rationale (not the imminence of the Singularity per se,<br />but that of "the danger of unfriendly AI. . . [being] so near --<br />as early as tomorrow") was attributed to Yudkowsky by Declan McCullagh<br />in that infamous 2001 _Wired_ article.<br /><br />http://archive.wired.com/science/discoveries/news/2001/04/43080<br />----------<br />Making HAL Your Pal<br />Declan McCullagh<br />04.19.01<br /><br />. . .<br /><br />"I've devoted my life to this," says Yudkowsky, a self-proclaimed "genius"<br />who lives in Atlanta and opted out of attending high school and college.<br /><br />It's not for lack of smarts. He's a skilled, if verbose, writer and an<br />avid science-fiction reader who reports he scored a perfect 1600 on his SATs.<br /><br />Yudkowsky's reason for shunning formal education is that he believes<br />the danger of unfriendly AI to be so near -- as early as tomorrow --<br />that there was no time for a traditional adolescence. "If you take the<br />Singularity seriously, you tend to live out your life on a shorter<br />time scale," he said.<br />====jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-32338614713604078892014-10-13T18:53:29.882-07:002014-10-13T18:53:29.882-07:00I said in this post, "I seem to recall that Y...I said in this post, "I seem to recall that Yudkowsky first claimed he didn't need to get a degree in any of the fields on which he still illiterately pontificates because the singularity was supposedly so near it would be a waste of time." As someone who has been reading and critiquing and sparring with transhumanists and singularitarians for nearly the whole online lifespan of these movements I meant what I said in making that observation quite literally. That is, I personally do recall this reasoning being made and justified by Yudkowsky quite early on, long before most people took notice of him at all. Does anybody have a citation trail that would nudge that observation out of its present anecdotal recollection status? I ask because the observation has been called out as a falsehood elsewhere. If I am misremembering this I am happy to be corrected and will say so. But like it or not, I must say I don't think I am misremembering this at all.Dale Carricohttps://www.blogger.com/profile/02811055279887722298noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-91970819837558884662014-10-13T11:18:30.789-07:002014-10-13T11:18:30.789-07:00look at the exponential trend
That's always a...<i>look at the exponential trend</i><br /><br />That's always an invitation to crazytown.Dale Carricohttps://www.blogger.com/profile/02811055279887722298noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-6559675979043590282014-10-13T11:02:42.106-07:002014-10-13T11:02:42.106-07:00> > We've seen this movie before, haven&...> > We've seen this movie before, haven't we?<br />><br />> Several movies, in fact. ;-><br /><br />And stay tuned for the next one(s):<br /><br />http://en.wikipedia.org/wiki/Terminator:_Genisys<br />et seq.<br /><br />Fasten your seat belts!<br /><br />http://www.businessinsider.com/louis-del-monte-interview-on-the-singularity-2014-7<br />----------------<br />"Today there's no legislation regarding how much intelligence a<br />machine can have, how interconnected it can be. If that continues,<br />look at the exponential trend. We will reach the singularity in<br />the timeframe most experts predict. From that point on you're<br />going to see that the top species will no longer be humans,<br />but machines."<br /><br />These are the words of Louis Del Monte, physicist, entrepreneur,<br />and author of "The Artificial Intelligence Revolution." Del Monte<br />spoke to us over the phone about his thoughts surrounding artificial<br />intelligence and the singularity, an indeterminate point in the<br />future when machine intelligence will outmatch not only your own<br />intelligence, but the world's combined human intelligence too.<br /><br />The average estimate for when this will happen is 2040, though<br />Del Monte says it might be as late as 2045. Either way, it's a<br />timeframe of within three decades.<br />====<br />jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-69153192152934169312014-10-13T10:23:23.222-07:002014-10-13T10:23:23.222-07:00> I seem to recall that Yudkowsky first claimed...> I seem to recall that Yudkowsky first claimed he didn't need<br />> to get a degree in any of the fields on which he still illiterately<br />> pontificates because the singularity was supposedly so near it<br />> would be a waste of time. <br /><br />http://ieet.org/index.php/IEET/more/scaruffi20141012<br />-------------------<br />[A]n age that is rapidly losing faith in the traditional God<br />desperately needs to find and found a new religion, and the<br />Singularity is the best option that some people have in the<br />21st century. The human mind is programmed to believe in the<br />supernatural. That is one of the limitations of the human mind<br />and all this talk about the Singularity is nothing but a<br />new modern proof of that limitation.<br />====<br /><br />http://www.scaruffi.com/singular/sin49.html<br />-------------------<br />A Conclusion: Sociology Again<br /><br />Humans have been expecting a supernatural event of some<br />kind or another since prehistory. Millions of people are<br />still convinced that Jesus will be coming back soon, and millions<br />believe that the Mahdi will too. The Singularity risks becoming<br />the new religion for the largely atheistic crowd of the<br />high-tech world. Just like with Christianity and Islam,<br />the eschatological issue/mission then becomes how to save<br />oneself from damnation when the Singularity comes, balanced<br />by the faith in some kind of resurrection. We've seen this<br />movie before, haven't we?<br />====<br /><br />Several movies, in fact. ;-><br /><br />http://www.independent.co.uk/news/science/stephen-hawking-transcendence-looks-at-the-implications-of-artificial-intelligence--but-are-we-taking-ai-seriously-enough-9313474.html<br />-------------------<br />Stephen Hawking: 'Transcendence looks at the implications of<br />artificial intelligence - but are we taking AI seriously enough?' <br /><br />With the Hollywood blockbuster Transcendence playing<br />in cinemas, with Johnny Depp and Morgan Freeman showcasing<br />clashing visions for the future of humanity, it's tempting<br />to dismiss the notion of highly intelligent machines<br />as mere science fiction. But this would be a mistake,<br />and potentially our worst mistake in history.<br />====jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.comtag:blogger.com,1999:blog-5956838.post-32430966995266875822014-10-12T16:20:09.102-07:002014-10-12T16:20:09.102-07:00> MIRI in its present incarnation overlaps with...> MIRI in its present incarnation overlaps with Meta-Med in many of its<br />> funders and advisors, as full-on fulminating techno-transcendentalism still<br />> remains confined to a fairly cramped sub(cult)ure. Specific names that<br />> come up in the comment from the Moot about robocultic citations in<br />> Bostrom's bibliography are "Stuart Armstrong, Kaj Sotala, Paul Christiano,<br />> Wei Dai, Peter de Blanc, Nick Hay, Jeff Kaufman, Roko Mijic, Luke Muehlhauser,<br />> Carl Shulman, Michael Vassar, and nine different Eliezer publications."<br /><br />Daily dumb. . . who, me?<br /><br />http://www.scaruffi.com/singular/sin25.html<br />----------------<br />[Excerpts from] Demystifying Machine Intelligence:<br />Why the Singularity is not Coming any Time Soon And Other Meditations<br />on the Post-Human Condition and the Future of Intelligence<br />by Piero Scaruffi<br /><br />The Audience of the Singularity<br /><br />I organize many events in the San Francisco Bay Area. I am always<br />frustrated that so few young people show up. I routinely attend technical<br />and scientific talks at prestigious organizations like Stanford University,<br />the Computer History Museum and Xerox PARC. They are free and frequently<br />feature top-notch speakers. At least half of the audience is consistently<br />made of grey-haired people. . . [I]t is mostly older inactive engineers<br />who hear the distinguished researchers talk about the state of high technology. . .<br />The reason why younger people don't come in proportional numbers to educational<br />events is simple: they are busy at work or studying. . . [or they're]<br />fed up after so many years of college and just want to party in the evening.<br />Younger people are therefore more likely to get their technology news from<br />attending yearly conferences and trade shows and (on a daily basis) from<br />reading popular bloggers. What they get is, in other words, press releases.<br />(Don't even try to convince me that your favorite tech blogger is competent<br />and reliable: he is just a flywheel in a highly efficient system to distribute<br />press releases by high-tech companies, and mostly product announcements,<br />with little or no knowledge of the science behind technology and little or<br />no contacts in the labs that produced that science before some startup turned it<br />into a popular gadget). Therefore young technology buffs are more likely to welcome<br />enthusiastically the news that some startup has introduced a new. . .<br />[gadget when] the startup that made that announcement is simply<br />looking for funds from venture capitalists and needs to create buzz<br />around its business plan. . . They are also the ones who tend to believe<br />that Artificial Intelligence has built incredibly smart machines and that<br />the Singularity is coming soon.<br /><br />That is half of the audience that absorbs enthusiastically any news about machine<br />intelligence. The other half is the one that I compared to religiously devout<br />people who simply have an optimistic view of all these press releases. . .<br />====<br /><br />Help me, io9, you're my only hope!<br />jimfhttps://www.blogger.com/profile/04975754342950063440noreply@blogger.com