amor mundi

Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Tuesday, September 30, 2014

Today's Random Wilde

It is not logic that makes men reasonable, nor the science of ethics that makes men good.

Monday, September 29, 2014

Fear Itself

I'm an American terrified how readily Americans are terrorized. Polities, like people, can be driven either by love or by fear. Fear starves commonsense and commonwealth to feed the War Machine. Love, in turn, isn't for objects, love is for objections. King's Beloved Community, liberalism's most revolutionary self-image, for example, is not terminally reconciled but interminably reconciling. And in its ongoing reconciliation of human hopes and histories the Beloved Community reveals its indispensable devotion is to politics. Politics is premised on living selves changing in intercourse with different others, rather than the death-dealing defense of dead, closed, completed selves. That kind of moralizing, colonizing, controlling project is an anti-politics, not a politics, finally, for which differences are always threats and never promises, but it can drive polities just as it can drive people. For a time. Again, love against fear. And for the love of the open futurity invigorated by politics, FDR's admonition really is the true one: the only thing we have to fear is fear itself.

Sunday, September 28, 2014


There is no question that my microblogging practice has reshaped my blogging here -- my writerly habits and commitments have changed. Of course, teaching is no less demanding than it ever was: For most of the first decade of the blog this often meant dry writing spells while I was crafting lectures or grading papers, while now it means instead days in a row in which a Wilde quote or a slightly enriched tweet is all I have the time (or inclination) to post. Superficial contact yields an ongoingness that keeps the blog livelier than those protracted inactive gulfs did, I guess, but contentedness with these light touches makes it too easy for me to put a blog on the shelf spiritually. I also do know of course that more pithy bloggers like Atrios are perfectly capable of doing critical, provocative, engaging work that manages in the aggregate to express a real voice in posts that cleave close to tweet-craft. Despite all that, I do think Amor Mundi has lost something lately.

Saturday, September 27, 2014

On the Road to the Sexist Singularity

My partner Eric's AI-God embryo spellchecker just insisted "servicemembers" should be "servicemen."

Thursday, September 25, 2014

Marketing Militarism

Rebranding crime as terrorism sure is a great way to sell more militarism.

"We're Not Monsters, We’re Just Ignorant Bigoted Greedy Short-Sighted Assholes Who Don't Think Everybody Deserves Healthcare, Who Steal Personal Credit and Profit For Collective Accomplishments, and Who Can't Win Elections Unless We Cheat," Declares New GOP Ad

The ad is premised on the aptly impoverished conceit that consumerism is enough to constitute political solidarity (actually nobody is surprised to hear that people in both political parties buy commodities from ubiquitous brands).

And needless to say, the proportion of people of color in this ad flabbergastingly fails to represent the proportion of people of color who identify with the Republican party in actual reality.

I'm posting the clip simply because I appreciate the real desperation it reveals, and because it pays to remember that if people of color and single women actually voted in higher than historical numbers this November Democrats could not only remain safely in charge of the Senate (and hence protect the courts from forced-pregnancy zealots) but actually regain the House of Representatives (and hence set the stage for  comprehensive immigration reform, commonsense gun safety legislation, and a jobs bill to address the ongoing unemployment crisis and invest in renewable energy and transportation infrastructure). Hell, if we were actually accomplishing such legacy-making good things at home, maybe our "Reluctant Warrior" would be too distracted to find yet another (six and counting, if you're keeping track) country to bomb at billions a month of the dollars we somehow don't have enough of to fund effective stimulative food security programs to our own suffering citizens.

So, yeah, here it is. For the record, I never doubted Republicans are people. I grew up surrounded by Republicans. To be a bad person, you have to be a person first. I guess there's some hope in that.

Tuesday, September 23, 2014

Teaching Day

Benjamin and Adorno today in my undergraduate critical theory survey course this afternoon. Blogging low to no. Wonder if we just started calling healthcare, nutritional support, and education "bombing" we would discover we actually have the money for that too.

Saturday, September 20, 2014

Thesis, Prosthesis

Given the insistent distinction of evolution from prosthesis, it is intriguing to notice that every culture is exactly equally evolved and exactly equally prostheticized, and that to say either is to describe more or less the same thing.

Friday, September 19, 2014

Geek Rule Is Weak Gruel: Why It Matters That Luddites Are Geeks

I received an invitation Monday night to contribute a mini Op-Ed for the New York Times' "Room for Debate" forum, responding to the question, "What does it mean when geek culture becomes mainstream?" Seven contributions appeared last night, including mine. They all contain worthy insights, and my own piece isn't my favorite among them, though mine is the only one I agree with (given the passage of time, that isn't a given). The headline for the forum now reads When Geeks Rule. The introduction of the theme of "Rule" actually changes my sense of the question, but I will set that aside for a moment. I assume that all the resulting essaylets were shaped by the same forces impinging on mine -- three hundred words is a rapidly looming horizon, a no less rapidly looming deadline, and an editorial scalpel wielded by someone with a "readability" agenda, angels and ministers of grace defend us. Who knows the subtleties, qualifications, questions left on the cutting room floor for my fellow participants?

I must say that I bristled a bit at the premise of the question itself. In every version of my contribution to the debate I included this first sentence: "There has never been a monolithic geek culture and geeks have never exhibited a singular profile, and so one important part of what happens as geek-identification becomes more 'mainstream' is that its diversity becomes more visible." My piece went several rounds with my firm but friendly editor, and over and over that opening sentence was excised -- and over and over I kept replacing it. It did not appear in the final version, by which time I more or less gave up. I decided to take the retention of my Vulcan quotation and the derisive gesture at "techbros" as hard won personal victories and move on. Clearly, my editor thought this initial observation about the irreducibility of geekdom to any one fetishized symptom or signifier seemed redundant in an essay treating geek enthusiasms as a motor of diversification more generally (which is the emphasis of the essaylet as it stands, I'd say).

I still do wish that initial frame for the essaylet remained. For me, the essence of geekery is about enthusiasm, it is an appreciation of appreciation. Wil Wheaten famously captured the spirit of what I am talking about in a blog post and video that was deliriously circulated a few years back -- I assume its popularity signifies that it articulated unusually well something many geeks felt about themselves already -- saying of geek gatherings that in them one is "surrounded by people who love the same things you love, the way you love them. But... also... by people who love things you don’t even know about, but you love your respective things in the same way, so you get to love your thing enthusiastically, completely, unironically, without fear of judgement." Geekery, then, is a celebration of the pleasures and knowledges that uniquely derive from close, sustained attention to subjects. And geekery is indifferent to whether or not the subjects being attended to are conventionally appreciated or not, hence it becomes a site of diversity the "mainstreaming" or ramification of which might facilitate diversity more generally. That is my essaylet's wee conceit, such as it is.

One of the few contributions that seemed to me to be on that very geeky wavelength was Zeynep Tufekci's essay, which insisted that joy is at the heart of geekery, non-judgmental joy, creativity, expressivity, experimentation, and the rough-and-tumble of making stuff. I was less pleased with the way her piece seemed to be assimilating this making to what I regard as the mostly self-congratulatory self-promotional BS of an entrepreneurial tech-sector start-up culture, and the rather libertechbrotarian flavor-of-the-month "maker culture" and "Maker Faires" less about making than about being on the make, it seems to me, less about DIY than about privileged disavowals of interdependence, self-declared corporate-sponsored doers doing their dreary disrupting. A few years ago perhaps her go-to Maker vocabulary would have buzzed on about "Smart" blah-de-blah, a few years before that "Bright" this-n-that, a few years before that "Extreme," before that "Virtual," and so on (I'm sure I've missed a meme here or there, but to know one of them is really to know them all). I cannot say that I regard the for-profit tech-sector as a site of conspicuous creativity and ingenuity so much as a vast skim-scam operation repackaging stale useless crap to bamboozle rubes impressed by PowerPoint presentations and buzzy ad-copy, or appropriating the ideas and sweat of quiet coders collaborating amongst themselves without much fanfare and, usually, without much reward. Of course, those obscure coders are almost certainly geeks and they are indeed making stuff, but for me a teenage girl using a toothpick to paint a panel on the surface of a model of Kubrick's spacecraft Discovery or a fifty year old sowing the bodice of an Arwen Evenstar gown for his outfit the last night of a science fiction convention are more illustratively enthusiastic makers invigorating geekery with its and our abiding joy -- even though nobody on hand expects to cash out that creativity in some big score.

Zaheer Ali penned what is probably my favorite of all the contributions, precisely because he exposed the diversity and dynamism of geekdom always-already against the grain of what seemed to me otherwise to be a series of rather sadly reductive mis-identifications of geekery with white techbros dreaming their dumb deadly VC dreams in the SillyCon Valley. Not only did Ali remind the Times readership of the thriving afro-futural musical and literary lineages (no Samuel Delaney or Janelle Monae tho!) which are the site of so much of the aliveness of my own life-long geekery -- not least because for me the afro-futural has also been the indispensable site of so much vital subversive queergeekery -- but he also pointed to Melissa Harris Perry and her #nerdland. It isn't only because I'm such a nerdland fan that I was happy to see Ali insist on the example, but also because it again went against the reductive grain of so many of the contributions otherwise: Harris-Perry's excellent show is such a geek-fest because it is an academic space filled with earnest activism -- folks who have been shaped by what Foucault described as the "grey, meticulous and patiently documentary" work of research and who retain the fierce pleasure of discovery and relevance of lives devoted to these intense attentions. 

When, to the contrary, Judith Donath defines geekdom in her piece through its "affinity to math and science" I wonder if all the science fiction geek shippers who don't know much science beyond what they learned from their Cosmos blu-rays and all the literary historian geeks smelling of a day in the archives but who flunked math are utterly invisible to her in all their geek glory?  Donath writes that "Geeky fascination with what is undiscovered drives scientific curiosity and invention, but unmoored from the desire to learn and create, it's simply consumerism." I actually do not agree that all cultural reception and appropriation is "simply consumerism." But even so, her larger point that "obsessive video game playing" will not help solve "climate change, emerging diseases, vast inequality" is certainly true. But why exactly would one expect otherwise? I am not sure that playing video games is enough to make one a geek, any more than watching movies does, and I would like to hear much more about the work that is getting done in Donath's attribution of "obsessiveness" to such gamers, but neither can I accept Donath's identification of geekery with the rational, scientific thought that is indispensable (together with free and accountable democratic governance in the service of general welfare) to the solution of such problems. I think rationality and science are both much bigger and much older than geekery, and that geek enthusiasms are quite valuable in human lives even when they do not contribute to the also indispensably valuable problem-solving work of science, engineering, public investment and harm-reduction policy-making. So, too, reactionary unsustainable consumerism is obviously a bigger problem than is its capacity to colonize the geek imagination -- it isn't the fault of geeks that under capitalism all that is solid melts into air. As I tried to insinuate at least in my piece geekery could use a bit more criticism along with its enthusiasm as an intellectual mode, but too strict an identification of geekdom with progressive science and policy or with reactionary consumerism seems to me to lose track of what geekdom actually is.

Just to be clear, it's not that I think geekery lacks the "affinity to math and science" Donath mentions and which so many geeks so conspicuously exhibit, any more than I deny the connection of geekery to "coding" that Kimberley Bryant and William Powers emphasize in their respective pieces. I simply want to refuse any reduction of geekdom to math nerds or software coders or what have you, whether the reduction is made in a spirit of sympathy or hostility to those geeks who happen to be math nerds or software coders. Again, as I said in my excised opening framing, "There has never been a monolithic geek culture and geeks have never exhibited a singular profile," and this really matters to me. To say that geeks are becoming mainstream because so many people use the term "app" in everyday conversation seems to me as obfuscatory as imagining that geeks became mainstream when folks started to talk about going to "the talkies." I happen to think the present ubiquity of handhelds no more indicates a rise in "tech savvyness" than did the ubiquity of office typewriters in the seventies. Am I really supposed to think people stumbling around in the street staring at images of plates of food their friends have eaten immerses me in a more geeky world in some way? Why is that a world more geeky than one in which people are adjusting the dials of their radios or adjusting the rabbit ears on their tee vees or dipping their quills into inkwells? To put the point more urgently still, why on earth would the New York Times treat the ratings of the relentlessly predicable, unwatchably execrable "Big Bang Theory" as relevant to the subject at hand in any deep sense? Network execs assimilating social change into nonthreatening nonrepresentative cheez whiz isn't exactly anything new or interesting -- in fact it would take a whole hell of a lot of culture studies geeks in an academic conference to convince me there was anything more interesting to say about the "Big Bang Theory" than about "Full House" and "Two And A Half Men" (the proper generic assignment for that bleak business).

When I submitted my contribution for its first editorial bloodletting, I received the rather exasperated suggestion that perhaps I might want to jettison the piece and write instead about why I am a Luddite, refusing to live on the terms of this emerging geek mainstream. I had the sinking suspicion at that point that I had been invited to participate in this forum because of my contrarian anti-futurological critiques. While it is true that the techno-transcendental futurist discourses and formations I write about are indeed geek subcultures, they are far from representative of geekdom in my experience of it, and certainly the farthest thing from definitive to it. I critique futurological pseudo-science as a champion of consensus science, I critique futurological corporate-militarism as a champion of accountable evidentiary harm-reduction policy, I critique parochial plutocratic market futures as a champion of free futures, I critique futurist consumer fandoms as a fan myself of literary sf: inother words, my critiques are those of a lifelong queergeek defending a capacious geekdom on which I have depended for my flourishing, and sometimes for my sanity, all my life.

Contra Fredrik deBoer I believe both that geek enthusiasms are still happening at the margins, and that geek enthusiasms are still marginalizing folks. I don't agree that venture capitalists and sexist gamers are representative of geekery (although I am far from denying and I am full of decrying their terrible geek ways). Certainly I would never pretend that geekdom is somehow insulated from the white racist patriarchal extractive industrial corporate-militarist American society which incubates and marinates geekdom -- my piece concludes with observations and warnings very much to the contrary. But, again, I think that if one wants to get to the heart of geekdom, the better to understand the changes it might enable as it ramifies through digital networked media formations, it is important to get at actually representative and symptomatic figures. I don't deny that Bill Gates exhibits geek traits, but I do deny that there is anything characteristically geeky about the traits in Gates that make him rich, powerful, and famous in America.

Titanic "Geeks Rule" archetypes like Gates, Jobs, Wozniak attended or hung out in geek communities connected to great public universities. Many of their marketable notions were cribbed from the knowledge and efforts of geek researchers and geek enthusiasts who have vanished from history. The reframing of these figures as randroidal sooper-genius fountainheads of entrepreneurial innovation and beneficience disavowing their utter dependence on climates of intellectual discovery (and usually scads of military investment) is a plutocratic commonplace. These hushed up circumstances attending the birth of our ruling tech enterprises (at least Berners-Lee doesn't disavow the gestation of the Web itself in CERN) in such skim-scam operations is re-enacted endlessly in the life of such enterprises, as the ideas and wage-labor of cohorts of coders under contract or scooped up from the bubbling crap cauldron of the start-up lottery, arrive in their full flower in the ritual spectacles of hyper-individualized celebrity CEOs bringing out new (usually just repackaged) gewgaws gobbled up by technoscientifically illiterate pop-tech journalists who confuse gossip and informercial pieties for substance. All along this terrorizing trajectory from creative geek collaboration to eventual plutocratic profitability there are endless occasions for folks with ever more tenuous connections to actual efforts and ideas either to take credit or otherwise sell out their fellows in the hope of some piece of pie down the road. If I may say so, there is nothing particularly geeky about this all too conventionally American ugliness. I am the first to critique the deceptions of plutocratic entreprenurial capitalism and the devastations of soulless unsustainable consumerism -- but I simply think it is a mistake to reduce geekdom to either of these phenomena or to treat geekdom as a particularly illuminating window on their deadly play in the world.

I'll conclude with a word on the Luddites. I do not know what is worse from my perspective, that I was expected by the Times to be a Luddite and not a geek, or that the Times would seem to accept the facile characterization of the Luddites as "anti-technology" in a sense opposed to a no less facile characterization of "pro-technology" geeks. As I never tire of saying, there is no such thing as technology-in-general about which it makes any kind of sense to be either loosely "pro" or "con." The constellation of tools and techniques contains too many differences that make a difference for one to assume a sympathetic or antipathetic vantage on the whole, and it will be the assumptions and aspirations in the service of which these tools and techniques will be put that yield our sensible judgments of right and wrong in any case. To speak, as too many of the contributors to the forum did I am afraid, of "technology" becoming more prevalent, guiding, approved of in society in some general way in connection to the mainstreaming of geekery seems to me to be an utterly confused and confusing way of thinking about the technoscientific vicissitudes at hand. All tools and all techniques are texts: and the literary imagination has quite as much to say about their play as does the engineering imagination. All culture is prosthetic and all prostheses are culture: and hence all individuals and all cultures are exactly as "prostheticized" as every other. To press the point, the Luddites were champions of certain tools just as they were critics of others, the Luddites were masters of certain techniques just as they were suspicious of others. Like those skeptics and contrarians who tend to get called "Luddites" today because they hesitate to be steamrolled into panglossian celebrations of anti-democratic technocratic and algorithmic governance or of hucksters peddling techno-utopian memes and kremes, it is important to grasp that Ray Kurzweil is no more cyborgic in Haraway's sense than is John Zerzan. Given their enthusiasm about their treasured tools and techniques it seems to me frankly far more in point to say that Luddites ARE Geeks, that luddic resistance and play is another facet of geek multiculture rather than its comic-book antagonist. 

I think Geeks are less inclined to rule at all than to drool over their personal perfections. I know that I am. As I said in the conclusion of my piece, the crucial final point of which was another editorial casualty, I'm sorry to say: "Perhaps such a mainstream embrace of marginal enthusiasms can help America overcome the defensive anti-intellectual bearings of its ruling masculine culture. But clashes over sexist representations in science fiction, exposures of sexist assumptions and practices in science education and tech-sector hiring, as well as the sexist antics of the 'techbros' of venture capitalism demand that we treat such promises of change in a much more critical way than with the usual geek enthusiasm." That is to say, geekery remains for me an intellectualism fueled by diverse and dynamic enthusiasms for enthusiasms, but the democratizing promise of such geekery would require a criticality that I would not yet identify with geekery as much as I would like to. Discussions such as the forum itself might go some distance to introduce this indispensable dimension of criticism -- but I worry that too many of these critics reduced geekery to certain of its superficial symptoms or mis-identified it with larger social struggles draining geekery of its specificity. As I said before, it's not an easy thing to say much of substance in three hundred words on short notice with editorial shears clipping away, so I doubt any of this reflects badly on my fellow participants, most of whom said more interesting things I managed to do in the time and space on offer when it comes to it.

We've traced the call...'s coming from inside your brain.

Thursday, September 18, 2014

Oh, It's Up Already

Here's a link. The Vulcan reference survived the final edit. I'll call that a victory.

Give a Squat

Before you romanticize squatting and other forms of informal insecure settlement you should realize that it often amounts to lethally dangerous crowdsourced real estate development for eventual plutocratic profit.

All Culture Is Prosthetic, All Prostheses Are Culture

Every tool and every technique is a text.

Room for Debate

I contributed a short piece to a discussion about the "mainstreaming of geek culture" for the New York Times' "Room for Debate" recurring feature. It will probably appear online later today or tomorrow morning. The piece is very short and was written on very short notice -- and despite both of these facts it was still edited to within an inch of its life, a process involving more steps than I could have expected. I'm still surprised to have been asked to contribute such a piece in the first place, and I will have more to say about both the process as well as about what the piece ended up saying and not saying in another post.

Monday, September 15, 2014

Richard Jones: No Uploads For You!

Anti-nanocornucopian Richard Jones offers up a fine technical debunking of techno-immortalizing "uploads" in his latest Soft Machines post, Your Mind Will Not Be Uploaded:
I start by asking whether or when it will be possible to map out... all the connections between [the brain's] 100 billion or so neurons. We’ll probably be able to achieve this mapping in the coming decades, but only for a dead and sectioned brain; the challenges for mapping out a living brain at sub-micron scales look very hard. Then we’ll ask some fundamental questions about what it means to simulate a brain. Simulating brains at the levels of neurons and synapses requires the input of phenomenological equations, whose parameters vary across the components of the brain and change with time, and are inaccessible to in-vivo experiment. Unlike artificial computers, there is no clean digital abstraction layer in the brain; given the biological history of nervous systems as evolved, rather than designed, systems, there’s no reason to expect one. The fundamental unit of biological information processing is the molecule, rather than any higher level structure like a neuron or a synapse; molecular level information processing evolved very early in the history of life. Living organisms sense their environment, they react to what they are sensing by changing the way they behave, and if they are able to, by changing the environment too. This kind of information processing, unsurprisingly, remains central to all organisms, humans included, and this means that a true simulation of the brain would need to be carried out at the molecular scale, rather than the cellular scale. The scale of the necessary simulation is out of reach of any currently foreseeable advance in computing power.
As I said, Jones is offering up a mostly "technical" debunking of the kind that enthusiasts for techno-transcendental conceits decry the lack of in my own critiques. But Jones is far from denying that these technical discussions are embedded in rhetorical frames, narratives, metaphorizations, conceptual problematics from which they derive much of their apparent intelligibility and force even if such discursive operations are not his focus.

You will notice, for example, that even in his brief summary above the notion of "simulating brains" is figuring prominently. About this he declares earlier on:
I want to consider two questions about mind uploading, from my perspective as a scientist. I’m going to use as an operational definition of “uploading a mind” the requirement that we can carry out a computer simulation of the activity of the brain in question that is indistinguishable in its outputs from the brain itself. For this, we would need to be able to determine the state of an individual’s brain to sufficient accuracy that it would be possible to run a simulation that accurately predicted the future behaviour of that individual and would convince an external observer that it faithfully captured the individual’s identity. I’m entirely aware that this operational definition already glosses over some deep conceptual questions, but it’s a good concrete starting point. My first question is whether it will be possible to upload the mind of anyone reading this now. My answer to this is no, with a high degree of probability, given what we know now about how the brain works, what we can do now technologically, and what technological advances are likely in our lifetimes. My second question is whether it will ever be possible to upload a mind, or whether there is some point of principle that will always make this impossible. I’m obviously much less certain about this, but I remain sceptical.
It's truly important that Jones insists such a discussion "glosses over some deep conceptual questions" but I wonder why this admission does not lead to a qualification of the predicate assertion, "it’s a good concrete starting point." To the extent that "uploading" is proffered by futurologists as a techno-immortalization scheme it isn't at all clear that even a successful "simulation" would satisfy the demands that invest their scheme. I flog the talking point that "you are not a picture of you" endlessly to make this point, but one might just as easily point out that nobody seriously entertains the substitution of one person by a longer-lived imposter as a viable life-extension method. And while I would agree that selfhood is substantiated in an ongoing way by observers, I think it is important to grasp that these observations have objective, but also subjective and inter-subjective dimensions none of which are adequate on their own and all of which supplement one another -- and also that these observations are not merely of "already existing" characteristics but of sociocultural scripts and norms through which selves are constructed/enacted in time. Uploading discussions tend to deploy radically impoverished understandings not only of selfhoods themselves but the terms of their substantiation. Again, Jones does not deny any of this, and he tends to be enthusiastically open to such considerations, but I wonder whether technical debunkings that circumvent such considerations at their point of departure don't end up smuggling in more of the reductionist nonsense he critiques as much as I do than he would like to do.

Another case in point, in Jones' truly welcome intervention into the work of metaphors in such discussions:
[T]o get anywhere in this discussion, we’re going to need to immunise ourselves against the way in which almost all popular discussion of neuroscience is carried out in metaphorical language. Metaphors used clearly and well are powerful aids to understanding, but when we take them too literally they can be badly misleading. It’s an interesting historical reflection that when computers were new and unfamiliar, the metaphorical traffic led from biological brains to electronic computers. Since computers were popularly described as “electronic brains”, it’s not surprising that biological metaphors like “memory” were quickly naturalised in the way computers were described. But now the metaphors go the other way, and we think about the brain as if it were a computer (I think the brain is a computer, by the way, but it’s a computer that’s so different to man-made ones, so plastic and mutable, so much immersed in and responsive to its environment, that comparisons with the computers we know about are bound to be misleading). So if what we are discussing is how easy or possible it will be to emulate the brain with a man-made computer, the fact that we are so accustomed to metaphorical descriptions of brains in terms of man-made computers will naturally bias us to positive answers.
This is music to my ears, but I have to wonder if these considerations really go far enough. (I'm a rhetorician for whom figurative language is the end-all be-all, so a working scientist like Jones might fairly question whether I would ever be satisfied on this score.) A scholar like Katherine Hayles has done extensive historical research into the ways in which the metaphors Jones is talking about here actually formed information science and computer science disciplines from their beginnings, so creating the conceptual terrain on which computers would seem plausibly describable later as "electronic brains" in the first place, an abiding conceptual terrain eventuating later still in the more recent reductions of discursive and cultural dynamics to "memes" and "viralities" -- or critical interventions into them nonetheless as efforts at a kind of "immunization," for example. Jones' talk about how we have been trained to treat glib biological and informational identifications as neutrally descriptive reaches deeper even than he reveals: how else do we account for the paradoxical proposal of his parenthesis that the brain is properly identified as a computer, while at once the brain is disanalogous with any actual computer? These associations are, as Jones says, so deeply ingrained as to be "naturalized." For me, it is enormously interesting that minds have so often been metaphorized as prostheses -- before its figuration as computer the mind has been mirror, blank slate, distributed steam pipes -- and that new figures do not displace old ones even when they are at odds. Freud's steampunk mind of repressions, displacements, projections, outlets lives on in the discourse of many who have made the digital turn to the computational mind. Who knows how or why exactly?

I find nicely provocative Jones speculative proposal that "the origin of van der Waals forces, as a fluctuation force, in the quantum fluctuations of the vacuum electromagnetic field... could be connected to some fundamental unpredictability of the decisions made by a human mind" and I am pleased that he takes care to distinguish such a proposal from theories like that of Roger Penrose that "the brain is a quantum computer, in the sense that it exploits quantum coherence" (since, as he points out, it... [is] difficult to understand how sufficient coherence could be maintained in the warm and wet environment of the cell"). For me, it is not necessary to save an ontic indeterminism traditionally ascribed to human minds through such expedients, since I was convinced well over twenty years ago by Rorty's argument in "Non-Reductive Physicalism" (from Objectivity, Relativism, and Truth, Cambridge: 1991, pp. 114-115) that one can be quite "prepared to say that every event can be described in micro-structural terms" while at once conceding that "[f]or most interesting examples of X and Y (e.g., minds and bodies, tables and particles) there are lots of true sentences about X's in which 'Y' cannot be substituted for 'X' while preserving truth... This is because any tool which has been used for some time is likely to continue to have a use...  a tool can be discarded... [but i]n such cases X-talk just fades away; not because someone has made a philosophical or scientific discovery that there are no X's... [nor] by 'linguistic analysis,' but, if at all, in everyday practice." I am cheerful about the prospect that the free will indispensable to my sense of selfhood may be a perspectival or discursive effect, but however poetically or scientifically potent its jettisoning might eventually become, dispensing with it would unquestionably be stupid and sociopathic for now rather than saying better the way the world is or speaking more in the language the universe prefers to be described in or any nonsense of the sort. 

I doubt that saying so would go very far toward convincing Jones -- any more than most transhumanists, for that matter -- that my own preferred philosophical and rhetorical arguments are more clarifying than their preferred technical skirmishing over the state-of-the-art and projected technodevelopmental timelines. But, again, I do worry that accepting enough figurative (rhetorical) and conceptual (philosophical) assumptions to have mutually intelligible "technical" discussions with techno-transcendentalists, especially when there really is no need to do so, simply concedes too much ground to them for resulting debunkery at its best to do much good -- they can always respond, after all, with minute nonthreatening qualifications or terminological shifts that leave you debating angels on pinheads at the level of detail interminably.

I am quite sure Jones is alive to this very worry, as he concludes with a practical consideration that looms large in my critiques of the futurologists as well:
[I]deas like mind uploading are not part of the scientific mainstream, but there is a danger that they can still end up distorting scientific priorities. Popular science books, TED talks and the like flirt around such ideas and give them currency... that influences -- and distorts -- the way resources are allocated between different scientific fields. Scientists doing computational neuroscience don’t themselves have to claim that their work will lead to mind uploading to benefit from an environment in which such claims are entertained by people like Ray Kurzweil, with a wide readership... I think computational neuroscience will lead to some fascinating new science, but you could certainly question the proportionality of the resource it will receive compared to, say, more experimental work to understand the causes of neurodegenerative diseases.
As I point out above, the effort critically to address techno-transcendental formulations on something like their own terms can smuggle prejudicial and reductive assumptions, frames, and metaphorizations into the discourse of even their critics in ways that circumscribe deliberation on these questions and so set the stage for the skewed public policy language and funding priorities and regulatory affordances that Jones points to here.

As a demonstration of how easily this can happen, notice that when Jones offhandedly declares that "[i]t’s unquestionably true, of course, that improvements in public health, typical lifestyles and medical techniques have led to year-on-year increases in life expectancy," the inevitable significance with which techno-transcendentalists freight such claims remains the furthest thing imaginable from an "unquestionable tru[th]" (Jones declares it "hollow" just a few sentences later) and yet the faith-based futurological frame itself remains in force even as he proceeds with his case: Needless to say (or it should be), improvements in prenatal care, childhood nutrition and disease treatment can yield year-on-year increases in life expectancy without year-on-year increases in life expectancy for people over the age of sixty-five, for example, and even if improvements in the treatment of heart disease and a few other chronic health conditions of older age yield some improvement for that cohort as well, this can and does remain compatible with absolute stasis of human longevity at its historical upper bound even if presently intractable neurodegenerative diseases are ameliorated, thus bedeviling altogether the happy talk of techno-immortalists pretending actuarial arrows on charts are rocketing irresistibly toward 150 year lifespans even in the absence of their handwaving about nanobotic repair swarms and angelic mindclone uploads. It is not an easy thing to address a critique to futurologists on terms they will not dismiss as hate speech or relativistic humanities mush and yet continue to speak sense at all. Richard Jones continues to make the effort to do so -- and succeeds far better than I do at that -- and for that I am full of admiration and gratitude, even if I devote my energies in response to his efforts to sounding warnings anyway.

Sunday, September 14, 2014

Kitten Has Clause

I have a social contract with my cat. It is full of claws

Faith Based

There is no article of faith more fancifully supernatural than belief in a natural free market.

More Dispatches from Libertopia here.

Lost To Be Found

Contra privacy rights discourses, privacy-qua-privation is not only NOT threatened by algorithmic profiling but vastly expanded and imposed. The greatest danger of the surveilled/profiled algorithmically legible subject is that it threatens our public, not our private selves. What Arendt sought to describe and defend as "public happiness" is not available on the terms proffered the false publicity of the surveilled/profiled subject. But in my classrooms and in union meetings, in public demonstrations and assemblies, both the pining and the pleasure Arendt is talking about are still palpable. The phenomenological inhabitation of democracy and its openings onto experimental political practices -- what Arendt called "The Lost Treasure of the Revolutionary Tradition" -- is always lost, the better to be found, and re-founding, in each new generation.

For more, check out my Twitterized Privacy Treatise and under posts available under the "Surveillance" topic tag at The Superlative Summary.

I Me Me Mine

Every Apple product is a funhouse mirror, an iMe killing time.

More Fool Me Tee Vee here.

Think This Notion Has A Future?

Idea for sf movie, novel, game: tho' apparently undistinguished and from humble origins, straight white guy turns out to be The Chosen One!

Saturday, September 13, 2014

Blear of Death

I regularly hear from people the claim that the fear of death is universal. I honestly wonder if I am simply not understanding what people are trying to communicate when they say this? I mean, I do hope I am not in excruciating pain or completely isolated when I die -- but I am not thrilled at such prospects even when they are survivable, it's not death that is fearsome in them as far as I can tell. Sure, I want to live, it's the only game in town. But, I dunno, I really truly find a fear of death weird. And preoccupation with that fear seems to me especially terrible and deranging, even a kind of death in life. Perfectly nondescript unaccomplished persons manage to die all the time. Honestly, how hard or how odd can it be? I don't get it.

Baby Talk

Futurology in its techno-transcendental moods is infantile wish-fulfillment, in its existential risk moods infantile attention seeking.

More Futurological Brickbats here.

Thursday, September 11, 2014

Today's Random Wilde

A community is infinitely more brutalized by the habitual employment of punishment, than it is by the occasional occurrence of crime.

Wednesday, September 10, 2014

Booting the Boots

Weird how important the figure "boots on the ground" has become, especially given its ambiguous relation to literal boots on the ground.