Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All
Tuesday, December 30, 2014
Top Posts for 2014
14. "Summoning the Demon": Robot Cultist Elon Musk Reads from Robo-Revelations at MIT October 27
13. Gizmuddle: Or, Why the Futuristic Is Always Perverse January 25
12. The Future Is A Hell of a Drug April 7
11. Car Culture Is A Futurological Catastrophe January 14
10. Very Serious Robocalyptics October 5
9. Em Butterfly: Robot Cultists George Dvorsky and Robin Hanson Go Overboard For Robo-Overlords February 24
8. Robot Cultist Martine Rothblatt Is In the News September 9
7. Geek Rule Is Weak Gruel: Why It Matters That Luddites Are Geeks September 19
6. R.U. Sirius on Transhumanism October 19
5. Rachel Haywire: Look At Me! Look At Me! Even If There's Nothing To See! August 18
4. It's Now Or Never: An Adjunct Responds to SFAI's Latest Talking Points May 5
3.Techbro Mythopoetics December 22
2. San Francisco Art Institute Touts Diego Rivera Fresco Celebrating Labor Politics While Engaging in Union Busting May Day
...and number 1. Forum on the Existenz Journal Issue, "The Future of Humanity and the Question of Post-Humanity" March 9
To round the list out to a nice full fifteen, I append not a hit but a miss, a post fewer people got a kick out of the first time around than I expected, given what most people come here to read: Tragic Techbrofashionistas of The Future Put. A. Phone. On. It! from January 6.
Apart from that last addition, these are essentially the most widely read of this year's posts, excluding a few popular but comparatively insubstantial one-liners. I'll share a few observations about these in the annual State of the Blog post to be written hungover from my bunker come the new year. You can compare these to the listicles from the last couple of years if you like: Top Posts for 2012 and Top Posts for 2013.
13. Gizmuddle: Or, Why the Futuristic Is Always Perverse January 25
12. The Future Is A Hell of a Drug April 7
11. Car Culture Is A Futurological Catastrophe January 14
10. Very Serious Robocalyptics October 5
9. Em Butterfly: Robot Cultists George Dvorsky and Robin Hanson Go Overboard For Robo-Overlords February 24
8. Robot Cultist Martine Rothblatt Is In the News September 9
7. Geek Rule Is Weak Gruel: Why It Matters That Luddites Are Geeks September 19
6. R.U. Sirius on Transhumanism October 19
5. Rachel Haywire: Look At Me! Look At Me! Even If There's Nothing To See! August 18
4. It's Now Or Never: An Adjunct Responds to SFAI's Latest Talking Points May 5
3.Techbro Mythopoetics December 22
2. San Francisco Art Institute Touts Diego Rivera Fresco Celebrating Labor Politics While Engaging in Union Busting May Day
...and number 1. Forum on the Existenz Journal Issue, "The Future of Humanity and the Question of Post-Humanity" March 9
To round the list out to a nice full fifteen, I append not a hit but a miss, a post fewer people got a kick out of the first time around than I expected, given what most people come here to read: Tragic Techbrofashionistas of The Future Put. A. Phone. On. It! from January 6.
Apart from that last addition, these are essentially the most widely read of this year's posts, excluding a few popular but comparatively insubstantial one-liners. I'll share a few observations about these in the annual State of the Blog post to be written hungover from my bunker come the new year. You can compare these to the listicles from the last couple of years if you like: Top Posts for 2012 and Top Posts for 2013.
Friday, December 26, 2014
The Inevitable Cruelty of Algorithmic Mediation
Also posted at the World Future Society.
On Christmas Eve, Eric Meyer posted a devastating personal account reminding us of the extraordinary cruelty of the lived experience of ever more prevailing algorithmic mediation.
Meyer's Facebook feed had confronted him that day with a chirpy headline that trilled, "Your Year in Review. Eric, here's what your year looked like!" Beneath it, there was the image that an algorithm had number-crunched to the retrospective forefront, surrounded by clip-art cartoons of dancing figures with silly flailing arms amidst balloons and swirls of confetti in festive pastels. The image was the face of Eric Meyer's six year old daughter. It was the image that had graced the memorial announcement he had posted upon her death earlier in the year. Describing the moment when his eye alighted on that adored unexpected gaze, now giving voice to that brutally banal headline, Meyer writes: "Yes, my year looked like that. True enough. My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully."
Meyer's efforts to come to terms with the impact of this algorithmic unkindness are incomparably more kind than they easily and justifiably might have been. "I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases." To emphasize the force of this point, "Inadvertent Algorithmic Cruelty" is also the title of Meyer's meditation. "To show me Rebecca’s face and say 'Here’s what your year looked like!' is jarring," writes Meyer. "It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate." But just what imaginary scene is being conjured up in this exculpatory rhetoric in which inadvertent cruelty is "coming from code" as opposed to coming from actual persons? Aren't coders actual persons, for example?
Needless to say, Meyers has every right to grieve and to forgive and to make sense of these events in the way that works best for him. And of course I know what he means when he seizes on the idea that none of this was "a deliberate assault." But it occurs to me that it requires the least imaginable measure of thought on the part of those actually responsible for this code to recognize that the cruelty of Meyer's confrontation with their algorithm was the inevitable at least occasional result for no small number of the human beings who use Facebook and who live lives that attest to suffering, defeat, humiliation, and loss as well as to parties and promotions and vacations. I am not so sure the word "inadvertent" quite captures the culpability of those humans who wanted and coded and implemented and promoted this algorithmic cruelty.
And I must say I question the premise of the further declaration that this code "works in the overwhelming majority of cases." While the result may have been less unpleasant for other people, what does it mean to send someone an image of a grimly-grinning, mildly intoxicated prom-date or a child squinting at a llama in a petting zoo surrounded by cartoon characters insisting on our enjoyment and declaring "here's what your year looked like"? Is that what any year looks like or lives like? Why are these results not also "jarring"? Why are these results not also "unfortunate"? Is any of this really a matter of code "working" for most everybody?
What if the conspicuousness of Meyer's experience of algorithmic cruelty indicates less an exceptional circumstance than the clarifying exposure of a more general failure, a more ubiquitous cruelty? Meyer ultimately concludes that his experience is the result of design flaws which demand design fixes. Basically, he proposes that users be provided the ability to opt out of algorithmic applications that may harm them. Given the extent to which social software forms ever more of the indispensable architecture of the world we navigate, this proposal places an extraordinary burden on those who are harmed by carelessly implemented environments they come to take for granted while absolving those who build, maintain, own, and profit from these environments from the harms resulting from their carelessness. And in its emphasis on designing for egregious experienced harms, this proposal disregards costs, risks, harms that are accepted as inevitable when they are merely habitual, or vanish in their diffusion, over the long-term, as lost opportunities hidden behind given actualities.
But what worries me most of all about this sort of "opt out" design-fix is that with each passing day algorithmic mediation is more extensive, more intensive, more constitutive of the world. We all joke about the ridiculous substitutions performed by autocorrect functions, or the laughable recommendations that follow from the odd purchase of a book from Amazon or an outing from Groupon. We should joke, but don't, when people treat a word cloud as an analysis of a speech or an essay. We don't joke so much when a credit score substitutes for the judgment whether a citizen deserves the chance to become a homeowner or start a small business, or when a Big Data profile substitutes for the judgment whether a citizen should become a heat signature for a drone commiting extrajudicial murder in all of our names. Meyer's experience of algorithmic cruelty is extraordinary, but that does not mean it cannot also be a window onto an experience of algorithmic cruelty that is ordinary. The question whether we might still "opt out" from the ordinary cruelty of algorithmic mediation is not a design question at all, but an urgent political one.
On Christmas Eve, Eric Meyer posted a devastating personal account reminding us of the extraordinary cruelty of the lived experience of ever more prevailing algorithmic mediation.
Meyer's Facebook feed had confronted him that day with a chirpy headline that trilled, "Your Year in Review. Eric, here's what your year looked like!" Beneath it, there was the image that an algorithm had number-crunched to the retrospective forefront, surrounded by clip-art cartoons of dancing figures with silly flailing arms amidst balloons and swirls of confetti in festive pastels. The image was the face of Eric Meyer's six year old daughter. It was the image that had graced the memorial announcement he had posted upon her death earlier in the year. Describing the moment when his eye alighted on that adored unexpected gaze, now giving voice to that brutally banal headline, Meyer writes: "Yes, my year looked like that. True enough. My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully."
Meyer's efforts to come to terms with the impact of this algorithmic unkindness are incomparably more kind than they easily and justifiably might have been. "I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases." To emphasize the force of this point, "Inadvertent Algorithmic Cruelty" is also the title of Meyer's meditation. "To show me Rebecca’s face and say 'Here’s what your year looked like!' is jarring," writes Meyer. "It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate." But just what imaginary scene is being conjured up in this exculpatory rhetoric in which inadvertent cruelty is "coming from code" as opposed to coming from actual persons? Aren't coders actual persons, for example?
Needless to say, Meyers has every right to grieve and to forgive and to make sense of these events in the way that works best for him. And of course I know what he means when he seizes on the idea that none of this was "a deliberate assault." But it occurs to me that it requires the least imaginable measure of thought on the part of those actually responsible for this code to recognize that the cruelty of Meyer's confrontation with their algorithm was the inevitable at least occasional result for no small number of the human beings who use Facebook and who live lives that attest to suffering, defeat, humiliation, and loss as well as to parties and promotions and vacations. I am not so sure the word "inadvertent" quite captures the culpability of those humans who wanted and coded and implemented and promoted this algorithmic cruelty.
And I must say I question the premise of the further declaration that this code "works in the overwhelming majority of cases." While the result may have been less unpleasant for other people, what does it mean to send someone an image of a grimly-grinning, mildly intoxicated prom-date or a child squinting at a llama in a petting zoo surrounded by cartoon characters insisting on our enjoyment and declaring "here's what your year looked like"? Is that what any year looks like or lives like? Why are these results not also "jarring"? Why are these results not also "unfortunate"? Is any of this really a matter of code "working" for most everybody?
What if the conspicuousness of Meyer's experience of algorithmic cruelty indicates less an exceptional circumstance than the clarifying exposure of a more general failure, a more ubiquitous cruelty? Meyer ultimately concludes that his experience is the result of design flaws which demand design fixes. Basically, he proposes that users be provided the ability to opt out of algorithmic applications that may harm them. Given the extent to which social software forms ever more of the indispensable architecture of the world we navigate, this proposal places an extraordinary burden on those who are harmed by carelessly implemented environments they come to take for granted while absolving those who build, maintain, own, and profit from these environments from the harms resulting from their carelessness. And in its emphasis on designing for egregious experienced harms, this proposal disregards costs, risks, harms that are accepted as inevitable when they are merely habitual, or vanish in their diffusion, over the long-term, as lost opportunities hidden behind given actualities.
But what worries me most of all about this sort of "opt out" design-fix is that with each passing day algorithmic mediation is more extensive, more intensive, more constitutive of the world. We all joke about the ridiculous substitutions performed by autocorrect functions, or the laughable recommendations that follow from the odd purchase of a book from Amazon or an outing from Groupon. We should joke, but don't, when people treat a word cloud as an analysis of a speech or an essay. We don't joke so much when a credit score substitutes for the judgment whether a citizen deserves the chance to become a homeowner or start a small business, or when a Big Data profile substitutes for the judgment whether a citizen should become a heat signature for a drone commiting extrajudicial murder in all of our names. Meyer's experience of algorithmic cruelty is extraordinary, but that does not mean it cannot also be a window onto an experience of algorithmic cruelty that is ordinary. The question whether we might still "opt out" from the ordinary cruelty of algorithmic mediation is not a design question at all, but an urgent political one.
Thursday, December 25, 2014
Contextualizing My Anti-Futurological Critique for Theoryheads
- This rather densely allusive sketch contextualizing my anti-futurological critique won't be everybody's cup of tea, but I've upgraded and adapted it from my response to a comment in the Moot for those readers who find this sort of thing useful but who would likely miss it otherwise. I still think probably the best, most concise and yet complete(-ish) formulation of my critique is the contrbution I published in the recent Existenz volume on posthumanism.
You accuse me of indulging in futurism while critiquing it. All the big boys make such moves, I don't think less of you for trying it. I have heard what you have to say so far, and I must say it seems to me you are baldly wrong to say this, and that believing it sends you off-track...
What I mean by futurism has its origins in specific institutional histories and discursive practices: namely, the emergence of fraudulent methodologies/ rationales of speculation in market futures and the extrapolative genre of the scenario in military think-tanks -- all taking place in the wider context of the suffusion of public deliberation and culture with the hyperbolic and deceptive techno-progressive norms and forms of consumer advertizing... To give you a sense of where I am coming from and to give you a sense of what I am hearing when you say "modernity" and how I might try to take us elsewhere with futurity-against-futurology, I provide this handy sketch:
To the extent that post-modernity (late modernity, a-modernity, neoliberalism, whatever) is the post-WW1/2 inflation of the petrochemical bubble in which other postwar financial bubbles are blown, my anti-futurology is of a piece with Lyotard's (whatever my differences with him, of which I have many, he makes some of the same warnings).To the extent that futurism markets elite-incumbency as progress, my anti-futurology is also of a piece with some of Debord's critique of the Spectacle, so-called (the parts about "enhanced survival" in particular), specifically to the extent that Debord's tale of "being degraded into having degraded into appearing" derives from Adorno's culture industri(alization) as formula-filling-mistaken-for-judgment and Benjamin's War Machine as the displacment of a revolutionary equity-in-diversity from the epilogue of Art in the Age.
Your emphasis seems more attuned to aesthetic modernities, so the larger context for me is the proposal that between the bookends of Thirty-Years' Wars from Westphalia to Bretton Woods European modernity indulged in a host of quarrels des anciens et des modernes, culture wars presiding over and rationalizing the ongoing organization of social militarization/ administration of nation-states and their competitive internationalism.
"The Future" of futurisms in my sense arises out of those discourses. Design discourses are especially provocative for my critical position, for example, since they are patently futurological -- at once doing and disavowing politics; peddling plutocracy qua meritocracy via the Merely Adequate Yet Advancement through their exemplary anti-democratzing Most Acceptable Yet Advanced MAYA principle -- but still quite modern in what I think is your sense of the term. This matters because futurological global/digital rationality is for me an importantly different phenomenon than the modern that constitutes itself in the repudiation of the ancient: the futurist for me is in between, at once a vestige of modern internationalism and a harbinger of post-nationalist planetarity.
Planetarity is a term I am taking from Spivak, and my sense of where we are headed -- if anywhere -- is informed by queer/critical race/post-colonial/environmental justice theories like hers. In my various theory courses I usually advocate in my final lecture (the one with the final warnings and visions in it) for a polycultural planetarity -- where the "polyculture" term resonates with Paul Gilroy's post-Fanonian convivial multiculturalism as well as with the repudiation of industrial monoculture for companion planting practices in the service of sustainability (but also synecdochic for sustainable political ecology), and then the "planetarity" term marks the failure/ eclipse of nation-state internationalism (say, UN-IMF-World Bank globalization) in digital financialization, fraud, marketing harassment, and surveillance and ecological catastrophe. Polycultural planetarity would build ethics and mobilize democratizations via contingent universalization (that's from my training with Judith Butler no doubt) in the future anterior (a Spivakian understanding of culture as interpretation practices toward practical conviviality). For me, that future anterior is the futurity inhering in the present in the diversity of stakeholders/peers to presence, very much opposed to the closures, reductions, extrapolations, instrumentalizations of "The Future."Lots of name-dropping there, I know, but almost every phrase here can easily turn into a three-hour lecture, I'm afraid, in one of my contemporary critical theory survey courses. I suspect you might be tempted to assimilate all that feminist/queer/posthuman-criticalrace theoretical complex to the categories you already know -- forgive me if I have jumped to conclusions in so saying -- but I think that would be an error, more an effort to dismiss and hence not have to read the work than think what we are doing as Hannah Arendt enjoined, the call I hear every day that keeps me going.
Monday, December 22, 2014
Techbro Mythopoetics
In an enjoyable rant over at io9 today, Charlie Jane Anders declares herself Tired of "The Smartest Man in the Room" science fiction trope. Her delineation of the stereotype is immediately legible:
What is perplexing about the Smartest Guy in the Room archetype, as well as for the more ubiquitous savvy but awkward nerd archetype, is the combination in it of superior knowledge and social ineptitude. Anders proposes that this fantasy space is doubly reassuring -- securing our faith that helpful people will always be around to navigate the incomprehensible technical demands of the world, but that we need not feel inferior in our dependency because these helpful people gained their superior knowledge at the cost of a lack of basic social skills nobody in their right mind would actually choose to pay. The gawky awkward nerd is as obviously inferior as superior, we get to keep our toys with our egos intact, and everybody wins (even the losers).
All this sounds just idiotically American enough to be plausible, but seems to assume that few of her readers -- or anybody, for that matter -- actually identifies with the nerds. Anders seems to have forgotten that she begins her piece with the assertion that The Smartest Guy in the Room is "wish-fulfillment for reasonably smart people," that is to say, the self-image of her entire readership. And of course the truth is that nearly every one of her readers do identify with the archetype, indeed the archetype is a space of aspirational identification in culture more generally, an identification which fuels much of the lucrative popularity and currency of spectacular science fiction and fantasy and geekdom more generally in this moment. That is the real problem that makes the phenomenon Anders has observed worthy of criticism in the first place.
Anders describes the Smartest Guy in the Room as someone who has "contempt for less intelligent people, mixed with adorable social awkwardness, and [a] magic ability to have the right answer at every turn." It is crucial to grasp that what appears as a kind of laundry list here is in fact a set of structurally inter-dependent co-ordinates of the moral universe of The Smartest Guy in the Room. He doesn't happen to be right all the time and socially awkward and contemptuous of almost everybody else, his sociopathic contempt is the essence of his social awkwardness, rationalized by his belief that he is superior to them because he is always right about everything, at least as he sees it.
Before I am chastised for amplifying harmless social awkwardness into sociopathy, let me point out that the adorable nerds of Anders' initial formulations are later conjoined to a discussion of Tony Stark, the cyborgically-ruggedized hyper-individualist bazillionaire tech-CEO hero of the Iron Man blockbusters. Although Anders describes this archetype in terms of its popular currency in pop sf narrative and fandom today, I think it is immediately illuminating to grasp the extent to which Randroidal archetypes Howard Roark, Francisco d'Anconia, Henry Rearden, and John Galt provide the archive from which these sooper-sociopath entrepreneurial mad-scientist cyborg-soldiers are drawn (if you want more connective tissue, recall that Randroidal archetypes are the slightest hop, skip, and jump away from Heinleinian archetypes and now we're off to the races).
The truth is that there is no such thing as the guy who knows all the answers, or who solves all the problems. Problem-solving is a collective process. There is more going on that matters than anybody knows, even the people who know the most. Even the best experts and the luminous prodigies stand on the shoulders of giants, depend on the support of lovers and friends and collaborators and reliable norms and laws and infrastructural affordances, benefit from the perspectives of critics and creative appropriations. Nobody deserves to own it all or run it all, least of all the white guys who happen to own and run most of it at the moment, and this is just as true when elite-incumbency hides its rationalizations for privilege behind a smokescreen of technobabble.
The sociopathy of the techno-fixated Smartest Guy in the Room is, in a word, ideological. Anders hits upon an enormously resonant phrasing when she declares him "an unholy blend of super-genius and con artist." In fact, his declared super-genius is an effect of con-artistry -- the fraudulent cost- and risk-externalization of digital networked financialization, the venture-capitalist con of upward-failing risk-fakers uselessly duplicating already available services and stale commodities as novelties, the privatization of the "disruptors" and precarization of "crowdsource"-sharecropping -- the "unholy" faith on the part of libertechbrotarian white dudes that they deserve their elite incumbent privileges
Perhaps this is a good time to notice that when Anders says the Smartest Guy in the Room provides "wish-fulfillment for reasonably smart people" her examples go on to demonstrate that by people she happens always to mean only guys and even only white guys. She does notice that the Smartest Guy does seem to be, you know, a guy and provides the beginnings of a gendered accounting of the archetype: "the 'smartest guy' thing confirms all our silliest gender stereotypes, in a way that's like a snuggly dryer-fresh blanket to people who feel threatened by shifting gender roles. In the world of these stories, the smartest person is always a man, and if he meets a smart woman she will wind up acknowledging his superiority."
That seems to me a rather genial take on the threatened bearings of patriarchal masculinity compensated by cyborg fantasizing, but at least it's there. The fact that the Smartest Guy keeps on turning out to be white receives no attention at all. This omission matters not only because it is so glaring, but because the sociopathic denial of the collectivity of intelligence, creativity, progress, and flourishing at the heart of the Smartest Guy in the Room techno-archetype, is quite at home in the racist narrative of modern technological civilization embodied in inherently superior European whiteness against which are arrayed not different but primitive and atavistic cultures and societies that must pay in bloody exploitation and expropriation the price of their inherent inferiority. That is to say, the Smartest Guy in the Room is also the Smartest Guy in History, naturally enough, with a filthy treasure pile to stand on and shout his superiority from.
From the White Man's Burden to Yuppie Scum to Techbro Rulz, the Smartest Guy in the Room is one of the oldest stories in the book. And, yeah, plenty of us are getting "kind of tired" of it.
The "smartest man in the room" is a kind of wish-fulfillment for reasonably smart people, because he's not just clever but incredibly glib. As popularized by people like Doctor Who/Sherlock writer Steven Moffat and the creators of American shows like House and Scorpion, the "smartest guy in the room" thinks quicker than everybody else but also talks rings around them, too. He's kind of an unholy blend of super-genius and con artist. Thanks to the popularity of Sherlock, House and a slew of other "poorly socialized, supergenius nerd" shows, the "smartest man in the room" has become part of the wallpaper. His contempt for less intelligent people, mixed with adorable social awkwardness, and his magic ability to have the right answer at every turn, have become rote.Later, she offers up a preliminary hypothesis that the intelligibility and force of the archetype derives from the widespread experience of consumers who feel themselves to be at the mercy of incomprehensible devices and therefore of the helpful nerds in their lives who better understand these things. I actually don't think the world is particularly more technologically incomprehensible now than it has always somewhat been in network-mediated extractive-industrial societies, but tech-talkers like to say otherwise because it consoles them that progress is happening rather than the immiserating unsustainable stasis that actually prevails, but that is a separate discussion. I do think Anders strikes very much the right note when she declares The Smartest Guy in the Room archetype a "wish-fulfillment fantasy," but I am not sure that I agree with her proposal about how the fantasy is operating here.
What is perplexing about the Smartest Guy in the Room archetype, as well as for the more ubiquitous savvy but awkward nerd archetype, is the combination in it of superior knowledge and social ineptitude. Anders proposes that this fantasy space is doubly reassuring -- securing our faith that helpful people will always be around to navigate the incomprehensible technical demands of the world, but that we need not feel inferior in our dependency because these helpful people gained their superior knowledge at the cost of a lack of basic social skills nobody in their right mind would actually choose to pay. The gawky awkward nerd is as obviously inferior as superior, we get to keep our toys with our egos intact, and everybody wins (even the losers).
All this sounds just idiotically American enough to be plausible, but seems to assume that few of her readers -- or anybody, for that matter -- actually identifies with the nerds. Anders seems to have forgotten that she begins her piece with the assertion that The Smartest Guy in the Room is "wish-fulfillment for reasonably smart people," that is to say, the self-image of her entire readership. And of course the truth is that nearly every one of her readers do identify with the archetype, indeed the archetype is a space of aspirational identification in culture more generally, an identification which fuels much of the lucrative popularity and currency of spectacular science fiction and fantasy and geekdom more generally in this moment. That is the real problem that makes the phenomenon Anders has observed worthy of criticism in the first place.
Anders describes the Smartest Guy in the Room as someone who has "contempt for less intelligent people, mixed with adorable social awkwardness, and [a] magic ability to have the right answer at every turn." It is crucial to grasp that what appears as a kind of laundry list here is in fact a set of structurally inter-dependent co-ordinates of the moral universe of The Smartest Guy in the Room. He doesn't happen to be right all the time and socially awkward and contemptuous of almost everybody else, his sociopathic contempt is the essence of his social awkwardness, rationalized by his belief that he is superior to them because he is always right about everything, at least as he sees it.
Before I am chastised for amplifying harmless social awkwardness into sociopathy, let me point out that the adorable nerds of Anders' initial formulations are later conjoined to a discussion of Tony Stark, the cyborgically-ruggedized hyper-individualist bazillionaire tech-CEO hero of the Iron Man blockbusters. Although Anders describes this archetype in terms of its popular currency in pop sf narrative and fandom today, I think it is immediately illuminating to grasp the extent to which Randroidal archetypes Howard Roark, Francisco d'Anconia, Henry Rearden, and John Galt provide the archive from which these sooper-sociopath entrepreneurial mad-scientist cyborg-soldiers are drawn (if you want more connective tissue, recall that Randroidal archetypes are the slightest hop, skip, and jump away from Heinleinian archetypes and now we're off to the races).
The truth is that there is no such thing as the guy who knows all the answers, or who solves all the problems. Problem-solving is a collective process. There is more going on that matters than anybody knows, even the people who know the most. Even the best experts and the luminous prodigies stand on the shoulders of giants, depend on the support of lovers and friends and collaborators and reliable norms and laws and infrastructural affordances, benefit from the perspectives of critics and creative appropriations. Nobody deserves to own it all or run it all, least of all the white guys who happen to own and run most of it at the moment, and this is just as true when elite-incumbency hides its rationalizations for privilege behind a smokescreen of technobabble.
The sociopathy of the techno-fixated Smartest Guy in the Room is, in a word, ideological. Anders hits upon an enormously resonant phrasing when she declares him "an unholy blend of super-genius and con artist." In fact, his declared super-genius is an effect of con-artistry -- the fraudulent cost- and risk-externalization of digital networked financialization, the venture-capitalist con of upward-failing risk-fakers uselessly duplicating already available services and stale commodities as novelties, the privatization of the "disruptors" and precarization of "crowdsource"-sharecropping -- the "unholy" faith on the part of libertechbrotarian white dudes that they deserve their elite incumbent privileges
Perhaps this is a good time to notice that when Anders says the Smartest Guy in the Room provides "wish-fulfillment for reasonably smart people" her examples go on to demonstrate that by people she happens always to mean only guys and even only white guys. She does notice that the Smartest Guy does seem to be, you know, a guy and provides the beginnings of a gendered accounting of the archetype: "the 'smartest guy' thing confirms all our silliest gender stereotypes, in a way that's like a snuggly dryer-fresh blanket to people who feel threatened by shifting gender roles. In the world of these stories, the smartest person is always a man, and if he meets a smart woman she will wind up acknowledging his superiority."
That seems to me a rather genial take on the threatened bearings of patriarchal masculinity compensated by cyborg fantasizing, but at least it's there. The fact that the Smartest Guy keeps on turning out to be white receives no attention at all. This omission matters not only because it is so glaring, but because the sociopathic denial of the collectivity of intelligence, creativity, progress, and flourishing at the heart of the Smartest Guy in the Room techno-archetype, is quite at home in the racist narrative of modern technological civilization embodied in inherently superior European whiteness against which are arrayed not different but primitive and atavistic cultures and societies that must pay in bloody exploitation and expropriation the price of their inherent inferiority. That is to say, the Smartest Guy in the Room is also the Smartest Guy in History, naturally enough, with a filthy treasure pile to stand on and shout his superiority from.
From the White Man's Burden to Yuppie Scum to Techbro Rulz, the Smartest Guy in the Room is one of the oldest stories in the book. And, yeah, plenty of us are getting "kind of tired" of it.
Sunday, December 21, 2014
Why Our Militant Atheists Are Not Secular Thinkers
Secularity -- from the Latin saecularis, worldly, timely, contingent -- properly so called, is very much a pluralist and not an eliminationist impulse. In naming the distinction of worldly affairs from spiritual devotions, it differentiated the good life of the vita contemplativa of philosophy from that of the vita activa of the statesman or aesthete, but later went on to carve out the distinctions of clerical from government, legal, professional authorities. The Separation of Church and State as pillar of secular thinking and practice is the furthest imaginable thing from sectarian or ethnic strife amplified by the eliminationist imagination into genocidal violence -- and yet the identification of today's militant atheists with a "secular worldview" risks precisely such a collapse.
Secularism has never demanded an anti-religiosity but recognized the legitimacy of non-religiosities. Indeed, in diverse multicultures such as our own secularism becomes indispensable to the continuing life of religious minorities against majority or authoritarian formations of belief, and hence is not only not anti-religious but explicitly facilitative of variously religious lifeways as it is of variously non-religious lifeways.
I have been an atheist since 1983 -- over thirty years by now! after a Roman Catholic upbringing. I am quite happy to live a life a-thiest -- "without god(s)" -- myself, but the primary value of secularism to me has always been its entailment of and insistence on a pluralist practice of reason, in which we recognize that there are many domains of belief distinguished in their concerns, in their cares, and in the manner of their convictions. Our scientific, moral, aesthetic, ethical, professional, political beliefs, and so on, occupy different conceptual and practical domains, incarnate different registers of our lives, are warranted by different criteria. For the pluralist, reason is not properly construed as the monomaniacal reduction of all belief to a single mode, but a matter of recognizing what manner of concern, care, and conviction belief is rightly occasioned for and then applying the right criteria of warrant appropriate to that mode.
Pluralism is not a relativism or nihilism, as threatened bearings of fundamentalist belief would have it, but a rigorous reasonableness equal to the complexity, dynamism, and multifaceted character of existence and of the personalities beset by its demands and possibilities. For one thing, pluralism allows us to grasp and reconcile the aspirational force of the contingent universalism of ethics without which we could not conceive let alone work toward progress or the Beloved Community of the we in which all are reconciled, while at once doing justice to the fierce demands and rewards in dignity and belonging deriving from our (inevitably plural, usually partial) inhabitation of moral communities that build the "we" from exclusions of various construals of the "they." Pluralism allows us to reconcile as well our pursuit of the private perfections of morality and sublimity (my appreciation of the aesthetical forms of which requires my admission of the validity for others, whatever my atheism, of its faithly forms) with the public works of scientific, political, legal, professional progress.
It is crucial to grasp that the refusal of pluralism is reductionism, and that reductionism is an irrationalism. It is a form of insensitivity, a form of unintelligence -- and usually a testimony to and inept compensation for insecurity. In Nietzsche's critique of the fetish (Marx's commodity fetishism and Freud's sexual fetishes are surface scratches in comparison) this reductionism is the ressentimental attutude of the life of fear over the lives of love, the philosophical imposture of deception and self-deception peddled as truth-telling. To impose the criteria of warrant proper to scientific belief to moral belief, say, or to aesthetic judgement, or to legal adjudication is to be irrational not rational. Also, crucially, it is to violate and not celebrate science.
To call the celebrated (or at any rate noisy) militant atheistic boy warriors of today "secular thinkers" is a profound error. To misconstrue as the sins of religious faith as such the moralizing misapplication of faithly norms to political practices is to misunderstand the problem at hand -- and usually in a way that multiplies errors: Hence, our militant atheists become bigots tarring innocent majorities with the crimes of violent minorities, they lose the capacity to recognize differences that make a difference in cultures, societies, individuals all the while crowing about their superior discernment.
Those who commit crimes and administer tyrannies in the name of faith irrationally and catastrophically misapply the substantiation of aesthetic sublimities and parochial mores connected to some among indefinitely many forms of religiosity to domains of ethical aspiration and political progress to which they are utterly unsuited. Fascism and moralizing are already-available terms for these too familiar irrational misapplications. Meanwhile those who attribute these crimes and tyrannies to the aesthetic and the moral as such, as practiced in variously faithful forms, are inevitably indulging in reductionism. This reductionism in its everyday stupidity is usually a form of ethnocentric subcultural parochialism, but the militant atheists prefer their stupidity in the form of scientism, usually assuming the imaginary vantage of a superior scientificity the terms of which presumably adjudicate the unethical in moralizing and the tyrannical in progressivity because it subsumes ethical and political domains within its own scientific terms. In this, scientism first distorts science into a morality which it then, flabbergastingly, distorts into a moralism itself, thus mirroring the very fundamentalism it seeks to critique.
Secularism is a theoretical and practical responsiveness to the plurality of a world in which there is always more going on that matters in the present than any of us can know and in which the diversity of stakeholders to the shared present interminably reopens history to struggle. It is bad enough that today's militant atheists get so much of the substance and value of science, taste, and faith wrong in their disordering rage for order, but in calling their reductionist irrationality "secular thinking" we risk losing the sense and significance of the secular altogether, that accomplishment of reason without which we can never be equal to the demands and promises of reality and history in the plurality of their actual presence.
Secularism has never demanded an anti-religiosity but recognized the legitimacy of non-religiosities. Indeed, in diverse multicultures such as our own secularism becomes indispensable to the continuing life of religious minorities against majority or authoritarian formations of belief, and hence is not only not anti-religious but explicitly facilitative of variously religious lifeways as it is of variously non-religious lifeways.
I have been an atheist since 1983 -- over thirty years by now! after a Roman Catholic upbringing. I am quite happy to live a life a-thiest -- "without god(s)" -- myself, but the primary value of secularism to me has always been its entailment of and insistence on a pluralist practice of reason, in which we recognize that there are many domains of belief distinguished in their concerns, in their cares, and in the manner of their convictions. Our scientific, moral, aesthetic, ethical, professional, political beliefs, and so on, occupy different conceptual and practical domains, incarnate different registers of our lives, are warranted by different criteria. For the pluralist, reason is not properly construed as the monomaniacal reduction of all belief to a single mode, but a matter of recognizing what manner of concern, care, and conviction belief is rightly occasioned for and then applying the right criteria of warrant appropriate to that mode.
Pluralism is not a relativism or nihilism, as threatened bearings of fundamentalist belief would have it, but a rigorous reasonableness equal to the complexity, dynamism, and multifaceted character of existence and of the personalities beset by its demands and possibilities. For one thing, pluralism allows us to grasp and reconcile the aspirational force of the contingent universalism of ethics without which we could not conceive let alone work toward progress or the Beloved Community of the we in which all are reconciled, while at once doing justice to the fierce demands and rewards in dignity and belonging deriving from our (inevitably plural, usually partial) inhabitation of moral communities that build the "we" from exclusions of various construals of the "they." Pluralism allows us to reconcile as well our pursuit of the private perfections of morality and sublimity (my appreciation of the aesthetical forms of which requires my admission of the validity for others, whatever my atheism, of its faithly forms) with the public works of scientific, political, legal, professional progress.
It is crucial to grasp that the refusal of pluralism is reductionism, and that reductionism is an irrationalism. It is a form of insensitivity, a form of unintelligence -- and usually a testimony to and inept compensation for insecurity. In Nietzsche's critique of the fetish (Marx's commodity fetishism and Freud's sexual fetishes are surface scratches in comparison) this reductionism is the ressentimental attutude of the life of fear over the lives of love, the philosophical imposture of deception and self-deception peddled as truth-telling. To impose the criteria of warrant proper to scientific belief to moral belief, say, or to aesthetic judgement, or to legal adjudication is to be irrational not rational. Also, crucially, it is to violate and not celebrate science.
To call the celebrated (or at any rate noisy) militant atheistic boy warriors of today "secular thinkers" is a profound error. To misconstrue as the sins of religious faith as such the moralizing misapplication of faithly norms to political practices is to misunderstand the problem at hand -- and usually in a way that multiplies errors: Hence, our militant atheists become bigots tarring innocent majorities with the crimes of violent minorities, they lose the capacity to recognize differences that make a difference in cultures, societies, individuals all the while crowing about their superior discernment.
Those who commit crimes and administer tyrannies in the name of faith irrationally and catastrophically misapply the substantiation of aesthetic sublimities and parochial mores connected to some among indefinitely many forms of religiosity to domains of ethical aspiration and political progress to which they are utterly unsuited. Fascism and moralizing are already-available terms for these too familiar irrational misapplications. Meanwhile those who attribute these crimes and tyrannies to the aesthetic and the moral as such, as practiced in variously faithful forms, are inevitably indulging in reductionism. This reductionism in its everyday stupidity is usually a form of ethnocentric subcultural parochialism, but the militant atheists prefer their stupidity in the form of scientism, usually assuming the imaginary vantage of a superior scientificity the terms of which presumably adjudicate the unethical in moralizing and the tyrannical in progressivity because it subsumes ethical and political domains within its own scientific terms. In this, scientism first distorts science into a morality which it then, flabbergastingly, distorts into a moralism itself, thus mirroring the very fundamentalism it seeks to critique.
Secularism is a theoretical and practical responsiveness to the plurality of a world in which there is always more going on that matters in the present than any of us can know and in which the diversity of stakeholders to the shared present interminably reopens history to struggle. It is bad enough that today's militant atheists get so much of the substance and value of science, taste, and faith wrong in their disordering rage for order, but in calling their reductionist irrationality "secular thinking" we risk losing the sense and significance of the secular altogether, that accomplishment of reason without which we can never be equal to the demands and promises of reality and history in the plurality of their actual presence.
Saturday, December 20, 2014
Consolidated A Quarter Century Later
I was listening to this with righteous fury back in Atlanta as a Queer National, vegan hard-ass (these days I'm cheerfully vegetarian and prefer my queerness post-nationalist), and budding socialist feminist green, writing an MA thesis connecting queer theory and technocultural theory (a trace of which survives here). I had not the slightest suggestion of a hope back then that I would be in San Francisco working with my hero Judith Butler in just two years' time. That was wonderful, even a little miraculous. But I did have great hope and conviction then that those songs would no longer be so thoroughly relevant to the America of 2014, a generation away, an America of ongoing unemployment and lowered expectations and still-profiteering banksters, of SillyCon fraudsters, of racist police, of rising Greenhouse storms. That has not been so wonderful, not so miraculous.
Thursday, December 04, 2014
Cop-Cam Sham: Political Problems Demand Political Solutions
Once again we are confronted with another miscarriage of justice as another police officer kills another unarmed black citizen the police are supposed to serve and protect. And once again calls are ringing out on all sides to install more cameras, cameras on police cars, cameras on the street, cameras on the bodies of cops on the beat.
Cop-cam techno-fixers really need to pause and take note: Eric Garner's death by a clearly illegal choke hold was on video and was seen by millions.
Solutions from scholars and activists and experts have been reiterated and mostly ignored for over a generation by now: setting up independent special prosecutors to address charges of police misconduct rather than grand juries composed of colleagues inthe criminal justice system with inherent conflicts of interest; extensive training for police in violence de-escalation strategies and to provide sensitivity to racial and other empirically well-established forms of bias, unconscious and conscious; hiring and promotion policies to reflect the composition of the communities they are meant to serve and protect; community policing, oversight and accountability; ending the harsh sentencing rules installed by the failed racist war on (some) drugs; commonsense gun safety regulations -- all of these and more are indispensable to address ongoing terrorization of vulnerable communities by police in all our names. If I point out that procedures are techniques and regulations are legal artifacts can technofixated futurists get behind these or similar proposals, even if they are not polished chrome and shaped like dildoes?
Of course, more body cameras for police on the street can and probably should be part of the story of better policing practices in our communities. I have nothing against that proposal except the pretense that cameras are "the solution."
It is crucial to grasp that the interpretation of camera footage is stratified and shaped by the same racism that shapes and stratifies the racist policing so many are talking about here, the footage is taken up in the context of the very institutional practices and procedures that are otherwise failing so conspicuously before our eyes. The same collegial incentives to protect police from accountability now would pressure those who presumably guard the footage. Think of "lost" e-mails, selective leaks of secret testimony, orchestrated press releases shaping public perceptions: more video surveillance footage is more mountain to mold.
It is a strange thing, to say the least, to propose more surveillance as the ready-made solution to the unjust policing of people of color who have lived for generations under a regime of relentless onerous arbitrary surveillance as the substance of much of that unjust policing. Stop and frisk is a surveillance technique, you know. That policing has not been reformed even in the face of generations of obvious, ongoing failures should sound a warning that justice does not flow automatically from the visibility of injustice alone.
State sanctioned violence against black people from slavery, to Jim Crow, to inequitable incarceration and policing, is centuries old: it is not incidental to but an abiding historical constituent element of the justice system. Political problems demand political solutions. In this context, technofixated dreams of circumventions of the political with handy gizmos amount to affirmations of the politics of the reactionary racist status quo. My point in saying so is not to call the techno-fixated racist, but to appeal to their anti-racism to impel them to dig deeper than their usual techno-fixation to take on this long ongoing crisis on the educational, agitational, organizational terms it actually demands.
Cop-cam techno-fixers really need to pause and take note: Eric Garner's death by a clearly illegal choke hold was on video and was seen by millions.
Solutions from scholars and activists and experts have been reiterated and mostly ignored for over a generation by now: setting up independent special prosecutors to address charges of police misconduct rather than grand juries composed of colleagues inthe criminal justice system with inherent conflicts of interest; extensive training for police in violence de-escalation strategies and to provide sensitivity to racial and other empirically well-established forms of bias, unconscious and conscious; hiring and promotion policies to reflect the composition of the communities they are meant to serve and protect; community policing, oversight and accountability; ending the harsh sentencing rules installed by the failed racist war on (some) drugs; commonsense gun safety regulations -- all of these and more are indispensable to address ongoing terrorization of vulnerable communities by police in all our names. If I point out that procedures are techniques and regulations are legal artifacts can technofixated futurists get behind these or similar proposals, even if they are not polished chrome and shaped like dildoes?
Of course, more body cameras for police on the street can and probably should be part of the story of better policing practices in our communities. I have nothing against that proposal except the pretense that cameras are "the solution."
It is crucial to grasp that the interpretation of camera footage is stratified and shaped by the same racism that shapes and stratifies the racist policing so many are talking about here, the footage is taken up in the context of the very institutional practices and procedures that are otherwise failing so conspicuously before our eyes. The same collegial incentives to protect police from accountability now would pressure those who presumably guard the footage. Think of "lost" e-mails, selective leaks of secret testimony, orchestrated press releases shaping public perceptions: more video surveillance footage is more mountain to mold.
It is a strange thing, to say the least, to propose more surveillance as the ready-made solution to the unjust policing of people of color who have lived for generations under a regime of relentless onerous arbitrary surveillance as the substance of much of that unjust policing. Stop and frisk is a surveillance technique, you know. That policing has not been reformed even in the face of generations of obvious, ongoing failures should sound a warning that justice does not flow automatically from the visibility of injustice alone.
State sanctioned violence against black people from slavery, to Jim Crow, to inequitable incarceration and policing, is centuries old: it is not incidental to but an abiding historical constituent element of the justice system. Political problems demand political solutions. In this context, technofixated dreams of circumventions of the political with handy gizmos amount to affirmations of the politics of the reactionary racist status quo. My point in saying so is not to call the techno-fixated racist, but to appeal to their anti-racism to impel them to dig deeper than their usual techno-fixation to take on this long ongoing crisis on the educational, agitational, organizational terms it actually demands.
Monday, December 01, 2014
Personal
To say the personal is political is to notice that in becoming the
personal some politics is de-politicized, with political consequences.
Saturday, November 29, 2014
Futurology Defined
The futurological in my sense of the term is an ideological formation; it is essentially a marketing discourse amplifying the profits and authority of incumbent elites by mobilizing seductive and reassuring techno-transcendental wish-fulfillment fantasies in the form of unaccountable, apparently predictive, promissory, or even prophetic utterances in which the deceptive, hyperbolic norms and forms of promotion and advertising already suffusing our public life take on the coloration and intensity of outright organized religiosity: for example, in the guiding narratives of mainstream corporate-military think-tanks, in popular consumer fandoms for Apple products or celebrity CEOs, or in marginal futurist subcultures like transhumanism.
Friday, November 14, 2014
Techbros Are Not Geeks
I find it utterly bewildering that the public face of geekdom more and more seems to be becoming venture capitalist skim-scam operations, evopsycho douchebaggery, GamerGate bigots, the anti-intellectual MOOCification of the Academy, and googlediculous soopergenius "thought-leaders" peddling futurological flim-flam. It's bad enough that most "thought leaders" aren't leading anything, but it's pretty plain they aren't even thinking.
I mean, is the unwatchably stale Cheez Whiz of "The Big Bang Theory" really supposed to have its finger on the pulse of some vital cultural phenomenon? Arts and crafts fairs rebranded as Maker Faires, learning annex courses and funding pitches rebranded as TEDtalkfotainments, superfluous duplications of existing goods and services given websites and then branded as a New Economy Tech Explosion, and always everywhere the corporate logos all watching over with loving grace... didn't we already do this dreary disaster in the irrationally exuberant dot.bomb 90s?
Since when has geekdom been so crass, so dumb, so monotonal? Where did all the camp, all the weirdos, all the elven/vulcan-eared librarians, all the dirty fucking hippies go?
I hate to be the one to break it to anybody, but the Federation is a multicultural socialist democracy -- Star Fleet is where they send the stale pale males who can't quite get with the laid back abundance program of their better adjusted fellow citizens.
You know, when I was in High School the only people I could talk about Star Trek and Dune with were theater geeks, and Model UN nerds, and downlow feminists on the pom pom squad in my honors English classes who played dumb for boyfriends they liked to make fun of when they weren't making out with them. Later, my geeks were in Queer Nation, loved John Waters as much as Star Wars, and pored over dusty archives for years dissertating on medieval French poets.
Speaking only for myself, of course, but these days when I am really geeking out to my heart's content it probably means I am reading Nnedi Okorafor, listening to Janelle Monae, or watching a roundtable of social scientists and cultural critics and activists on Melissa Harris-Perry's show. Geekery has never looked or felt or wanted to me to be anything remotely like the white-racist patriarchal corporate-militarist yuppie scumbaggery that is now getting marketed as "geekery."
Of course, every cultural formation is diverse in ways that exceed any parochial vantage on it and is stratified by the legacies and agonies of historical placement. While they often, all too often, exhibited such frailties, for me, geeks have never been about greed or exclusion or reactionary politics or consumer conformity. For me, geeks have been about resisting the forces and forms with which they seem increasingly to be identified in public discourse. The New Geekdom is the old bleak-dumb.
I say, Kol-Ut-Shan! and die techbro scum! (A paradox, you say? Mine is a geekery that thrives on paradoxes.)
I mean, is the unwatchably stale Cheez Whiz of "The Big Bang Theory" really supposed to have its finger on the pulse of some vital cultural phenomenon? Arts and crafts fairs rebranded as Maker Faires, learning annex courses and funding pitches rebranded as TEDtalkfotainments, superfluous duplications of existing goods and services given websites and then branded as a New Economy Tech Explosion, and always everywhere the corporate logos all watching over with loving grace... didn't we already do this dreary disaster in the irrationally exuberant dot.bomb 90s?
Since when has geekdom been so crass, so dumb, so monotonal? Where did all the camp, all the weirdos, all the elven/vulcan-eared librarians, all the dirty fucking hippies go?
I hate to be the one to break it to anybody, but the Federation is a multicultural socialist democracy -- Star Fleet is where they send the stale pale males who can't quite get with the laid back abundance program of their better adjusted fellow citizens.
You know, when I was in High School the only people I could talk about Star Trek and Dune with were theater geeks, and Model UN nerds, and downlow feminists on the pom pom squad in my honors English classes who played dumb for boyfriends they liked to make fun of when they weren't making out with them. Later, my geeks were in Queer Nation, loved John Waters as much as Star Wars, and pored over dusty archives for years dissertating on medieval French poets.
Speaking only for myself, of course, but these days when I am really geeking out to my heart's content it probably means I am reading Nnedi Okorafor, listening to Janelle Monae, or watching a roundtable of social scientists and cultural critics and activists on Melissa Harris-Perry's show. Geekery has never looked or felt or wanted to me to be anything remotely like the white-racist patriarchal corporate-militarist yuppie scumbaggery that is now getting marketed as "geekery."
Of course, every cultural formation is diverse in ways that exceed any parochial vantage on it and is stratified by the legacies and agonies of historical placement. While they often, all too often, exhibited such frailties, for me, geeks have never been about greed or exclusion or reactionary politics or consumer conformity. For me, geeks have been about resisting the forces and forms with which they seem increasingly to be identified in public discourse. The New Geekdom is the old bleak-dumb.
I say, Kol-Ut-Shan! and die techbro scum! (A paradox, you say? Mine is a geekery that thrives on paradoxes.)
Friday, October 17, 2014
"Overcoming Our Biological Limits"
It is a techno-transcendental commonplace to pretend that medicine is confounding long definitive biological limitations in human health, capacity, lifespan, and so on. These claims are usually just completely false, but they are always overblown. The Pill provides a fine illustration of the point, since it truly did enable a fundamental and profoundly emancipatory transformation of the human condition in my view, but without rendering humanity "post-human" in the sense that interests the superlative futurologists in the least.
I see no reason to expect any radical enhancement of definitive human capacities or any increase of lifespan beyond the upper bound some lucky humans have always enjoyed in recorded history (although I would like to think medical improvements might enable many more humans to share in that bit of luck) of the sort that would render "posthuman" or "transhuman" terms more apt now than they have been since World War II to characterize human beings. But quite apart from this sort of well warranted skepticism about imminent techno-transcendental expectations -- whether originating in ill-informed credulity, promotional fraud, pseudo-science, or wish-fulfillment fantasizing -- I must say that I find techno-transcendental interpretations of such projected outcomes profoundly wrongheaded in principle even if they were not also wildly implausible or premature.
On the one hand, these futurologists seem too eager to treat biological limits as self-evident givens, when the terms of bodily legibility and the significance with which biological traits and capacities are freighted are in fact historically varied, constructed and contingent. While on the other hand, futurologists seem to dismiss the extent to which salient continuities in human life -- and especially a shared vulnerability to suffering, correction, injury, abuse, neglect, disease, mortality -- have provided a context within which humans have testified together to our hopes and our histories, a context out of which humans have elaborated our still fledgling morals, ethics, politics, aesthetics. If our technique ever truly were to confound long definitive human limits, such as they are, this would hardly be experienced as an ecstatic overcoming of all limits but as a confrontation with new, and utterly bedeviling, limits to our sense of sharable experience and shared significance.
I see no reason to expect any radical enhancement of definitive human capacities or any increase of lifespan beyond the upper bound some lucky humans have always enjoyed in recorded history (although I would like to think medical improvements might enable many more humans to share in that bit of luck) of the sort that would render "posthuman" or "transhuman" terms more apt now than they have been since World War II to characterize human beings. But quite apart from this sort of well warranted skepticism about imminent techno-transcendental expectations -- whether originating in ill-informed credulity, promotional fraud, pseudo-science, or wish-fulfillment fantasizing -- I must say that I find techno-transcendental interpretations of such projected outcomes profoundly wrongheaded in principle even if they were not also wildly implausible or premature.
On the one hand, these futurologists seem too eager to treat biological limits as self-evident givens, when the terms of bodily legibility and the significance with which biological traits and capacities are freighted are in fact historically varied, constructed and contingent. While on the other hand, futurologists seem to dismiss the extent to which salient continuities in human life -- and especially a shared vulnerability to suffering, correction, injury, abuse, neglect, disease, mortality -- have provided a context within which humans have testified together to our hopes and our histories, a context out of which humans have elaborated our still fledgling morals, ethics, politics, aesthetics. If our technique ever truly were to confound long definitive human limits, such as they are, this would hardly be experienced as an ecstatic overcoming of all limits but as a confrontation with new, and utterly bedeviling, limits to our sense of sharable experience and shared significance.
Saturday, October 04, 2014
"Non-Religious"
Although I am an atheist myself, I am personally less interested in the political significance of the state of explicit atheist identifications than I am in the vast and rising self-description, "nonreligious." The weirdly sectarian skirmishes around precisely correct atheist identifications seem to me as ugly and useless as the religious kind, for the most part, especially when it is white guys doing the skirmishing online, but the rising numbers of people who are simply doing without god -- a good word for which might be, what do you know, a-theist, "without god" -- as any kind of important organizing idea in their actual daily lives (whether or not they feel particularly disposed to argue or even dwell on the question) seems to me to be contributing to the very wholesome secularization, diversification, planetization of American society.
Friday, September 19, 2014
Geek Rule Is Weak Gruel: Why It Matters That Luddites Are Geeks
I received an invitation Monday night to contribute a mini Op-Ed for the New York Times' "Room for Debate" forum, responding to the question, "What does it mean when geek culture becomes mainstream?" Seven contributions appeared last night, including mine. They all contain worthy insights, and my own piece isn't my favorite among them, though mine is the only one I agree with (given the passage of time, that isn't a given). The headline for the forum now reads When Geeks Rule. The introduction of the theme of "Rule" actually changes my sense of the question, but I will set that aside for a moment. I assume that all the resulting essaylets were shaped by the same forces impinging on mine -- three hundred words is a rapidly looming horizon, a no less rapidly looming deadline, and an editorial scalpel wielded by someone with a "readability" agenda, angels and ministers of grace defend us. Who knows the subtleties, qualifications, questions left on the cutting room floor for my fellow participants?
I must say that I bristled a bit at the premise of the question itself. In every version of my contribution to the debate I included this first sentence: "There has never been a monolithic geek culture and geeks have never exhibited a singular profile, and so one important part of what happens as geek-identification becomes more 'mainstream' is that its diversity becomes more visible." My piece went several rounds with my firm but friendly editor, and over and over that opening sentence was excised -- and over and over I kept replacing it. It did not appear in the final version, by which time I more or less gave up. I decided to take the retention of my Vulcan quotation and the derisive gesture at "techbros" as hard won personal victories and move on. Clearly, my editor thought this initial observation about the irreducibility of geekdom to any one fetishized symptom or signifier seemed redundant in an essay treating geek enthusiasms as a motor of diversification more generally (which is the emphasis of the essaylet as it stands, I'd say).
I still do wish that initial frame for the essaylet remained. For me, the essence of geekery is about enthusiasm, it is an appreciation of appreciation. Wil Wheaten famously captured the spirit of what I am talking about in a blog post and video that was deliriously circulated a few years back -- I assume its popularity signifies that it articulated unusually well something many geeks felt about themselves already -- saying of geek gatherings that in them one is "surrounded by people who love the same things you love, the way you love them. But... also... by people who love things you don’t even know about, but you love your respective things in the same way, so you get to love your thing enthusiastically, completely, unironically, without fear of judgement." Geekery, then, is a celebration of the pleasures and knowledges that uniquely derive from close, sustained attention to subjects. And geekery is indifferent to whether or not the subjects being attended to are conventionally appreciated or not, hence it becomes a site of diversity the "mainstreaming" or ramification of which might facilitate diversity more generally. That is my essaylet's wee conceit, such as it is.
One of the few contributions that seemed to me to be on that very geeky wavelength was Zeynep Tufekci's essay, which insisted that joy is at the heart of geekery, non-judgmental joy, creativity, expressivity, experimentation, and the rough-and-tumble of making stuff. I was less pleased with the way her piece seemed to be assimilating this making to what I regard as the mostly self-congratulatory self-promotional BS of an entrepreneurial tech-sector start-up culture, and the rather libertechbrotarian flavor-of-the-month "maker culture" and "Maker Faires" less about making than about being on the make, it seems to me, less about DIY than about privileged disavowals of interdependence, self-declared corporate-sponsored doers doing their dreary disrupting. A few years ago perhaps her go-to Maker vocabulary would have buzzed on about "Smart" blah-de-blah, a few years before that "Bright" this-n-that, a few years before that "Extreme," before that "Virtual," and so on (I'm sure I've missed a meme here or there, but to know one of them is really to know them all). I cannot say that I regard the for-profit tech-sector as a site of conspicuous creativity and ingenuity so much as a vast skim-scam operation repackaging stale useless crap to bamboozle rubes impressed by PowerPoint presentations and buzzy ad-copy, or appropriating the ideas and sweat of quiet coders collaborating amongst themselves without much fanfare and, usually, without much reward. Of course, those obscure coders are almost certainly geeks and they are indeed making stuff, but for me a teenage girl using a toothpick to paint a panel on the surface of a model of Kubrick's spacecraft Discovery or a fifty year old sowing the bodice of an Arwen Evenstar gown for his outfit the last night of a science fiction convention are more illustratively enthusiastic makers invigorating geekery with its and our abiding joy -- even though nobody on hand expects to cash out that creativity in some big score.
Zaheer Ali penned what is probably my favorite of all the contributions, precisely because he exposed the diversity and dynamism of geekdom always-already against the grain of what seemed to me otherwise to be a series of rather sadly reductive mis-identifications of geekery with white techbros dreaming their dumb deadly VC dreams in the SillyCon Valley. Not only did Ali remind the Times readership of the thriving afro-futural musical and literary lineages (no Samuel Delaney or Janelle Monae tho!) which are the site of so much of the aliveness of my own life-long geekery -- not least because for me the afro-futural has also been the indispensable site of so much vital subversive queergeekery -- but he also pointed to Melissa Harris Perry and her #nerdland. It isn't only because I'm such a nerdland fan that I was happy to see Ali insist on the example, but also because it again went against the reductive grain of so many of the contributions otherwise: Harris-Perry's excellent show is such a geek-fest because it is an academic space filled with earnest activism -- folks who have been shaped by what Foucault described as the "grey, meticulous and patiently documentary" work of research and who retain the fierce pleasure of discovery and relevance of lives devoted to these intense attentions.
When, to the contrary, Judith Donath defines geekdom in her piece through its "affinity to math and science" I wonder if all the science fiction geek shippers who don't know much science beyond what they learned from their Cosmos blu-rays and all the literary historian geeks smelling of a day in the archives but who flunked math are utterly invisible to her in all their geek glory? Donath writes that "Geeky fascination with what is undiscovered drives scientific curiosity and invention, but unmoored from the desire to learn and create, it's simply consumerism." I actually do not agree that all cultural reception and appropriation is "simply consumerism." But even so, her larger point that "obsessive video game playing" will not help solve "climate change, emerging diseases, vast inequality" is certainly true. But why exactly would one expect otherwise? I am not sure that playing video games is enough to make one a geek, any more than watching movies does, and I would like to hear much more about the work that is getting done in Donath's attribution of "obsessiveness" to such gamers, but neither can I accept Donath's identification of geekery with the rational, scientific thought that is indispensable (together with free and accountable democratic governance in the service of general welfare) to the solution of such problems. I think rationality and science are both much bigger and much older than geekery, and that geek enthusiasms are quite valuable in human lives even when they do not contribute to the also indispensably valuable problem-solving work of science, engineering, public investment and harm-reduction policy-making. So, too, reactionary unsustainable consumerism is obviously a bigger problem than is its capacity to colonize the geek imagination -- it isn't the fault of geeks that under capitalism all that is solid melts into air. As I tried to insinuate at least in my piece geekery could use a bit more criticism along with its enthusiasm as an intellectual mode, but too strict an identification of geekdom with progressive science and policy or with reactionary consumerism seems to me to lose track of what geekdom actually is.
Just to be clear, it's not that I think geekery lacks the "affinity to math and science" Donath mentions and which so many geeks so conspicuously exhibit, any more than I deny the connection of geekery to "coding" that Kimberley Bryant and William Powers emphasize in their respective pieces. I simply want to refuse any reduction of geekdom to math nerds or software coders or what have you, whether the reduction is made in a spirit of sympathy or hostility to those geeks who happen to be math nerds or software coders. Again, as I said in my excised opening framing, "There has never been a monolithic geek culture and geeks have never exhibited a singular profile," and this really matters to me. To say that geeks are becoming mainstream because so many people use the term "app" in everyday conversation seems to me as obfuscatory as imagining that geeks became mainstream when folks started to talk about going to "the talkies." I happen to think the present ubiquity of handhelds no more indicates a rise in "tech savvyness" than did the ubiquity of office typewriters in the seventies. Am I really supposed to think people stumbling around in the street staring at images of plates of food their friends have eaten immerses me in a more geeky world in some way? Why is that a world more geeky than one in which people are adjusting the dials of their radios or adjusting the rabbit ears on their tee vees or dipping their quills into inkwells? To put the point more urgently still, why on earth would the New York Times treat the ratings of the relentlessly predicable, unwatchably execrable "Big Bang Theory" as relevant to the subject at hand in any deep sense? Network execs assimilating social change into nonthreatening nonrepresentative cheez whiz isn't exactly anything new or interesting -- in fact it would take a whole hell of a lot of culture studies geeks in an academic conference to convince me there was anything more interesting to say about the "Big Bang Theory" than about "Full House" and "Two And A Half Men" (the proper generic assignment for that bleak business).
When I submitted my contribution for its first editorial bloodletting, I received the rather exasperated suggestion that perhaps I might want to jettison the piece and write instead about why I am a Luddite, refusing to live on the terms of this emerging geek mainstream. I had the sinking suspicion at that point that I had been invited to participate in this forum because of my contrarian anti-futurological critiques. While it is true that the techno-transcendental futurist discourses and formations I write about are indeed geek subcultures, they are far from representative of geekdom in my experience of it, and certainly the farthest thing from definitive to it. I critique futurological pseudo-science as a champion of consensus science, I critique futurological corporate-militarism as a champion of accountable evidentiary harm-reduction policy, I critique parochial plutocratic market futures as a champion of free futures, I critique futurist consumer fandoms as a fan myself of literary sf: inother words, my critiques are those of a lifelong queergeek defending a capacious geekdom on which I have depended for my flourishing, and sometimes for my sanity, all my life.
Contra Fredrik deBoer I believe both that geek enthusiasms are still happening at the margins, and that geek enthusiasms are still marginalizing folks. I don't agree that venture capitalists and sexist gamers are representative of geekery (although I am far from denying and I am full of decrying their terrible geek ways). Certainly I would never pretend that geekdom is somehow insulated from the white racist patriarchal extractive industrial corporate-militarist American society which incubates and marinates geekdom -- my piece concludes with observations and warnings very much to the contrary. But, again, I think that if one wants to get to the heart of geekdom, the better to understand the changes it might enable as it ramifies through digital networked media formations, it is important to get at actually representative and symptomatic figures. I don't deny that Bill Gates exhibits geek traits, but I do deny that there is anything characteristically geeky about the traits in Gates that make him rich, powerful, and famous in America.
Titanic "Geeks Rule" archetypes like Gates, Jobs, Wozniak attended or hung out in geek communities connected to great public universities. Many of their marketable notions were cribbed from the knowledge and efforts of geek researchers and geek enthusiasts who have vanished from history. The reframing of these figures as randroidal sooper-genius fountainheads of entrepreneurial innovation and beneficience disavowing their utter dependence on climates of intellectual discovery (and usually scads of military investment) is a plutocratic commonplace. These hushed up circumstances attending the birth of our ruling tech enterprises (at least Berners-Lee doesn't disavow the gestation of the Web itself in CERN) in such skim-scam operations is re-enacted endlessly in the life of such enterprises, as the ideas and wage-labor of cohorts of coders under contract or scooped up from the bubbling crap cauldron of the start-up lottery, arrive in their full flower in the ritual spectacles of hyper-individualized celebrity CEOs bringing out new (usually just repackaged) gewgaws gobbled up by technoscientifically illiterate pop-tech journalists who confuse gossip and informercial pieties for substance. All along this terrorizing trajectory from creative geek collaboration to eventual plutocratic profitability there are endless occasions for folks with ever more tenuous connections to actual efforts and ideas either to take credit or otherwise sell out their fellows in the hope of some piece of pie down the road. If I may say so, there is nothing particularly geeky about this all too conventionally American ugliness. I am the first to critique the deceptions of plutocratic entreprenurial capitalism and the devastations of soulless unsustainable consumerism -- but I simply think it is a mistake to reduce geekdom to either of these phenomena or to treat geekdom as a particularly illuminating window on their deadly play in the world.
I'll conclude with a word on the Luddites. I do not know what is worse from my perspective, that I was expected by the Times to be a Luddite and not a geek, or that the Times would seem to accept the facile characterization of the Luddites as "anti-technology" in a sense opposed to a no less facile characterization of "pro-technology" geeks. As I never tire of saying, there is no such thing as technology-in-general about which it makes any kind of sense to be either loosely "pro" or "con." The constellation of tools and techniques contains too many differences that make a difference for one to assume a sympathetic or antipathetic vantage on the whole, and it will be the assumptions and aspirations in the service of which these tools and techniques will be put that yield our sensible judgments of right and wrong in any case. To speak, as too many of the contributors to the forum did I am afraid, of "technology" becoming more prevalent, guiding, approved of in society in some general way in connection to the mainstreaming of geekery seems to me to be an utterly confused and confusing way of thinking about the technoscientific vicissitudes at hand. All tools and all techniques are texts: and the literary imagination has quite as much to say about their play as does the engineering imagination. All culture is prosthetic and all prostheses are culture: and hence all individuals and all cultures are exactly as "prostheticized" as every other. To press the point, the Luddites were champions of certain tools just as they were critics of others, the Luddites were masters of certain techniques just as they were suspicious of others. Like those skeptics and contrarians who tend to get called "Luddites" today because they hesitate to be steamrolled into panglossian celebrations of anti-democratic technocratic and algorithmic governance or of hucksters peddling techno-utopian memes and kremes, it is important to grasp that Ray Kurzweil is no more cyborgic in Haraway's sense than is John Zerzan. Given their enthusiasm about their treasured tools and techniques it seems to me frankly far more in point to say that Luddites ARE Geeks, that luddic resistance and play is another facet of geek multiculture rather than its comic-book antagonist.
I think Geeks are less inclined to rule at all than to drool over their personal perfections. I know that I am. As I said in the conclusion of my piece, the crucial final point of which was another editorial casualty, I'm sorry to say: "Perhaps such a mainstream embrace of marginal enthusiasms can help America overcome the defensive anti-intellectual bearings of its ruling masculine culture. But clashes over sexist representations in science fiction, exposures of sexist assumptions and practices in science education and tech-sector hiring, as well as the sexist antics of the 'techbros' of venture capitalism demand that we treat such promises of change in a much more critical way than with the usual geek enthusiasm." That is to say, geekery remains for me an intellectualism fueled by diverse and dynamic enthusiasms for enthusiasms, but the democratizing promise of such geekery would require a criticality that I would not yet identify with geekery as much as I would like to. Discussions such as the forum itself might go some distance to introduce this indispensable dimension of criticism -- but I worry that too many of these critics reduced geekery to certain of its superficial symptoms or mis-identified it with larger social struggles draining geekery of its specificity. As I said before, it's not an easy thing to say much of substance in three hundred words on short notice with editorial shears clipping away, so I doubt any of this reflects badly on my fellow participants, most of whom said more interesting things I managed to do in the time and space on offer when it comes to it.
I must say that I bristled a bit at the premise of the question itself. In every version of my contribution to the debate I included this first sentence: "There has never been a monolithic geek culture and geeks have never exhibited a singular profile, and so one important part of what happens as geek-identification becomes more 'mainstream' is that its diversity becomes more visible." My piece went several rounds with my firm but friendly editor, and over and over that opening sentence was excised -- and over and over I kept replacing it. It did not appear in the final version, by which time I more or less gave up. I decided to take the retention of my Vulcan quotation and the derisive gesture at "techbros" as hard won personal victories and move on. Clearly, my editor thought this initial observation about the irreducibility of geekdom to any one fetishized symptom or signifier seemed redundant in an essay treating geek enthusiasms as a motor of diversification more generally (which is the emphasis of the essaylet as it stands, I'd say).
I still do wish that initial frame for the essaylet remained. For me, the essence of geekery is about enthusiasm, it is an appreciation of appreciation. Wil Wheaten famously captured the spirit of what I am talking about in a blog post and video that was deliriously circulated a few years back -- I assume its popularity signifies that it articulated unusually well something many geeks felt about themselves already -- saying of geek gatherings that in them one is "surrounded by people who love the same things you love, the way you love them. But... also... by people who love things you don’t even know about, but you love your respective things in the same way, so you get to love your thing enthusiastically, completely, unironically, without fear of judgement." Geekery, then, is a celebration of the pleasures and knowledges that uniquely derive from close, sustained attention to subjects. And geekery is indifferent to whether or not the subjects being attended to are conventionally appreciated or not, hence it becomes a site of diversity the "mainstreaming" or ramification of which might facilitate diversity more generally. That is my essaylet's wee conceit, such as it is.
One of the few contributions that seemed to me to be on that very geeky wavelength was Zeynep Tufekci's essay, which insisted that joy is at the heart of geekery, non-judgmental joy, creativity, expressivity, experimentation, and the rough-and-tumble of making stuff. I was less pleased with the way her piece seemed to be assimilating this making to what I regard as the mostly self-congratulatory self-promotional BS of an entrepreneurial tech-sector start-up culture, and the rather libertechbrotarian flavor-of-the-month "maker culture" and "Maker Faires" less about making than about being on the make, it seems to me, less about DIY than about privileged disavowals of interdependence, self-declared corporate-sponsored doers doing their dreary disrupting. A few years ago perhaps her go-to Maker vocabulary would have buzzed on about "Smart" blah-de-blah, a few years before that "Bright" this-n-that, a few years before that "Extreme," before that "Virtual," and so on (I'm sure I've missed a meme here or there, but to know one of them is really to know them all). I cannot say that I regard the for-profit tech-sector as a site of conspicuous creativity and ingenuity so much as a vast skim-scam operation repackaging stale useless crap to bamboozle rubes impressed by PowerPoint presentations and buzzy ad-copy, or appropriating the ideas and sweat of quiet coders collaborating amongst themselves without much fanfare and, usually, without much reward. Of course, those obscure coders are almost certainly geeks and they are indeed making stuff, but for me a teenage girl using a toothpick to paint a panel on the surface of a model of Kubrick's spacecraft Discovery or a fifty year old sowing the bodice of an Arwen Evenstar gown for his outfit the last night of a science fiction convention are more illustratively enthusiastic makers invigorating geekery with its and our abiding joy -- even though nobody on hand expects to cash out that creativity in some big score.
Zaheer Ali penned what is probably my favorite of all the contributions, precisely because he exposed the diversity and dynamism of geekdom always-already against the grain of what seemed to me otherwise to be a series of rather sadly reductive mis-identifications of geekery with white techbros dreaming their dumb deadly VC dreams in the SillyCon Valley. Not only did Ali remind the Times readership of the thriving afro-futural musical and literary lineages (no Samuel Delaney or Janelle Monae tho!) which are the site of so much of the aliveness of my own life-long geekery -- not least because for me the afro-futural has also been the indispensable site of so much vital subversive queergeekery -- but he also pointed to Melissa Harris Perry and her #nerdland. It isn't only because I'm such a nerdland fan that I was happy to see Ali insist on the example, but also because it again went against the reductive grain of so many of the contributions otherwise: Harris-Perry's excellent show is such a geek-fest because it is an academic space filled with earnest activism -- folks who have been shaped by what Foucault described as the "grey, meticulous and patiently documentary" work of research and who retain the fierce pleasure of discovery and relevance of lives devoted to these intense attentions.
When, to the contrary, Judith Donath defines geekdom in her piece through its "affinity to math and science" I wonder if all the science fiction geek shippers who don't know much science beyond what they learned from their Cosmos blu-rays and all the literary historian geeks smelling of a day in the archives but who flunked math are utterly invisible to her in all their geek glory? Donath writes that "Geeky fascination with what is undiscovered drives scientific curiosity and invention, but unmoored from the desire to learn and create, it's simply consumerism." I actually do not agree that all cultural reception and appropriation is "simply consumerism." But even so, her larger point that "obsessive video game playing" will not help solve "climate change, emerging diseases, vast inequality" is certainly true. But why exactly would one expect otherwise? I am not sure that playing video games is enough to make one a geek, any more than watching movies does, and I would like to hear much more about the work that is getting done in Donath's attribution of "obsessiveness" to such gamers, but neither can I accept Donath's identification of geekery with the rational, scientific thought that is indispensable (together with free and accountable democratic governance in the service of general welfare) to the solution of such problems. I think rationality and science are both much bigger and much older than geekery, and that geek enthusiasms are quite valuable in human lives even when they do not contribute to the also indispensably valuable problem-solving work of science, engineering, public investment and harm-reduction policy-making. So, too, reactionary unsustainable consumerism is obviously a bigger problem than is its capacity to colonize the geek imagination -- it isn't the fault of geeks that under capitalism all that is solid melts into air. As I tried to insinuate at least in my piece geekery could use a bit more criticism along with its enthusiasm as an intellectual mode, but too strict an identification of geekdom with progressive science and policy or with reactionary consumerism seems to me to lose track of what geekdom actually is.
Just to be clear, it's not that I think geekery lacks the "affinity to math and science" Donath mentions and which so many geeks so conspicuously exhibit, any more than I deny the connection of geekery to "coding" that Kimberley Bryant and William Powers emphasize in their respective pieces. I simply want to refuse any reduction of geekdom to math nerds or software coders or what have you, whether the reduction is made in a spirit of sympathy or hostility to those geeks who happen to be math nerds or software coders. Again, as I said in my excised opening framing, "There has never been a monolithic geek culture and geeks have never exhibited a singular profile," and this really matters to me. To say that geeks are becoming mainstream because so many people use the term "app" in everyday conversation seems to me as obfuscatory as imagining that geeks became mainstream when folks started to talk about going to "the talkies." I happen to think the present ubiquity of handhelds no more indicates a rise in "tech savvyness" than did the ubiquity of office typewriters in the seventies. Am I really supposed to think people stumbling around in the street staring at images of plates of food their friends have eaten immerses me in a more geeky world in some way? Why is that a world more geeky than one in which people are adjusting the dials of their radios or adjusting the rabbit ears on their tee vees or dipping their quills into inkwells? To put the point more urgently still, why on earth would the New York Times treat the ratings of the relentlessly predicable, unwatchably execrable "Big Bang Theory" as relevant to the subject at hand in any deep sense? Network execs assimilating social change into nonthreatening nonrepresentative cheez whiz isn't exactly anything new or interesting -- in fact it would take a whole hell of a lot of culture studies geeks in an academic conference to convince me there was anything more interesting to say about the "Big Bang Theory" than about "Full House" and "Two And A Half Men" (the proper generic assignment for that bleak business).
When I submitted my contribution for its first editorial bloodletting, I received the rather exasperated suggestion that perhaps I might want to jettison the piece and write instead about why I am a Luddite, refusing to live on the terms of this emerging geek mainstream. I had the sinking suspicion at that point that I had been invited to participate in this forum because of my contrarian anti-futurological critiques. While it is true that the techno-transcendental futurist discourses and formations I write about are indeed geek subcultures, they are far from representative of geekdom in my experience of it, and certainly the farthest thing from definitive to it. I critique futurological pseudo-science as a champion of consensus science, I critique futurological corporate-militarism as a champion of accountable evidentiary harm-reduction policy, I critique parochial plutocratic market futures as a champion of free futures, I critique futurist consumer fandoms as a fan myself of literary sf: inother words, my critiques are those of a lifelong queergeek defending a capacious geekdom on which I have depended for my flourishing, and sometimes for my sanity, all my life.
Contra Fredrik deBoer I believe both that geek enthusiasms are still happening at the margins, and that geek enthusiasms are still marginalizing folks. I don't agree that venture capitalists and sexist gamers are representative of geekery (although I am far from denying and I am full of decrying their terrible geek ways). Certainly I would never pretend that geekdom is somehow insulated from the white racist patriarchal extractive industrial corporate-militarist American society which incubates and marinates geekdom -- my piece concludes with observations and warnings very much to the contrary. But, again, I think that if one wants to get to the heart of geekdom, the better to understand the changes it might enable as it ramifies through digital networked media formations, it is important to get at actually representative and symptomatic figures. I don't deny that Bill Gates exhibits geek traits, but I do deny that there is anything characteristically geeky about the traits in Gates that make him rich, powerful, and famous in America.
Titanic "Geeks Rule" archetypes like Gates, Jobs, Wozniak attended or hung out in geek communities connected to great public universities. Many of their marketable notions were cribbed from the knowledge and efforts of geek researchers and geek enthusiasts who have vanished from history. The reframing of these figures as randroidal sooper-genius fountainheads of entrepreneurial innovation and beneficience disavowing their utter dependence on climates of intellectual discovery (and usually scads of military investment) is a plutocratic commonplace. These hushed up circumstances attending the birth of our ruling tech enterprises (at least Berners-Lee doesn't disavow the gestation of the Web itself in CERN) in such skim-scam operations is re-enacted endlessly in the life of such enterprises, as the ideas and wage-labor of cohorts of coders under contract or scooped up from the bubbling crap cauldron of the start-up lottery, arrive in their full flower in the ritual spectacles of hyper-individualized celebrity CEOs bringing out new (usually just repackaged) gewgaws gobbled up by technoscientifically illiterate pop-tech journalists who confuse gossip and informercial pieties for substance. All along this terrorizing trajectory from creative geek collaboration to eventual plutocratic profitability there are endless occasions for folks with ever more tenuous connections to actual efforts and ideas either to take credit or otherwise sell out their fellows in the hope of some piece of pie down the road. If I may say so, there is nothing particularly geeky about this all too conventionally American ugliness. I am the first to critique the deceptions of plutocratic entreprenurial capitalism and the devastations of soulless unsustainable consumerism -- but I simply think it is a mistake to reduce geekdom to either of these phenomena or to treat geekdom as a particularly illuminating window on their deadly play in the world.
I'll conclude with a word on the Luddites. I do not know what is worse from my perspective, that I was expected by the Times to be a Luddite and not a geek, or that the Times would seem to accept the facile characterization of the Luddites as "anti-technology" in a sense opposed to a no less facile characterization of "pro-technology" geeks. As I never tire of saying, there is no such thing as technology-in-general about which it makes any kind of sense to be either loosely "pro" or "con." The constellation of tools and techniques contains too many differences that make a difference for one to assume a sympathetic or antipathetic vantage on the whole, and it will be the assumptions and aspirations in the service of which these tools and techniques will be put that yield our sensible judgments of right and wrong in any case. To speak, as too many of the contributors to the forum did I am afraid, of "technology" becoming more prevalent, guiding, approved of in society in some general way in connection to the mainstreaming of geekery seems to me to be an utterly confused and confusing way of thinking about the technoscientific vicissitudes at hand. All tools and all techniques are texts: and the literary imagination has quite as much to say about their play as does the engineering imagination. All culture is prosthetic and all prostheses are culture: and hence all individuals and all cultures are exactly as "prostheticized" as every other. To press the point, the Luddites were champions of certain tools just as they were critics of others, the Luddites were masters of certain techniques just as they were suspicious of others. Like those skeptics and contrarians who tend to get called "Luddites" today because they hesitate to be steamrolled into panglossian celebrations of anti-democratic technocratic and algorithmic governance or of hucksters peddling techno-utopian memes and kremes, it is important to grasp that Ray Kurzweil is no more cyborgic in Haraway's sense than is John Zerzan. Given their enthusiasm about their treasured tools and techniques it seems to me frankly far more in point to say that Luddites ARE Geeks, that luddic resistance and play is another facet of geek multiculture rather than its comic-book antagonist.
I think Geeks are less inclined to rule at all than to drool over their personal perfections. I know that I am. As I said in the conclusion of my piece, the crucial final point of which was another editorial casualty, I'm sorry to say: "Perhaps such a mainstream embrace of marginal enthusiasms can help America overcome the defensive anti-intellectual bearings of its ruling masculine culture. But clashes over sexist representations in science fiction, exposures of sexist assumptions and practices in science education and tech-sector hiring, as well as the sexist antics of the 'techbros' of venture capitalism demand that we treat such promises of change in a much more critical way than with the usual geek enthusiasm." That is to say, geekery remains for me an intellectualism fueled by diverse and dynamic enthusiasms for enthusiasms, but the democratizing promise of such geekery would require a criticality that I would not yet identify with geekery as much as I would like to. Discussions such as the forum itself might go some distance to introduce this indispensable dimension of criticism -- but I worry that too many of these critics reduced geekery to certain of its superficial symptoms or mis-identified it with larger social struggles draining geekery of its specificity. As I said before, it's not an easy thing to say much of substance in three hundred words on short notice with editorial shears clipping away, so I doubt any of this reflects badly on my fellow participants, most of whom said more interesting things I managed to do in the time and space on offer when it comes to it.
Tuesday, September 16, 2014
Monday, September 08, 2014
The Yes That Is The No
Behind the eyes of every positive person cold opportunism, every
confident boast shrieking panic, every insistent optimist cruelest dark.
Saturday, September 06, 2014
Graduate Seminar on the Anti-Politics of Design
Here are the first few weeks of the syllabus for my graduate seminar at the San Francisco Art Institute, Designs On Us. The course began as an idea for a book that never went anywhere -- for whatever reason, I think elaborating arguments through course trajectories suits me more than doing so in longform books. Anyway, the syllabus goes on from here, but the textual assignments change quite a bit as the weeks go on, so there is no point posting the rest of it here and now. If you are interested, click the link to the course blog and you can read the assignments as they arrive. The assignments usually arrive at their final form no later than the Friday before the following session.
CS-500H-01 Designs On Us: The Politics and Anti-Politics of Design
Course Blog: http://designsonus.blogspot.com/
Dale Carrico: dcarrico@sfai.edu; ndaleca@gmail.com
Attendance/Participation, 10%; Precis, 10%; Designer Presentation, 10%; 10+ Comments, 10%; Symposium Presentation, 10%; Final Paper, 50%
The proposal that is the point of departure for our course is that design discourse is a site where at once politics is done and politics is disavowed. Design as a site of "designation" invokes the gesture of naming as mastery, of reduction as revelation, of problems as provocations to instrumental technique and not stakeholder struggle, a mentalité with its own paradoxical temporality, publicity, linearity, cognition. Design as a site of the "designer label" is an indulgence in fetishism, of the commodity-form, of an auratic posture, of a psychic compensation of lack and its threat. To elaborate and pressure these propositions, we will spend quite a bit of time in the critique of three design discourses in particular: one, Green design which would accomplish sustainability without history; two, social/p2p software design which would accomplish democracy without participation; and three, eugenic design which would accomplish life-enhancement without lifeway diversity. In your individual presentations I hope we will ramify our attentions to other design sites: comparative constitutions, fashion design, food styling, graphic design, industrial design, interior design, landscape design, "life coaching," and who knows what else?
Week One | August 27 -- Introductions
Added, for those who mentioned during the opening lecture that they might like a little more background on the fetishism of the Designer Label:
Marx on The Fetishism of Commodities and the Secret Thereof from Capital
Walter Benjamin, Art in the Age of Mechanical Reproducibility
Naomi Klein, Taking On the Brand Bullies from No Logo
Week Two | September 3 -- Biomimicry, Cradle to Cradle, Natural Capitalism
Martin Heidegger, The Question Concerning Technology
Dale Allen Pfeiffer, Eating Fossil Fuels
Janine Benyus, Echoing Nature
Biomimicry Institute, Velcro
William McDonough & Michael Braungart The NEXT Industrial Revolution
Cradle to Cradle -- Principles
Amory Lovins, Hunter Lovins, Paul Hawken, A Roadmap for Natural Capitalism
OpenPolitics Critiques of Paul Hawken and Natural Capitalism
Week Three | September 10 -- Permaculture and Viridian Design
Bruce Sterling, When Blobjects Rule the Earth
Bruce Sterling, Manifesto of January 3, 2000
Viridian Design Principles
Bruce Sterling, Last Viridian Note
Wes Jackson and Wendell Berry, A 50-Year Farm Bill
The Land Institute: (a) Issues (b) Solutions (c) Science
Navdanya: About Us
GEN: Global Eco-Village Network: Definitions
Permaculture Design Principles, Online Interactive Presentation
CS-500H-01 Designs On Us: The Politics and Anti-Politics of Design
Course Blog: http://designsonus.blogspot.com/
Dale Carrico: dcarrico@sfai.edu; ndaleca@gmail.com
Attendance/Participation, 10%; Precis, 10%; Designer Presentation, 10%; 10+ Comments, 10%; Symposium Presentation, 10%; Final Paper, 50%
The proposal that is the point of departure for our course is that design discourse is a site where at once politics is done and politics is disavowed. Design as a site of "designation" invokes the gesture of naming as mastery, of reduction as revelation, of problems as provocations to instrumental technique and not stakeholder struggle, a mentalité with its own paradoxical temporality, publicity, linearity, cognition. Design as a site of the "designer label" is an indulgence in fetishism, of the commodity-form, of an auratic posture, of a psychic compensation of lack and its threat. To elaborate and pressure these propositions, we will spend quite a bit of time in the critique of three design discourses in particular: one, Green design which would accomplish sustainability without history; two, social/p2p software design which would accomplish democracy without participation; and three, eugenic design which would accomplish life-enhancement without lifeway diversity. In your individual presentations I hope we will ramify our attentions to other design sites: comparative constitutions, fashion design, food styling, graphic design, industrial design, interior design, landscape design, "life coaching," and who knows what else?
Week One | August 27 -- Introductions
Added, for those who mentioned during the opening lecture that they might like a little more background on the fetishism of the Designer Label:
Marx on The Fetishism of Commodities and the Secret Thereof from Capital
Walter Benjamin, Art in the Age of Mechanical Reproducibility
Naomi Klein, Taking On the Brand Bullies from No Logo
Week Two | September 3 -- Biomimicry, Cradle to Cradle, Natural Capitalism
Martin Heidegger, The Question Concerning Technology
Dale Allen Pfeiffer, Eating Fossil Fuels
Janine Benyus, Echoing Nature
Biomimicry Institute, Velcro
William McDonough & Michael Braungart The NEXT Industrial Revolution
Cradle to Cradle -- Principles
Amory Lovins, Hunter Lovins, Paul Hawken, A Roadmap for Natural Capitalism
OpenPolitics Critiques of Paul Hawken and Natural Capitalism
Week Three | September 10 -- Permaculture and Viridian Design
Bruce Sterling, When Blobjects Rule the Earth
Bruce Sterling, Manifesto of January 3, 2000
Viridian Design Principles
Bruce Sterling, Last Viridian Note
Wes Jackson and Wendell Berry, A 50-Year Farm Bill
The Land Institute: (a) Issues (b) Solutions (c) Science
Navdanya: About Us
GEN: Global Eco-Village Network: Definitions
Permaculture Design Principles, Online Interactive Presentation
Sunday, August 31, 2014
Science Fiction Is Not Agitprop For Your "The Future"
Upgraded and adapted from the Moot, "JimF" snarks about an interview in the transhumanoid magazine humanity-plus -- so if you don't get it, you're obviously "humanity-minus" like me -- portentiously (obviously) entitled, Transhumanist Science Fiction: The Most Important Genre the World Has Ever Seen? (An Interview with David Simpson). In this piece, "science fiction author, transhumanist, and award-winning English literature teacher" David Simpson talks about "his Post-Human series (which include the novels Sub-Human, Post-Human, Trans-Human, Human Plus, and Inhuman) [which] is centered on the topics and interests of transhumanists." We are told that "David is also currently working with producers to turn the Post-Human series into a major motion picture."
All this is of world shattering importance because the hoary sfnal conceits predictably tumbling in these superlative fictions like socks in a dryer (reconceived, you will have noticed as "topics and interests of transhumanists," that is to say reconceived as legitimate scientific/philosophical objects and political/policy stakes for legible constituencies -- neither of which they remotely are) are imagined here to function as educational, agitational, and organizational agitprop fueling a movement that will sweep the world and materially bring about "The Future" with which that movement identifies. In other words, the usual stuff and nonsense.
"JimF" notices a family resemblance of these earthshatttering "notions" with those already available in, for example, 2001: A Space Odyssey, Blade Runner and Alien (and I will speak of Star Trek in a moment), although he takes an ironic measure of reassurance in the fact that the attention spans of modern audiences would no doubt require even literal remakes of these classics to be embiggened and ennobled by the introduction of kung-fu and car chase sequences.
Anyway, "JimF" connects these dreary ruminations on (counter-)revolutionary futurological propaganda films with the recent hopes of Chris Edgette to Kickstart a film called I's about the usual futurological is that ain't, telling the story of the rather Biblical Workweek from the day a supercomputer that "wakes up" (how original! how provocative! you really gotta hand it to him) and then snowballs in days into the Rapture/Apocalypse of the Singularity. You know, rather like Left Behind for New Age pseudo-scientists.
Or, The Lawnmower Man -- AGAIN!
Rather like Randroids who seem to keep pinning their hopes on the next Atlas Shrugged movie sweeping the world and bringing the masses to muscular greedhead baby jeebus, so too the pale stale males of the Robot Cult really truly seem to keep thinking that the next iteration of The Lawnmower Man won't only not suck but will bring on the Singularity at last.
Anyhoozle, "JimF" is clearly on the same wavelength when he snarkily wonder whether futurological propaganda pedagogues and hopes of the world like David Simpson and Chris Edgette haven't had most of their thunder stolen by now what with the megaflop of Transcendence, the limp sexism of critical darling and popular meh Her, and the forgettable racist amusements of Lucy.
But, if those recent sf retreads could steal transhumanoid singularitarian agitprop thunder, personally I can't for the life of me conceive why Star Trek hadn't already stolen their thunder irrevocably before they even started. Needless to say, uploading (in well over a dozen eps), sentient robots/computers, genetic (and ESPer) supermen, better than real virtualities, techno superabundance were all explored as sfnal conceits in Star Trek.
But also needless to say (sadly, no, this obviously needs saying), none of these sfnal conceits originated in Star Trek either, they were each citations in a popular and popularizing sf series of widely and readily available tropes.
Quite beyond the paradoxical figuration of the brain dumbing mind numbing stasis of their endlessly regurgitated futurological catechism as some kind of register "shock levels" and "accelerating change" (or even the "acceleration of acceleration"!) -- a paradox not unconnected with the skim-and-scam upward fail con artistry of tech startups describing as "disruptions" their eager amplications of the deregulatory looting and fraudulent financialization of the already catastrophically prevailing neoliberal status quo -- it really is extraordinary to grasp how superannuated the presumably shattering provocations of the Robot Cultists really turn out to be, the most ham handed reiterations of the most stock sf characters and conceits imaginabl
Just as Star Trek explored current politics allegorically
(notoriously sometimes somewhat clumsily) in many sfnal plots, so too
their explorations of the sfnal archive were in my view reflections in
the present on the impact of ongoing sociocultural forces (materialism,
industrialism, computationalism) on abiding values and notions of
identity and so on.
Like all great literature, sf at
its best comments on the present, on present problems and possibilities,
and the open (promising, threatening) futurity that inheres in the
present. Of course, plenty of authors and readers may have said that
their good sf was about "The Future," but this confused locution often
obscures the ways in which their work actually engaged futurity in ways
that exceeded authorial intentions and understanding. I will go so far
as to say that no great sf has ever been about "The Future,"
predictive of "The Future," agitprop for "The Future. Of course,
extrapolation is a technique in the sf toolkit (as in the satirist's and
the fabulist's), but predictions and hypotheses and the rest never make
for great or even good sf: To read sf as prophetic agitprop for
parochialisms denominated "The Future" always reveals a crappy writer or
a crappy reader.
Star Trek actually did and does still inspire mass
movements -- but surely not all or even more than a
few of its fans thought or think their enthusiasm for the specificities of the
show's characters or plots or furniture, or even for the secular scientific liberal multiculturalism of its
values, expect that they constitute somehow the kernel of a literal
proto-federation that will bring its inventions into existence through
the shared fervency of their fandom at conventions.
I often chide
transhumanoids as pseudo-scientific scam artists peddling boner pill
and anti-aging kreme scams but amplified from late-nite infomercials to
outright phony religions faith-based initiatives. But I also often
chide them as consumer fandoms of the particularly crappy sf genres of the
corporate press release and the futurological scenario.
As to
the latter, it seems relevant to point out that the problem of the
transhumanoids isn't that I think they have terrible taste in sf
(anybody who gets off on Toffler, Kurzweil, and venture capitalist
spiels has execrable taste whatever their other character flaws) it's
that they are an sf fandom predicated on not even getting what sf is
about at the most basic level.
Star Trek was not predicting or building a vision of "The
Future," but exposing the futurity inhering in the diversity of beings
in the present, reminding us of the wonder and promise and danger of that ever-open
futurity, understanding that its audience corralled together a
diversity out of whom also-open next-presents would be made. Like all true sf, like all true literature, Star Trek solicits our more capacious identification with the diversity of beings with whom we share the present world, the better to engage that diversity in the shaping of shared present worlds to come.
Champions of science -- who treat science as pseudo-scientific PR and
faith-based techno-transcendentalism? Sf fandoms -- who don't even get
that sf is literature? Is it any wonder these clueless careless dumbasses think they are the smartest guys in any room?
All this is of world shattering importance because the hoary sfnal conceits predictably tumbling in these superlative fictions like socks in a dryer (reconceived, you will have noticed as "topics and interests of transhumanists," that is to say reconceived as legitimate scientific/philosophical objects and political/policy stakes for legible constituencies -- neither of which they remotely are) are imagined here to function as educational, agitational, and organizational agitprop fueling a movement that will sweep the world and materially bring about "The Future" with which that movement identifies. In other words, the usual stuff and nonsense.
"JimF" notices a family resemblance of these earthshatttering "notions" with those already available in, for example, 2001: A Space Odyssey, Blade Runner and Alien (and I will speak of Star Trek in a moment), although he takes an ironic measure of reassurance in the fact that the attention spans of modern audiences would no doubt require even literal remakes of these classics to be embiggened and ennobled by the introduction of kung-fu and car chase sequences.
Anyway, "JimF" connects these dreary ruminations on (counter-)revolutionary futurological propaganda films with the recent hopes of Chris Edgette to Kickstart a film called I's about the usual futurological is that ain't, telling the story of the rather Biblical Workweek from the day a supercomputer that "wakes up" (how original! how provocative! you really gotta hand it to him) and then snowballs in days into the Rapture/Apocalypse of the Singularity. You know, rather like Left Behind for New Age pseudo-scientists.
Or, The Lawnmower Man -- AGAIN!
Rather like Randroids who seem to keep pinning their hopes on the next Atlas Shrugged movie sweeping the world and bringing the masses to muscular greedhead baby jeebus, so too the pale stale males of the Robot Cult really truly seem to keep thinking that the next iteration of The Lawnmower Man won't only not suck but will bring on the Singularity at last.
Anyhoozle, "JimF" is clearly on the same wavelength when he snarkily wonder whether futurological propaganda pedagogues and hopes of the world like David Simpson and Chris Edgette haven't had most of their thunder stolen by now what with the megaflop of Transcendence, the limp sexism of critical darling and popular meh Her, and the forgettable racist amusements of Lucy.
But, if those recent sf retreads could steal transhumanoid singularitarian agitprop thunder, personally I can't for the life of me conceive why Star Trek hadn't already stolen their thunder irrevocably before they even started. Needless to say, uploading (in well over a dozen eps), sentient robots/computers, genetic (and ESPer) supermen, better than real virtualities, techno superabundance were all explored as sfnal conceits in Star Trek.
But also needless to say (sadly, no, this obviously needs saying), none of these sfnal conceits originated in Star Trek either, they were each citations in a popular and popularizing sf series of widely and readily available tropes.
Quite beyond the paradoxical figuration of the brain dumbing mind numbing stasis of their endlessly regurgitated futurological catechism as some kind of register "shock levels" and "accelerating change" (or even the "acceleration of acceleration"!) -- a paradox not unconnected with the skim-and-scam upward fail con artistry of tech startups describing as "disruptions" their eager amplications of the deregulatory looting and fraudulent financialization of the already catastrophically prevailing neoliberal status quo -- it really is extraordinary to grasp how superannuated the presumably shattering provocations of the Robot Cultists really turn out to be, the most ham handed reiterations of the most stock sf characters and conceits imaginabl
Thursday, August 28, 2014
Is George Lucas A Barbarian?
George Lucas:
People who alter or destroy works of art and our cultural heritage for profit or as an exercise of power are barbarians, and if the laws of the United States continue to condone this behavior, history will surely classify us as a barbaric society.Lucas refuses to allow the National Film Registry to preserve the actual 1977 version of Star Wars, pretending that the version that had an impact none of his subsequent solo efforts ever did or ever could was unfinished. His endless larding of films with crappy videogame CGI and infantile slapstick gags and leaden fanwanking exposition to render his whole bloated execrable saga consistent may indeed finish the film for good. I found the original film enjoyable -- and it actually mattered to me as a kid who watched it in a theater on my twelfth birthday on a screen the size of a football field. Of course, the prequels are literally unwatchably bad, and in consequence Return of the Jedi now seems mostly unwatchable as well, as forgivable missteps in that movie now seem like anticipations of the awfulness of the prequels and so have gotten retroactively implicated in their crimes (the camp resonance of a few moments -- like "It's a trap!" -- and, of course, the Emperor's scenery chewing evil monologues alone save the movie for me), and at this point the bullying sexism in The Empire Strikes Back makes long stretches of the best in the bunch nearly unwatchable for me too. Lucas can do what he wants with his movies, of course (the opening quote refers to the profitable colorization of classic films by those who did not have a hand in their making), but the original Star Wars was a cultural phenomenon. That really happened and archivists and historians shouldn't have to contend with Lucas' bad taste and elephantine ego in doing their work of doing justice to that reality. Once released into the world, the world has its way with our work, the changing receptions of the work collaborate in the significance of which it is capable. All actually relevant and living works of art are unfinished in this way. Lucas' effort to control the circulation of his best work is of a piece with the amplifying awfulness of the rest of his work -- closing himself off from the world he contributes less and less worth taking up by the world. The world will win this contest, and when Lucas vanishes it will the archivists and historians and critics he disdains who will be the likeliest to save the trace of his part in the contest that might live in worlds to come.
Wednesday, August 27, 2014
Arturo Galster, R.I.P.
SF Bay Guardian
To call seminal SF perfomer and alpha theater aficionado Arturo Galster merely a "drag queen" is to do his range -- from the legendary Vegas in Space movie and pitch-perfect live-sung Pasty Cline interpretations to his recent technicolor turns with the Thrillpeddlers -- a disservice. But his name will always call to mind that moment in the late '80s and early '90s when SF's drag scene unmoored itself from polite old school diva kabuki into a squall of gloriously punky, ironic camp.
Monday, August 25, 2014
Richard Jones Critiques Transhumanism
Richard Jones has been a sympathetic critic of superlative futurology for years, and his training and research makes him the rare scientist who can engage transhumanists in the "technical debates" they cherish. Most who are qualified to indulge in these debates either don't take the transhumanists seriously enough to give them the time of day and almost all the rest are already True Believers whose science was acquired and is selectively filtered in the service of their futurological faith. Richard Jones (like Athena Andreadis and a handful of others) can marshal devastating scientific critiques of techno-transcendental pretensions, but crucially remain intrigued enough by the social and cultural dimensions of futurological discourses and subcultures to remain engaged with them. Jones has recently offered the beginnings (he promises that there is more to come, and that seems to me promising indeed) of such criticism in a piece contextualizing pseudo-scientific futurological extrapolations in an apocalyptic religiosity lending itself to technological determinism over at Soft Machines:
Transhumanists are surely futurists... And yet, their ideas, their motivations, do not come from nowhere. They have deep roots, perhaps surprising roots, and following those intellectual trails... we’re led back, not to rationalism, but to a particular strand of religious apocalyptic thinking that’s been a persistent feature of Western thought... Transhumanism is an ideology, a movement, or a belief system... The idea of transhumanism is associated with three predicted technological advances. The first is a vision of a radical nanotechnology as sketched by K. Eric Drexler, in which matter is effectively digitised... the route to the end of scarcity, and complete control over the material world. The second is a conviction -- most vocally expounded by Aubrey de Grey -- that it will shortly be possible to radically extend human lifespans, in effect eliminating ageing and death. The third is the belief that the exponential growth in computer power implied by Moore’s law, to be continued and accelerated through the arrival of advanced nanotechnology, makes the arrival of super-human level artificial intelligence both inevitable and imminent. I am sceptical about all three claims on technical grounds... But here I want to focus, not on technology, but on cultural history. What is the origin of these ideas... The connection between singularitarian ideas and religious eschatology is brilliantly captured in the phrase... “Rapture of the Nerds” ... A thoughtful transhumanist might well ask, what is the problem if an idea has origins in religious thought? ... The problem is that mixed up with those good ideas were some very bad and pernicious ones, and people who are ignorant of the history of ideas are ill-equipped to distinguish good from bad. One particular vice of some religious patterns of thought that has slipped into transhumanism, for example, is wishful thinking... If you think that a technology for resurrecting dead people is within sight, we need to see the evidence. But we need to judge actually existing technologies rather than dubious extrapolations... This leads me to what I think is the most pernicious consequence of the apocalyptic and millennial origins of transhumanism, which is its association with technological determinism. The idea that history is destiny has proved to be an extremely bad one, and I don’t think the idea that technology is destiny will necessarily work out that well either. I do believe in progress... But I don’t think... [it] is inevitable. I don’t think... progress... is irreversible, either, given the problems, like climate change and resource shortages... I think people who believe that further technological progress is inevitable actually make it less likely.I do not doubt that many singularitarians and transhumanists will declare Jones' concluding verdict false, insist that they think positive futures are far from inevitable, and explain that the whole point of their membership organizations is to facilitate better outcomes. This is why they devote so much of their energy to existential risk discourse and coding friendly AI and so on. Quite apart from the curious fact that so much of this "organized activity" amounts to titillating collective rituals in soft-porn techno-terror and techno-paradise navel-gazing, I daresay Jones would point out that the "concrete concerns" of superlative futurology with mind-uploads, desktop drexler boxes, superintelligent code, robot and clone armies, various runaway goos provide the figurative furniture (in what sense are any of these concerns really "concrete" at all?) rendering more real, more necessary, more intuitive, more natural the deeper assumptions and aspirations and conceits fueling their futurological faith. Ultimately, what futurologists deem and need to preserve as "inevitable" is the gesture of a repudiation of the open futurity inhering in the diversity of stakeholders to the present through the projection of and identification with parochial incumbencies denominated The Future. The specificities of the techno-transcendental catechism, whatever they may be from futurist to futurist, proceed from there.
Monday, August 04, 2014
Spectacle From Marx to Debord to Big Data
First being degraded into having, then having degraded into appearing, and now appearing degraded into targeting....
We have arrived at the "targeting" phase of Spectacle. In the specifically digital-networked Spectacle since the turn of the millennium -- after which mass-mediation is no longer defined by broadcast and press publication -- what Debord called the Opium War of "enhanced survival" (his condensation of the Benjaminian War Machine in the Epilogue of "Art in the Age of Mechanical Reproducibility" with the Adornian "manufactured needs" of the Culture Industry chapter of Dialectic of Enlightenment) has given way to a micro-targeted marketing harassment promising to confer both legibility and individuation for consumer/partisan subjection, an operation absolutely continuous with at once the Big Data profiling framing every subject for eventual legal prosecution and the biometric profiling tagging every subject for ongoing medical experimentation (digital networked bioremediation by Big Pharma) and/or eventual effective targeting by drone (the drone is synecdochic for the range of collateral damaging demanded by disaster capitalism).
Quite relevant to this telling of the tale is Naomi Klein's latter day elaboration of the Debordian account back in No Logo, in which an advertizing practice originating in the false individuation of mass-produced consumer goods via the brands they bear eventuated in the global/digital moment in the false individuation of mass-consumers via the brands they buy. As in Debord, the degrading of already degraded having into "appearing" seduces spectator-subjects through something a bit like Althusserian interpellation, offering up social legibility, usually by means of subcultural signaling of identifications and dis-identifications, through the citation -- via conspicuous consumption -- of already-available scripts and stage-settings (the grownup living room on the glossy cover of a furniture catalog, the rebellion of a concert t-shirt, the romance of over-expensive coffee, the reassuring daydreams of futurological projections and displacements).
There is a threat inhering in the Althusserian hail -- yes, a threat to rather than resource for hegemonic management -- should just enough hails ring out (hey, you, hey, You, hey, YOU!) the subject turning and turning and turning to meet the would-be authority might be left more dizzy than docile -- might even make the reflective turn of thought-made-act to which Arendt looked for a last miraculous hope of redemption from tyranny. But is this threat recontained (or rendered more efficient, in case the threat was never more than delusive anyway) in the targeted hail of the networked-data profile? Can we resist the authoritarian hail of the profile that authors you for you? The Big Data Hail scans the iris and the gait and the buying history and the message trail and the credit rating at once to collapse the indetermination of multiple readings depriving you -- in a privation yielding a last vestige of privacy -- of the singular selfhood that becomes the target, knows enough more than you about you do to aggregate the into the heavy hand of the Spectacle, a knowing so authoritative the transferential brute-force alone at hand might re-write you in the image of the profitably congenial profile before you know enough to know it?
We have arrived at the "targeting" phase of Spectacle. In the specifically digital-networked Spectacle since the turn of the millennium -- after which mass-mediation is no longer defined by broadcast and press publication -- what Debord called the Opium War of "enhanced survival" (his condensation of the Benjaminian War Machine in the Epilogue of "Art in the Age of Mechanical Reproducibility" with the Adornian "manufactured needs" of the Culture Industry chapter of Dialectic of Enlightenment) has given way to a micro-targeted marketing harassment promising to confer both legibility and individuation for consumer/partisan subjection, an operation absolutely continuous with at once the Big Data profiling framing every subject for eventual legal prosecution and the biometric profiling tagging every subject for ongoing medical experimentation (digital networked bioremediation by Big Pharma) and/or eventual effective targeting by drone (the drone is synecdochic for the range of collateral damaging demanded by disaster capitalism).
Quite relevant to this telling of the tale is Naomi Klein's latter day elaboration of the Debordian account back in No Logo, in which an advertizing practice originating in the false individuation of mass-produced consumer goods via the brands they bear eventuated in the global/digital moment in the false individuation of mass-consumers via the brands they buy. As in Debord, the degrading of already degraded having into "appearing" seduces spectator-subjects through something a bit like Althusserian interpellation, offering up social legibility, usually by means of subcultural signaling of identifications and dis-identifications, through the citation -- via conspicuous consumption -- of already-available scripts and stage-settings (the grownup living room on the glossy cover of a furniture catalog, the rebellion of a concert t-shirt, the romance of over-expensive coffee, the reassuring daydreams of futurological projections and displacements).
There is a threat inhering in the Althusserian hail -- yes, a threat to rather than resource for hegemonic management -- should just enough hails ring out (hey, you, hey, You, hey, YOU!) the subject turning and turning and turning to meet the would-be authority might be left more dizzy than docile -- might even make the reflective turn of thought-made-act to which Arendt looked for a last miraculous hope of redemption from tyranny. But is this threat recontained (or rendered more efficient, in case the threat was never more than delusive anyway) in the targeted hail of the networked-data profile? Can we resist the authoritarian hail of the profile that authors you for you? The Big Data Hail scans the iris and the gait and the buying history and the message trail and the credit rating at once to collapse the indetermination of multiple readings depriving you -- in a privation yielding a last vestige of privacy -- of the singular selfhood that becomes the target, knows enough more than you about you do to aggregate the into the heavy hand of the Spectacle, a knowing so authoritative the transferential brute-force alone at hand might re-write you in the image of the profitably congenial profile before you know enough to know it?
Wednesday, July 16, 2014
We're Another Step Closer to Growing the Post Office Into Low-Cost Banking
David Dayen writes in Salon today about the improving prospects for a great idea I've written about wistfully here from time to time:
Darrell Issa, chairman of the House Oversight Committee which oversees the Postal Service, previously called postal banking “unacceptable” and a “massive expansion” of government power. But now, the Senate Homeland Security and Government Affairs Committee... finally held hearings for four nominees to the Postal Service Board of Governors... For most of the Obama Administration through to today, Republicans held a majority on this board thanks to multiple vacancies. But confirming these nominees would equalize the representation at four Democrats and four Republicans (the President still needs to nominate a replacement for an additional vacant seat, which would give Democrats the majority). This would put the pieces in place that could make postal banking a reality... Democratic nominee Vicki Kennedy -- Ted’s widow -- did say... “I think it also important to look at the possibility of expanding into related business lines,” and that the post office needed the “regulatory flexibility to take advantage of opportunity and innovate when it is in the public interest.” Postal banking serves that capacity... 1 in 4 American households with little or no access to financial services need a convenient, cheap banking option, so they don’t continue to get gouged by... payday lenders and check-cashing stores. Another Democratic nominee, Stephen Crawford, cited the Inspector General report on postal banking directly during questioning. “We see a lot of foreign postal services make some money on that,” Crawford correctly pointed out... “If I were on the board, that’s an area I would give special attention to.” ... Even more momentum comes today from a full-day conference in Washington on postal banking... Speakers include Sen. Elizabeth Warren, Issa and Postal Service Inspector General David Williams... The unbanked favored the post office making available prepaid debit cards by 38%-9%... And a large majority said they would be likely to use lower-cost versions of these services: 81% would go to the post office to cash checks, 79% to pay bills and 71% as an alternative to payday loans... “There’s a lot of interest if they can use postal services at a lower price point,” said Alex Horowitz, a researcher at the Pew Charitable Trusts... [B]y virtue of its universal service mandate, it has a network of 35,000 locations in every corner of the country. And where banks have not made the effort, the post office has significantly more reach, particularly in rural America... 3.5 million Americans live more than 10 miles from the nearest bank branch, and another 3 million live in densely populated areas that are nonetheless still over a mile from the nearest bank... Even in some urban locations, particularly in high-poverty areas, the closest postal branch location offers more convenience than the bank. And with bank closings more pronounced in low-income areas, the value of post offices as a financial services alternative could grow... Check-cashing stores and payday lenders... are ubiquitous in poor communities... But... low cost and convenience could give postal banking a leg up... Public outcry, largely from postal unions and their allies, has led to the Postal Service ending their pilot program of post office counters inside Staples, staffed by non-union workers at lower pay. [Grrrrrr! --d] The announcement came... after the American Federation of Teachers... voted to boycott Staples in solidarity with postal workers... [Yay, Unions! --d] Believers in postal banking have some high-profile support and some key facts. Now they need to organize and act... [T]here’s no reason the United States cannot respond to technological changes in mail volume by returning to offering financial services, which aligns with its core mission of promoting commerce. It certainly beats closing more distribution centers, firing more workers and squandering a vast network of physical and human capital that can serve some of the nation’s critical needs.Republicans are endlessly attacking the post office and postal workers (as they attack every aspect of government that does conspicuous public good and maintains a good public reputation), and this is an idea which would make the beleaguered postal service more solvent and hence more insulated from these ideological anti-civilizational attacks. And it would do so while at once transforming the landscape of financial services for the working poor, both urban and rural, as well as providing a contrast of fair fees and good service that might begin to pressure the venal con-artists of big banking into better practices in these areas themselves. This is a good idea and it's time has come. Of course, even a Democratic majority on the Board is little likely to provide more than a pilot program, and the changes will likely take years, but the exploitation of people who work for a living by the payday lenders and cash card sharks is a problem years in the making, too, and a couple more Democratic terms in the White House can shepherd these processes into implementation while the Republicans howl and do their usual worst.
Tuesday, July 08, 2014
Disrupt, For Real: Richard Eskow Makes A Case for Nationalizing Big Tech
Salon:
[L]aw professor Susan Crawford argues that “high-speed wired Internet access is as basic to innovation, economic growth, social communication, and the country’s competitiveness as electricity was a century ago.” Broadband as a public utility? If not for corporate corruption of our political process, that would seem like an obvious solution. Instead, our nation’s wireless access is the slowest and costliest in the world. But why stop there? Policymakers have traditionally considered three elements when evaluating the need for a public utility: production, transmission, and distribution. Broadband is transmission. What about production and distribution? The Big Tech mega-corporations... were created with publicly-funded technologies, and prospered as the result of indulgent policies and lax oversight. They’ve achieved monopoly or near-monopoly status, are spying on us to an extent that’s unprecedented in human history, and have the potential to alter each and every one of our economic, political, social and cultural transactions... No matter how they spin it, these corporations were not created in garages or by inventive entrepreneurs. The core technology behind them is the Internet, a publicly-funded platform for which they pay no users’ fee. In fact, they do everything they can to avoid paying their taxes. Big Tech... operates in a technological “commons” which they are using solely for its own gain, without regard for the public interest. Meanwhile the United States government devotes considerable taxpayer resource to protecting them... Big Tech’s services have become a necessity in modern society. Businesses would be unable to participate in modern society without access... For individuals, these entities have become the public square... The bluntness with which Big Tech firms abuse their monopoly power is striking. Google has said that it will soon begin blocking YouTube videos... unless independent record labels sign deals with it... Amazon’s war on publishers... is another sign of Big Tech arrogance. But what is equally striking about these moves is the corporations’ disregard for basic customer service... Google is confident that even frustrated music fans have nowhere to go. Amazon is so confident of its dominance that it retaliated against Hachette by removing order buttons... and lied about the availability of Hachette books when a customer attempts to order one... Internet companies are using taxpayer-funded technology to make billions of dollars from the taxpayers –- without paying a licensing fee... Amazon was the beneficiary of tax exemptions which allowed it to reach its current monopolistic size. Google and the other technology companies have also benefited from tax policies and other forms of government indulgence. Contrary to popular misconception, Big Tech corporations aren’t solely the products of ingenuity and grit. Each has received, and continues to receive, a lot of government largesse... Most of Big Tech’s revenues come from the use of our personal information... Social media entries, web-surfing patterns, purchases, even our private and personal communications add value to these corporations. They don’t make money by selling us a product. We are the product, and we are sold to third parties for profit. Public utilities are often created when the resource being consumed isn’t a “commodity” in the traditional sense. “We” aren’t an ordinary resource. Like air and water, the value of our information is something that should be... at a minimum, publicly managed.... Privacy, like water or energy, is a public resource. As the Snowden revelations have taught us, all such resources are at constant risk of government abuse. The Supreme Court just banned warrantless searches of smartphones –- by law enforcement. Will we be granted similar protections from Big Tech corporations? ... Google tracks your activity and customizes search results, a process which can filter or distort your perception of the world around you. What’s more, this “personalized search results” feature leads you back to information sources you’ve used before... Over time this creates an increasingly narrow view of the world... Google has photographically mapped the entire world. It intends to put the world’s books into a privately-owned online library. It's launching balloons around the globe which will bring Internet access to remote areas –- on its terms... [T]hings are likely to get worse -- perhaps a lot worse -- unless something is done. The solution may lie with an old concept. It may be time to declare Big Tech a public utility.I think these are strong arguments that should have a prominent place in public arguments about Big Tech. There are comparatively recent precedents for their applications in related fields, for the naysayers out there. Even if you judge the practical prospects for such policy outcomes unlikely, these arguments re-frame a host of "technology" issues in what seem to me incomparably more clarifying ways than the usual terms provide at present. And even our failure to nationalize equitable access to bandwidth, search and socializing tools as public utilities accountably administered for the common good need not be deemed a complete failure even on the terms of these arguments themselves, if instead they manage only to scare the shit out of enough greedy short-sighted self-congratulatory techbro skim-and-scam artists to make them actually behave themselves.
Subscribe to:
Posts (Atom)