Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Tuesday, September 30, 2014

Today's Random Wilde

It is not logic that makes men reasonable, nor the science of ethics that makes men good.

Monday, September 29, 2014

Fear Itself

I'm an American terrified how readily Americans are terrorized. Polities, like people, can be driven either by love or by fear. Fear starves commonsense and commonwealth to feed the War Machine. Love, in turn, isn't for objects, love is for objections. King's Beloved Community, liberalism's most revolutionary self-image, for example, is not terminally reconciled but interminably reconciling. And in its ongoing reconciliation of human hopes and histories the Beloved Community reveals its indispensable devotion is to politics. Politics is premised on living selves changing in intercourse with different others, rather than the death-dealing defense of dead, closed, completed selves. That kind of moralizing, colonizing, controlling project is an anti-politics, not a politics, finally, for which differences are always threats and never promises, but it can drive polities just as it can drive people. For a time. Again, love against fear. And for the love of the open futurity invigorated by politics, FDR's admonition really is the true one: the only thing we have to fear is fear itself.

Sunday, September 28, 2014

Meta/Micro

There is no question that my microblogging practice has reshaped my blogging here -- my writerly habits and commitments have changed. Of course, teaching is no less demanding than it ever was: For most of the first decade of the blog this often meant dry writing spells while I was crafting lectures or grading papers, while now it means instead days in a row in which a Wilde quote or a slightly enriched tweet is all I have the time (or inclination) to post. Superficial contact yields an ongoingness that keeps the blog livelier than those protracted inactive gulfs did, I guess, but contentedness with these light touches makes it too easy for me to put a blog on the shelf spiritually. I also do know of course that more pithy bloggers like Atrios are perfectly capable of doing critical, provocative, engaging work that manages in the aggregate to express a real voice in posts that cleave close to tweet-craft. Despite all that, I do think Amor Mundi has lost something lately.

Saturday, September 27, 2014

On the Road to the Sexist Singularity

My partner Eric's AI-God embryo spellchecker just insisted "servicemembers" should be "servicemen."

Friday, September 26, 2014

Thursday, September 25, 2014

Marketing Militarism

Rebranding crime as terrorism sure is a great way to sell more militarism.

"We're Not Monsters, We’re Just Ignorant Bigoted Greedy Short-Sighted Assholes Who Don't Think Everybody Deserves Healthcare, Who Steal Personal Credit and Profit For Collective Accomplishments, and Who Can't Win Elections Unless We Cheat," Declares New GOP Ad

The ad is premised on the aptly impoverished conceit that consumerism is enough to constitute political solidarity (actually nobody is surprised to hear that people in both political parties buy commodities from ubiquitous brands).

And needless to say, the proportion of people of color in this ad flabbergastingly fails to represent the proportion of people of color who identify with the Republican party in actual reality.


I'm posting the clip simply because I appreciate the real desperation it reveals, and because it pays to remember that if people of color and single women actually voted in higher than historical numbers this November Democrats could not only remain safely in charge of the Senate (and hence protect the courts from forced-pregnancy zealots) but actually regain the House of Representatives (and hence set the stage for  comprehensive immigration reform, commonsense gun safety legislation, and a jobs bill to address the ongoing unemployment crisis and invest in renewable energy and transportation infrastructure). Hell, if we were actually accomplishing such legacy-making good things at home, maybe our "Reluctant Warrior" would be too distracted to find yet another (six and counting, if you're keeping track) country to bomb at billions a month of the dollars we somehow don't have enough of to fund effective stimulative food security programs to our own suffering citizens.

So, yeah, here it is. For the record, I never doubted Republicans are people. I grew up surrounded by Republicans. To be a bad person, you have to be a person first. I guess there's some hope in that.

Wednesday, September 24, 2014

Tuesday, September 23, 2014

Teaching Day

Benjamin and Adorno today in my undergraduate critical theory survey course this afternoon. Blogging low to no. Wonder if we just started calling healthcare, nutritional support, and education "bombing" we would discover we actually have the money for that too.

Saturday, September 20, 2014

Thesis, Prosthesis

Given the insistent distinction of evolution from prosthesis, it is intriguing to notice that every culture is exactly equally evolved and exactly equally prostheticized, and that to say either is to describe more or less the same thing.

Friday, September 19, 2014

Geek Rule Is Weak Gruel: Why It Matters That Luddites Are Geeks

I received an invitation Monday night to contribute a mini Op-Ed for the New York Times' "Room for Debate" forum, responding to the question, "What does it mean when geek culture becomes mainstream?" Seven contributions appeared last night, including mine. They all contain worthy insights, and my own piece isn't my favorite among them, though mine is the only one I agree with (given the passage of time, that isn't a given). The headline for the forum now reads When Geeks Rule. The introduction of the theme of "Rule" actually changes my sense of the question, but I will set that aside for a moment. I assume that all the resulting essaylets were shaped by the same forces impinging on mine -- three hundred words is a rapidly looming horizon, a no less rapidly looming deadline, and an editorial scalpel wielded by someone with a "readability" agenda, angels and ministers of grace defend us. Who knows the subtleties, qualifications, questions left on the cutting room floor for my fellow participants?

I must say that I bristled a bit at the premise of the question itself. In every version of my contribution to the debate I included this first sentence: "There has never been a monolithic geek culture and geeks have never exhibited a singular profile, and so one important part of what happens as geek-identification becomes more 'mainstream' is that its diversity becomes more visible." My piece went several rounds with my firm but friendly editor, and over and over that opening sentence was excised -- and over and over I kept replacing it. It did not appear in the final version, by which time I more or less gave up. I decided to take the retention of my Vulcan quotation and the derisive gesture at "techbros" as hard won personal victories and move on. Clearly, my editor thought this initial observation about the irreducibility of geekdom to any one fetishized symptom or signifier seemed redundant in an essay treating geek enthusiasms as a motor of diversification more generally (which is the emphasis of the essaylet as it stands, I'd say).

I still do wish that initial frame for the essaylet remained. For me, the essence of geekery is about enthusiasm, it is an appreciation of appreciation. Wil Wheaten famously captured the spirit of what I am talking about in a blog post and video that was deliriously circulated a few years back -- I assume its popularity signifies that it articulated unusually well something many geeks felt about themselves already -- saying of geek gatherings that in them one is "surrounded by people who love the same things you love, the way you love them. But... also... by people who love things you don’t even know about, but you love your respective things in the same way, so you get to love your thing enthusiastically, completely, unironically, without fear of judgement." Geekery, then, is a celebration of the pleasures and knowledges that uniquely derive from close, sustained attention to subjects. And geekery is indifferent to whether or not the subjects being attended to are conventionally appreciated or not, hence it becomes a site of diversity the "mainstreaming" or ramification of which might facilitate diversity more generally. That is my essaylet's wee conceit, such as it is.

One of the few contributions that seemed to me to be on that very geeky wavelength was Zeynep Tufekci's essay, which insisted that joy is at the heart of geekery, non-judgmental joy, creativity, expressivity, experimentation, and the rough-and-tumble of making stuff. I was less pleased with the way her piece seemed to be assimilating this making to what I regard as the mostly self-congratulatory self-promotional BS of an entrepreneurial tech-sector start-up culture, and the rather libertechbrotarian flavor-of-the-month "maker culture" and "Maker Faires" less about making than about being on the make, it seems to me, less about DIY than about privileged disavowals of interdependence, self-declared corporate-sponsored doers doing their dreary disrupting. A few years ago perhaps her go-to Maker vocabulary would have buzzed on about "Smart" blah-de-blah, a few years before that "Bright" this-n-that, a few years before that "Extreme," before that "Virtual," and so on (I'm sure I've missed a meme here or there, but to know one of them is really to know them all). I cannot say that I regard the for-profit tech-sector as a site of conspicuous creativity and ingenuity so much as a vast skim-scam operation repackaging stale useless crap to bamboozle rubes impressed by PowerPoint presentations and buzzy ad-copy, or appropriating the ideas and sweat of quiet coders collaborating amongst themselves without much fanfare and, usually, without much reward. Of course, those obscure coders are almost certainly geeks and they are indeed making stuff, but for me a teenage girl using a toothpick to paint a panel on the surface of a model of Kubrick's spacecraft Discovery or a fifty year old sowing the bodice of an Arwen Evenstar gown for his outfit the last night of a science fiction convention are more illustratively enthusiastic makers invigorating geekery with its and our abiding joy -- even though nobody on hand expects to cash out that creativity in some big score.

Zaheer Ali penned what is probably my favorite of all the contributions, precisely because he exposed the diversity and dynamism of geekdom always-already against the grain of what seemed to me otherwise to be a series of rather sadly reductive mis-identifications of geekery with white techbros dreaming their dumb deadly VC dreams in the SillyCon Valley. Not only did Ali remind the Times readership of the thriving afro-futural musical and literary lineages (no Samuel Delaney or Janelle Monae tho!) which are the site of so much of the aliveness of my own life-long geekery -- not least because for me the afro-futural has also been the indispensable site of so much vital subversive queergeekery -- but he also pointed to Melissa Harris Perry and her #nerdland. It isn't only because I'm such a nerdland fan that I was happy to see Ali insist on the example, but also because it again went against the reductive grain of so many of the contributions otherwise: Harris-Perry's excellent show is such a geek-fest because it is an academic space filled with earnest activism -- folks who have been shaped by what Foucault described as the "grey, meticulous and patiently documentary" work of research and who retain the fierce pleasure of discovery and relevance of lives devoted to these intense attentions. 

When, to the contrary, Judith Donath defines geekdom in her piece through its "affinity to math and science" I wonder if all the science fiction geek shippers who don't know much science beyond what they learned from their Cosmos blu-rays and all the literary historian geeks smelling of a day in the archives but who flunked math are utterly invisible to her in all their geek glory?  Donath writes that "Geeky fascination with what is undiscovered drives scientific curiosity and invention, but unmoored from the desire to learn and create, it's simply consumerism." I actually do not agree that all cultural reception and appropriation is "simply consumerism." But even so, her larger point that "obsessive video game playing" will not help solve "climate change, emerging diseases, vast inequality" is certainly true. But why exactly would one expect otherwise? I am not sure that playing video games is enough to make one a geek, any more than watching movies does, and I would like to hear much more about the work that is getting done in Donath's attribution of "obsessiveness" to such gamers, but neither can I accept Donath's identification of geekery with the rational, scientific thought that is indispensable (together with free and accountable democratic governance in the service of general welfare) to the solution of such problems. I think rationality and science are both much bigger and much older than geekery, and that geek enthusiasms are quite valuable in human lives even when they do not contribute to the also indispensably valuable problem-solving work of science, engineering, public investment and harm-reduction policy-making. So, too, reactionary unsustainable consumerism is obviously a bigger problem than is its capacity to colonize the geek imagination -- it isn't the fault of geeks that under capitalism all that is solid melts into air. As I tried to insinuate at least in my piece geekery could use a bit more criticism along with its enthusiasm as an intellectual mode, but too strict an identification of geekdom with progressive science and policy or with reactionary consumerism seems to me to lose track of what geekdom actually is.

Just to be clear, it's not that I think geekery lacks the "affinity to math and science" Donath mentions and which so many geeks so conspicuously exhibit, any more than I deny the connection of geekery to "coding" that Kimberley Bryant and William Powers emphasize in their respective pieces. I simply want to refuse any reduction of geekdom to math nerds or software coders or what have you, whether the reduction is made in a spirit of sympathy or hostility to those geeks who happen to be math nerds or software coders. Again, as I said in my excised opening framing, "There has never been a monolithic geek culture and geeks have never exhibited a singular profile," and this really matters to me. To say that geeks are becoming mainstream because so many people use the term "app" in everyday conversation seems to me as obfuscatory as imagining that geeks became mainstream when folks started to talk about going to "the talkies." I happen to think the present ubiquity of handhelds no more indicates a rise in "tech savvyness" than did the ubiquity of office typewriters in the seventies. Am I really supposed to think people stumbling around in the street staring at images of plates of food their friends have eaten immerses me in a more geeky world in some way? Why is that a world more geeky than one in which people are adjusting the dials of their radios or adjusting the rabbit ears on their tee vees or dipping their quills into inkwells? To put the point more urgently still, why on earth would the New York Times treat the ratings of the relentlessly predicable, unwatchably execrable "Big Bang Theory" as relevant to the subject at hand in any deep sense? Network execs assimilating social change into nonthreatening nonrepresentative cheez whiz isn't exactly anything new or interesting -- in fact it would take a whole hell of a lot of culture studies geeks in an academic conference to convince me there was anything more interesting to say about the "Big Bang Theory" than about "Full House" and "Two And A Half Men" (the proper generic assignment for that bleak business).

When I submitted my contribution for its first editorial bloodletting, I received the rather exasperated suggestion that perhaps I might want to jettison the piece and write instead about why I am a Luddite, refusing to live on the terms of this emerging geek mainstream. I had the sinking suspicion at that point that I had been invited to participate in this forum because of my contrarian anti-futurological critiques. While it is true that the techno-transcendental futurist discourses and formations I write about are indeed geek subcultures, they are far from representative of geekdom in my experience of it, and certainly the farthest thing from definitive to it. I critique futurological pseudo-science as a champion of consensus science, I critique futurological corporate-militarism as a champion of accountable evidentiary harm-reduction policy, I critique parochial plutocratic market futures as a champion of free futures, I critique futurist consumer fandoms as a fan myself of literary sf: inother words, my critiques are those of a lifelong queergeek defending a capacious geekdom on which I have depended for my flourishing, and sometimes for my sanity, all my life.

Contra Fredrik deBoer I believe both that geek enthusiasms are still happening at the margins, and that geek enthusiasms are still marginalizing folks. I don't agree that venture capitalists and sexist gamers are representative of geekery (although I am far from denying and I am full of decrying their terrible geek ways). Certainly I would never pretend that geekdom is somehow insulated from the white racist patriarchal extractive industrial corporate-militarist American society which incubates and marinates geekdom -- my piece concludes with observations and warnings very much to the contrary. But, again, I think that if one wants to get to the heart of geekdom, the better to understand the changes it might enable as it ramifies through digital networked media formations, it is important to get at actually representative and symptomatic figures. I don't deny that Bill Gates exhibits geek traits, but I do deny that there is anything characteristically geeky about the traits in Gates that make him rich, powerful, and famous in America.

Titanic "Geeks Rule" archetypes like Gates, Jobs, Wozniak attended or hung out in geek communities connected to great public universities. Many of their marketable notions were cribbed from the knowledge and efforts of geek researchers and geek enthusiasts who have vanished from history. The reframing of these figures as randroidal sooper-genius fountainheads of entrepreneurial innovation and beneficience disavowing their utter dependence on climates of intellectual discovery (and usually scads of military investment) is a plutocratic commonplace. These hushed up circumstances attending the birth of our ruling tech enterprises (at least Berners-Lee doesn't disavow the gestation of the Web itself in CERN) in such skim-scam operations is re-enacted endlessly in the life of such enterprises, as the ideas and wage-labor of cohorts of coders under contract or scooped up from the bubbling crap cauldron of the start-up lottery, arrive in their full flower in the ritual spectacles of hyper-individualized celebrity CEOs bringing out new (usually just repackaged) gewgaws gobbled up by technoscientifically illiterate pop-tech journalists who confuse gossip and informercial pieties for substance. All along this terrorizing trajectory from creative geek collaboration to eventual plutocratic profitability there are endless occasions for folks with ever more tenuous connections to actual efforts and ideas either to take credit or otherwise sell out their fellows in the hope of some piece of pie down the road. If I may say so, there is nothing particularly geeky about this all too conventionally American ugliness. I am the first to critique the deceptions of plutocratic entreprenurial capitalism and the devastations of soulless unsustainable consumerism -- but I simply think it is a mistake to reduce geekdom to either of these phenomena or to treat geekdom as a particularly illuminating window on their deadly play in the world.

I'll conclude with a word on the Luddites. I do not know what is worse from my perspective, that I was expected by the Times to be a Luddite and not a geek, or that the Times would seem to accept the facile characterization of the Luddites as "anti-technology" in a sense opposed to a no less facile characterization of "pro-technology" geeks. As I never tire of saying, there is no such thing as technology-in-general about which it makes any kind of sense to be either loosely "pro" or "con." The constellation of tools and techniques contains too many differences that make a difference for one to assume a sympathetic or antipathetic vantage on the whole, and it will be the assumptions and aspirations in the service of which these tools and techniques will be put that yield our sensible judgments of right and wrong in any case. To speak, as too many of the contributors to the forum did I am afraid, of "technology" becoming more prevalent, guiding, approved of in society in some general way in connection to the mainstreaming of geekery seems to me to be an utterly confused and confusing way of thinking about the technoscientific vicissitudes at hand. All tools and all techniques are texts: and the literary imagination has quite as much to say about their play as does the engineering imagination. All culture is prosthetic and all prostheses are culture: and hence all individuals and all cultures are exactly as "prostheticized" as every other. To press the point, the Luddites were champions of certain tools just as they were critics of others, the Luddites were masters of certain techniques just as they were suspicious of others. Like those skeptics and contrarians who tend to get called "Luddites" today because they hesitate to be steamrolled into panglossian celebrations of anti-democratic technocratic and algorithmic governance or of hucksters peddling techno-utopian memes and kremes, it is important to grasp that Ray Kurzweil is no more cyborgic in Haraway's sense than is John Zerzan. Given their enthusiasm about their treasured tools and techniques it seems to me frankly far more in point to say that Luddites ARE Geeks, that luddic resistance and play is another facet of geek multiculture rather than its comic-book antagonist. 

I think Geeks are less inclined to rule at all than to drool over their personal perfections. I know that I am. As I said in the conclusion of my piece, the crucial final point of which was another editorial casualty, I'm sorry to say: "Perhaps such a mainstream embrace of marginal enthusiasms can help America overcome the defensive anti-intellectual bearings of its ruling masculine culture. But clashes over sexist representations in science fiction, exposures of sexist assumptions and practices in science education and tech-sector hiring, as well as the sexist antics of the 'techbros' of venture capitalism demand that we treat such promises of change in a much more critical way than with the usual geek enthusiasm." That is to say, geekery remains for me an intellectualism fueled by diverse and dynamic enthusiasms for enthusiasms, but the democratizing promise of such geekery would require a criticality that I would not yet identify with geekery as much as I would like to. Discussions such as the forum itself might go some distance to introduce this indispensable dimension of criticism -- but I worry that too many of these critics reduced geekery to certain of its superficial symptoms or mis-identified it with larger social struggles draining geekery of its specificity. As I said before, it's not an easy thing to say much of substance in three hundred words on short notice with editorial shears clipping away, so I doubt any of this reflects badly on my fellow participants, most of whom said more interesting things I managed to do in the time and space on offer when it comes to it.

We've traced the call...

...it's coming from inside your brain.

Thursday, September 18, 2014

Oh, It's Up Already

Here's a link. The Vulcan reference survived the final edit. I'll call that a victory.

Give a Squat

Before you romanticize squatting and other forms of informal insecure settlement you should realize that it often amounts to lethally dangerous crowdsourced real estate development for eventual plutocratic profit.

All Culture Is Prosthetic, All Prostheses Are Culture

Every tool and every technique is a text.

Room for Debate

I contributed a short piece to a discussion about the "mainstreaming of geek culture" for the New York Times' "Room for Debate" recurring feature. It will probably appear online later today or tomorrow morning. The piece is very short and was written on very short notice -- and despite both of these facts it was still edited to within an inch of its life, a process involving more steps than I could have expected. I'm still surprised to have been asked to contribute such a piece in the first place, and I will have more to say about both the process as well as about what the piece ended up saying and not saying in another post.

Monday, September 15, 2014

Richard Jones: No Uploads For You!

Anti-nanocornucopian Richard Jones offers up a fine technical debunking of techno-immortalizing "uploads" in his latest Soft Machines post, Your Mind Will Not Be Uploaded:
I start by asking whether or when it will be possible to map out... all the connections between [the brain's] 100 billion or so neurons. We’ll probably be able to achieve this mapping in the coming decades, but only for a dead and sectioned brain; the challenges for mapping out a living brain at sub-micron scales look very hard. Then we’ll ask some fundamental questions about what it means to simulate a brain. Simulating brains at the levels of neurons and synapses requires the input of phenomenological equations, whose parameters vary across the components of the brain and change with time, and are inaccessible to in-vivo experiment. Unlike artificial computers, there is no clean digital abstraction layer in the brain; given the biological history of nervous systems as evolved, rather than designed, systems, there’s no reason to expect one. The fundamental unit of biological information processing is the molecule, rather than any higher level structure like a neuron or a synapse; molecular level information processing evolved very early in the history of life. Living organisms sense their environment, they react to what they are sensing by changing the way they behave, and if they are able to, by changing the environment too. This kind of information processing, unsurprisingly, remains central to all organisms, humans included, and this means that a true simulation of the brain would need to be carried out at the molecular scale, rather than the cellular scale. The scale of the necessary simulation is out of reach of any currently foreseeable advance in computing power.
As I said, Jones is offering up a mostly "technical" debunking of the kind that enthusiasts for techno-transcendental conceits decry the lack of in my own critiques. But Jones is far from denying that these technical discussions are embedded in rhetorical frames, narratives, metaphorizations, conceptual problematics from which they derive much of their apparent intelligibility and force even if such discursive operations are not his focus.

You will notice, for example, that even in his brief summary above the notion of "simulating brains" is figuring prominently. About this he declares earlier on:
I want to consider two questions about mind uploading, from my perspective as a scientist. I’m going to use as an operational definition of “uploading a mind” the requirement that we can carry out a computer simulation of the activity of the brain in question that is indistinguishable in its outputs from the brain itself. For this, we would need to be able to determine the state of an individual’s brain to sufficient accuracy that it would be possible to run a simulation that accurately predicted the future behaviour of that individual and would convince an external observer that it faithfully captured the individual’s identity. I’m entirely aware that this operational definition already glosses over some deep conceptual questions, but it’s a good concrete starting point. My first question is whether it will be possible to upload the mind of anyone reading this now. My answer to this is no, with a high degree of probability, given what we know now about how the brain works, what we can do now technologically, and what technological advances are likely in our lifetimes. My second question is whether it will ever be possible to upload a mind, or whether there is some point of principle that will always make this impossible. I’m obviously much less certain about this, but I remain sceptical.
It's truly important that Jones insists such a discussion "glosses over some deep conceptual questions" but I wonder why this admission does not lead to a qualification of the predicate assertion, "it’s a good concrete starting point." To the extent that "uploading" is proffered by futurologists as a techno-immortalization scheme it isn't at all clear that even a successful "simulation" would satisfy the demands that invest their scheme. I flog the talking point that "you are not a picture of you" endlessly to make this point, but one might just as easily point out that nobody seriously entertains the substitution of one person by a longer-lived imposter as a viable life-extension method. And while I would agree that selfhood is substantiated in an ongoing way by observers, I think it is important to grasp that these observations have objective, but also subjective and inter-subjective dimensions none of which are adequate on their own and all of which supplement one another -- and also that these observations are not merely of "already existing" characteristics but of sociocultural scripts and norms through which selves are constructed/enacted in time. Uploading discussions tend to deploy radically impoverished understandings not only of selfhoods themselves but the terms of their substantiation. Again, Jones does not deny any of this, and he tends to be enthusiastically open to such considerations, but I wonder whether technical debunkings that circumvent such considerations at their point of departure don't end up smuggling in more of the reductionist nonsense he critiques as much as I do than he would like to do.

Another case in point, in Jones' truly welcome intervention into the work of metaphors in such discussions:
[T]o get anywhere in this discussion, we’re going to need to immunise ourselves against the way in which almost all popular discussion of neuroscience is carried out in metaphorical language. Metaphors used clearly and well are powerful aids to understanding, but when we take them too literally they can be badly misleading. It’s an interesting historical reflection that when computers were new and unfamiliar, the metaphorical traffic led from biological brains to electronic computers. Since computers were popularly described as “electronic brains”, it’s not surprising that biological metaphors like “memory” were quickly naturalised in the way computers were described. But now the metaphors go the other way, and we think about the brain as if it were a computer (I think the brain is a computer, by the way, but it’s a computer that’s so different to man-made ones, so plastic and mutable, so much immersed in and responsive to its environment, that comparisons with the computers we know about are bound to be misleading). So if what we are discussing is how easy or possible it will be to emulate the brain with a man-made computer, the fact that we are so accustomed to metaphorical descriptions of brains in terms of man-made computers will naturally bias us to positive answers.
This is music to my ears, but I have to wonder if these considerations really go far enough. (I'm a rhetorician for whom figurative language is the end-all be-all, so a working scientist like Jones might fairly question whether I would ever be satisfied on this score.) A scholar like Katherine Hayles has done extensive historical research into the ways in which the metaphors Jones is talking about here actually formed information science and computer science disciplines from their beginnings, so creating the conceptual terrain on which computers would seem plausibly describable later as "electronic brains" in the first place, an abiding conceptual terrain eventuating later still in the more recent reductions of discursive and cultural dynamics to "memes" and "viralities" -- or critical interventions into them nonetheless as efforts at a kind of "immunization," for example. Jones' talk about how we have been trained to treat glib biological and informational identifications as neutrally descriptive reaches deeper even than he reveals: how else do we account for the paradoxical proposal of his parenthesis that the brain is properly identified as a computer, while at once the brain is disanalogous with any actual computer? These associations are, as Jones says, so deeply ingrained as to be "naturalized." For me, it is enormously interesting that minds have so often been metaphorized as prostheses -- before its figuration as computer the mind has been mirror, blank slate, distributed steam pipes -- and that new figures do not displace old ones even when they are at odds. Freud's steampunk mind of repressions, displacements, projections, outlets lives on in the discourse of many who have made the digital turn to the computational mind. Who knows how or why exactly?

I find nicely provocative Jones speculative proposal that "the origin of van der Waals forces, as a fluctuation force, in the quantum fluctuations of the vacuum electromagnetic field... could be connected to some fundamental unpredictability of the decisions made by a human mind" and I am pleased that he takes care to distinguish such a proposal from theories like that of Roger Penrose that "the brain is a quantum computer, in the sense that it exploits quantum coherence" (since, as he points out, it... [is] difficult to understand how sufficient coherence could be maintained in the warm and wet environment of the cell"). For me, it is not necessary to save an ontic indeterminism traditionally ascribed to human minds through such expedients, since I was convinced well over twenty years ago by Rorty's argument in "Non-Reductive Physicalism" (from Objectivity, Relativism, and Truth, Cambridge: 1991, pp. 114-115) that one can be quite "prepared to say that every event can be described in micro-structural terms" while at once conceding that "[f]or most interesting examples of X and Y (e.g., minds and bodies, tables and particles) there are lots of true sentences about X's in which 'Y' cannot be substituted for 'X' while preserving truth... This is because any tool which has been used for some time is likely to continue to have a use...  a tool can be discarded... [but i]n such cases X-talk just fades away; not because someone has made a philosophical or scientific discovery that there are no X's... [nor] by 'linguistic analysis,' but, if at all, in everyday practice." I am cheerful about the prospect that the free will indispensable to my sense of selfhood may be a perspectival or discursive effect, but however poetically or scientifically potent its jettisoning might eventually become, dispensing with it would unquestionably be stupid and sociopathic for now rather than saying better the way the world is or speaking more in the language the universe prefers to be described in or any nonsense of the sort. 

I doubt that saying so would go very far toward convincing Jones -- any more than most transhumanists, for that matter -- that my own preferred philosophical and rhetorical arguments are more clarifying than their preferred technical skirmishing over the state-of-the-art and projected technodevelopmental timelines. But, again, I do worry that accepting enough figurative (rhetorical) and conceptual (philosophical) assumptions to have mutually intelligible "technical" discussions with techno-transcendentalists, especially when there really is no need to do so, simply concedes too much ground to them for resulting debunkery at its best to do much good -- they can always respond, after all, with minute nonthreatening qualifications or terminological shifts that leave you debating angels on pinheads at the level of detail interminably.

I am quite sure Jones is alive to this very worry, as he concludes with a practical consideration that looms large in my critiques of the futurologists as well:
[I]deas like mind uploading are not part of the scientific mainstream, but there is a danger that they can still end up distorting scientific priorities. Popular science books, TED talks and the like flirt around such ideas and give them currency... that influences -- and distorts -- the way resources are allocated between different scientific fields. Scientists doing computational neuroscience don’t themselves have to claim that their work will lead to mind uploading to benefit from an environment in which such claims are entertained by people like Ray Kurzweil, with a wide readership... I think computational neuroscience will lead to some fascinating new science, but you could certainly question the proportionality of the resource it will receive compared to, say, more experimental work to understand the causes of neurodegenerative diseases.
As I point out above, the effort critically to address techno-transcendental formulations on something like their own terms can smuggle prejudicial and reductive assumptions, frames, and metaphorizations into the discourse of even their critics in ways that circumscribe deliberation on these questions and so set the stage for the skewed public policy language and funding priorities and regulatory affordances that Jones points to here.

As a demonstration of how easily this can happen, notice that when Jones offhandedly declares that "[i]t’s unquestionably true, of course, that improvements in public health, typical lifestyles and medical techniques have led to year-on-year increases in life expectancy," the inevitable significance with which techno-transcendentalists freight such claims remains the furthest thing imaginable from an "unquestionable tru[th]" (Jones declares it "hollow" just a few sentences later) and yet the faith-based futurological frame itself remains in force even as he proceeds with his case: Needless to say (or it should be), improvements in prenatal care, childhood nutrition and disease treatment can yield year-on-year increases in life expectancy without year-on-year increases in life expectancy for people over the age of sixty-five, for example, and even if improvements in the treatment of heart disease and a few other chronic health conditions of older age yield some improvement for that cohort as well, this can and does remain compatible with absolute stasis of human longevity at its historical upper bound even if presently intractable neurodegenerative diseases are ameliorated, thus bedeviling altogether the happy talk of techno-immortalists pretending actuarial arrows on charts are rocketing irresistibly toward 150 year lifespans even in the absence of their handwaving about nanobotic repair swarms and angelic mindclone uploads. It is not an easy thing to address a critique to futurologists on terms they will not dismiss as hate speech or relativistic humanities mush and yet continue to speak sense at all. Richard Jones continues to make the effort to do so -- and succeeds far better than I do at that -- and for that I am full of admiration and gratitude, even if I devote my energies in response to his efforts to sounding warnings anyway.

Sunday, September 14, 2014

Kitten Has Clause

I have a social contract with my cat. It is full of claws

Faith Based

There is no article of faith more fancifully supernatural than belief in a natural free market.

More Dispatches from Libertopia here.

Lost To Be Found

Contra privacy rights discourses, privacy-qua-privation is not only NOT threatened by algorithmic profiling but vastly expanded and imposed. The greatest danger of the surveilled/profiled algorithmically legible subject is that it threatens our public, not our private selves. What Arendt sought to describe and defend as "public happiness" is not available on the terms proffered the false publicity of the surveilled/profiled subject. But in my classrooms and in union meetings, in public demonstrations and assemblies, both the pining and the pleasure Arendt is talking about are still palpable. The phenomenological inhabitation of democracy and its openings onto experimental political practices -- what Arendt called "The Lost Treasure of the Revolutionary Tradition" -- is always lost, the better to be found, and re-founding, in each new generation.

For more, check out my Twitterized Privacy Treatise and under posts available under the "Surveillance" topic tag at The Superlative Summary.

I Me Me Mine

Every Apple product is a funhouse mirror, an iMe killing time.

More Fool Me Tee Vee here.

Think This Notion Has A Future?

Idea for sf movie, novel, game: tho' apparently undistinguished and from humble origins, straight white guy turns out to be The Chosen One!

Saturday, September 13, 2014

Blear of Death

I regularly hear from people the claim that the fear of death is universal. I honestly wonder if I am simply not understanding what people are trying to communicate when they say this? I mean, I do hope I am not in excruciating pain or completely isolated when I die -- but I am not thrilled at such prospects even when they are survivable, it's not death that is fearsome in them as far as I can tell. Sure, I want to live, it's the only game in town. But, I dunno, I really truly find a fear of death weird. And preoccupation with that fear seems to me especially terrible and deranging, even a kind of death in life. Perfectly nondescript unaccomplished persons manage to die all the time. Honestly, how hard or how odd can it be? I don't get it.

Baby Talk

Futurology in its techno-transcendental moods is infantile wish-fulfillment, in its existential risk moods infantile attention seeking.

More Futurological Brickbats here.

Thursday, September 11, 2014

Today's Random Wilde

A community is infinitely more brutalized by the habitual employment of punishment, than it is by the occasional occurrence of crime.

Wednesday, September 10, 2014

Booting the Boots

Weird how important the figure "boots on the ground" has become, especially given its ambiguous relation to literal boots on the ground.

Tuesday, September 09, 2014

Comedy Routines Compensate Routine Tragedies

Or so I find.

Robot Cultist Martine Rothblatt Is In the News

If you scroll down to the name "Martine Rothblatt" in the Superlative Summary (an archive of anti-futurological critiques and also rants of mine) you will find The "Imagination" of a Robot Cultist, More Serious Futurology from Martine Rothblatt, and Martine Rothblatt's Artificial Imbecillence, or if you scroll to the topic "Avatar Techno-Immortalism" and you will find a piece critique of one of Rothblatt's pet projects What's Wrong With Terasem?

Not to put too fine a point on it, I think it actually matters that we don't know enough about brains to understand let alone replicate let alone "exceed" human consciousness, and that confident proposals based on such mastery, especially proposals that tap into irrational passions for superintelligence and immortality (or, more to the point, irrational fears of vulnerabilities to error and misunderstanding or irrational fears of aging and death), are profoundly irresponsible, deceptive even if also self-deceived, and when propounded in the tonalities of promises of salvation for the faithful, at once pseudo-scientific, even anti-scientific, amounting to flim-flam operations.

Of course, the world -- and especially the pop-tech press -- is full of hype and loose talk and promises of sooper-powers and looming apocalypse. Isn't all this rather harmless if it isn't taken too seriously? Rothblatt rather seems like fun for a dinner party.

At the risk of being a fuddy-duddy, the fact is that in a time of planetary environmental catastrophes and digital networked surveillance and marketing formations we happen to need quite desperately to be able to make recourse to warranted scientific consensus in the context of the most accountable possible democracy to help solve our shared problems and implement harm-reducing public policies and maintain renewable and equitable public infrastructural affordances. Pseudo-science whether in the form of climate change denialism, homeopathic evangelism, or futurological techno-transcendentalism erodes standards, suffuses public deliberation with skewed stakes, and distracts attention from real problems and real possibilities.

Quite apart from all that, I personally think that those who fear death in ways that lead them to fixate on nanobots spinning shiny robot bodies out of their frozen hamburgerized brains or to fixate instead on having their "info-souls" "scanned" or "uploaded" so as to live as cyberangels in Holodeck Heaven (more to Rothblatt's way of pining) can be tragically more dead in their dread and denial than they need be in life, and all to be quite as dead in death as everybody ends up being anyway. And while we have had nearly a century by now of cocksure computer enthusiasts declaring that their reductive disembodied often sociopathic conceptions of human intelligence are the springboard for the arrival of artificial intelligence any moment now, year after year after year, usually just Twenty Years in the Future! never diminished in their certainty despite the endless deferral of the dream, worse than the facile techno-cheerleading of it all for me is that despite their serial failures the impoverished understanding of intelligence that fuels their failed vision meets with success after success, as we call more and more dumb devices and programs "smart" that are not, all the while risking losing altogether the sense of the distinctiveness and demands of the actually incarnated intelligence of real live human beings in the world, losing our capacity to appreciate and empathize and support the decent, delicate, delicious collaborating consciousnesses of billions of living needful earthlings who share this world in this moment with us as real stakeholders: Fixated instead on our stupid smart cards, our stupid smart cars, our stupid smart homes, our stupid smart phones, a whole stupid smart crap avalanche tumbles toward the landfill take would take us all down with it.

All that said, I find it quite easy to agree with the main assertion of a rather breathless profile of Rothblatt in Jezebel today: "One of the world's most fascinating humans is easily Martine Rothblatt, America's highest-paid female CEO. A futurist and trans woman who cofounded both Sirius Satellite Radio and a biotech company, she's got a robot replica of her wife and believes that soon, tech will allow our consciousnesses to live forever." Fascinating? Unquestionably. The Jezebel profile refers to another higher-profile profile of Rothblatt published in New York magazine last weekend. Jezebel describes that piece as "exhaustive" but I failed to see mention in that piece (nor in the Jezebel piece) of Rothblatt's "solution" for "achieving lasting peace in the Middle East," namely, that Israel and Palestine both be admitted as States in the United States of America. Because if anything is obvious to everybody it is that the United States has not been involved enough in Israeli politics? Rothblatt wrote a whole book about it. That proposal is also "fascinating" to say the least, but perhaps there are too many other words that come to mind to describe it for it to find its way to a mention in a sympathetic fluff piece on the occasion of the flogging yet another new book, this one saying nothing new to those who have been following the techno-immortalist sects of the transhumanist and singularitarian Robot Cult with any kind of attention.

Anyway, I can't say that I am saying anything new either in responding to Rothblatt's latest star turn. The truth is that I was inspired to write this mostly because of the comments on the Jezebel piece. These seemed to me to be quite encouraging, which in itself really did feel like something new to me. The first thing I was pleased to see is that so many of the commenters were vigilant and fierce in policing any dumb transphobia on exhibition in these pieces or occasioned by them. There has been more and more of that in online precincts especially, and as a queer geek and queer theoryhead of many years it just makes me happy to find so much ferocious intelligence and impatience with cissexist and transphobic discourse in the world. But the second thing I was pleased to see is that even in geeky online fora lots of people are having no truck with transhumanoid and immortalist futurological formulations -- many more people seem to recognize that techno-transcendentalisms are pseudo-scientific varieties of evangelical flim-flam or corporate-militarist apologiae and hype. Back in the days when the libertechbrotarians ruled over so much of the cyberspatial sprawl it seemed any corporate press release hyperventilating a modest qualified app or lab result into a Royal Road to techno-utopian godhood was greeted with blissed out consumer fandoms and saucer-eyed True Believers who would cast out the least skepticism like a band of rabid Ayn Rand protagonists slashing at moochers with their blade-sharp sculpted cheekbones. Those days, it seems, at least for now, are over.

Monday, September 08, 2014

I Haven't Kept Up With the Fool Me Tee Vee Wisecracks

Watching Chris Matthews is like watching Mars Attacks! dubbed entirely into Martian.

More Fool Me Tee Vee here.

Is Reddit A "Failed State"?

I can agree that Reddit is run by something like "warlords," I suppose -- by which I mean merely to note that it is driven by the libertechbrotarian subculture that once suffused the entire web and remains a noisy noisome minority with delusions of prevalence and viability. But Reddit as an online networked formation is so far from statelikeness that to call it a "failed state" is to court a misconstrual nearly as radical as any contention its bullying bigots might hold that they are making a successful go at freedom. Culture isn't government, so there are category errors lurking in such glib formulations -- which is pretty commonplace when so-called "libertarians" are making noises. T.C. Sottek implies in a recent piece in The Verge that Reddit wants to think of itself as a government in some sense, but that would be an impossible thing for them to be and hence a stupid thing for them to want to be, quite apart from the fact that I don't see much evidence that they really want to be anyway. If Reddit facilitates crimes they will discover soon enough, like generations of cocksure libertechbrotarians before them, that cyberspatial boasts to the weary giants of flesh and steel that "You have no sovereignty where we gather," don't particularly cut the mustard. There are moments, to be sure, when Reddit seems to want to seem a principled and respectable enterprise, although hardly in any really consistent way. And there is no question that Reddit attracts reactionaries and is run by enough reactionaries that they are slow to grasp that any aspirations to sustainable mainstream profitability are threatened by the reactionaries. Reddit has attracted eyeballs longer than I expected it to, so I have no predictions to make about all that. I must say I have always personally found the place pretty gross and dumb for the most part, and Sottek's exposures confirm longstanding impressions.

T.C. Sottek, "Reddit is a failed state: The 'front page of the internet' is run by warlords":
Reddit wants to be a techno-libertarian's wet dream, but in practice its a weak feudal system that's actually run by a small group of angry warlords who use "free speech" as a weapon. Reddit is mostly a nice place filled with nice people who run nice little communities, but there's virtually nothing keeping them safe from bullies like "John," a 33-year-old man who brazenly dispersed stolen private photos and then cried foul when The Washington Post published information about him. Reddit's government is more interested in protecting John than the women he harassed. [CEO Yishan] Wong wrote that "the role and responsibility of a government differs from that of a private corporation, in that it exercises restraint in the usage of its powers." Forgiving for a moment the fact that this statement is completely wrong, Reddit's justification for this special type of behavior is incoherent since it does exercise its powers to censor content and protect people, unless they are victims... Last time the company found itself here it was dealing with negative press over a couple of seedy communities that were distributing "creepshots:" sexualized photos taken of women, often in awkward or compromising positions, without their knowledge. Reddit allowed this to go on for some time, but only brought out the big guns when Gawker revealed the creepshot ring leader was a 49-year-old man named Michael Brutsch, because Reddit believes in free speech as long as it protects the unsavory men who keep exploitative content flowing. Even charities won't accept money from these men, but Reddit will. According to a report from Recode, Reddit's free speech zone, where men run wild over women's privacy and dignity, may be worth somewhere in the neighborhood of $500 million. If Reddit wants to be thought of as a government, we'll call it what it is: a failed state, unable to control what happens within its borders. At minimum, Reddit is a kleptocracy that speaks to lofty virtues while profiting from vice.

The Yes That Is The No

Behind the eyes of every positive person cold opportunism, every confident boast shrieking panic, every insistent optimist cruelest dark.

Meet the Penis

Re-"tooled," still white.

By the way, the original host of NBC's Meet the Press was Martha Roundtree, who developed the show for radio and then moderated the televised version in its first years.

 

Saturday, September 06, 2014

Natasha Lomas on the Shift from Social to Sociopathic Software

The lesson I personally draw from the Natasha Lomas piece in yesterday's TechCrunch is that when online social networks like facebook and twitter emerge into the rare sort of prevalence that makes them actually look reliably profitable (or that awful gawky artless term of art, monetizable) as few such efforts ever manage to be and even fewer for long, these social networks can then mature in roughly two ways (the second of which few seriously entertain yet, but I insist must): Either they will be transformed into sociopathic networks driven by for-profit algorithms that betray and usually alienate their initial user-base, or they could be nationalized instead as public goods driven by the people who will continue to use them in unpredictable and unreliable and often unprofitable ways.

Follow the link for the whole piece, There's Something Rotten in the State of Social Media, my excerpts follow:
Facebook forcing users to download a separate messaging app if they want to carry on IMing their friends. Twitter polluting its users’ carefully curated timelines with content they did not choose to read. Facebook manipulating the emotional cadence of content to figure out whether it can actively impact users’ emotions. Twitter flirting with the idea of adding even more algorithmically selected content into the timeline -- content that prioritizes what’s already popular, and thus mainstream, at the expense of non-mainstream interests, nuance and minority views. Meanwhile, studies suggest the majority of Facebook users are unaware that what they see in their newsfeed is not actually the sum total of their friends’ news, but rather an algorithmic skim -- biased for clickbait, stickiness and, of course, advertising... The overarching theme [is]... increasing external pressure to monetize these social services... Bottom line: who I choose not to follow is [a] core a part of why Twitter is so useful to me... Also... human time and attention span are finite, so any digital service that steers you away from the things you are actively interested in for its own profit-making ends is acting parasitically.... [A]nother theme here: increasing automation, and thus decreasing human (end-user) control. Automation... lowers costs over the long term because it’s autonomous... automation is linked intrinsically to monetization. Automation can also be granularly controlled via algorithmic levers... A business can clinically figure out how to maximize factors such as user engagement or ad views just by running a couple of new algorithmic recipes and comparing the results -- selecting whichever one maximizes its bottom line... they just optimize core algorithms for particular business outcomes: page views, stickiness, user engagement, and so on. And for the overarching business imperative: profit... But it’s not without cost... Slowly but surely the freedoms that initially drew us into these glittering social spaces are being withdrawn, as barred gates drop into place -- limiting our usage options, and controlling and constraining the social content we see. The walled gardens shrink, getting narrower in outlook as the logic of their underlying content-filtering algorithms becomes evident. The business advantage of limiting what users can do is that user behavior can be better channeled and predicted... Many people won’t even realize exactly how staged and contrived their digital social services have become, as they are encouraged to keep scrolling mindlessly through all the tasty looking, populist clickbait fired at their eyeballs. Yet the original promise of a free social space is reduced to...  a pace-optimizing profit-maximizing machine... Automation is also a type of enforcement. It has to be. It implies a lack of choice stemming from the lack of a human hand steering things. Control is taken away from humans and handed over to what are (for now) human written algorithms... Facebook and Twitter are indeed free at the point of use. And the hackneyed tech maxim runs that ‘if it’s free you’re the product’. And yes it follows that if you’re the product then you’re not an agent involved in the service but a unit opted in to being manipulated by the service. The service views you as a controllable commodity... in order to achieve its goal of greater profits. But there is a problem with that logic. Thing is, without us human users of these social services there would be no Facebook and no Twitter. Without millions and even billions of people freely producing and uploading content there would be no social media palaces at all... It can be difficult to decry social media’s inexorable shift towards algorithmically curated info-feeds on the grounds that these businesses need to drive usage to please their investors. After all that is how publicly traded companies operate... Twitter, as it (mostly) is now with users in the driving seat, is a service with a human soul. While Facebook, which long ago prioritized algorithmic logic over human choices, is just another mechanized process.

Graduate Seminar on the Anti-Politics of Design

Here are the first few weeks of the syllabus for my graduate seminar at the San Francisco Art Institute, Designs On Us. The course began as an idea for a book that never went anywhere -- for whatever reason, I think elaborating arguments through course trajectories suits me more than doing so in longform books. Anyway, the syllabus goes on from here, but the textual assignments change quite a bit as the weeks go on, so there is no point posting the rest of it here and now. If you are interested, click the link to the course blog and you can read the assignments as they arrive. The assignments usually arrive at their final form no later than the Friday before the following session.  

CS-500H-01 Designs On Us: The Politics and Anti-Politics of Design

Course Blog: http://designsonus.blogspot.com/
Dale Carrico: dcarrico@sfai.edu; ndaleca@gmail.com

Attendance/Participation, 10%; Precis, 10%; Designer Presentation, 10%; 10+ Comments, 10%; Symposium Presentation, 10%; Final Paper, 50%

The proposal that is the point of departure for our course is that design discourse is a site where at once politics is done and politics is disavowed. Design as a site of "designation" invokes the gesture of naming as mastery, of reduction as revelation, of problems as provocations to instrumental technique and not stakeholder struggle, a mentalité with its own paradoxical temporality, publicity, linearity, cognition. Design as a site of the "designer label" is an indulgence in fetishism, of the commodity-form, of an auratic posture, of a psychic compensation of lack and its threat. To elaborate and pressure these propositions, we will spend quite a bit of time in the critique of three design discourses in particular: one, Green design which would accomplish sustainability without history; two, social/p2p software design which would accomplish democracy without participation; and three, eugenic design which would accomplish life-enhancement without lifeway diversity. In your individual presentations I hope we will ramify our attentions to other design sites: comparative constitutions, fashion design, food styling, graphic design, industrial design, interior design, landscape design, "life coaching," and who knows what else?

Week One | August 27 -- Introductions

Added, for those who mentioned during the opening lecture that they might like a little more background on the fetishism of the Designer Label:

Marx on The Fetishism of Commodities and the Secret Thereof from Capital
Walter Benjamin, Art in the Age of Mechanical Reproducibility
Naomi Klein, Taking On the Brand Bullies from No Logo

Week Two | September 3 -- Biomimicry, Cradle to Cradle, Natural Capitalism

Martin Heidegger, The Question Concerning Technology
Dale Allen Pfeiffer, Eating Fossil Fuels
Janine Benyus, Echoing Nature
Biomimicry Institute, Velcro
William McDonough & Michael Braungart The NEXT Industrial Revolution
Cradle to Cradle -- Principles
Amory Lovins, Hunter Lovins, Paul Hawken, A Roadmap for Natural Capitalism
OpenPolitics Critiques of Paul Hawken and Natural Capitalism

Week Three | September 10 -- Permaculture and Viridian Design

Bruce Sterling, When Blobjects Rule the Earth
Bruce Sterling, Manifesto of January 3, 2000
Viridian Design Principles
Bruce Sterling, Last Viridian Note
Wes Jackson and Wendell Berry, A 50-Year Farm Bill
The Land Institute: (a) Issues (b) Solutions (c) Science
Navdanya: About Us
GEN: Global Eco-Village Network: Definitions
Permaculture Design Principles, Online Interactive Presentation

Friday, September 05, 2014

Pseudonymous Futurology

Futurology as pseudo-science: Confusing science fiction with science practice, confusing faith-based initiatives with consensus science.

Futurology as pseudo-politics: Confusing consumer fandoms with stakeholder constituencies, confusing technocratic incumbency with democratic contestation.

Futurology as pseudo-philosophy: Confusing PR with analysis, confusing making bets with having thoughts.

Too Bored

...to blog.

Thursday, September 04, 2014

Wednesday, September 03, 2014

Decisions, Decisions

I suppose one way to show you value journalists would be to try use their occasional murder as a pretext for indiscriminate slaughter. Another option might be, you know, reading and supporting  their actual work.

Tuesday, September 02, 2014

Today's Random Wilde

I don't believe in progress: but I do believe in the stagnation of human perversity.

Monday, September 01, 2014

Last of the Grading...

...finally finishing up the last of the grading for the summer's last intensive term. Blogging low to no for a little while. Resuming sanity soon. Comparative sanity.