Michael recommends that I have a look at a short text, Technology: Four Possible Perspectives, most of which I quote here:
1.Technology will lead to extremely good outcomes (technophile)
2. Technology will lead to extremely bad outcomes (technophobe)
3. Technology will lead to outcomes that are on the whole neutral (technonormal?)
4. Technology will lead to extreme outcomes, either good or bad (technovolatile?)
I happen to consider all four stances offered here rather unserious. That is because they seem to me to reduce actual technodevelopmental complexity and dynamism to something of a monolith ("technology [in general])." Worse, they seem to go on to invest that monolith with a kind of intentionality and agency ("will [automatically?] lead"). This gesture is all the more worrisome since it seems (as happens so often among Superlative Technocentrics) to go hand in hand with "neutral" (so-called), "autonomous," "apolitical" views of technodevelopmental change that would diminish the agency (or at any rate our awareness of and responsiveness to the agency) of the actually-existing actors on that terrain: the people who collectively research, study, invest, invent, test, publish, edit, teach, criticize, regulate, facilitate, promote, distribute, apply and appropriate technoscientific outcomes. As it happens, none of the -- presumably "only" -- "four possible stances" offered up in the recommended formulation describes my own position and, hence, I daresay the title of the recommended post may need revision.
Michael chides me for expressing what he paraphrases as the accusation that "[Transhumanists] are all 'feeding an enormous amount of irrational delusive careless and damaging thinking' according to you, which is too bad."
Needless to say, I say this very thing on a regular basis and have done for years, so I can't quite see how this would surprise anyone. What I will insist on, however, is that when I say these things it isn't an ad hominem insult but a conclusion supported by reasons (which, as you all know by now, I regularly reiterate in my writing).
In a nutshell, lately I have been distinguishing two broad perspectives on technodevelopmental politics (surely there are more, but these are the two that preoccupy me in such discussions for now). First, some people assume a sub(cult)ural perspective on technodevelopmental politics, that is to say, they assume an identity politics frame directing itself to the implementation of particular concrete futural scenarios with which they personally identify. I distinguish this sub(cult)ural perspective from technoprogressive perspectives that assume instead a democratizing frame devoted to what Jamais Cascio would describe as "open futures," and the substance of which demands, in my view, social struggle directing itself to the safest, fairest, most consensual, most democratically responsive distribution of technodevelopmental risks, costs, and benefits possible. There is, it seems to me, an undue linearity, elitism, and utopianism that sub(cult)ural perspectives lend themselves to, and I would suggest that sub(cult)ural technocentricity is the likeliest political (a better word might be depoliticizing) expression of what I call Technocentric Superlativity. (That's a lot of jargon, I know, but many long critiques are being telescoped here in a rather breezy way since this is a conversation that Michael and I have been having a long time by now.)
Michael continues: I just find it oddly fascinating that someone who doesn't believe that technology can cause extreme, sweeping, transformative change ("Superlative" in your rhetoric) obviously identifies with a community that does.
That simple overgeneralization ("radical change is likely") is, of course, not what I mean by the term "Superlative" in my own (oft-delineated) usage. Again, in a nutshell:  Superlativity is a focus on an idealized farther-future over what I would regard as a more useful focus on futures emerging from and shaped by proximate problems and ongoing problem-solving.  Superlativity also tends to be a discourse invested with hyperbolizing and transcendentalizing significances of a kind once associated primarily with religious worldviews and still vulnerable to appropriation by social formations with the trappings of authoritarian religiosity (like cults). To the extent that Superlativity  solicits personal identification with rather than deliberation over particular futural scenarios (and this is very regularly the case in my view) it seems to be attractive to certain socially marginalized (many of them especially vulnerable to True Belief) and explicitly anti-social (among them, market fundamentalists and technocratic elitists) personalities. I find this very interesting, and more interesting still are the ways in which , , and  tend to support one another.
It is abundantly clear from my writings that I consider technodevelopmental social struggle enormously sweeping and transformative in many of its historical and current and likely formations (an observation in any case so obvious that it hardly even qualifies as an insight in my view), contrary to Michael's impression, gleaned who knows how from who knows what writings of mine. And you can trust me when I say that I don't identify with "transhumanists" in the least (even if I count a few among my friends and colleagues), however "obviously" it might seem to them that I really truly must do so, presumably just because I happen to take them seriously enough to worry about the impact they have on technodevelopmental policy language, efforts at education and organizing, and so on.
The point of saying this sort of thing is not to indulge in facile name-calling, however much folks who feel targeted by these critiques may wish to dismiss them as such, but to undermine tendencies to Superlativity that seem to me to inhere in technocentricity (any social worldview defined by a focus on technodevelopmental questions) at a time when technocentricity is demanded of serious progressives in a changed and changing world. I say these things to transhumanists and other futurists in particular, by the way, precisely because I think many of them are quite open to these critiques, would benefit from them, and once enlightened would make better technoprogressive allies. I should have thought all that would be obvious by now.
Solutions to poverty, neglected disease and militarism will require a *combination* of political and technological solutions, Michael insists.
But the "technological solutions," so-called, are already available to eliminate poverty (and in any case technical problem solving is already ineradicably political), meanwhile the barriers to the solution of these problems are indeed profoundly political questions of laziness, greed, short-sightedness, parochialism, and ruthless incumbency. New technologies will not alter that basic state of affairs one bit. Technodevelopmental outcomes express politics, they don't circumvent them. Until my Superlative Technocentric interlocutors grasp and come to terms with such basic propositions it is, I fear, rather difficult to take them very seriously for very long.
Michael proposes that: One either accepts the possibility [of Molecular Manufacturing in a strict Drexlerian construal] or not, and in your case, it seems you don't.
But the fact is that this particular logical alternative is not one I invest with much in the way of significance, personally. Is the particular scenario that preoccupies Michael's attention here logically possible, or at any rate not logically disallowed -- as it certainly is, for now, practically unavailable -- given our present knowledge?
Sure. But to be a wee bit provocative here: So what?
There are a bazillion equally logically possible outcomes that seem to me as or more likely as this one at the level of detail where life is actually lived and will continue to be. And I am, in any case (as a rhetorician and technocritical theorist by trade, recall) far more interested personally in the fascinating displays of loose argumentation, the surrogate commentaries on contemporary circumstances, the symptoms of social alienation, collective wish fulfillment, authoritarian religiosity, and so on that tend to freight Superlative Technology discourses, than I could possibly be interested in making promises I can't keep or listening to others make such promises where technodevelopmental outcomes are concerned.
Michael admits, I find it odd that you recommend the materials at CRN that are obviously so "Superlative".
But there is nothing mysterious in all this. I simply see more in the materials at the Center for Responsible Nanotechnology than he seems to do. For all I know Mike and Chris (the founders and directors of the Center) wouldn't agree with me at all about the texts of theirs and the discussions they have faciliated that seem to me to be the most valuable ones. Be that as it may, I certainly don't agree that everything of interest discussed at CRN is properly identified with Superlativity in my sense of the term, even if some of it is (as some of my own writing could no doubt justly be criticized for as well). I find the things Mike and Chris write about quite interesting on a regular basis, but it may be that I simply skip right past some discussions out of complete lack of interest which are the very passages that for Michael define the very spirit of the place altogether. Different things interest different people, this is nothing odd in the least but, to the contrary, surely an obvious commonplace.