Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All
Friday, December 29, 2006
To Be Denatured Is Humanity's Natural State
Would augmentless humans have less rights than posthumans or AI since augmentless humans can not participate at the same level?
There is simply no such thing as an "augmentless human." Whether you are talking about media immersion, transsexual surgeries, pacemakers, vaccinations, contact lenses, prosthetic limbs, clothing, or written language every human being is always already ineradicably prostheticized through and through.
Thus, what will count as "augmentation" will always be radically contingent historically and socially constructed.
The question that has been asked here is a good and important one (I encountered it on a discussion list earlier today, but I hear variations on this theme from many different sources all the time). It's an important question, because the issue of ensuring that all people have a say in the public decisions that affect them is the definitive problem of democratic politics. And, I'd go on to say that sensitivity to the ways technoscientific developments might provide opportunities for some people to variously threaten or promise to impact the capacities of other people to have such a say in these public decisions that affect them is the key insight for technoprogressive folks in all their varieties.
But it is crucial always to keep in mind that the technologies and the science do not constitute a circumvention of or proper surrogate for what remains in its essence a very straightforward political problem. The politics articulate the technodevelopmental form conspicuously more than the converse. At the root of most technocratic, technophiliac, technophobic, and otherwise technocentric perspectives is a misunderstanding or outright denial of this very basic priority of the political over the technical.
Distributions of authority, wealth, knowledge, force, luck are unequal, and this is a problem in democratic societies that value both the diversity that healthy democracies always exhibit and the equity on which those democracies depend to function. Technodevelopmental transformation constitutes the preeminent contemporary expression of this quandary, but it isn't anything new. For me, the best way to negotiate the quandary is to insist first of all, and as always, on the value of consent. We best ensure both the equity on which democracies depend and the diversity we celebrate as the sign of its thriving when we ensure that the scene of consent is as informed and nonduressed as may be, by securing the widest possible access to knowledge and recourse to the law, and by protecting the space of free deliberation by defending freedom of expression and association and securing freedom from want.
The original question also mentions "posthumans" and "AI," and I'll conclude by commenting about these curiously evocative figures very briefly as well:
[one] I would personally describe most of the beings who get described as "posthuman" candidates (because of their projected gee-whiz gizmoization rather than our own dull customary gizmoization) simply as just "humans" since all humans have always been essentially prosthetic beings. You might say that Aristotle's definition of man as the "political animal," as the animal whose being uniquely comes to fruition in urban -- that is to say public and artifactual -- contexts already forcefully suggests this point.
[two] As for nonbiological-substrate intelligence I really do wish smart technoprogressive folks would wait to cross that bridge when we look like we may actually be coming to it. And, to be clear, I'm sure you'll be flabbergasted to grasp I personally think we are not now close to arriving at this point, or at any rate not close enough to prioritize this issue over other technodevelopmental issues here and now.
I'll go further and suggest that a strong focus on the rights of nonbiological beings who do not yet exist and long might not come to exist often synptomizes, in my view, for now, social alienation more than anything else.
Also, and again this is just in my humble little view, such discussions too readily provide occasion for surrogate political discussions about culturally fraught issues like "racial" and sexual and other morphological differences (read the weird homophobic panic registered in many bioconservative discussions of chimeras, for example), or intergenerational anxieties (ditto, bioconservative discussions of "designer sooperbabies" or "clone armies"), or worries about the soundness of processes of public deliberation among members of a badly educated populace (this Burkean trace is discernible in most expressions of technocratic sensibilities) and so on. In almost every case it seems to me these difficult discussions would be incomparably more helpful to everybody involved if they actually explicitly used the terms they were actually really about.
There is simply no such thing as an "augmentless human." Whether you are talking about media immersion, transsexual surgeries, pacemakers, vaccinations, contact lenses, prosthetic limbs, clothing, or written language every human being is always already ineradicably prostheticized through and through.
Thus, what will count as "augmentation" will always be radically contingent historically and socially constructed.
The question that has been asked here is a good and important one (I encountered it on a discussion list earlier today, but I hear variations on this theme from many different sources all the time). It's an important question, because the issue of ensuring that all people have a say in the public decisions that affect them is the definitive problem of democratic politics. And, I'd go on to say that sensitivity to the ways technoscientific developments might provide opportunities for some people to variously threaten or promise to impact the capacities of other people to have such a say in these public decisions that affect them is the key insight for technoprogressive folks in all their varieties.
But it is crucial always to keep in mind that the technologies and the science do not constitute a circumvention of or proper surrogate for what remains in its essence a very straightforward political problem. The politics articulate the technodevelopmental form conspicuously more than the converse. At the root of most technocratic, technophiliac, technophobic, and otherwise technocentric perspectives is a misunderstanding or outright denial of this very basic priority of the political over the technical.
Distributions of authority, wealth, knowledge, force, luck are unequal, and this is a problem in democratic societies that value both the diversity that healthy democracies always exhibit and the equity on which those democracies depend to function. Technodevelopmental transformation constitutes the preeminent contemporary expression of this quandary, but it isn't anything new. For me, the best way to negotiate the quandary is to insist first of all, and as always, on the value of consent. We best ensure both the equity on which democracies depend and the diversity we celebrate as the sign of its thriving when we ensure that the scene of consent is as informed and nonduressed as may be, by securing the widest possible access to knowledge and recourse to the law, and by protecting the space of free deliberation by defending freedom of expression and association and securing freedom from want.
The original question also mentions "posthumans" and "AI," and I'll conclude by commenting about these curiously evocative figures very briefly as well:
[one] I would personally describe most of the beings who get described as "posthuman" candidates (because of their projected gee-whiz gizmoization rather than our own dull customary gizmoization) simply as just "humans" since all humans have always been essentially prosthetic beings. You might say that Aristotle's definition of man as the "political animal," as the animal whose being uniquely comes to fruition in urban -- that is to say public and artifactual -- contexts already forcefully suggests this point.
[two] As for nonbiological-substrate intelligence I really do wish smart technoprogressive folks would wait to cross that bridge when we look like we may actually be coming to it. And, to be clear, I'm sure you'll be flabbergasted to grasp I personally think we are not now close to arriving at this point, or at any rate not close enough to prioritize this issue over other technodevelopmental issues here and now.
I'll go further and suggest that a strong focus on the rights of nonbiological beings who do not yet exist and long might not come to exist often synptomizes, in my view, for now, social alienation more than anything else.
Also, and again this is just in my humble little view, such discussions too readily provide occasion for surrogate political discussions about culturally fraught issues like "racial" and sexual and other morphological differences (read the weird homophobic panic registered in many bioconservative discussions of chimeras, for example), or intergenerational anxieties (ditto, bioconservative discussions of "designer sooperbabies" or "clone armies"), or worries about the soundness of processes of public deliberation among members of a badly educated populace (this Burkean trace is discernible in most expressions of technocratic sensibilities) and so on. In almost every case it seems to me these difficult discussions would be incomparably more helpful to everybody involved if they actually explicitly used the terms they were actually really about.
Subscribe to:
Posts (Atom)