[T]here are serious problems with the transhumanist agenda, and I think you ill prepare the world by waving your hands and saying "don't worry, they'll never pull any of it off."
There are indeed serious problems with "liberal" eugenic formulations that undermine consent in non-normativizing medical practices.
There are serious problems with budgetary priorities that foreground "futurological" existential risks (meteor impacts! gamma ray burst! gray goo! Internets "wake up" as malevolent superintelligent Robot God, oh noes!) and then propose corporate-militarist geoengineering solutions over more proximate risks, appropriate technology, and p2p formations.
There are serious problems with elitists who deploy "accelerating change" rhetoric to justify circumventions of democratic deliberation and implement elites-know-best policy in favor of incumbent interests.
And since I do think that transhumanists and singularitarians function as a kind of extreme rhetorical frontier-space for trying out the frames, figures, and formulations on the basis of which these seriously problematic propositions eventually get disseminated into mainstream discourse, I do agree that one should take them seriously indeed.
And of course I do. And I focus my seriousness precisely where superlativity does its chief mischief -- at the level of rhetoric.
If, as I fear, what my Anonymous Commenter meant to imply instead when he (forgive the gender presumption) spoke of the "serious problems with the trasnhumanist agenda" is that a few Robot Cultists are really and for true gonna code a sooperintelligent history-ending Robot God or clone a designer sooper-baby army, and we need to "prepare the world" for this sort of thing before it's too late, well, then, you'll forgive me, but I think it is he who needs to get serious and not me, and how.