In my previous posts on happiness, I said that people only value happiness if it's achieved in a fashion consistent with their ultimate values. To tie this back thematically to transhumanism --- this is yet another reason that some people will choose not to undergo radical alterations, regardless of how happy or powerful the transformed beings become.
Some people value their humanity, however "irrational" that attachment, and any future transhuman society must respect the choice of some people to remain "merely" human, or "merely" anything else. I'm pretty open to radical alterations --- I once told my friends that I thought it would be cool to have my consciousness uploaded into a star for a couple of billion years (yes, that's right, a star, as in a gargantuan flaming ball of plasma; assuming, of course, that you could build a computational substrate capable of supporting consciousness in such a medium) --- but even I have limits. I do not want to be forced to exceed those limits, either by direct coercion or by competition for the resources I need to survive. (Incidentally, the last of these requirements implies that an ethical transhuman society must, to some extent, provide a welfare state for ordinary humans; as I've noted previously, it seems probable that transhumans will outcompete humans in every field of endeavor, making direct economic competition ruinous for humans.)