All this talk about the singularity misses something vital about human nature, I think.
I don't care if an electronic 'clone' of me exists somewhere. That's fine for him. It does absolutely nothing for me. If I'm going to seek a remedy for death, it's not going to be a wholesale replacement of what I am - it is, to the best ability possible, going to be continuous with the 'me' that exists today.. or no deal.
That may include technological enhancement of what exists already, in addition to medical treatments. It may include a slow 'growing over' of the existing neurological structure with nano-metal or whatever the hell you wanna call it.. but who cares, seriously, if somebody just makes a twin of me?
Furthermore.. emotion. Not gonna give that up. Anybody who does choose to do that is a literal monster. Ethically, it's gonna come down being judged like that. Just because we become robots does not mean we'll lose the basis for emotion - social bonds, material interdependency, uncertainty and the desire for safety and familiarity.
You can talk all you want about living in a virtual world, but as of any scenario we can imagine, there's still going to be a physical universe supporting all that virtual existence.. and that means that there's still going to be 'off switches' for everybody, and you better dang well pay attention to the physical world or risk death. The 'base universe' is still going to be the most meaningful field of interaction.
Furthermore, I'm going to borrow an ethical judgement from 'The Culture', a sci-fi post-scarcity society. They simply look at those advanced civilizations who withdraw from the universe as being anti-social. In other words, the attitude is that, well... 'hey, they've given up on helping? Screw 'em, we can do better than that. We're more involved'.