While I think that Yudkowsky does tidily put to rest the allegedly Repugnant Conclusion, he still introduces yet another problem, which he calls the Lifespan Dilemma. I think this dilemma is unique in introducing absolutely everything that is still problematic with utilitarianism.
While I invite you to read the whole story, because it's kinda neat, I'll summarize the problem here. (I changed this dilemma from the first-person perspective to a global perspective to make it more compelling for utilitarians who truly care more about the entire world more than themselves.)
...
Suppose that you gain perfect knowledge that the current universe will last for 10^20 years, and utilitarians will eventually manage to install a perfect universal utopia for the last 10^10 years. Also, the universe is deterministic, so nothing you can do can change this fate, except wager as follows:
Would you exchange this current universe (as described) for a 1-(10^-100) chance of complete (painless) annihilation now and 10^-100 chance of the entire universe instantly transforming to an ideal utilitarian utopia and continuing as so for 10^10^10^10 years?
...
If you do the utility calculations as follows, you can expect 10^10 years of perfect utopia with the status quo, but with the wager you get 1-(10^-100) chance of 0 years of perfect utopia and 10^-100 chance of 10^10^10^10 years of perfect utopia, which adds up to an expected utility of 10^((10^10^10)-100) years of perfect universal utopia, which is still amazing.
Yet, at the same time, odds are 10^100 to 1 that the entire universe will be annihilated and there will be no more future for anyone.
Do you take the wager? I, for one, am completely confused.
While I invite you to read the whole story, because it's kinda neat, I'll summarize the problem here. (I changed this dilemma from the first-person perspective to a global perspective to make it more compelling for utilitarians who truly care more about the entire world more than themselves.)
...
Suppose that you gain perfect knowledge that the current universe will last for 10^20 years, and utilitarians will eventually manage to install a perfect universal utopia for the last 10^10 years. Also, the universe is deterministic, so nothing you can do can change this fate, except wager as follows:
Would you exchange this current universe (as described) for a 1-(10^-100) chance of complete (painless) annihilation now and 10^-100 chance of the entire universe instantly transforming to an ideal utilitarian utopia and continuing as so for 10^10^10^10 years?
...
If you do the utility calculations as follows, you can expect 10^10 years of perfect utopia with the status quo, but with the wager you get 1-(10^-100) chance of 0 years of perfect utopia and 10^-100 chance of 10^10^10^10 years of perfect utopia, which adds up to an expected utility of 10^((10^10^10)-100) years of perfect universal utopia, which is still amazing.
Yet, at the same time, odds are 10^100 to 1 that the entire universe will be annihilated and there will be no more future for anyone.
Do you take the wager? I, for one, am completely confused.