I know I went back and forth on this issue in the past but I think I've finally come to a conclusion that I think makes clear sense.
The only point I disagree with (hedonistic) utilitarians on is the idea that the happiness/suffering of separate beings can be objectively aggregated. Happiness can be measured in terms of intensity and duration but it can't be objectively quantified because it doesn't exist in objective reality, it's an abstraction. Happiness is a subjective experience, it's meaningless to talk about aggregating an experience because an experience is subjectively felt and there is no shared consciousness between two or more beings. Some pro-aggregation HUs will concede, even if they insist it will never be a practical concern, that it might be theoretically justifiable to cause one person 1000 points of pain to prevent 1000 people from experiencing 2 points of pain each but if the morally right decision to make in any scenario is the one that maximizes the greatest happiness/minimizes the most suffering (I think it is), this makes no sense because there is no subjectively felt 2000 points of stress that's being prevented that would outweigh the man's subjectively felt 1000 points of pain. You haven't decreased the greatest suffering, by causing this one person 1000 points of pain to prevent 1000 other people from experiencing a much less amount of pain, you've decreased the number of people who suffer. The goal of ethics shouldn't be to increase/decrease the greatest number of happy or distressed people but to increase/decreased the greatest subjectively felt happiness/suffering.
The argument comparing the 2 points of stress the 1000 separate people would feel to feeling 2 points of pain a day for 1000 day is what prevented me from coming to this conclusion earlier but I no longer think it holds any weight because it still depends on the idea that there is some kind of supermind that experiences the collective 2000 points of pain intermittently, if not at once. A pinprick every day for 1000 days, with the expectation that you'll xperience it again later or putting it into context with how many times you've experienced it in the past is different than a pin prick you experience once alone. If you wipe out my memory so I longer remember having experienced the pin prick the day before or have any expectation of experiencing it again in future, my argument would still stand, that there's no subjectively felt 2000 points of pain that would make it any worse than experience 1000 points of pain at once.
Intuition aside, it's difficult to take the idea that a universe with 10 billion moderately happy beings could be better than a universe with 1 billion extremely happy beings.
I'm open to opposing arguments but I think utilitarianism would be a lot easier to swallow for many people once you remove the idea of aggregation.
The only point I disagree with (hedonistic) utilitarians on is the idea that the happiness/suffering of separate beings can be objectively aggregated. Happiness can be measured in terms of intensity and duration but it can't be objectively quantified because it doesn't exist in objective reality, it's an abstraction. Happiness is a subjective experience, it's meaningless to talk about aggregating an experience because an experience is subjectively felt and there is no shared consciousness between two or more beings. Some pro-aggregation HUs will concede, even if they insist it will never be a practical concern, that it might be theoretically justifiable to cause one person 1000 points of pain to prevent 1000 people from experiencing 2 points of pain each but if the morally right decision to make in any scenario is the one that maximizes the greatest happiness/minimizes the most suffering (I think it is), this makes no sense because there is no subjectively felt 2000 points of stress that's being prevented that would outweigh the man's subjectively felt 1000 points of pain. You haven't decreased the greatest suffering, by causing this one person 1000 points of pain to prevent 1000 other people from experiencing a much less amount of pain, you've decreased the number of people who suffer. The goal of ethics shouldn't be to increase/decrease the greatest number of happy or distressed people but to increase/decreased the greatest subjectively felt happiness/suffering.
The argument comparing the 2 points of stress the 1000 separate people would feel to feeling 2 points of pain a day for 1000 day is what prevented me from coming to this conclusion earlier but I no longer think it holds any weight because it still depends on the idea that there is some kind of supermind that experiences the collective 2000 points of pain intermittently, if not at once. A pinprick every day for 1000 days, with the expectation that you'll xperience it again later or putting it into context with how many times you've experienced it in the past is different than a pin prick you experience once alone. If you wipe out my memory so I longer remember having experienced the pin prick the day before or have any expectation of experiencing it again in future, my argument would still stand, that there's no subjectively felt 2000 points of pain that would make it any worse than experience 1000 points of pain at once.
Intuition aside, it's difficult to take the idea that a universe with 10 billion moderately happy beings could be better than a universe with 1 billion extremely happy beings.
I'm open to opposing arguments but I think utilitarianism would be a lot easier to swallow for many people once you remove the idea of aggregation.