I was thinking about exchange rates between happiness and suffering and came up with the following:
The neural mechanisms that create happiness and the ones that create suffering may be quite different, so equating them is always going to be based on some arbitrary exchange rate. Even if they were the same mechanism it's still arbitrary. Some people have argued this before.
Why do we value happiness but not feelings of happy-excitement, love, peacefulness, etc. Also we need to have an exchange rate between things like physical and mental suffering (unless they use exactly the same neural pathways).
People may use "happiness" and "suffering" to encompass several feelings squished together. But where does this leave us in terms of utilitorinium shockwaves? Well they're still almost certainly the best possible outcome we can hope for, but perhaps not the best possible outcome imaginable.
Anyway, while our understanding of what creates happiness and suffering is limited I'm pretty happy to let sentients choose their own mix of good emotions according to their preference (supposing all sentients could simply program the emotions they would like for the day). I still wouldn't allow bad emotions And with a better understand of neuroscience I might become stricter in some way.
Is there a word for this? Preference hedonistic utilitarianism?
The neural mechanisms that create happiness and the ones that create suffering may be quite different, so equating them is always going to be based on some arbitrary exchange rate. Even if they were the same mechanism it's still arbitrary. Some people have argued this before.
Why do we value happiness but not feelings of happy-excitement, love, peacefulness, etc. Also we need to have an exchange rate between things like physical and mental suffering (unless they use exactly the same neural pathways).
People may use "happiness" and "suffering" to encompass several feelings squished together. But where does this leave us in terms of utilitorinium shockwaves? Well they're still almost certainly the best possible outcome we can hope for, but perhaps not the best possible outcome imaginable.
Anyway, while our understanding of what creates happiness and suffering is limited I'm pretty happy to let sentients choose their own mix of good emotions according to their preference (supposing all sentients could simply program the emotions they would like for the day). I still wouldn't allow bad emotions And with a better understand of neuroscience I might become stricter in some way.
Is there a word for this? Preference hedonistic utilitarianism?