As a person who is trying to practice utilitarianism, I often get into some variant of this problem:
1. Let's say I convinced John to be an utilitarian
2. John has produced 100 more units of happiness during his life than he would have produced without me convincing him.
How many units of happiness did I produced by convincing John? I do not think it is 100, because then both of us would think that we produced that 100 units of happiness, but there is only 100 units produced, not 200. So maybe thinking, that I produced 100 units would lead to bias towards indirect consequences when evaluating which action is more important to do.
I don't think I am the first one to think about this. Does this problem has a name or something?
1. Let's say I convinced John to be an utilitarian
2. John has produced 100 more units of happiness during his life than he would have produced without me convincing him.
How many units of happiness did I produced by convincing John? I do not think it is 100, because then both of us would think that we produced that 100 units of happiness, but there is only 100 units produced, not 200. So maybe thinking, that I produced 100 units would lead to bias towards indirect consequences when evaluating which action is more important to do.
I don't think I am the first one to think about this. Does this problem has a name or something?