How important are climate change, aging, enhancement, etc.?

Whether it's pushpin, poetry or neither, you can discuss it here.

How important are climate change, aging, enhancement, etc.?

Postby Brian Tomasik on 2012-02-29T06:36:00

In Aug. 2009, I had an email discussion with some friends about the relative importance of a few topics that are usually seen as worth considering for activism. I'm reproducing that email below with some modifications to update references, as well as to update my own views slightly. ;)

Combating global warming

Possibly the main impact of climate change is its effect on existential risk -- if not directly through runaway climate change, then indirectly insofar as ecological burdens exacerbate political conflicts and increase chances of nuclear war.

As far as the direct effects of global warming to the earth, I'm not totally sure whether they're net positive or negative. The direct human impact is almost certainly negative, but the wild-animal impact is less obvious: Will global warming increase or decrease the net wild-animal population of earth? I go into some details in this piece, but my conclusion from a few hours of research is that I still can't tell either way. Tentatively, I'm guessing that climate change is bad because "Global Warming Could Trigger Insect Population Boom," and most of those insects would live short lives before dying painfully at a few days/weeks of age.

My main reaction to global warming, though, is the following: Since so many people are concerned about it, and it's now a major political issue, the marginal impact of your involvement will be really small. It's much better for utilitarians to focus their energies on big-picture questions that the general public misses.

Ending aging

In general, I'm skeptical of claims that this is a utilitarian cause, because I think a lot of people have obvious ulterior motivations for wanting to support it. Since I'm a classical utilitarian, I do see organisms as just buckets in which to hold positive emotions, and it doesn't matter which buckets you use to store them or how often their replaced. That's an approximation, since in practice, death causes personal anguish, pain to the elderly person, etc., but the point is that I don't view "saving lives" for its own sake as intrinsically valuable.

There are two ways in which I can see at least an attempt at a utilitarian justification for life extension:
  1. If you want to cause environmental destruction for the sake of wild animals, this might not be a bad way to do it (pending considerations about increasing insect populations through climate change).
  2. This is the standard argument: Extending lifespans will make more people care more about the far-distant future and so work toward reducing existential risk. This argument seems mildly plausible, and since I think the ending-aging project itself may be fairly cost-effective (in the sense of having high leverage for marginal donations), working on aging might be an okay way to prevent existential risk. Or maybe not -- if the increased environmental burdens make conflicts worse. And more people surviving implies more people (especially in the developed world) in total, which means more brains that can think up ways to destroy the world per unit time. But it also means more brains to work on space colonization to reduce risk.

Stopping wildlife suffering

Well, this is of course my favorite option. :P Mainly what I would focus on here is promoting concern for animal suffering, such as through veg outreach in the short term. It might make sense also to advance ideas like humane insecticides to push the envelope on people's moral sympathies in a way that still allows for concrete action today.

Among hard-core anti-speciesists, we can be more explicit about the fact that suffering in nature can hurt just as much as suffering due to human cruelty. I think there are a number of people who would latch on to the cause if there was a group out there working on it. I've met probably 15-20 people who now care passionately about wild-animal suffering, and most of the time it was because of the influence of other people they knew. (One friend said that my piece on the topic helped reassure him that he wasn't crazy. :))

Avoiding astronomical waste

Bostrom is right that, if creating vast numbers of minds is your priority, rather than preventing massive amounts of suffering, then you should probably focus on existential risk. Plus, existential risk is a much easier sell to most people than a utilitronium shockwave, which is my desired outcome. Unfortunately most people I talk to -- even at SIAI, etc. -- actively oppose pure utilitronium. :cry:

Human enhancement

As with aging, I have a hard time seeing the obvious utilitarian benefits here. Even more than with negligible senescence, I'm skeptical of the marginal returns, because I think most of this technology will probably be developed anyway for selfish reasons.

What about intelligence enhancement? Well, it's not clear to me whether that's good or bad. Smarter people means more ability to design super-bugs, etc. per unit time, which means less reaction time for defenses against disasters, to the extent that there may be an inherent asymmetry between offense and defense. Some people (not at SIAI, but elsewhere) claim that intelligence enhancement would even improve morality, while I doubt that very much. I'm even sometimes worried that improving people's comfort level in general could be harmful.

One form of enhancement that I do think is worth exploring is changes that make people more empathetic and more utilitarian. (See the above link for more on that topic, too.) If widely deployed, this could potentially trump even promoting concern for wild-animal suffering, because the latter would follow from increased ability for empathy. But I really can't imagine how someone would do this: How do you go around telling people to change their children to make them more utilitarian, unless the parents are already hard-core utilitarians? If it could be done, though, I would be interested!

Friendly AI

I favor this more than generically reducing extinction risk. However, I'm still ambivalent about it out of concern that friendly AI could lead to more wild-animal (and other) suffering than, say, paperclipping.

For one thing, different people have different ideas of what a "nice future" would look like. For these people, a good future means propagation of life throughout the universe. For deep ecologists, it means preserving the cruelty of untouched natural habitats. (Ned Hettinger: "Respecting nature means respecting the ways in which nature trades values, and such respect includes painful killings for the purpose of life support [...].") For many more people, that includes creating lab universes (if physically possible). And there will almost certainly be suffering for instrumental reasons like terraforming and simulations for scientific purposes.

What's more, that's only talking about a future in which relatively "good" people take control. But reducing extinction risk also means increasing the chance of really bad things arising from planet earth, including war-torture, savage religious-type ideologies, suffering simulated slaves, etc. We may be able to shift the course of the future somewhat, but much of it will be out of our hands and steered by Darwinian forces, so our probabilities for these undesirable outcomes never get even close to zero. Increasing the odds that humans survive necessarily means increasing the odds of really bad things by some non-trivial amount.

(At the time of writing the original email, I said that I met someone just last week who told me his moral objective function consisted in propagating life as much as possible, even though he agreed that wild animals probably endure net suffering.)

Summary: Which considerations are most important for utilitarian organizations to focus on?

My predictable answer is that the most important thing to get right (and probably the most important thing to work on, at least indirectly, depending on cost-effectiveness considerations) is to steer humanity's moral, economic, and psychological values in the direction we want. To the extent that happens, we don't have to worry too much about the rest ourselves (e.g., technical details of implementation) because that will come along for the ride with any superintelligent future civilization.

Of course, "steering values in the right direction" is a broad charter, and in many cases, the best way to promote values may be to focus on concrete projects. (Beliefs often follow actions rather than preceding them.)

There are a few main ways I envision to change society's values: (1) Straightforward social movements (e.g., civil rights, women's liberation, animal rights). (2) Changing biological / psychological constitution (e.g., reducing tendencies toward aggression and sadism, enhancing ability to feel others' pain). (3) Influencing a seed AI. Chances are that (1) would have a big role to play in accomplishing (3).

I think influencing the values of future civilization is more important than many people do because I'm a metaethical emotivist and am not sure whether people in the future will feel the way I do on ethical questions (notably because many people in the present don't feel the way I do about them!). Things like whether it's okay to create new wildlife that will suffer (I think we shouldn't) and whether bugs would be better off not existing (I think so).

Suggestions on meme spreading?

I don't have as many recommendations for reading as I'd like. We started a discussion on the topic on Felicifia, but it doesn't have a lot of concrete points. I hear that Nick Cooney's Change of Heart: What Psychology Can Teach Us About Spreading Social Change is a nice synthesis of research, focused especially on vegetarianism and concern for animals.

Religions provide some interesting case studies for spreading and preserving strong ideological views that can often differ significantly from evolutionary drives. That said, we don't necessarily want to replicate many of the dark arts that religions employ, because we care about actually reducing suffering in the universe, which requires rationality and sound epistemology, not just "following the party line" for all eternity.

Any recommendations for utilitarian lifestyle?

I made some observations here, and we've had a number of discussions of this type on various Felicifia forums.

A few sound bites:
  • Have utilitarian friends who keep you interested in what matters most.
  • Don't let the best be the enemy of the good -- e.g., with meat consumption, not wasting time on frivolities, how much to donate, etc. (This is a point that LadyMorgana has mentioned.)
  • Watch some videos of animal suffering every once in a while.
  • Make public commitments about your intentions to do good to reduce your risk of future relapse.
  • While you're doing great work for so many sentient organisms, make sure to have fun in the process. Play, laugh, and smile. :D
User avatar
Brian Tomasik
 
Posts: 1130
Joined: Tue Oct 28, 2008 3:10 am
Location: USA

Re: How important are climate change, aging, enhancement, etc.?

Postby Pablo Stafforini on 2012-02-29T07:32:00

Thank you for sharing this exchange, Brian. I am in the process of doing some research for 80,000 Hours, comparing the relative impact of various prima facie promising interventions, and have found your analysis very helpful.

What do others think? Do you agree with Brian's assessment? Would you add other candidates to his list? And do you know of relevant literature or websites not referenced above?
"‘Méchanique Sociale’ may one day take her place along with ‘Mécanique Celeste’, throned each upon the double-sided height of one maximum principle, the supreme pinnacle of moral as of physical science." -- Francis Ysidro Edgeworth
User avatar
Pablo Stafforini
 
Posts: 177
Joined: Thu Dec 31, 2009 2:07 am
Location: Oxford

Re: How important are climate change, aging, enhancement, etc.?

Postby Arepo on 2012-02-29T11:03:00

A few thoughts:

Combating global warming
My main reaction to global warming, though, is the following: Since so many people are concerned about it, and it's now a major political issue, the marginal impact of your involvement will be really small. It's much better for utilitarians to focus their energies on big-picture questions that the general public misses.


It seems much too soon to assume this, since ‘climate change’ is such a wide category. You could make just the same generalisation about ‘health’ or ‘overseas aid’ (as Dambisa Moyo seems to ), yet surely a lesson of Givewell/GWWC is that if an area is very popular, it’s a) likely to have low-hanging fruit and b) likely to have several distractions from those fruit that are a large part of the reason they remain unpicked. Conceivably these factors could add up to a bigger marginal impact here than in an untouched field, where you start out feeling around in the dark.

Another reason for combating climate change is that it’s likely to reduce biodiversity, and biodiversity speeds up science, which, if you’re positive about the future, has a huge compound interest effect.

Ending aging
That's an approximation, since in practice, death causes personal anguish, pain to the elderly person, etc., but the point is that I don't view "saving lives" for its own sake as intrinsically valuable.


Basically agree with everything you said here, though one further benefit is that it costs a *lot* to turn an infant human into a productive member of society, and possibly costs as much (maybe even more in an ageing population?) to care for them at the end of the lifespan. Extending the bit in the middle makes the cost:productivity ratio better.

Stopping wildlife suffering

Well, this is of course my favorite option. :PMainly what I would focus on here is promoting concern for animal suffering, such as through veg outreach in the short term. It might make sense also to advance ideas like humane insecticides to push the envelope on people's moral sympathies in a way that still allows for concrete action today.


I’m not as sold on this as you since it’s basically a short-term fix in a way that most of the human-helping options aren’t (an insect that you make healthier will never cure cancer. It’s happier, then it dies, and we’re back where we were before). That said, this is such an outlier that you might have a big marginal impact.

Human enhancement

One form of enhancement that I do think is worth exploring is changes that make people more empathetic and more utilitarian.


I don’t have much hope for this approach – utilitarianism corresponds with disinterested calculation (aka psychopathy, apparently), so making people more emotional seems a priori likely to make them *less*utilitarian if anything. It also means I’m more hopeful about the prospects of increased intelligence. But as you say, I think self-interest will cover this one pretty well. Perhaps worth keeping an eye open for long-term technologies in need of funding, or ones that could somehow have a benefit that most people won’t recognise or will react against. Nothing much comes to mind.

My predictable answer is that the most important thing to get right (and probably the most important thing to work on, at least indirectly, depending on cost-effectiveness considerations) is to steer humanity's moral, economic, and psychological values in the direction we want. To the extent that happens, we don't have to worry too much about the rest ourselves (e.g., technical details of implementation) because that will come along for the ride with any superintelligent future civilization.


(Slightly) more specifically than this, I would suggest ‘make more utilitarians’ as a further possible cause. You can do this various ways – bnroadly either by influencing people to become utils (just getting them to hang out with a group other utils seems to have this effect, based on some extremely limited anecdotal evidence), or by finding people who’re too underprivileged to put any of the altruistic goals they have in mind into practice and helping them out to the extent that they become >self-sufficient.
"These were my only good shoes."
"You ought to have put on an old pair, if you wished to go a-diving," said Professor Graham, who had not studied moral philosophy in vain.
User avatar
Arepo
 
Posts: 1065
Joined: Sun Oct 05, 2008 10:49 am

Re: How important are climate change, aging, enhancement, etc.?

Postby Ruairi on 2012-02-29T13:52:00

@alan: that was really excellent and really readable! often I read stuff and find myself googling stuff to get the meaning out of it, to make a massive generalization utils seem to be very intelligent and sometimes use language that's unusual, I know sometimes I'm reading stuff I know my friends wouldn't understand and would just not read.

@Pablo: I don't have time for a proper reply now and I need to read more on all the above subjects anyway but yes i agree with Alan,

also i think maybe a very impact activity would be to try to assess whether or not the future will be positive or negative by doing things like finding out if the wild is net positive or net negative and how much and perhaps promoting concern for wild animal suffering and seeing how the public warms to it and stuff like that
User avatar
Ruairi
 
Posts: 392
Joined: Tue May 10, 2011 12:39 pm
Location: Ireland

Re: How important are climate change, aging, enhancement, etc.?

Postby Pablo Stafforini on 2012-02-29T17:18:00

Thank you for your replies, Arepo and Ruairi.


(Slightly) more specifically than this, I would suggest ‘make more utilitarians’ as a further possible cause. You can do this various ways – bnroadly either by influencing people to become utils (just getting them to hang out with a group other utils seems to have this effect, based on some extremely limited anecdotal evidence), or by finding people who’re too underprivileged to put any of the altruistic goals they have in mind into practice and helping them out to the extent that they become >self-sufficient.

I think this approach raises two questions:

1. Are utilitarians more likely to make the right decisions, in the areas that matter most, than are folks with some alternative ideology that we could spread around? Given the vast logical space of ideologies, on the one hand, and the well-known problems of utilitarianism as a decision procedure, on the other, it seems that the answer to this question is at least not obviously positive.

2. Even assuming utilitarianism as the most beneficial ideology, there appears to be a tension between the degree to which one is truthful to this theory in advertising it to others and the likelihood that others will be persuaded as a consequence of our efforts. Utilitarianism might, in certain circumstances, require one to do all sorts of things that very many people find horrendous; and many of those who declare themselves utilitarians do so on the assumption that their preferred theory does not actually issue such horrendous requirements. So perhaps this tension is best resolved by spreading some kind of "utilitarianism without tears" to the wider public; the losses in accuracy might be outweighed by the gains in popularity.


also i think maybe a very impact activity would be to try to assess whether or not the future will be positive or negative by doing things like finding out if the wild is net positive or net negative

I agree that this would be a very important activity. Let us however not lose sight of the following fact. Despite ignoring whether the future will be positive or negative, we can know that the future will be either incalculably positive or incalculably negative, in light of the sheer number of sentients that would exist under even the most conservative of estimates. This knowledge might suffice to drastically narrow down our attention to existential risk, to the exclusion of all other possible interventions. Moreover, as Brian notes, our ignorance might be sidestepped by working, not on ways to affect the probability of extinction, but on ways to affect the welfare of future sentients, in case humanity does survive. (My reservations about this proposal can be found here.)
"‘Méchanique Sociale’ may one day take her place along with ‘Mécanique Celeste’, throned each upon the double-sided height of one maximum principle, the supreme pinnacle of moral as of physical science." -- Francis Ysidro Edgeworth
User avatar
Pablo Stafforini
 
Posts: 177
Joined: Thu Dec 31, 2009 2:07 am
Location: Oxford

Re: How important are climate change, aging, enhancement, etc.?

Postby Brian Tomasik on 2012-03-01T07:15:00

(This post accidentally deleted by your dozy neighbourhood moderator - A)
User avatar
Brian Tomasik
 
Posts: 1130
Joined: Tue Oct 28, 2008 3:10 am
Location: USA

Re: How important are climate change, aging, enhancement, etc.?

Postby Pat on 2012-03-02T07:40:00

I am utterly baffled. It looks as though Arepo is responding to a post by Alan Dawrst that I can't see, and Arepo's post has Alan Dawrst's name next to it. Is anybody else seeing the same thing?

Arepo (?) wrote:For eg, it's fairly easy to show evidence that you're significantly smarter than most of humanity, so seems conceivable that you're utilitarian despite your high empathy (we usually assume you need some empathy to be a util, but I'm not even convinced of this - I have never noticed empathic sensations motivating me to utilitarian behaviour.

I think empathy gives you some reason to care about others. If you had zero empathy, it's possible that you would endorse utilitarianism as the "correct" moral theory (perhaps even likely, since sociopaths give more-utilitarian answers to moral dilemmas). But I don't think you'd be motivated to do anything to move the world in the right direction.

Pat
 
Posts: 111
Joined: Sun Jan 16, 2011 10:12 pm
Location: Bethel, Alaska

Re: How important are climate change, aging, enhancement, etc.?

Postby Arepo on 2012-03-02T11:11:00

This was my idiocy. I meant to hit 'reply' to Alan's post and must have hit 'edit' instead, so I've overwritten his response with mine. Not sure how best to untangle - Alan doesn't have a copy of his original version. Moved my comment below into this post, which at least clears up who's written it, but further confuses things in that you're now replying to something before it was posted...

Maybe a little, but I'm doubtful that it's enough to factor importantly into the calculation. Quite possibly international health speeds up science more per dollar because people who are healthier can get better education.


Good point.

To be clear, what I meant was that I favor things intended to change society's perspective toward animals (wild and otherwise) in order to improve post-humanity's decisions with respect to creating massive numbers of suffering beings. We may do small projects here and now to move this along, but much of the benefit will come from how it transforms society's values.


In this respect it seems to be semi-dominated by the 'make more utilitarians' approach. Assuming both are equally easy (prob not a good assumption, but it's not obvious which idea would be more cost-effective to promote - there are prob more people who're broadly utilitarian than who care about WAS, but the figure doesn't seem to have changed much), future utilitarians will care about animal suffering to their best estimate (which should be better than ours) of how relevant an issue it is.

I may have a different perspective, because my utilitarian leanings are heavily emotional. Being utilitarian is a combination of (a) caring a lot about the welfare of others and (b) applying rational thinking to (a). It seems like increasing (a) while keeping (b) at least constant would probably be a good thing.


Maybe. But assuming that the psychopathy paper was right, you seem to be quite unusual in your merging of empathy and util, so you ought to be careful about generalising from yourself. Also, it's not clear that utilitarianness = a*b on the terms above. It might be a more complicated relationship, requiring a fairly specific balance between the two - or even a cap on one. For eg, it's fairly easy to show evidence that you're significantly smarter than most of humanity, so seems conceivable that you're utilitarian despite your high empathy (we usually assume you need some empathy to be a util, but I'm not even convinced of this - I have never noticed empathic sensations motivating me to utilitarian behaviour. I feel empathy in some cases, but it usually pushes me in the other direction, eg giving money to beggars, street performers, prioritising friends' welfare etc; essentially it feels like another form of selfishness, since I'm doing these things to feel contented).
"These were my only good shoes."
"You ought to have put on an old pair, if you wished to go a-diving," said Professor Graham, who had not studied moral philosophy in vain.
User avatar
Arepo
 
Posts: 1065
Joined: Sun Oct 05, 2008 10:49 am

Re: How important are climate change, aging, enhancement, etc.?

Postby Brian Tomasik on 2012-03-03T09:45:00

Arepo wrote:In this respect it seems to be semi-dominated by the 'make more utilitarians' approach. Assuming both are equally easy (prob not a good assumption, but it's not obvious which idea would be more cost-effective to promote - there are prob more people who're broadly utilitarian than who care about WAS, but the figure doesn't seem to have changed much), future utilitarians will care about animal suffering to their best estimate (which should be better than ours) of how relevant an issue it is.

Hard to say. I almost think it's easier to change people's outlook on issues than to change their fundamental moral intuitions. I know some non-utilitarian prioritarians who care about wild-animal suffering, and I think people of most persuasions would if there were a cultural shift. (Example: Because of the gay-rights movement, people of almost all ethical stripes now support that cause.)

That said, I agree that the question isn't totally obvious. It may be a case-by-case project: Make utilitarians where you can, but encourage people to care about WAS where utilitarian conversion isn't possible. ("One is silver, and the other is gold.")

Arepo wrote:I have never noticed empathic sensations motivating me to utilitarian behaviour. I feel empathy in some cases, but it usually pushes me in the other direction, eg giving money to beggars, street performers, prioritising friends' welfare etc; essentially it feels like another form of selfishness, since I'm doing these things to feel contented).

Fascinating. I feel bad in those situations too, but it usually reinforces my motivation to do what I'm already doing.

Agree that generalizing from one example can be misleading.
User avatar
Brian Tomasik
 
Posts: 1130
Joined: Tue Oct 28, 2008 3:10 am
Location: USA

Re: How important are climate change, aging, enhancement, etc.?

Postby Arepo on 2012-03-03T10:15:00

Much better link: http://xkcd.com/605/ ;)
"These were my only good shoes."
"You ought to have put on an old pair, if you wished to go a-diving," said Professor Graham, who had not studied moral philosophy in vain.
User avatar
Arepo
 
Posts: 1065
Joined: Sun Oct 05, 2008 10:49 am


Return to General discussion