RyanCarey wrote:1. Reducing existential risk
2. Researching kinds and magnitudes of existential risk, and ways to reduce them
3. Evaluating the trajectory of humanity, such as to project its future
4. Deciding on our values.
A recent discussion with Jesper Östman made me realize that the concept of
existential risk might be an impediment to clear thinking on these issues. Consider these two alternative ways of making the world a better place:
A. Reduce the risk of human extinction.
B. Reduce the risk that, if humanity does not become extinct, humans will eventually create
Dolorium.
These are very different ways of improving the world, and it's very likely that they require very different behaviors from us (e.g. AI research versus meme spreading). However, because both will count as instances of "
permanently and drastically curtail[ing] humanity's potential," they both fall under the category of "existential risk reduction". As a consequence, people using the concept of existential risk might fail to appreciate the important ways in which these two approaches differ from one another. Furthermore, the similarity between the words 'existential' and 'extinction' is likely to cause folks to assume, without argument, that the most effective way to reduce existential risk is to reduce the risk of human extinction. Given that (B) is not clearly a suboptimal way to insure that humanity realizes its full potential for successful development, this assumption is unwarranted.
So, to go back to your question, I'd list both (A) and (B) as top candidates for "tasks [that are] most important."
peterhurford wrote:I pick "5.) Effectively answering this question". It's still an open question even among people who are finished with 4.
This relates to something I think about occasionally, without making much progress. How "meta" should we go? If the second-order task of deciding which first-order task we should focus on is itself as important as any of these first-order tasks, isn't the third-order task of deciding which of these first- and second-order tasks are more important itself plausibly as important as these lower-level tasks? Etc.