Adapted from a facebook-comment:
It all hangs on the empirical estimations of various existential risks. However, it is important to note the difference between a marginal decrease in the current pace of general technological development and a permanent halting of technology development (which is likely impossible to cause, except through causing some global catastrophe).
Consider Bostrom's metaphor of a rocket trying to navigate through a minefield - the only safe place is on the other side of the field. However, although it is unwise to halt the thrust to the rocket permanently it is also unwise to just assume that max throttle is optimal.
So the question turns into the question of what pace of general technology development minimizes existential risk. My current best estimate is that it is a lower pace than the current pace.
There are two main components for this reasoning. 1: The current existential risks seem relatively low. Few natural risks seem to have a higher chance than perhaps 1 in 100000 of killing us during the next 100 years. Also, according to what I have heard from scientists in the relevant areas the extinction risks from nukes and climate change seem also extremely low. (note that I am only considering existential risk here, not GCRs).
On the other hand emerging technologies such as biotech, molecular nanotech and AI seem much more risky (technology development might also uncover more risks). Experts surveyed on the area give much higher subjective extinction risk estimates for such emerging technology risks.
Now, speeding up the development will take us from the very low risk era to a transitional high-risk era.
2: Existential risk reduction seems more dependent on other factors than general economical progress (in contrast to general technology development). Here much depends on areas like FAI-research, lobbying, spreading awareness and so forth.
Thus to conclude, it seems like a marginal increase in GDP will get us earlier into a high-risk era unprepared, whereas a slower growth would allow us to become more prepared when we enter the risky era.
It all hangs on the empirical estimations of various existential risks. However, it is important to note the difference between a marginal decrease in the current pace of general technological development and a permanent halting of technology development (which is likely impossible to cause, except through causing some global catastrophe).
Consider Bostrom's metaphor of a rocket trying to navigate through a minefield - the only safe place is on the other side of the field. However, although it is unwise to halt the thrust to the rocket permanently it is also unwise to just assume that max throttle is optimal.
So the question turns into the question of what pace of general technology development minimizes existential risk. My current best estimate is that it is a lower pace than the current pace.
There are two main components for this reasoning. 1: The current existential risks seem relatively low. Few natural risks seem to have a higher chance than perhaps 1 in 100000 of killing us during the next 100 years. Also, according to what I have heard from scientists in the relevant areas the extinction risks from nukes and climate change seem also extremely low. (note that I am only considering existential risk here, not GCRs).
On the other hand emerging technologies such as biotech, molecular nanotech and AI seem much more risky (technology development might also uncover more risks). Experts surveyed on the area give much higher subjective extinction risk estimates for such emerging technology risks.
Now, speeding up the development will take us from the very low risk era to a transitional high-risk era.
2: Existential risk reduction seems more dependent on other factors than general economical progress (in contrast to general technology development). Here much depends on areas like FAI-research, lobbying, spreading awareness and so forth.
Thus to conclude, it seems like a marginal increase in GDP will get us earlier into a high-risk era unprepared, whereas a slower growth would allow us to become more prepared when we enter the risky era.