Ruairi wrote:whats the probability of humans spreading vast amounts of life and then going extinct?
the probability we wont create massive numbers of sentients in the future?
Here are again some very intuitive, unmeditated numbers. I haven't factored anthropics (doomsday argument, simulation argument, etc.) into this either.
P(humans survive and develop AGI) = 30%
P(directed panspermia | humans survive and develop AGI) = 12%
P(terraforming | humans survive and develop AGI) = 15%
P(lab universes | humans survive and develop AGI) = 0.001%
P(a few sentient simulations | humans survive and develop AGI) = 55%
P(massive numbers of sentient simulations | humans survive and develop AGI) = 20%.
If we account for the simulation argument, all of these values decrease quite a bit, in rough proportion to the amount of computing resources they require. Of course, the same will be true for probabilities of happy simulations.
Ruairi wrote:is there anyway we can predict the future better? any useful research that could be done?
SIAI has a page called
The Uncertain Future, although it's primarily about probabilities of AGI alone, without considering the many possible social implications thereof. There are lots of futurology publications, and even if their estimates are way off, they might suggest new ideas you hadn't thought of. For that matter, even science fiction can be a great way to stretch the imagination, but keep in mind that most science fiction is highly improbable and prone to
good-story bias. I think a paperclipping future is fairly likely, but it doesn't make for a very interesting story.
There's a saying, "The best way to predict the future is to create it." While this is obviously a hugely false hyperbole, it hints at what I was suggesting in my previous post: That regardless of whether the future is net good or bad, we can still push it in a slightly better direction by raising concern for animal suffering, etc.