Humanity’s vast future and its implications for cause prioritization

By Eevee🔹 @ 2022-07-26T05:04 (+38)

This is a linkpost to https://sunyshore.substack.com/p/humanitys-vast-future-and-its-implications

Summary

How many people could there be?

There are almost 8 billion people in the world today. The United Nations estimates that by 2100, the global population will stabilize around 11 billion people with 125 million births per year. If we have an idea of how long humanity will survive, then we can estimate how many people will be born in the future.

Modern humans have been around for 200,000 years, and the average mammalian species lasts for a million years. So let’s suppose that humanity will survive for another 800,000 years. If 125 million people are born each year, then 100 trillion people will eventually be born. That’s 12,500 times as many people as are alive today. If we last 10 million years, like some mammalian species, then we could eventually give rise to over a quadrillion people.

Of course, human civilization could survive for much longer than that—or much shorter. Even in extreme scenarios, climate change is unlikely to destroy civilization or render humans extinct, but it could indirectly lead to our extinction by driving political instability and global conflict.[1] By some accounts, artificial general intelligence could be developed this century, and if we don’t have the technology to align AGI systems with our values, one could go rogue and kill us all simply because it is indifferent to our survival.[2]

On the other hand, we could use our technological capabilities to live for billions of years in the Solar System and beyond. This century, we will probably start establishing human settlements on other planets, like Mars, and satellites like Saturn’s largest moon Titan. Some scientists have speculated that we will be able to start traveling through deep space by the end of the 24th century.[3] Earth will remain habitable for at least another 500 million years, but if we have a presence elsewhere in the Solar System, we can survive for much longer than that. When the Sun eventually becomes a white dwarf in 8 billion years’ time, we could still live in artificial space habitats orbiting what’s left of it, but most of us will likely be living on exoplanets and space habitats orbiting other stars.[4]

Trajectory changes vs. speeding up growth

This vast potential has profound implications for how best to help as many people as possible in the present and future. Many longtermists believe that creating trajectory changes—durable changes to the amount of good in the world at every point in the future—is more valuable for the trillions of people yet to be born than trying to speed up the future.[5] Preventing an existential catastrophe, such as human extinction or the collapse of civilization, is a prototypical trajectory change, since the entire value of the future hangs in the balance.

In Stubborn Attachments, economist Tyler Cowen argues that humanity must focus on three things to make the long-term future go as well as possible: reducing existential risks, protecting basic human rights, and maximizing the rate of sustainable economic growth. On timescales of 50 to 100 years, speeding up economic growth is significant. If the world economy grows at 2% per year for the next 100 years, then it will have grown to 7 times its current size. But if the economy grows at 3% per year, it will have grown by a factor of 19. This percentage point change would make a huge difference for people alive over the next century: global extreme poverty would be eradicated sooner, life expectancy would be higher, and people would be more educated and tolerant.

But economic growth cannot realistically continue at a sustained exponential rate for more than a thousand years or so. If the world economy grows by 2% per year for the next 8200 years, then we would eventually “need to be sustaining multiple economies as big as today’s entire world economy” per atom in the Milky Way galaxy.[6] Whether the economy grows at 2% or 3% over the next 500 years will probably make little difference to the living standards of people alive 10,000 years from now, as the economy will have reached its peak well before then. On the other hand, global extreme poverty and income inequality could have persistent negative effects on the global political order. Since political developments tend to get locked in, reducing global poverty and inequality today could have far-reaching benefits for the long-term future.

Reducing S-risks as a key priority

Since realizing this, I’ve decided to focus more on making positive changes to humanity’s long-term trajectory and less on economic growth. Although I still believe that reducing barriers to growth and development is important, I think that positive trajectory changes that affect humanity’s entire future are more important still. I group positive trajectory changes into three basic categories:

Tentatively, I think reducing S-risks is the most promising strategy for me to pursue, because it is way more neglected than reducing existential risks in general. The field of S-risks is new to me, but commonly discussed S-risks include:

Historically, I have been skeptical of claims that AI safety is the most pressing issue from an EA perspective, but I’ve been starting to come around. Although I still think there could be more important causes, I now think that advanced AI is a major threat to the future, factoring in both existential risks and S-risks, and I recognize that I have a good personal fit for the field. So at EA Global this coming weekend, I’m going to explore whether I’d be a good fit for AI safety roles and which areas of AI safety seem most promising for reducing S-risks. I’ve also been spinning up a Discord server focused on the intersection of longtermism and animal welfare, an area that I think is especially important and neglected by the EA community. Even though I’m uncertain about these initiatives, I am excited about the potential impact they could have on future suffering.

  1. ^

    Hilton, Benjamin (2022). “Climate change.” 80,000 Hours.

  2. ^

    Karnofsky, Holden (2021). “The Most Important Century.” Cold Takes.

  3. ^

    Tangermann, Victor (2021). “NASA Scientists Predict Settlements on Moons of Saturn, Jupiter.” Futurism.

  4. ^
  5. ^

    Beckstead, Nick (2013). “A proposed adjustment to the astronomical waste argument.” EA Forum.

  6. ^

    Karnofsky, Holden (2021). “This Can’t Go On.” Cold Takes.

  7. ^

    Gloor, Lukas (2016). “Altruists Should Prioritize Artificial Intelligence.” Center on Long-Term Risk.


James Ozden @ 2022-07-26T16:02 (+6)

Heads up that the link to join the Discord seems to be broken! Otherwise would love to join.

evelynciara @ 2022-07-26T18:02 (+2)

Darn it! Will get that fixed.

S_Adi @ 2022-07-26T17:44 (+1)

+1 , it is stuck on the Captcha check stage for me.