Summary of and thoughts on "Dark Skies" by Daniel Deudney
By Cody_Fenwick @ 2022-12-31T20:28 (+38)
This is my summary of the book “Dark Skies: Space Expansionism, Planetary Geopolitics, and the Ends of Humanity,” by Daniel Deudney. It was published in 2020, but I haven’t seen many people interested in longtermism or effective altruism engage with the book. I wrote this primarily to organise my own thoughts about the book, and I thought it was worth sharing. I give my impression of the strengths and weaknesses of the book, though it’s possible some of my interpretations are idiosyncratic. I am condensing a long, detailed book to make a short summary, so I’m inevitably leaving out important parts of the argument and drawing out what seemed most central to me.
TL;DR: The book offers a provocative and thoughtful thesis for longtermists to consider, though it is overly long and has some notable flaws.
Basic thesis of the book
Deudney believes human space settlement and technological expansion into space is itself an existential risk, arguing against those who say it diminishes existential risk. Those who advocate for “space expansionism” erroneously ignore the serious and likely dangers, in his view. The book argues that humanity should therefore actively restrain its ambitions of expanding into space for at least the next few centuries and possibly forever.
My general thoughts on the book’s quality: Worth reading for longtermists or anyone interested in existential risks, space settlement, or space governance
Downsides
It is at times unhelpfully polemical, and Deudney does not apply as much criticism to his own ideas as he does to those of others. He takes some cheap shots at his ideological opponents. He often repeats the same thought in multiple ways in the same section, and some sections repeat previous sections’ ideas unnecessarily. It could probably have been just as informative at half the length.
Upsides
Nevertheless, it is still highly dense with ideas, probably more so than 90 percent of books. And it is very thoughtful and provocative about existential risk and humanity’s long-term future. It takes seriously moral and political questions about humanity’s future and speculative possibilities in a way that few outside the longtermism canon typically do.
How I read it
I listened to the audiobook at 1.5x speed. I usually do 1.8x or 2x speed, but the nature of the writing made it too risky to go this fast — it’s easy to miss key steps in his arguments. This is unfortunate since a lot of it is repetitive. The audiobook is 20h40m long at 1x; the hardcover is 464 pages long.
Key takeaways
Deudney argues that space exploration thus far has been net negative for humanity, despite a lot of hype to the contrary.
- His primary argument for this is that he classifies ICBMs as space technology because, he says, they rely on the fact that these missiles are only as reliable and useful as they are because they fly in the upper atmosphere.
- Thus the threat of nuclear weapons is exacerbated by space technology, which swamps out any of the benefits.
- This seems plausible to some degree, but what’s the point?
- It’s not entirely clear what the argumentative strategy is, but I think he wants to establish that our baseline expectation should be that space expansionism is dangerous and to undercut the optimistic story that is usually told about it.
Deudney basically sees the risk from nuclear weapons, and perhaps more importantly the risk from manipulating asteroid trajectories, as the great risk from space expansionism.
- I think the argument relies mostly on versions of Mars settlemnet and communities aboard space stations within our solar system; it’s possible he envisions Venus terraforming and/or colonies on nearby moons.
- He seems to believe settling other solar systems is just infeasible on any timescale worth thinking about.
- He basically argues that the geography would make unified governance of these separate colonies impossible, they would likely be unfree, and the populations would diverge in their values and physical features (because to survive in non-Earth environments would require technological and biological adaptations).
- This differentiation would fuel conflict between the competing groups, and because of Earth’s particularly deep gravity well, the Earth population would be at particular risk.
- Earth's deep gravity well makes it an easy target and harder to defend.
- The external security threats could fuel the rise of a worldwide totalitarian government
Deudney avoids the obvious cheap shot of denouncing space settlement by analogy to historical colonialism, noting that the latter was a moral atrocity because the colonised regions were already inhabited (unlike, say, Mars).
- But he still draws an interesting lesson from colonialism, which is that even without the direct victims of colonialism, space travel could repeat the conflict between different groups of settlers who are angling to access to the new frontier.
In one of the weakest parts of the argument, he compares letting our descendants become a species that doesn’t look like us in meaningful ways to extinction and says we’d be uniquely foolish as a species to let this happen.
- He gives almost no serious consideration to the idea that descendants of ours that may be significantly different from modern-day humans could still have lives well worth living and could enjoy a future we might be happy to create.
Because the dynamics of near-term space settlement would almost certainly lead to catastrophic conflict, Deudney believes we should commit to humanity remaining Earth-bound for the foreseeable future, i.e., at least a few centuries.
- After that point, he thinks it’s possible that humanity will have developed to a stage where space settlement wouldn’t inevitably lead to conflict, and so it could be safe.
- However, he comes close to endorsing an even more pessimistic conclusion, that it could never be safe for us to leave the planet.
- He doesn’t address the concern that, if we stay Earth-bound forever, we’ll eventually face (potentially premature) extinction. It’s not entirely clear why he doesn’t address this, but he may believe one of the following:
- 1. The lifespan of inhabitable Earth is satisfactory for the human species.
- 2. At the end of Earth's habitable lifespan, the decision to spread into space becomes sensible.
- 3. There's not really any use for thinking about possibilities so far in the future; future generations will be better able to figure out what to do then anyway.
- He doesn’t address the concern that, if we stay Earth-bound forever, we’ll eventually face (potentially premature) extinction. It’s not entirely clear why he doesn’t address this, but he may believe one of the following:
Key omissions
Though he mentions the risks from advanced AI, he doesn’t spend much time addressing its potential implications for the argument.
- He doesn't address how AI or AI timelines would affect the trajectory of space settlement
- For people with short or medium timelines, this may make the book as a whole less relevant.
Likewise, he doesn't do much to address the possibility of digital people, aside from some potentially relevant references in his discussion of how space-faring people could evolve.
- This seems particularly relevant since many people believe settlement of the galaxy is most likely to happen via some form of digital descendants.
He doesn’t take seriously the argument about “astronomical waste” or the immense levels of potential value that could be obtained by space expansionism.
- This might be a result of essentially ignoring the possibility of digital people.
He doesn’t sufficiently address the challenges for his own view of “Oasis Earth,” i.e. the coercive apparatus that will be required to keep the species Earth-bound.
- This seems like a big problem for his view, because getting everyone on board for his program of refraining from space expansion seems unlikely.
- He also takes some cheap shots about space enthusiasts’ aims potentially leading to disaster because the ends are so spectacular they might justify any means. But by the same token, his view about the dangers of space expansion could also lead to extreme levels of oppression under the same means-ends reasoning that he decries.
Additional interesting points
He believes space settlement would exacerbate AI x-risk: We wouldn’t be able to prevent other colonies from developing unfriendly AI because they’d be out of our direct sphere of influence, so we’d be more at risk.
He suggests an interesting solution to the Fermi Paradox: Aliens may have either expanded into space and killed themselves, or they recognized the dangers and decided to stay on their home planet. Either way, it would be no surprise that we don’t see them.
- But does this argument actually go through? His argument seems to rely on existential risks that arise through conflict. But it’s not clear why the risks he highlights, though genuinely concerning, would actually likely lead to the extinction of all parties to the conflict. And if not, it’s not obvious this is actually a solution to the Fermi Paradox. The victors would still be around.
Read more:
I did find one thoughtful response to Deudney’s book from Al Globus, who defends space settlement.
Ramiro @ 2023-01-01T10:50 (+5)
Thanks for this review. I'm linking here another post commenting a previous review for those interested in the subject. https://forum.effectivealtruism.org/posts/gcPp2bPin3wywjnGH/is-space-colonization-desirable-review-of-dark-skies-space