JoelMcGuire's Quick takes
By JoelMcGuire @ 2022-06-15T04:08 (+4)
nullJoelMcGuire @ 2023-05-19T21:44 (+28)
What’s the track record of secular eschatology?
A recent SSC blog post depicts a dialogue about Eugenics. This raised the question: how has the track record been for a community of reasonable people to identify the risks of previous catastrophes?
As noted in the post, at different times:
- Many people were concerned about overpopulation posing an existential threat (c.f. population bomb, discussed at length in Wizard and The Prophet). It now seems widely accepted that the risk overpopulation posed was overblown. But this depends on how contingent the green revolution was. If there wasn’t a Norman Borlaug, would someone else have tried a little bit harder than others to find more productive cultivars of wheat?
- Historically, there also appeared to be more worry about the perceived threat posed by a potential decline in population IQ. This flowed from the reasonable-sounding argument “Smart people seem to have fewer kids than their less intellectually endowed peers. Extrapolate this over many generations, and we have an idiocracy that at best will be marooned on earth or at worst will no longer be capable of complex civilization.” I don’t hear these concerns much these days (an exception being a recent Clearer Thinking podcast episode). I assume the dismissal would sound something like “A. Flynn effect.[1] B. If this exists, it will take a long time to bite into technological progress. And by the time it does pose a threat, we should have more elegant ways of increasing IQ than selective breeding. Or C. Technological progress may depend more on total population size than average IQ since we need a few Von Neumann’s instead of hordes of B-grade thinkers.”
- I think many EAs would characterize global warming as tentatively in the same class: “We weren’t worried enough when action would have been high leverage, but now we’re relatively too worried because we seem to be making good progress (see the decline in solar cost), and we should predict this progress to continue.”
- There have also been concerns about the catastrophic consequences of: A. Depletion of key resources such as water, fertilizer, oil, etc. B. Ecological collapse. C. Nanotechnology(???). These concerns are also considered overblown in the EA community relative to the preoccupation with AI and engineered pathogens.
- Would communism's prediction about an inevitable collapse of capitalism count? I don't know harmful this would have been considered in the short run since most attention was about the utopia this would afford.
Most of the examples I’ve come up with seem to make me lean towards the view that “these past fears were overblown because they consistently discount the likelihood that someone will fix the problem in ways we can't yet imagine.”
But I’d be curious to know if someone has examples or interpretations that lean more towards "We were right to worry! And in hindsight, these issues received about the right amount of resources. Heck they should have got more!"
What would an ideal EA have done if teleported back in time and mindwiped of foresight when these issues were discovered? If reasonable people acted in folly then, and EAs would have acted in folly as well, what does that mean for our priors?
- ^
I can't find an OWID page on this, despite google image searches making it apparent it once existed. Might not have fed the right conversations to have allowed people to compare IQs across countries?
Kirsten @ 2023-05-20T21:34 (+6)
Right to worry about nuclear war, based on information later revealed about the Cuban Missile Crisis and other near misses
David Mathers @ 2023-05-21T20:50 (+2)
Good point, but I think people worried about extinction risk from nuclear war before a really plausible mechanism-nuclear winter-was found by which that would occur following a US-Soviet exchange. There's a nuclear doomsday device in Dr. Strangelove, a novel about post-nuclear war human extinction from the 50s etc.: https://en.wikipedia.org/wiki/On_the_Beach_(novel) Though to be fair, these are fiction, and it's not clear the idea was without foundation pre-nuclear winter research: https://en.wikipedia.org/wiki/Cobalt_bomb
https://en.wikipedia.org/wiki/Doomsday_device
Kirsten @ 2023-05-20T20:43 (+2)
Unclear if Y2K was fixed or was never really a problem - this article suggests the latter. https://education.nationalgeographic.org/resource/Y2K-bug/#:~:text=Software and hardware companies raced,worked to address the problem.
JoelMcGuire @ 2022-06-19T19:06 (+2)
Wild Idea #4 Simultaneously Solve American Education, Politics and Maybe Housing
2050: It’s been twenty years since ground broke on Franklin University, nestled in the foothills of Medicine Bow National Forest. Cal Newport was right, skilled researchers and professors flocked to the promise of 70%+ research time. The recent national spotlight on several successful alumni caused a surge of interest in the school. Enrollment swelled. The school was beginning to turn a profit and wean itself off those tech millions.
Its students aren’t that different from those attending good state schools, and the school doesn’t improve them all that much. Only 5% of each class goes into direct EA work (much higher for those who get a degree in Global Priorities Studies or Wellbeing Decision Science). The average student would look much the same if they’d gone elsewhere, but they’re slightly more sensible and expansive in their mindset.
The research is where the university really shines. Professors get plenty of time to do research, as long as 40% of it is spent on the department’s high impact research agenda. They can spend the rest of their time exploring an esoteric and in expectation un-impactful area of knowledge, but the university probably won’t fund them to do it.
The accompanying city of Longwell was planned with sane ideas about zoning and transportation. The relatively infertile land and embrace of building made it one of the only cities with genuinely cheap housing in a place worth living. Miles of hiking and biking trails sprawl out of the city and stretch into the surrounding hills and woodlands. The promise of cheap housing tempts many remote workers to the charmingly walkable city. Many remark that the town feels “old”, comparing it favorably to cozy north-eastern or European towns. The same appeal keeps a share of graduates hanging around, starting companies.
The schools commitment to idealogical diversity somewhat appeases conservatives who sometimes point approvingly to the Universities’ distinct lack of wokeness. Since the beginning, many of the Institutes founders stress the non-partisan aspects of their shared research agenda. Despite the firm and careful guidance, the project draws the ire of pundits warning of a liberal plot to take over Wyoming and steal its securely Republican electoral and congressional votes.
The success of Frank-U and Longwell spawn imitators across the country.
2090: Longwell and its suburbs, long the fastest growing MSA in the United States, is now large enough to be a political force in the state. For the first time, both Senators from Wyoming come from a liberal democratic political party. Sitting near the center, they exert disproportional political force. It still rides goodwill after several technologies stemming from projects related to the city blunted a potentially catastrophic pandemic in 2084.
——----------
Note, that Montana, the second smallest red state with a population of 1 mil instead of ~500k voted 40% for Biden compared to 25% in woyming. This led to an absolute vote gap that was smaller, 100k instead of 120k.
https://en.wikipedia.org/wiki/2020_United_States_presidential_election_in_Montana
https://en.wikipedia.org/wiki/2020_United_States_presidential_election_in_Wyoming
JoelMcGuire @ 2022-06-15T04:08 (+2)
What we owe the past
(note: I know this post of the same title exists, but I haven't read it)
Under a (somewhat) plausible theory of wellbeing, you may want to consider the deep terrestrial potential of improving the lives of the dead.
To explain requires some exposition. The two most prominent theories of wellbeing are hedonism and satisfactionism. The first claims that life is best for you when you feel best, and the second claims that life goes best when you satisfy your desires.
Satisfactionist or desire theories of wellbeing cleave into two further camps. One in which it's important you feel that you satisfy your desires. The other is "objective", where what is morally relevant is not that you think you're getting what you want, but that you actually get what you want. Note that the first type of satisfactionist would hop into the experience machine with the hedonists. Also, in the second theory, if your partner cheats on you for years and you never found out, it would consider that your life is worse -- which many people find intuitive.
I think Parfit had a thought experiment regarding Actual Satisfactionist theories where a man burns his whole life with a profound desire that there is life on another planet. He lives. Earth is alone. He dies. The next day, scientists discover life on mars! It was there the whole time. He never felt his desire satisfied, but it was, and his life was better for it.
This strikes some readers as strange. However, for those willing to bite this bullet, I offer you more. Imagine the "Life on Mars" case. Except, this version has a twist. Life wasn't there the whole time. Mars was dead. However, the year after he dies Elon rockets to the red planet with a whole terrarium full of critters and founds a permanent colony.
Is the man's life better off for having his desire fulfilled? If you think it's plausible that his life in fact improved, you can probably guess where I'm going with this...
The dead are many. We should look not towards bettering future lives, but past lives if the following is true: A. We care bout people's actual desires getting satisfied (and satisfying dying wishes counts), B. there are fewer people in the future than in the past and C. The dead carry coherent desires that we can satisfy.
Point B. is true if we believe Eliezer about AI and there's nothing we can do about it. And to offer a candidate to fulfill point C: Children. Every one of your ancestor's, and you had lots, I mean lots of them, by revealed preference wanted you to pass on their genes. Well now it's down to you.
So if you're into a weird variant of desire theories (can't stand them myself), you may want to reconsider that vasectomy.