Research project idea: How should EAs react to funders pulling out of the nuclear risk space?
By MichaelA🔸 @ 2023-04-15T14:37 (+12)
This post is part of a series of rough posts on nuclear risk research ideas. I strongly recommend that, before you read this post, you read the series’ summary & introduction post for context, caveats, and to see the list of other ideas. One caveat that’s especially worth flagging here is that I drafted this in late 2021 and haven’t updated it much since. I’m grateful to Will Aldred for help with this series.
One reason I'm publishing this now is to serve as one menu of research project ideas for upcoming summer research fellowships.
Some tentative bottom-line views about this project idea
Importance | Tractability | Neglectedness | Outsourceability |
Medium | Medium/High | Medium/Low | Medium |
What is this idea? How could it be tackled?
Several of the largest philanthropic funders of work related to nuclear risk have stopped or reduced their funding for such work or will do so soon. This apparently includes the Compton Foundation, the Ford Foundation, the John Merck Fund, the Hewlett Foundation, MacArthur Foundation, the Rockefeller Foundation, Skoll Foundation, and the W. Alton Jones Foundation (Bender, 2021; MacArthur Foundation, 2021; ORS Impact, 2015; Wilson, 2021).[1]
The plausible or likely effects of this include:
- Various projects (organisations, training programs, campaigns, etc.) in this space having to shut down or scale back
- People finding it hard to start or scale up projects in this space
- Some projects in this space being more open than usual to adopting the focuses, approaches, etc. funders request
- Some projects in this space becoming more focused on non-nuclear issues (e.g., doing projects framed as being about the intersection of nuclear risk and either climate change or social justice, to appeal to new funders)
This raises questions such as:
- Which specific funders have reduced or eliminated their funding to this area or will do so soon?
- What levels of philanthropic funding for this area should we expect at various future points? Who will be the leading funders? How much will be allocated to various priorities, perspectives, etc.?
- Are the reasons why funders are stopping or reducing their spending in this area also reasons why EA funders should avoid focusing on this area or should avoid some of the specific funding priorities or perspectives that those non-EA funders had?
- E.g., are the other funders correctly diagnosing that various organisations and perspectives were having little impact?
- What projects, if any, are worth “saving” from shutting down or scaling back?
- This could intersect with the project idea “Impact assessment of various organisations, programmes, movements, etc.”
- They could be worth saving based on:
- their past impact
- other reasons to expect strong future impact
- it being worth acting fast to preserve option value while doing further analysis and/or gathering further evidence about their expected future impact
- Are there many projects that are worth starting or scaling up that, under previous conditions, would’ve been started or scaled up, but now won’t by default?
- To what extent do these changes in the funding landscape increase the ability of EA funders to fund established organisations, experts, etc. to adopt the specific topic focuses, perspectives, approaches, policy focuses, etc. we’d like them to adopt?[2]
- E.g., to increase their focus on the scenarios and policies that matter most from a longtermist perspective, or to move toward better epistemic standards.
- All things considered, how should these changes in the funding landscape affect (1) how much EAs (especially EA funders) prioritise nuclear risk and (2) what we prioritise within this area?
These questions could be tackled via activities such as:
- Reading what’s being said about these changes by the funders who are pulling out of the space, by funders who will continue to fund in this space, by the affected organisations, and by commentators
- One could look for what they’re saying in their own reports (e.g., MacArthur, 2020), in reports they commissioned (e.g., ORS Impacts, 2015), in the news, etc.
- Talking to people from these institutions
- Making forecasts or putting relevant questions on Metaculus
- Talking to EA funders or other funders about pros, cons, and best practices for trying to influence what fundees do
- Attempting an impact assessment of various organisations, programmes, movements, etc.
Why might this research be useful?
Depending on what precise sub-questions one focuses on, I think a well-executed 2-9 month version of this project would have a decent chance (>10%) of either or both of:
- Convincingly indicating that EA funders should be spending an additional $3m-$25m in this space over the next 5 years than they’d otherwise think
- Convincingly indicating that specific funding opportunities totalling $0.5m-$5m are worth investigation by EA funders and are >25% likely to be worth funding (with these being opportunities that probably wouldn’t have been funded otherwise)
(Those specific numbers shouldn’t be treated as well-calibrated forecasts; they were made up pretty quickly just as an attempt to more clearly communicate my fuzzy, tentative beliefs.)
What sort of person might be a good fit for this?
I expect any good generalist researcher could provide a useful analysis of these questions. I expect someone to be a stronger fit the more of the following criteria they meet:
- Knowledge, experience, or connections relevant to the think tank, arms control, policymaking, and/or foundation worlds, especially in relation to Washington D.C. and nuclear risk
- Grantmaking experience
- Trust from EA funders (so they are more likely to act on the person's analysis)
- ^
I haven’t independently verified that this is true of any of these specific funders except Hewlett and MacArthur. Instead, the other funders I name are based entirely on the two cited news sources (which don’t themselves cite sources for the relevant claims).
- ^
We might also want to consider the following related options:
- Funding established organisations, experts, etc. to do work aimed at other longtermist/x-risk-related priorities, such as biosecurity or AI risk. (There’s substantial overlap between the expertise and backgrounds relevant to nuclear risk and the expertise and backgrounds relevant to some other longtermist priority areas.)
- Funding established organisations to take on board a person of “our” choice, who then gets a great chance to test, improve, and demonstrate their fit for various types of work it could be impactful for them to do in future. (See also the section “Fellowships and bringing in your own funding” in “Working at a (DC) policy think tank: Why you might want to do it, what it’s like, and how to get a job”.)