Some thoughts on recent Effective Altruism funding announcements

By JamesÖz @ 2022-03-03T15:53 (+116)

[Linkposted from my blog, Understanding Social Change]
 Epistemic status: Quite weak. I wrote this fairly quickly and I'm probably much less informed than those actively involved in the EA funding landscape.

It’s been an interesting couple of weeks for the EA movement, to say the least. The FTX foundation, thanks primarily to Sam Bankman-Fried, has announced the launch of the Future Fund, whose giving will focus on improving the long-term future. The Future Fund says it wants to deploy $100 million this year alone, with the potential to increase up to $1 billion if the opportunities are good enough. On top of this, Open Philanthropy is now hiring for four different roles within their longtermist community building team, who gave $60 million to long-term community building in 2021 and are very likely looking to increase this number significantly, given the expansion of the team from four people to potentially eight.

The announcement by the Future Fund was particularly interesting, as they’re taking a more decentralised approach to grantmaking than most existing foundations. Specifically, they have a long list of projects they're interested in hearing proposals about, with a competition to source more ideas, as well as running a regranting challenge. The regranting challenge is something particularly needed I believe, as it’ll build the grantmaking capacity of people within the EA movement, which will be crucial going forward when deploying even larger sums (I expand on this further down).
 

If that wasn’t enough, Open Philanthropy also announced they were hiring a program officer for community building around their Global Health and Wellbeing portfolio, which focuses on global health and poverty, farmed animal welfare, scientific research, and more. They state here they expect the program officer to allocate $10 million in funding in their first year and that “funding could grow significantly from there depending on the volume of good opportunities they find.” 

 

So, what does all this mean for Effective Altruism as a movement, or for individuals who are trying to do the most good? A few things potentially:

  1. The funding landscape for long-term vs near-term opportunities seems to have changed significantly, with approx. 5x more funding available for long-term community building relative to near-term community building.
  2. We need to scale up our grantmaking capacity as funding might increase by 8-10x by 2025, relative to 2019 levels.
  3. The EA movement needs to consider how having the influence and resources to grow certain fields by up to 10x will influence the wider ecosystem of organisations. It’s traditionally been based around marginal allocations of money, but now more complex coordination dynamics might appear.

And what’s been said many times, but is still true:

4. We need entrepreneurs, founders and other people who can kickstart projects

5. We need to massively scale up our ambition

-

1. The funding landscape for long-term vs near-term opportunities seems to have changed significantly

Seemingly an impossible and never-ending challenge, is how we allocate resources across a variety of different worldviews. This is a particularly hot topic when it comes down to near-term vs long-term causes; How do we allocate resources to help people alive now vs people who might live 10,000 years away? And how do animals fit into the picture?

 

To think about how the funding for near-term vs long-term cause areas have changed, we can use community building, or more ‘meta’ interventions, as an example. This assumes that the funding allocated to community building by the EA community is broadly representative of the total funding allocated to near-term vs long-term causes. This might break down slightly as it’s generally acknowledged that there are more tangible near-term opportunities to give (e.g. GiveDirectly could probably still productively use $350 million, although the effectiveness of this is 5-10x lower than other GiveWell top charities). On the other hand, longtermist funders are constrained by the number of good opportunities and proposals they receive. Due to that, it seems reasonably obvious that the longtermist community would spend more on broad movement building, to build the pipeline of people who are capable of starting successful and impactful projects to improve the long-term future, whereas this is comparatively less of a priority for the near-term EA community.

 

If we just look at the recent Open Philanthropy updates on their community building spending, we can get a sense of how this looks for Open Philanthropy, who controls a significant portion of committed EA funding. One note is that since this post by Benjamin Todd, the major donor situation has changed slightly, with Dustin Moskovitz’s net-worth dropping to $13 billion and SBF’s increased to $24.5 billion. This means that the FTX foundation might be the biggest EA funder going forward, so it’s probably more important to see what their priorities are. In both cases, the trends tend to be a growth in longtermist community building over near-term efforts:

 Lower bound estimate (in $ millions per year in 2022)Median estimate (in $ millions per year in 2022)Upper bound estimate (in $ millions per year in 2022)
Open Phil Longtermist community building

100

120

140

Open Phil near-term community building

10

20

30

Ratio of OP community building, longtermist / neartermist

10

6

5

FTX Future Fund

100

400

1,000

FTX Community Fund

18

50

100

Ratio of FTX longtermist / neartermist

6

8

10


 

In short, it’s tough to place an exact number on the ratio of allocated funding for near-term vs long-term causes, especially without the FTX Foundation announcing their commitments to FTX Community and FTX Climate. At a rough best guess, this ratio seems like 1:7 (near term: long-term) when considering community-building for Open Philanthropy, assuming a doubling in size of the longtermist community building team causes an increase of 66% funding distributed. For FTX, it could similarly be around 1:8 or even up to 1:10, provided the longtermist opportunities are good enough. A caveat again is that the projects proposed by the Future Fund and those funded by Open Phil will likely overlap with near-term community building efforts (e.g. improving operations with EA, more competition in the EA ecosystem, increasing diversity within EA, etc.) so this isn’t so clear cut, and could be more like 1:5. Another thing to keep in mind is that the EA Infrastructure Fund (via Peter Wildeford) is also quite keen to fund neartearmist community building, so the difference might be reduced further, depending on the size of projects.

 

Interestingly, Ben Todd finds that resources allocated to broad longtermism are around 9x smaller than what EA leaders think they should be. This additional funding increase, given that my ratio of 1:8 is broadly correct, would bring funding for broad longtermism much more closely in line with the ideal portfolio from the EA Coordination Forum in 2020.

In addition, according to the same post by Ben above, neartermist grants were a much larger proportion of EA’s giving in 2019 relative to longtermist giving (71% to 29% if you include animals in the near-term bucket). Overall, this funding increase by the Future Fund brings little overall change in ratios of funding for longtermist vs neartesrmist causes for 2022, as it is counterbalanced by GiveWell allocating an estimated $400 million in 2021, up from $172 million in 2019. This is somewhat surprising, as I always assumed longtermist grantmaking was already much greater than the neartermist portfolio, yet this post also confirms otherwise. However, It’s quite likely that longtermist grantmaking will outpace the neartermist portfolio over the next 3-10 years, as the total funding committed to longtermism is seemingly much greater (definitely by the FTX Foundation, if not also Open Phil).

Worldview$ millions per year in 2019% of total in 2019$ millions per year in 2022 (estimated)% of total in 2022Growth from 2019 to 2022
Near-term

230

57

640

60

2.8

Long-term

117.8

29

318

30

2.7

Animal Inclusive

55

14

110

10

2.0

Total

402.8

100.0

1067.8

100

2.7

A note that my table above is quite simplistic, and I could be off by quite a lot, as I’m working solely with public information which isn’t always perfect/up-to-date. I calculated the 2022 values using Ben Todd’s previous funding allocation post, along with adding $400 million (Open Phil + GiveWell) million to near-term funding efforts, $200 million to long-term funding ($100m from the Future Fund + $100m from Open Phil, which is roughly doubling their 2019 figures), and increasing animal funding by 2x, which are all somewhat conservative/reasonable I think. It’s quite likely that there have been changes in other donors within the EA community that might have altered these numbers as well. However, it’s also fair to assume that Open Phil, GiveWell and FTX are the biggest funding sources within the EA community, so it might not be too far off given I’ve estimated those amounts correctly. 

All in all, I think it's safe to say that if you want to launch a longtermist community building project (e.g. like one of these), now is probably a pretty good time to go for it!

 

2. We need to scale up our grantmaking capacity

The amount of funding committed to Effective Altruism has grown dramatically in the past few years, with an estimated  $46 billion dollars currently earmarked for EA. With this significant increase in available funding, there is now a greatly increased need for talented and thoughtful grantmakers, who can effectively deploy this money. It's plausible that yearly EA grantmaking could increase by a factor of 5-10x over the coming decade, based on the FTX Foundation and Open Philanthropy scaling up their funding, which they’re planning on doing (I believe). Some quick numbers on this:

 

Overall, that means by 2025, the EA community could quite reasonably be deploying around $3 billion per year, which is 8-10x larger than 2019 figures. Whilst the number of grantmakers doesn’t have to scale up by 8-10x in line with this, especially if the Future Fund is taking a more decentralised approach, it’s plausible to assume that we’ll need significantly more grantmaking and project vetting capacity, possibly by a factor of 3-5x. Obviously it’s no longer 2019 and we’ve probably increased our capacity in this field since then, but I’m doubtful it’s grown by more than 2x in the past 3 years.

80,000 Hours also agrees that grantmaking is a significant bottleneck to effectively deploying funding. They also touch on the difficulty in entering this field, especially for people earlier in their careers, which most EAs tend to be. This implies that we need some structural solutions to overcome these barriers, whereby people can both a) test if they would be a good fit for grantmaking, on a satisfaction and aptitude level and b) build trust with philanthropists or foundations if they were a good fit.
 

As someone who's quite interested in exploring grantmaking, I’ve found no opportunities to proactively test this on a small scale (with the exception of the regranting program by the Future Fund, which I’m applying for). This seems like something we should be working harder to address and as per my comment on important projects that FTX could fund, I believe there are some ways we can develop this grantmaking capacity. Some ways to build the grantmaker pipeline might be grantmaking fellowships, grantmaker mentoring, more frequent donor lotteries, more EA funds-style organisations with rotating fund managers, more regranting challenges (if successful), and more. Specifically, I would be excited about some version of this pipeline:

  1. Selective grantmaker fellowships (in the format of a 6-8 week course) organised by an organisation that's either focused on upskilling EAs (e.g. Training for Good) or grantmaking specifically. This would almost definitely need to be run by at least one experienced grantmaker for it to be worthwhile.
  2. The best candidates from the fellowship are invited to take part in a regranting challenge by a foundation, or invited onto EA Funds (or a similar org) as a part-time guest manager
  3. The best candidates again can be offered more permanent roles within foundations / with philanthropists.
  4. These candidates (or ones that didn’t meet the bar earlier on) have 1-to-1 mentoring with experienced grantmakers within their field, to further hone their judgement and develop best practices.

 

Step 2) onwards could also happen for someone who performs particularly well at a donor lottery. People within the EA community generally believe that donor lotteries are useful, yet we only have 3 per year. Having something closer to 5-10 would allow many more interested people to try out grantmaking, which is seemingly one step on the journey of how Adam Gleave became a fund manager on the Long Term Future Fund.

Generally, I think another EA Funds style organisation with rotating fund managers would be extremely high value, as you could offer approx. 4 people the opportunity to test their fit per cause area, as well as increasing funder diversity by a good margin.

 

3. Does the EA movement have to shift from marginal thinking to more complex coordination dynamics? 

Many EAs utilise marginal thinking when thinking about their donations, which might look like, what’s the specific additional impact of my donation, given that X amount of money has already been donated to this charity? This worked well when EA was relatively much smaller and had less leverage over the wider ecosystem of a certain cause. However, with the significant increase in committed funds for EA, there are already signs that EA funding is shaping certain fields towards EA priorities. As giving is plausibly expected to grow by up to 8-10x by 2025, relative to 2019 figures, this will likely become even more pronounced. For example, there was approximately $40 million being spent on biorisk and AI alignment in 2019, and I’m almost certain that if the opportunities were good enough, this could easily be $400 million per year for those two causes in the next couple of years. However, increasing the funding size of a field by up to 10x might introduce some complex dynamics, such as:

Some tangible examples of these effects already happening might be:

 

An important caveat is that I’m somewhat confident that people at Open Phil, FTX Foundation, EA Funds, etc. are thinking about this already (I hope!). And as mentioned above, these dynamics aren’t necessarily bad, as it potentially means people are working on more effective interventions or pressing problems. However, it may lead to some unwanted and potentially negative consequences, listed above. 

In short, what this means for major funders is that they might now need to have a more holistic view of the cause they’re working on, to ensure that it is developing well across the board, and ensuring additional EA funding doesn’t lead to any blindspots or adverse consequences. Broadly, this means investing in an ecology of change. This is potentially an over-simplification of a very complex coordination problem, so take it all with a pinch of salt.
 

The final two points have been spoken about much more within the EA community, yet the issues still remain:

4. We need entrepreneurs, founders and other people who can kickstart projects

In essence, it’s now widely talked about that funding has grown faster than the number of people involved with EA. With this increased amount of funding, one of the main bottlenecks to greater allocation per year is the number of good proposals, which is very closely related to the number of founders willing to launch them (as we have no shortage of ideas). This means that entrepreneurs, or people willing to start nonprofit or for-profit organisations, are in extremely high demand and could have huge leverage in unlocking more funding (especially by building scalable projects).
 

How could we make more of this happen? A couple of very preliminary ideas:

 

5. We need to massively scale up our ambition

It’s hard to add more to the post linked above, but in short:


Michael_Wiebe @ 2022-03-03T17:50 (+37)

This is a particularly hot topic when it comes down to near-term vs long-term causes; Do we think humans today morally matter more than humans in 10,000 years, and how, if at all, should we discount the value of humans over time?

Is there much debate on this? I'd expect most EAs to answer 'no' and 'discount rate=0'.

I'd expect more debate over the tractability of longtermist interventions.

andiehansen @ 2022-03-03T19:10 (+10)

As an EA group facilitator, I've been a part of many complex discussions talking about the tradeoffs between prioritizing long-term and short-term causes.

Even though I consider myself a longtermist, I now have a better understanding and respect for the concerns that near-term-focused EAs bring up. Allow me to share a few of them.

  1. The world has finite resources, so when you direct resources to long-term causes, those same resources cannot be put towards short-term causes. If the EA community was 100% focused on the very long term, for example, then it's likely that solvable problems in the near-term affecting millions or billions of people would get less attention and resources, even if they were easy to solve. This is especially true as EA gets bigger, having a more outsized impact on where resources are directed. As this post says, marginal reasoning becomes less valid as EA gets larger.
  2. Some long-term EA cause areas may increase the risk of negative outcomes in the near-term. For example, people working on AI safety often collaborate with and even contribute to capabilities research. AI is already a very disruptive technology and will likely be even moreso as its capabilities become more powerful.
  3. People who think "x-risk is all that matters" may be discounting other kinds of risks, such as s-risks (suffering risks) due to dystopian futures. If we prioritize x-risk while allowing global catastrophic risks (GCRs) to increase (that is, risks which don't wipe out humanity but greatly set back civilization), that increases s-risks because it's very hard to have well-functioning institutions and governments in a world crippled by war, famine, and other problems.

These and other concerns have updated me towards preferring a "balanced portfolio" of resources spread across EA causes from different worldviews, even if my inside view prefers certain causes over others.

Michael_Wiebe @ 2022-03-03T21:47 (+2)

If the EA community was 100% focused on the very long term, for example, then it's likely that solvable problems in the near-term affecting millions or billions of people would get less attention and resources, even if they were easy to solve.

This is directly captured by the ITC framework: as longtermist interventions are funded and hit diminishing returns, then neartermist ones will have the highest marginal utility per dollar. (Usually, MU/$ is a diminishing function of spending, so the top-ranked intervention will change as funding changes.)

James Ozden @ 2022-03-03T18:25 (+1)

Yes my bad! This is actually what I meant e.g. the epistemic uncertainty around longtermist interventions makes it challenging to determine funding allocation. Will amend this, thank you!

Cillian Crosson @ 2022-03-04T09:14 (+27)
  1. Selective grantmaker fellowships (in the format of a 6-8 week course) organised by an organisation that's either focused on upskilling EAs (e.g. Training for Good) or grantmaking specifically. This would almost definitely need to be run by at least one experienced grantmaker for it to be worthwhile.


We (Training for Good) are actually developing a grantmaker training programme like what you've described here to help build up EA's grantmaking capacity. It will likely be an 8 week, part-time programme, with a small pot of "regranting" money for each participant and we're pretty excited to launch this in the next few months.

In the meantime, we're looking for 5-10 people to beta test a scaled-down version of this programme (starting at the end of March). The time commitment for this beta test would be ~5 hours per week (~2 hrs reading, ~2 hrs projects, ~1 hr group discussion). If anyone reading this is interested, feel free to shoot me an email cillian@trainingforgood.com 

Kat Woods @ 2022-03-06T12:15 (+26)

Nonlinear is launching a longtermist incubator! Given my background co-founding Charity Entrepreneurship and nobody else moving forward on the idea, I thought it was a good fit for us.

Details to be announced soon.

DavidNash @ 2022-03-03T20:12 (+11)

Your point on biosecurity and marginal thinking was discussed in this forum post.

Kaleem @ 2022-03-03T20:15 (+5)

The point about bringing non-EA entrepreneurs into EA is a good one (I think - but I think that I think that because I have also been thinking about it recently!). One idea I've had is whether or not it'd be worth hosting some type of conference where we bring together HEA university students with enterprising and ambitious (not required) business school students to exchange ideas and brainstorm, with the aim of allowing co-founder pairs to enter some type of competition with seed-funding as the prize

Jan-WillemvanPutten @ 2022-03-04T09:06 (+1)

Great idea, at TFG we have similar thoughts and are currently researching the best way to run a program like this. Feel free to PM to provide input.