Why the Open Philanthropy Project isn't currently funding organizations focused on promoting effective altruism

By Holden Karnofsky @ 2015-10-28T19:40 (+32)

This post attempts to clear up some confusions and discuss why the Open Philanthropy Project isn't currently funding organizations focused on promoting effective altruism. "We" refers to the Open Philanthropy Project.

We're excited about effective altruism, and we think of GiveWell as an effective altruist organization (while knowing that this term is subject to multiple interpretations, not all of which apply to us).

Over the last few years, multiple organizations have sprung up that focus on promoting and supporting the general ideas associated with effective altruism, and there's a case to be made that the Open Philanthropy Project should be funding such organizations. We may do so in the future, but we haven't to date and don't plan to do so imminently.

We are particularly interested in clarifying our thinking, and pointing out some of the constraints and limitations we face, in order to make clear that our lack of funding in this area does not mean that we have negatively evaluated the giving opportunities. We encourage individual donors to support organizations in this area if and when they feel they have ample context and a strong case for doing so.

For brevity, we abbreviate "grants to organizations focused on promoting and supporting the general ideas associated with effective altruism" as "EA organization grants" for the remainder of this post. Despite the abbreviation, this term isn't meant to include all organizations that consider themselves "EA organizations," and in particular doesn't include organizations that focus on global catastrophic risk reduction.

Summary:

There are several reasons for which we might make exceptions (one-off grants) or change our minds about EA organization grants as a focus area:

The bottom line is that at this point in time, we should be seen as generally agnostic on EA organization grants. It is true that we would put more time into this area if we had seen giving opportunities that seemed overwhelmingly compelling and unlikely to be funded by other donors, but for the most part we have not put enough time into assessing potential EA organization grants to have strong views. We don't think working on such grants is the right decision for our organization today, but we think there are other people for whom EA organization grants may be an excellent choice. While we sometimes share informal thoughts on effective altruism organizations and alert interested donors to giving opportunities, we feel that for the most part, many other donors are well-positioned to make their own judgments about whether to fund effective altruism organizations, particularly those donors who are highly impact-focused and have a good deal of context on the effective altruism community.

Avoiding "one-off" grants

We have limited capacity, and we generally feel that one way to make the most of it is to concentrate on focus areas. "Focus areas" refers to causes that we've made a deliberate, strategic decision to prioritize.

For any given focus area, we can:

When a giving opportunity sits within a focus area, we're generally able to evaluate it efficiently. It's clear whose role it is to provide most of the context; there are multiple people on staff who are familiar enough with the area to weigh in; potential risks are relatively easy to identify. We are well-positioned to explain our thinking to staff, to others in our audience, and to people who work within the cause. By contrast, one-off grants incur many of the costs of analyzing and discussing a new cause, without commensurate benefits.

In the specific case of EA organization grants, some of the costs of one-off grants are muted, while others are heightened. On one hand, we have strong connections in the effective altruist community, and we're broadly familiar with the goals of EA organizations, though we don't have the level of context we'd want for a focus area. So some of the costs associated with investigating focus areas are lower. On the other hand, we're particularly sensitive about problems we could cause if we funded some groups but not others, without being systematic and thoughtful about the reasons for doing so. Doing this could:

People in the effective altruism community are among those best positioned to both promote our work and to critique it at a big-picture level. Our relationships in this community are important to us, and that means we'd want to be able to situate any grants within this community in a well-thought-out overall strategy.

We aren't categorically opposed to one-off grants. We can make them when they appear sufficiently (a) outstanding and/or (b) unlikely to distract significantly from our focus-area-oriented work. EA organization grants generally seem particularly unlikely to pass the second criterion.

We have made one small grant in this category as a one-off, and may do so again in the future, but don't plan on much of it.

EA organization grants would be a relatively intensive focus area



There are some cases in which a focus area takes only a small fraction of a staff member's time - for example, land use reform and immigration policy. These causes are characterized by: (a) a thin field and ecosystem around our goals, with only a small number of giving opportunities and little in the way of established players; (b) sitting within a broader category (US policy) that one of our staff members focuses on.

By default, we don't think it would work well to approach EA organization grants this way, for several reasons.

First, there are enough organizations and people seeking funding that it would take serious time investment to keep up with and consider giving opportunities.

Second, there are some risks associated with this area:


It's important to note that the above factors aren't arguments against working on EA organization grants; they're simply reasons that doing so would have to be fairly intensive for us. Many focus areas bring substantial challenges and would require significant effort to work in, and we are happy to take these challenges on for the small number of focus areas we prioritize most highly.

A final note on this topic is that there is a fairly small set of people whom we could picture leading our work on EA organization grants, and all of them could work on many other areas for us as well. We have done several cause-specific job searches, looking for people we wouldn't find through our generalist hiring process, but we don't think this would work well for EA organization grants.

The bottom line of this section is that the opportunity cost of working on EA organization grants would be fairly high for us. It would likely require substantial time from generalist senior staff, which - at this stage in our development - would substantially slow our work on another broad category, such as scientific research funding.

We are actively thinking about ways we could approach EA organization grants in a less intensive way. We haven't yet settled on an approach we're comfortable with.

Our comparative advantage



There are several reasons we prefer our current research agenda to one with more focus on EA organization grants. One is our informal assessment of the importance, neglectedness and tractability of the "EA organization grants" area. While we think greater interest in effective altruism could lead to a large amount of positive impact, the giving opportunities we've seen (after accounting for existing sources of funding) don't seem outstanding enough to make working in this area preferable to exploring other areas. However, this assessment is highly speculative and informal - especially given how far we are from having a good understanding of some of the areas we're working on, such as scientific research.

A different line of reasoning (that overlaps to some degree with questions around importance, neglectedness and tractability) has to do with our comparative advantage and long-term strategy:

Other donors are well-positioned to fund EA organizations

It currently seems to us that:

This isn't to say that we've seen no funding gaps. But the field as a whole seems both relatively young (hence limited room for more funding) and capable of raising money in line with its current stage of development. And we're unsure of how much value we'd add over existing donors. The most unique aspects of the Open Philanthropy Project approach to funding pertain to cause selection; our model for working within a cause is not (so far) very different from the approach many others take, which is to assign most of the decision-making to a person who knows the space (and its organizations, people and literature) well and can make informed judgment calls.

As a more minor point, if we made a large commitment to the space of EA organization grants, we'd be somewhat worried about causing others to give less to EA organizations. The donors we're discussing tend to be highly attentive to questions like "If I didn't fund this project, would someone else?" - but we're not confident in all cases that we would agree with the details of how they handle these questions. If we made EA organization grants a focus area, and others gave less to EA organizations in hope that we would fill the gap, effective altruism organizations could end up with less robust donor bases, i.e. relying more heavily on fewer donors, and therefore in a weaker position.

It does seem worth noting that today's donors, by supporting EA organizations at their current level, may be helping build capacity that will lead to much larger giving opportunities down the line, and thus making our future entry into the space more likely.

Our comparative advantage and long-term strategy


We feel that there are multiple individual donors who have similar values to ours and are well-positioned to evaluate EA organizations. By contrast, we feel that it is very challenging for individuals to apply effective altruist values to causes that aren't "about" effective altruism, and this is an area where we feel uniquely well-positioned to be helpful.

We've long believed that one of the best things we can do for effective altruism is to give it more substance and definition. There are other groups who focus on getting more people to become interested in the broad ideas and values behind effective altruism; by contrast, we feel particularly well-positioned to help people identify specific donations they can make, issues they can engage with, etc. once they have bought in. Doing this can, itself, help get more people interested in effective altruism. For example, we believe that GiveWell's work on top charities has improved engagement from people who wouldn't have been drawn in purely by the abstract idea of effective giving. Much as one might develop and refine a scientific theory by using it to make predictions, we're trying to develop and refine an effective altruist framework by using it to arrive at concrete recommendations. We believe that doing so can help improve and make the case for the framework, and that this is a distinct goal from supporting promotion of the framework.

There are a few other reasons that working on causes other than EA organization grants fits well with our long-term strategy:

Bottom line

At the moment, we're not imminently planning to make EA organization grants, either as one-offs or via a focus area. We're continually reassessing this stance. Our staff capacity is in flux (generally growing), as is the state of the effective altruism community and the associated giving opportunities. As room for more funding in the EA organization grants space grows, and our capacity grows, the case for working on EA organization grants gets stronger. We do want to hear about EA grant giving opportunities; the more pressing and un-filled gaps we heard about, the more likely we would be to make the space a focus area, and we haven't ruled out one-off grants if the right opportunities (both in terms of promise and in terms of time required from us and potential risks) arose.

For the time being, however, we wish to make clear that we see no conflict between (a) choosing not to make EA organization grants ourselves and (b) being glad that there are other donors interested in doing so (both of which are the case).


undefined @ 2015-11-02T20:57 (+17)

Thanks for the comments, all.

Telofy and kgallas: I'm not planning to write up an exhaustive list of the messages associated with EA that we're not comfortable with. We don't have full internal agreement on which messages are good vs. problematic, and writing up a list would be a bit of a project in itself. But I will give a couple of examples, speaking only for myself:

  1. I'm generally uncomfortable with (and disagree with) the "obligation" frame of EA. I'm particularly uncomfortable with messages along the lines of "The arts are a waste when there are people suffering," "You should feel bad about (or feel the need to defend) every dollar you spend on yourself beyond necessities," etc. I think messages along these lines make EA sound overly demanding/costly to affiliate with as well as intellectually misguided.

  2. I think there are a variety of messages associated with EA that communicate unwarranted confidence on a variety of dimensions, implying that we know more than we do about what the best causes are and to what extent EAs are "outperforming" the rest of the world in terms of accomplishing good. "Effective altruism could be the last social movement we ever need" and "Global poverty is a rounding error compared to other causes" are both examples of this; both messages have been prominently enough expressed to get in this article , and both messages are problematic in my view.

Telofy: my general answer on a given grant idea is going to be to ask whether it fits into any of our focus areas, and if not, to have a very high bar for it as a "one-off" grant. In this case, supporting ACE fits into the Farm Animal Welfare focus area, where we've recently made a new hire; it's too early to say where this sort of thing will rank in our priorities after Lewis has put some work into considering all the options.

undefined @ 2015-11-06T18:36 (+4)

I’m looking forward to news from Lewis then!

Agreed on point 2.

About point 1: “I think messages along these lines make EA sound overly demanding/costly to affiliate with”: This strategic issue is one that I have no informed opinion on. Intuitively I would also think that people work that way, but the practice of hazing, e.g., initiation rites of fraternities, suggests that such costliness might counter recidivism, and that’s an important factor. Moral frameworks that have this obligation aspect also seem relatively more simple and consistent to me, which might make it easier to defend them convincingly in outreach.

“As well as intellectually misguided”: From a moral antirealist’s perspective, this depends on the person’s moral framework. Taking Brian’s critique of the demandingness critique into account, this does apply to mine, so whether to demand the same from others, again only boils down to the strategic question above. Do you have an ethical or epistemic reason why it would be misguided even from a broadly utilitarian viewpoint?

undefined @ 2015-10-28T23:51 (+13)

I really appreciate when you or other GiveWell employees take the time to write up your positions like this.

undefined @ 2015-10-29T06:59 (+8)

Thank you! That makes a lot of sense and increases my estimate of the marginal value of ETG for me.

One on-topic question: You say that “there are several messages associated with it that we're not comfortable with.” I have a bit of a history of updating in response to explanations from GiveWell, so I’m worried that I’m also running the risk of perpetuating (myself or through my donations) EA messaging that I will, in retrospect, regret. At the same time I’m puzzled as to which messages you might be referring to. Can you clarify this?

One explicitly off-topic question: Since ACE’s top charities are not charities that have grown out of the EA movement and ACE’s raison d'être is close to GiveWell’s, is there a chance you could provide a fixed yearly grant to ACE for regranting so as to incentivize charities to cooperate with it? (Small and fixed enough so not to make individual donations fungible to any worrisome degree.)

undefined @ 2015-10-29T14:35 (+7)

At the same time, effective altruism is still in a very nascent phase, there are several messages associated with it that we're not comfortable with, and amplification of problematic messages at this stage could affect general perceptions around the label, which could be a problem for both effective altruism and us.

Could you please clarify?