How important is marginal earning to give?
By Ben Kuhn @ 2015-05-19T20:41 (+16)
Some observations:
- Most of GiveWell’s senior staff are moving over to the Open Philanthropy Project.
- This year, GiveWell had to set explicit funding targets for all of their charities and update their recommendations in April to make sure nobody ran out of room for more funding.
- My understanding is that Good Ventures (a) probably has more money than the current discounted cash flows from the rest of the EA movement combined and (b) still isn’t deploying nearly as much money as they eventually will be able to.
- Open Phil has recently posted about an org they wish existed but doesn’t and funder-initiated startups.
- I can’t remember any EA orgs failing to reach a fundraising target.
- Effective altruism is growing quickly; many EAers plan to earn to give but are currently students and will increase their giving substantially in the next few years.
These observations make me feel generally weird about earning to give: Good Ventures and other large foundations can fund a ton of stuff, and there are many individual EA donors who can fund the good ideas that aren't worth large funders engaging with for whatever reason (at least, many relative to the available opportunities). So it might be important to have more people trying to spot opportunities and start effective charities with support from large funders or current EtGers. For instance, the Gates Foundation has 1200 employees trying to help them deploy their money (and that’s presumably not counting the people who help them start new organizations); applying a similar ratio to Good Ventures would suggest they should have on the order of 100 people helping them, whereas today they have ~10.
Given that doing a normal job and making large donations is psychologically more attractive than trying to start nonprofits for a lot of people (including myself), this suggests that marginal EtGers (also potentially including myself?) might want to give more weight to trying to find opportunities to start new effective organizations, and leave the funding to people like Dustin Moskovitz.
One counterpoint might be that “large funders” are not actually that large; for instance, 72% of total giving is from individuals, but I don’t know if that ratio holds for global poverty or other causes EAs are interested in. And even if it does, it seems like you have to be a certain size of organization to raise grassroots funds effectively, and right now we don’t have enough orgs of that size.
I’d love to get other people’s thoughts on this.
undefined @ 2015-05-19T21:05 (+15)
To play devil's advocate (these don't actually represent my beliefs):
I can’t remember any EA orgs failing to reach a fundraising target.
This doesn't necessarily mean much, because fundraising targets have a lot to do with how much money EA orgs believe they can raise.
Open Phil has recently posted about an org they wish existed but doesn’t and funder-initiated startups.
It's pretty hard to get funding for a new organization, e.g. Spencer and I put a lot of effort into it without much success. The general problem I see is a lack of "angel investing" or its equivalent–the idea of putting money into small, experimental organizations and funding them further as they grow. (As a counter-counterpoint, EA Ventures seems well poised to function as an angel investor in the nonprofit world.)
Also, to address the general point that EA is talent-constrained, the problem might be that there are very few people with the skills needed, and more funding can be used to train people, like MIRI is doing with the summer fellows program. In that case earning to give is still a good solution to the talent constraint.
undefined @ 2015-05-20T01:04 (+5)
It's pretty hard to get funding for a new organization, e.g. Spencer and I put a lot of effort into it without much success. The general problem I see is a lack of "angel investing" or its equivalent–the idea of putting money into small, experimental organizations and funding them further as they grow.
I agree with this. Moreover, I think there's a serious lack of funding in the 'fringe' areas of EA like biosecurity, systemic change in global poverty, rationality training, animal rights, or personal development. These areas arguably have the greatest impact, but it's difficult to attract the major funders.
For example, I think the Swiss EA groups are quite funding-constrained, but they aren't well-known to the major funders and movement-building lacks robust evidence.
undefined @ 2015-05-20T04:32 (+6)
Have the Swiss EA groups tried to raise funding from the broader community? I had no idea they were funding-constrained until you mentioned it.
undefined @ 2015-05-20T13:19 (+4)
It's correct that the Swiss EA organizations are currently funding-constrained. We haven't pitched any projects to the international community yet, but we're considering it if an opportunity arises where this makes sense.
I also think that funding is going to be less of an issue once more people in the movement transition from still being students to etg.
undefined @ 2015-05-20T04:43 (+3)
This doesn't necessarily mean much, because fundraising targets have a lot to do with how much money EA orgs believe they can raise.
I agree that this could confound the result, but it's still some evidence!
The general problem I see is a lack of "angel investing" or its equivalent–the idea of putting money into small, experimental organizations and funding them further as they grow. (As a counter-counterpoint, EA Ventures seems well poised to function as an angel investor in the nonprofit world.)
It's hard to say for sure without knowing the fraction of solicited EA startups that get funding, but GiveWell has made some angel-esque investments in the past (e.g. New Incentives), and I think some large individual donors have as well.
the problem might be that there are very few people with the skills needed, and more funding can be used to train people, like MIRI is doing with the summer fellows program.
This is pretty plausible for AI risk, but not so obvious for generic organization-starting, IMO. Are there specific skills you can think of that might be a factor here?
undefined @ 2015-05-20T07:00 (+3)
It's hard to say for sure without knowing the fraction of solicited EA startups that get funding, but GiveWell has made some angel-esque investments in the past (e.g. New Incentives), and I think some large individual donors have as well.
I get the impression that these are going mostly to programs that already have a lot of evidence and aren't really exploring the space of possible interventions. I tend to believe that the effectiveness of projects probably follows a power law, and that therefore the most effective interventions are probably ones people haven't tried yet, so funding variants on existing programs doesn't help us find those interventions.
This is pretty plausible for AI risk, but not so obvious for generic organization-starting, IMO. Are there specific skills you can think of that might be a factor here?
GiveWell style research seems very trainable, and it is plausible that GiveWell could hire less experienced people & provide more training if they had significantly more money (I have no information on this though.)
The right way to learn organization-starting skills might be to start an organization; Paul Graham suggests that this is the right way to learn startup-building skills. In that case we'd want to fund more people running experimental EA projects.
undefined @ 2015-05-20T12:56 (+2)
I wouldn't say that New Incentives has "a lot of evidence and aren't really exploring the space of possible interventions." But again, this is just dueling anecdata for now.
GiveWell style research seems very trainable, and it is plausible that GiveWell could hire less experienced people & provide more training if they had significantly more money
GiveWell already hires and trains a number of people with 0 experience (perhaps most of their hires).
The right way to learn organization-starting skills might be to start an organization; Paul Graham suggests that this is the right way to learn startup-building skills. In that case we'd want to fund more people running experimental EA projects.
Ah, good point. This seems like a pretty plausible mechanism.
undefined @ 2015-05-20T21:19 (+1)
GiveWell already hires and trains a number of people with 0 experience (perhaps most of their hires).
Oh, cool! I definitely didn't realize this.
undefined @ 2015-05-20T01:33 (+12)
Note that GiveWell / Good Ventures (unsurprisingly) like to research a charity or cause area themselves before they direct funding to it, and this is tightly constrained by the pace of GiveWell research staff growth, so in practice many high-leverage opportunities are still (in my opinion) available to marginal EtGers — at-least, if those EtGers are willing to be at least 1/5th as proactive about finding good opportunities as, say, Matt Wage is. Maybe that won't be true after 10 years of additional research conducted by GiveWell (incl. OpenPhil), but I think it'll be true for the foreseeable future.
There are probably additional reasons GiveWell / Good Ventures won't fund particular things, besides the fact that they haven't been researched in sufficient depth by GiveWell. E.g. GiveWell might think it's a good thing for there to be multiple meta-charities in the EA space that maintain independence, and so even if funding CEA projects is a clear win, they still might think it's a bad idea for GW/GV to direct any support to CEA projects.
And finally, it's also possible that individual EtGers might have different values or world-models than the public faces of GW/GV have, and for that reason those marginal EtGers could have good opportunities available to them that are not likely to be met by GW-directed funding anytime soon, if ever.
(I say all this as a random EA who thinks about these things, not as a soon-to-be GW staffer.)
That said, I also think people with the right collection of talents should seriously consider applying to do cause prioritization research at GW or elsewhere, and people with a different right collection of talents should consider starting new projects/organizations, especially when doing so in coordination with an already-interested funder like GV.
undefined @ 2015-05-20T14:49 (+2)
Yes, I think it's right that people can find opportunities beyond those that are researched by GW if they have different values, different epistemology, pro-actively investigate opportunities to fund, or even outsource this evaluation to Wage, EA Ventures, Beckstead or elsewhere.
undefined @ 2015-05-21T20:50 (+4)
I love the idea of outsourcing my donation decisions to someone who is much more knowledgeable than I am about how to be most effective. An individual might be preferable to an organization for reasons of flexibility. Is anyone actually doing this -- e.g., accepting others' EtG money?
In fact, I'd outsource all kinds of decisions to the smartest, most well-informed, most value-aligned person I could find. Why on earth would I trust myself to make major life decisions if I'm primarily motivated by altruistic considerations?
undefined @ 2015-05-22T11:47 (+1)
Well, even if you're primarily motivated by altruistic considerations, there are likely to be some significant personal factors that you can introspect more easily. But what's related, and clearly beneficial, is getting advice from mentors who you talk to when you have a bigger than usual decision.
My other thought is: what kinds of decisions do you want to outsource? Clever altruistic people have occasionally described why they made various kinds of decisions in their personal lives, and these can be copied e.g.:
- http://www.gwern.net/DNB%20FAQ
- https://meteuphoric.wordpress.com/2014/11/21/when-should-an-effective-altruist-be-vegetarian/
- http://robertwiblin.com/2012/04/19/should-you-floss-a-cost-benefit-analysis/
undefined @ 2015-05-22T14:14 (+3)
Absolutely re personal factors. "Outsource" is an overstatement.
And no, I don't mean decisions like whether to be a vegetarian (which, as I've noted elsewhere, presents a false dichotomy) or whether to floss, which can be generically answered.
I mean a personalized version of what 80,000 hours does for people mid-career. Imagine several people in their mid-30s to -40s--a USAID political appointee; a law firm partner; a data scientist working in the healthcare field--who have decided they are willing to make significant lifestyle changes to better the world. What should they do? This seems to be a very different inquiry than it is for an undergrad. And for some people, a lot turns on it--millions of dollars. Given the amount at stake, it seems like a decision that should be taken just as seriously by the EA community as how an EA organization should spend millions of dollars.
undefined @ 2015-05-22T17:04 (+1)
Ah, mid-career work-related decisions. Yes, it seems important. As mid-career decisions are more tailored, they're harder for 80,000 Hours, who are nonetheless better equipped than most for this task.
Although career direction is important, you can see why it might be done less than directing donations - everyone's money works the same, and so one set of charity-evaluations generalises reasonably well to everyone, assuming they have fairly similar values. Career decisions are harder.
Mentors who sympathise with the idea of effective altruism are helpful here, because they know you. Also special interest groups could be useful. So for people in policy, it makes sense for them to be acquainted with other effective altruists in a similar space, even if they're living in a different country. If someone who had an unusually high-stakes career (say Jaan Tallinn, a cofounder of Skype) wanted to make an altruistic decision about his career, I'm sure he could pull together some of 80,000 Hours and others to do some relevant research for him.
Beyond that, how we can get these questions better answered is an open question :)
undefined @ 2015-05-22T18:18 (+3)
I'm thinking more along the line of mentors for the mentors, and I think one solution would be a platform on which to crowd source ideas for individuals' ten-year strategic plan. In a perfect world, one would be able to donate one's talents (in addition to one's money) to the EA cause, which could then be strategically deployed by an all-seeing EA director. Maybe MIRI could work on that.
undefined @ 2015-05-20T04:59 (+2)
in practice many high-leverage opportunities are still (in my opinion) available to marginal EtGers — at-least, if those EtGers are willing to be at least 1/5th as proactive about finding good opportunities as, say, Matt Wage is.
Interesting! Are you able to be more concrete about those opportunities? (Or how proactive Matt is?)
And finally, it's also possible that individual EtGers might have different values or world-models than the public faces of GW/GV have, and for that reason those marginal EtGers could have good opportunities available to them that are not likely to be met by GW-directed funding anytime soon, if ever.
Yeah, definitely agree that this is the case--on the other hand, it seems like there are a lot of EtGers with a fairly diverse set of values/world-models in place already. I'm worried specifically about marginal EtGers; I think the average EtGer is doing super useful stuff.
undefined @ 2015-05-20T07:02 (+4)
From talking to Matt Wage a few times I got the impression that he spends the equivalent of a few full time work weeks per year figuring out where to donate. Requiring potential donors to spend that much time seems like a flaw in the system, and EA ventures seems to be addressing it.
undefined @ 2015-05-20T06:43 (+2)
I don't know the whole story, but Matt Wage kept close tabs on FLI, and gave a substantial amount of money at a well-chosen time, which helped make the AI conference planning go more smoothly.
undefined @ 2015-05-20T05:33 (+10)
This is what I was trying to get at with http://acesounderglass.com/2015/05/11/map-of-open-spaces-in-effective-altruism/ . I don't think the number of unsolved problems is at all well publicized.
undefined @ 2015-05-20T10:15 (+5)
I liked this article! There may be enough forum-goers that haven't seen it (as I hadn't) that it would be worth cross-posting.
undefined @ 2015-05-20T15:55 (+11)
Thanks :) I plan to do so as soon as I have the karma
undefined @ 2015-05-20T10:38 (+9)
A note on what we mean when we talk about "marginal EtGers".
In some sense all EtGers look marginal, in that they could shift the margins by moving onto direct work. But there's a coordination issue. Really the people who have the highest comparative advantage at EtG should be pursuing that, and whether we have the right balance determines where the cut-off should be between people choosing EtG or direct work. "Marginal EtGers" are people who only just decided on EtG. They could be people already in EtG careers, but they will more often be people who haven't started, because experience and specialisation shifts your comparative advantage.
I expect this is all exactly as you were thinking, but I've been confused about this before so the clarification seemed like it might be useful for somebody!
undefined @ 2015-05-24T04:06 (+1)
They could be people already in EtG careers, but they will more often be people who haven't started, because experience and specialisation shifts your comparative advantage.
I'm only twenty-two years old, and I haven't completed a university or college certification yet. When I first encountered 80,000 Hours and effective altruism, I opted for earning to give because I didn't think of myself as having many skills. I don't know what soft or general skills I'll learn in various careers by the time I'm thirty or forty, but I know the names of jobs which earn lots of money. Earning to give is what seems available to me. I'm aiming for it because it's the only thing I can concretely imagine myself doing right now. I think this might be the case for lots of young(er) effective altruists, so I think Owen's correct.
undefined @ 2015-05-20T14:42 (+1)
Didn't we all already mean that when we said 'marginal EtGers'? Like - the people whose decision to be in an earning-to-give career rather than charity is marginal? And I agree that it applies more frequently to early-career-stage. But yes, I agree that anyone could theoretically do a little less earning and a little more volunteering for example.
undefined @ 2015-05-20T15:14 (+1)
We probably did, but the meaning of "marginal EtGers" should be context-dependent[1], so it seemed worth clarification.
[1] For example if we're talking about the value of persuading people to re-cast their professional career as earning-to-give, we could want to refer to people who were only just persuaded, or people in different areas who might be reached by expanding the efforts -- either of which is a different margin.
undefined @ 2015-05-20T13:00 (+1)
Thanks for clarifying! Yup, the coordination problem is pretty hard. (Personally, I actually basically have no idea whether I should actually consider myself a marginal EtGer, and don't really know how to answer the parts of this question that require information about the rest of the EA community.)
undefined @ 2015-05-19T23:37 (+7)
Has there been much thought or discussion put into the idea of making existing charities more effective? Sure there are lots of organizations out there that focus on making marketing more effective or getting more donors; but there seems to be a big whole in the market for people or organizations that work to turn current charities into ones we would consider effective. I've thought about this myself quite frequently and would be stoked to see something like this. Has this already been discussed elsewhere?
undefined @ 2015-05-24T04:10 (+2)
I haven't seen this discussed online. When I met Holden Karnofsky, co-executive director of Givewell, I asked him if making existing charities more effective is work Givewell would consider getting into. He told me Givewell is not considering that, and they intend to stick to their work of charity evaluation. He believes making existing charities more effective would be a more difficult job than evaluating them.
undefined @ 2015-05-26T07:57 (+1)
On the face of it, the Carter Centre, Fred Hollows Foundation, and several other charities look like they are already doing fantastically cost effective projects, but that on the whole they don't fair as well. On the face of it, it appears like large donors can request that certain things are done with their money (from having worked at WaterAid) and concerns about intra-organisation arbitrage might be over-stoked. I think you're on the right lines there Syd! Perhaps this could be an article on its own?
undefined @ 2015-05-20T14:53 (+3)
Thanks for posting this Ben, I've noticed some of the same things you mention.
I personally would be more motivated against earning to give if I heard an argument about why the "more money equals more stuff done" equation which seems to hold in the rest of our economy fails for charities.
undefined @ 2015-05-20T17:45 (+5)
why the "more money equals more stuff done" equation which seems to hold in the rest of our economy fails for charities
Why isn't there an equal presumption that "more people willing to try things equals more stuff done"? EtGers and org-starters are complementary goods. The point isn't that EtG is not going to do anything, just that there might be other things that did even more.
undefined @ 2015-05-20T05:20 (+3)
I think EAs irrationally avoid giving to "second-best" charities (like GiveWell's standouts) , but that's a relatively weak impression. It might be helpful to talk more about top giving opportunities in a given moment/year, rather than talking about top charities, which can become less top as donations are made, until donating doesn't feel so shiny anymore (also saying this as a random EA, not soon-to-be GW staffer).
Of course, it might be better to ask people to give later in general, but there's no reason as far as I know to believe the best order would be 'donate to room-for-funding-remaining top charity' > 'donate later' > 'donate to second-best charity.'
Also, as Eliezer and Jacy pointed out on Facebook, this sufficient funding argument is far less true of existential risk and animal-focused charities than global poverty ones (in fact, many of those are somewhat strapped for cash).
undefined @ 2015-05-19T21:49 (+3)
I don't have any new considerations to add but I agree with a) there's probably a relative oversupply of 'EA money' on the margin and b) there are various psychological reasons that would pull towards E2G over direct work.
undefined @ 2015-05-26T08:12 (+2)
Psychological reasons? Social status of high earning jobs? The dopaminergic reward of numbers on a page and the oxytocic reward of the expression of gratitude from those you fund? Certainty against uncertainty? Any others?
undefined @ 2015-05-19T21:41 (+3)
I think there are plausibly contrary explanations for some of these observations. for senior staff moving to Open Phil, it could be because Open Phil is younger, and its tasks are less structured. For top charities running out of room for more funding, this is only the top couple of GiveWell charities, and this needn't apply to intergenerationally-altruistic charities. GiveWell has mentioned a couple of organisations that they would like to see, but it's not as though finding such opportunities has yet become their main activity.
I think the general point is right though: Good Ventures has most of the cash that we need, and EA Ventures has some also, as do Jaan Tallin, Sam Harris, edit: Matt Wage and others. Most of the people who are clever enough to want to make epic charities are also clever enough to know that they can have a more secure and conventional life elsewhere. This can be solved by just starting epic charities anyway, and by accumulating more funding to push more marginal individuals to do the same.
What charity would you start?
undefined @ 2015-05-20T04:52 (+3)
I think there are plausibly contrary explanations for some of these observations.
Yeah, I agree; I think they're suggestive rather than definitive.
and by accumulating more funding to push more marginal individuals to do the same.
Sorry--didn't you previously say that you agreed marginal people should be focusing less on accumulating more funding? I think I'm missing a link somewhere here.
What charity would you start?
Good question! I suspect the fact that this is much less well-defined than "which org would you donate to" is one of the psychological factors in favor of EtG :P
undefined @ 2015-05-20T14:56 (+2)
and by accumulating more funding to push more marginal individuals to do the same.
Sorry--didn't you previously say that you agreed marginal people should be focusing less on accumulating more funding? I think I'm missing a link somewhere here.
EA charities seem sufficiently talent constrained at the moment that I think some organisations will want to take a combination of two different measures: increasing salaries and encouraging people to move across from ETG (or not enter ETG in the first place).
undefined @ 2015-05-20T00:45 (+3)
To avoid confusing people: my own annual contributions to charity are modest.
undefined @ 2015-05-19T21:13 (+3)
Sounds reasonable - and if successful will come back to a funding constraint.
You want to put your most expensive resource at the bottle-neck as an efficiency heuristic.
Do people think the bottle neck is ideas, execution, or funding - or the infrastructure needed to facilitate one or more of those?
It seems a shame that such intelligent and amazing people in the EA movement are, on the whole, putting their most productive and creative years (25-35?) into EtG rather than building and delivering practical solutions to improve the world in the best way possible outside of AI/Xrisk and movement building - as I think there is a lot of learning value from this kind of thing!
EA groups specialised around key promising interests such as healthcare, AI, governance, animal welfare, poverty and development etc. that can learn together, network efficiently, keep track of projects, prioritise collective resources etc. might be a way forward. AI/X-risk and perhaps animal welfare appears from the outside to be more developed along these lines than the others? What are people's thoughts?
undefined @ 2015-05-28T02:28 (+2)
The answer to this question depends somewhat on your focus area, but my experience so far has been that almost all the organizations I work with could use more money (including ACE, Animal Ethics, most Swiss EA projects, MIRI, FHI, and most object-level charities except those that get saturated by Good Ventures). Many of these groups need money more than talent right now.
I also think the people who can be hired for an opportunity cost of ~$50K aren't 4 times less talented than those who can be hired for an opportunity cost of $200K. This belief is partly based on what I've seen at current charities and partly based on the premise that EAs aren't as special as they can sometimes seem. When you do nonprofit startups, most of the time goes toward ordinary tasks that lots of people could do.
I enjoy working on nonprofit stuff, so I don't find EtG psychologically more attractive, but after a lot of thinking about this question, I've tentatively concluded that I can make a bigger impact EtGing. I think this would be true even if I could only make ~$150K/year, which is too low as an estimate of long-term future earnings.