Ideas EAIF is excited to receive applications for
By Jamie_Harris @ 2024-12-11T11:51 (+91)
The EA Infrastructure Fund isn’t currently funding-constrained. Hooray! This means that if you submit a strong application that fits within our “principles-first” effective altruism scope soon, we’d be excited to fund it, and won’t be constrained by a lack of money. We’re open to considering a range of grant sizes, including grants over $500,000 and below $10,000.[1]
In part, we’re writing this post because we spoke to a few people with projects we’d be interested in funding who didn’t know that they could apply to EAIF. If you’re unsure, feel free to ask me questions or just apply!
The rest of this post gives you some tips and ideas for how you could apply, including ideas we’re excited to receive applications for. I (Jamie) wrote this post relatively quickly; EAIF staff might make more such posts if people find them helpful.
🔍 What’s in scope?
- Research that aids prioritisation across different cause areas.
- Projects that build communities focused on impartial, scope-sensitive and ambitious altruism.
- Infrastructure, especially epistemic infrastructure, to support these aims.
- (More on this post and our website, though the site needs a bit of a revamp. And please err on the side of applying. You don’t need to be fully ‘principles first’; that’s our strategy.)
💪 What makes an application strong?
A great idea — promising theory of change and expected cost-effectiveness.[2]
- Evidence suggesting you are likely to execute well on the idea.
- (I’m simplifying a bit of course. See also Michael Aird’s tips here.)
The second part is straightforward enough; if your project has been ongoing for a while, we’d like to understand the results you’ve achieved so far. If it’s brand new, or you’re pivoting a lot, we’re interested in evidence about your broader achievements and skills that would set you up well to do a good job.
You might already have a great idea. If so, nice one! Please ignore the rest of this post and crack on with an application. If not, I’ll now highlight a few specific topics that we’re especially interested in receiving applications for at the moment.[3]
💡 Consider applying for projects in these areas
Epistemics and integrity
What’s the problem?
- EA is vulnerable to groupthink, echo chambers, and excessive deference to authority.
- A bunch of big EA mistakes and failures were perhaps (partly) due to these things.
- A lot of external criticism of EA stems back to this.
What could be done?
- Training programmes and fellowships that help individual participants develop good epistemic habits or integrity directly (e.g. Scout Mindset, Fermi estimates, developing virtues), indirectly (e.g. helping them form their own views on cause prioritisation), or as part of a broader package.
- Training, tools, or platforms for forecasting and prediction markets.
- Researching and creating tools that aid structured and informed decision-making.
- Developing filtering and vetting mechanisms to weed out applicants with low integrity or poor epistemics.
- New structures or incentives at the community level: integrating external feedback, incentivising red-teaming, or creating better discussion platforms.
What have we funded recently?
- Elizabeth Van Nostrand and Timothy Telleen Lawton recorded a discussion about why Elizabeth left EA and why Timothy is seeking a ‘renaissance’ of EA instead. They’re turning this into a broader podcast.
- EA Netherlands is working with Shoshannah Tekofsky to develop 5-10 unique rationality workshops to be presented to 100-240 Dutch EAs over a 12-month period, aiming to improve their epistemic skills and decision-making processes.
- André Ferretti launched the “Retrocaster” tool on Clearer Thinking, to enhance users’ forecasting skills. By obscuring data from sources like Our World in Data, Retrocaster invites users to forecast hidden trends.
Harri Besceli, another EAIF Fund Manager, wrote more thoughts on EA epistemics projects here. This is beyond EAIF’s scope but if you have a for-profit idea here, feel free to contact me.[4]
EA brand and reputation
What’s the problem?
- Since FTX, the public perception of EA has become significantly worse.
- This makes it harder to grow and do community outreach.
- Organisations and individuals are less willing to associate with EA; this reduces the benefits it provides and further worsens its reputation.
What could be done?
- Good PR. There’s a whole massive industry out there focused on exactly this, and presumably a bunch of it works. Not all PR work is dishonest.
- Empirical testing of different messages and frames to see what resonates best with different target audiences.
- More/better comms and marketing generally for promising organisations.
- Inwards-focusing interventions that help create a healthier self-identity, culture, and vision, or that systematically boost morale (beyond one-off celebratory posts).
- Support for high-quality journalism on relevant topics.
What have we funded recently?
- Yi-Yang Chua is exploring eight community health projects. Some relate to navigating EA identity; others might have knock-on effects for EA’s reputation by mitigating harms and avoiding scandals.
- Honestly not much. Please send us requests!
I’ve focused on addressing challenges of poor brand and reputation, but of course the ideal would be to actually fix any underlying issues that have bad consequences and in turn cause poor reputation. Proposals relating to those are of course welcome (e.g. on epistemics & integrity).
Funding diversification
What’s the problem?
- Many promising projects are bottlenecked by funding, from AI safety to animal welfare.
- Projects are often dependent on funding from Open Philanthropy, which makes their situation unstable and incentivises deference to OP’s views.
- There’s less funding in EA than there used to be (especially due to the FTX crash) or could be (especially given historical reliance on OP and FTX).
What could be done?
- Projects focused on broadly raising funding from outside the EA community.
- More targeted fundraising, like projects focusing specifically on high-net-worth donors, local donors in priority areas (e.g. India), or specific professions and interest groups (e.g. software engineers, alt protein startup founders, AI lab staff).
- Regranting projects.
- Projects focused on democratising decision making within the EA community.
- Philanthropic advising, grantmaking, or talent pipelines to help address bottlenecks here.
What have we funded recently?
- Giv Effektivt hired its first FTE staff member to reach high-net-worth individuals and improve operations, media outreach, and SEO.
- EA Poland grew and promoted a platform for cost-effective donations to address global poverty, factory farming, and climate change.
- But we’ve mostly only received applications for broad, national effective giving initiatives; and there are so many more opportunities in this space!
Areas deprioritised by Good Ventures
Good Ventures announced that it would stop supporting certain sub-causes via Open Philanthropy. We expect that programmes focused on rationality or supporting under 18s (aka ‘high school outreach’) are the most obviously in-scope-for-EAIF affected areas; you can check this post for other possibilities.
We expect that Good Ventures’ withdrawal here leaves at least some promising projects underfunded, and we’d be excited to help fill (some of) the gap.
✨ This is by no means an exhaustive list!
There are lots of problems in effective altruism, and lots of bottlenecks faced by projects making use of EA principles; if you have noticed an issue, let us know about how you can help fix it by submitting an application.
For instance, if you’ve been kicking around for a few years — you’ve built up some solid career capital in top orgs, and have a rich understanding of the EA community, warts and all — then there’s a good chance we’d be excited to fund you to make progress on tackling an issue you’ve identified.[5]
And of course, other people have already done some thinking and suggested some ideas. Here are a few longlists of potential projects, if you want to scour for options[6]:
- Here’s a quick list we made earlier with different tabs from different EAIF staff (2024)
- Rethink Priorities’ General Longtermism Team’s longlist (2023)
- Finn Moorhouse’s list of “EA Projects I'd Like to See” (2022)
- FTX Future Fund’s list (2022) and crowdsourced suggestions (731 comments!)
- Charity Entrepreneurship’s “survey of 40 EAs” on the most promising areas to start new EA meta charities (2020)
- Probably a bunch of other posts in the EA Forum tags for “Opportunities to take action”, “Research agendas, questions, and project lists”, or “Community projects”
❓ Ask me almost anything
I’m happy to do an informal ‘ask me anything’ here — I encourage you to ask away in the comments section if there’s anything you’re unsure about or that is holding you back, and I expect to be able to respond to most/all of them. You can also email me (jamie@effectivealtruismfunds.org) or use my anonymous advice form, but posting your comment here is a public good if you’re up for it, since others might have the same question.
But if you already know everything you need to know…
🚀 Apply
See also: “Don’t think, just apply! (usually)”. By the way, EAIF’s turnaround times are much better than they used to be; typically 6 weeks or less.
The application form is here. Thanks!
- ^
We don’t have a hard upper bound at the moment. Historically, most of our grants have been between about $10,000 and $200,000. We’d be a bit hesitant evaluating something much higher than $500,000 but we’re open to it. If it was over $1m, we’d likely encourage you to apply elsewhere, e.g. Open Philanthropy.
- ^
Scalability and high upside value can make an application more promising but are not requirements.
- ^
The first three of these are inspired by a few calls Harri Besceli (another EAIF Fund Manager) carried out with some people who work in EA community building or who have been engaged in the EA community for a long time. But the bullet points here are my own take; this isn’t a writeup of the findings so to speak. I’m not trying to ‘make the case’ for any of these areas’ importance here; it’s fine if you disagree, I’m just flagging that we’d be excited for applications in these areas.
- ^
I also work at Polaris Ventures, which makes investments and might be interested.
- ^
Of course, this isn’t guaranteed; it still needs to be an in-scope, strong application. And we sometimes receive strong applications from people who are newer to effective altruism, too.
- ^
Caveats:
- With the exception of mine and CE’s, these lists all contain ideas that wouldn’t be in-scope for EAIF.
- Many of these lists were put together quickly or had a low bar for inclusion. Some are mostly outdated, some may focus on worldviews you disagree with, etc. You shouldn’t treat an idea being mentioned on one of these lists as a strong vote of confidence from anyone that it’s actually a good use of time. These are usually just ideas.
- Even if it is a great idea, you still need to have relevant skills and track record to be able to put in a 💪 strong application.
toonalfrink @ 2024-12-11T17:33 (+32)
Re "epistemics and integrity" - I'm glad to see this problem being described. It's also why I left (angrily!) a few years ago, but I don't think you're really getting to the core of the issue. Let me try to point at a few things
-
centralized control and disbursion of funds, with a lot of discretionary power and a very high and unpredictable bar, gives me no incentive to pursue what I think is best, and all the incentive to just stick to the popular narrative. Indeed groupthink. Except training people not to groupthink isn't going to change their (existential!) incentive to groupthink. People's careers are on the line, there are only a few opportunities for funding, no guarantee to keep receiving it after the first round, and no clear way to pivot into a safer option except to start a new career somewhere your heart does not want to be, having thrown years away
-
lack of respect for "normies". Many EA's seemingly can't stand interacting with non-EA's. I've seen EA meditation, EA bouldering, EA clubbing, EA whatever. Orgs seem to want everyone and the janitor to be "aligned". Everyone's dating each other. It seems that we're even afraid of them. I will never forget that just a week before I arrived at an org I was to be the manager of, they turned away an Economist reporter at their door...
-
perhaps in part due to the above, massive hubris. I don't think we realise how much we don't know. We started off with a few slam dunks (yeah wow 100x more impact than average) and now we seem to think we are better at everything. Clearly the ability to discern good charities does not transfer to the ability to do good management. The truth is: we are attempting something of which we don't even know whether it is possible at all. Of course we're all terrified! But where is the humility that should go along with that?
Neel Nanda @ 2024-12-12T09:02 (+12)
It seems that we're even afraid of them. I will never forget that just a week before I arrived at an org I was to be the manager of, they turned away an Economist reporter at their door...
Fwiw, I think being afraid of journalists is extremely healthy and correct, unless you really know what you're doing or have very good reason to believe they're friendly. The Economist is probably better than most, but I think being wary is still very reasonable.
Jamie_Harris @ 2024-12-11T18:18 (+10)
Thanks! Sorry to hear the epistemics stuff was so frustrating for you and caused you to leave EA.
Yes, very plausible that the example interventions don't really get to the core of the issue -- I didn't spend long creating those and they're more meant to be examples to help spark ideas rather than confident recommendations on the best interventions or some such. Perhaps I should have flagged this in the post.
Re "centralized control and disbursion of funds": I agree that my example ideas in the epistemics section wouldn't help with this much. Would the "funding diversification" suggestions below help here?
And I'm intrigued if you're up for elaborating why you don't think the sorts of "What could be done?" suggestions would help with the other two problems you highlight. (They're not optimising for addressing those two specific concerns of course, but insofar as they all relate back to bad/weird epistemic practices, then things like epistemics training programmes might help?) No worries if you don't want to or don't have time though.
Thanks again!
toonalfrink @ 2024-12-11T19:31 (+9)
Yes, I imagine funding diversification would help, though I'm not sure if it would go far enough to make EA a good career bet.
My own solution is to work myself up to the point where I'm financially independent from EA so my agency is not compromised by someone elses model of what works
And you're right that better epistemics might help address the other two problems, but only insofar that these are interventions that are targeted at "s1 epistemics" i.e. the stuff that doesn't necessarily follow from conscious deliberation. Most of the techniques in this category would fall under the banner of spirituality (the pragmatic type without metaphysics). This is something that the rationalist project has not addressed sufficiently. I think there's a lot of unexplored potential there.
Jacob Watts🔸 @ 2024-12-15T13:44 (+5)
I've seen EA meditation, EA bouldering, EA clubbing, EA whatever. Orgs seem to want everyone and the janitor to be "aligned". Everyone's dating each other. It seems that we're even afraid of them.
I am not in the Bay Area or London, so I guess I'm maybe not personally familiar with the full extent of what you're describing, but there are elements of this that sound mostly positive to me.
Like, of course, it is possible to overemphasize the importance of culture fit and mission alignment when making hiring decisions. It seems like a balance and depends on the circumstance and I don't have much to say there.
As far as the extensive EA fraternizing goes, that actually seems mostly good. Like, to the extent that EA is a "community", it doesn't seem surprising or bad that people are drawn to hang out. Church groups do that sort of thing all the time for example. People often like hanging out with others with shared values, interests, experiences, outlook, and cultural touchstones. Granted, there are healthy and unhealthy forms of this.
I'm sure there's potential for things to get inappropriate and for inappropriate power dynamics to occur when it comes to ambiguous overlap between professional contexts, personal relationships, and shared social circles. At their best though, social communities can provide people a lot of value and support.
Why is "EA clubbing" a bad thing?
Davidmanheim @ 2024-12-12T22:37 (+2)
- I strongly agree.
- It seems that living in the Bay Area as an EA has a huge impact, and the dynamics are healthier elsewhere. (The fact that a higher concentration of EAs is worse, of course, is at least indicative of a big problem.)
Rockwell @ 2024-12-11T23:43 (+19)
Thanks for the post! Quick flag for EAIF and EA Funds in general (@calebp?) that I would find it helpful to have the team page of the website up to date, and possibly for those who are comfortable sharing contact information, as Jamie did here, to have it listed in one place.
I actively follow EA Funds content and have been confused many times over the years about who is involved in what capacity and how those who are comfortable with it can be contacted.
Jamie_Harris @ 2024-12-12T10:46 (+4)
Seems fair. I do work there, I promise this post isn't an elaborate scheme to falsely bulk out my CV.
calebp @ 2024-12-12T11:07 (+2)
Thanks for the flag, we have had some turnover recently - will ask our dev to update the site!
OllieBase @ 2024-12-12T09:12 (+14)
- EA is vulnerable to groupthink, echo chambers, and excessive deference to authority.
- A bunch of big EA mistakes and failures were perhaps (partly) due to these things.
- A lot of external criticism of EA stems back to this.
I'm a bit skeptical that funding small projects that try to tackle this are really stronger than other community-building work on the margin. Is there an example of a small project focused on epistemics that had a really meaningful impact? Perhaps by steering an important decision or helping someone (re)consider pursuing high-impact work?
I'm worried there's not a strong track record here. Maybe you want to do some exploratory funding here, but I'm still interested in what you think the outcomes might be.
Jamie_Harris @ 2024-12-12T10:45 (+6)
Mm they don't necessarily need to be small! (Ofc, big projects often start small, and our funding is more likely to look like early/seed funding in these instances.) E.g. I'm thinking of LessWrong or something like that. A concrete example of a smaller project would be ESPR/SPARC, which have a substantial (albeit not sole) focus on epistemics and rationality, that have had some good evidence of positive effects, e.g. on Open Phil's longtermism survey.
But I do think the impacts might be more diffuse than other grants. E.g. we won't necessarily be able to count participants, look at quality, and compare to other programmes we've funded.
Some of the sorts of outcomes I have in mind are just things like altered cause prioritisation, different projects getting funded, generally better decision-making.
I expect we would in practice judge whether these seemed on track to be useful by a combination of (1) case studies/stories of specific users and the changes they made (2) statistics about usage.
(I do like your questions/pushback though; it's making me realise that this is all a bit vague and maybe when push comes to shove with certain applications that fit into this category, I could end up confused about the theory of change and not wanting to fund.)
OllieBase @ 2024-12-12T14:39 (+2)
Thanks!
I don't know much about LW/ESPR/SPARC but I suspect a lot of their impact flows through convincing people of important ideas and/or the social aspect rather than their impact on community epistemics/integrity?
Some of the sorts of outcomes I have in mind are just things like altered cause prioritisation, different projects getting funded, generally better decision-making.
Similarly, if the goal is to help people think about cause prioritisation, I think fairly standard EA retreats / fellowships are quite good at this? I'm not sure we need some intermediary step like "improve community epistemics".
Appreciate you responding and tracking this concern though!
Jamie_Harris @ 2024-12-12T15:46 (+4)
I think fairly standard EA retreats / fellowships are quite good at this
Maybe. To take cause prio as an example, my impression is that the framing is often a bit more like: 'here are lots of cause areas EAs think are high impact! Also, cause prioritisation might be v important.' (That's basically how I interpret the vibe and emphasis of the EA Handbook / EAVP.) Not so much 'cause prio is really important. Let's actually try and do that and think carefully about how to do this well, without just deferring to existing people's views.'
So there's a direct ^ version like that that I'd be excited about.
Although perhaps contradictorily I'm also envisaging something even more indirect than the retreats/fellowships you mention as a possibility, where the impact comes through generally developing skills that enable people to be top contributors to EA thinking, top cause areas, etc.
I don't know much about LW/ESPR/SPARC but I suspect a lot of their impact flows through convincing people of important ideas and/or the social aspect rather than their impact on community epistemics/integrity?
Yeah I think this is part of it. But I also think that they help by getting people to think carefully and arrive at sensible and better processes/opinions.
Kyle Smith @ 2024-12-11T19:15 (+9)
I think it's great that EAIF is not funding constrained.
Here's a random idea I had recently if anyone is interested and has the time:
An org that organizes a common application for nonprofits applying to foundations. There is enormous economic inefficiency and inequality in matching PF grants to grantees. PF application processes are extremely opaque and burdensome. Attempts to make common applications have largely been unsuccessful, I believe mostly because they tend to be for a specific geographic region. Instead, I think it would be interesting to create different common applications by cause area. A key part of the common application could be incorporating outcome reporting specific to each cause area, which I believe would cause PF to make more impact-focused grants, making EAs happy.
Brad West🔸 @ 2024-12-12T18:45 (+4)
I think this is an excellent idea.
Orgs or "proto-orgs" in their early stages are often in a catch-22. They don't have the time or expertise (because they don't have full time staff) to develop a strong grantwriting or other fundraising operations, which could be enabled by startup funds. An org that was familiar with the funding landscape, could familiarize itself with new orgs, and help it secure startup funds could help resolve the catch-22 that orgs find themselves at step 0.
toonalfrink @ 2024-12-11T19:34 (+4)
Excellent idea. This would also incentivize writing an application that is generally convincing instead of trying to hack the preferences of the specific fund
gergo @ 2024-12-18T08:48 (+4)
Thanks for the post, Jamie!
Given the call for the Ask me Anything, I was wondering if you would be able to share additional context on the following from your application form:
Important Note: The EA Infrastructure Fund and the Long-Term Future Fund are currently unable to make grants with an end date after August 31st 2025, and any applications to these funds must have a grant period which ends on or before this date. We are working on a solution to this, and hope to be able to remove this restriction as soon as possible. If you were intending to apply for a grant past this date, you are welcome to apply for funding up until August 31st 2025, and once we’re able to make grants past this date again you will be able to apply again for the remaining period.
I'm specifically interested in asking about:
once we’re able to make grants past this date again you will be able to apply again for the remaining period
Is there a risk that EAIF will have to pause grantmaking after August if the issue is not solved, or do you expect this not to be a problem by then?
Jamie_Harris @ 2024-12-18T19:20 (+4)
I don't know all the details since it's a governance/operational thing but I don't think we expect this to be an issue, thankfully!
Daniel Abiliba @ 2024-12-15T10:38 (+3)
Thanks for this information Jamie. Could you elaborate more on the regranting opportunities. A few advocates and myself are becoming more convinced a regranting program for farm animal welfare research in Africa is crucial for the growth of the effective animal advocacy in Africa (currently sparse and ~ less than 5 reports/papers are published annually). We have a few ideas about how to go about this but will be open to discussing it with a grantmaker.
Jamie_Harris @ 2024-12-16T15:24 (+2)
Hi Daniel! I don't have a lot to elaborate on here; I haven't really thought much about the practicalities, I was just flagging that proposals and ideas relating to regranting seem like a plausible way to help with funding diversification.
Also, just FYI, on the specific intervention idea, which could be promising, that would fall in the remit of EA Funds' Animal Welfare Fund (which I do not work at), not the Infrastructure Fund (which I work at). I didn't check with fund managers there if they endorse things I've written here or not.
Jacob Watts🔸 @ 2024-12-16T04:35 (+1)
I have a few questions about the space of EA communities.
You mention
Projects that build communities focused on impartial, scope-sensitive and ambitious altruism.
as in scope. I am curious what existing examples you have of communities that place emphasis on these values aside from the core "EA" brand?
I know that GWWC kind of exists as it's own community independent of "EA" to ~some extent, but honestly I am unclear to what extent. Also, I guess LessWrong and the broader rationality-cinematic-universe might kind of fit here too, but realistically whenever scope sensitive altruism is the topic of discussion on LessWrong an EA Forum cross-post is likely. Are there any big "impartial, scope-sensitive and ambitious altruism" communities I am missing? I know there are several non-profits independently working on charity evaluation and that sort of thing, but I am not very aware of distinct "communities" per say?
Some of my motivation for asking is that I actually think there is a lot of potential when it comes to EA-esque communities that aren't actually officially "EA" or "EA Groups". In particular, I am personally interested in the idea of local EA-esque community groups with a more proactive focus on fellowship, loving community, social kindness/fraternity, and providing people a context for profound/meaningful experiences. Still championing many EA-values (scope-sensitivity, broad moral circles, proactive ethics) and EA tools (effective giving, research oriented, and ethics-driven careers), but in the context of a group which is a shade or two more like churches, humanist associations, and the Sunday Assembly and a shade or two less like Rotary Clubs or professional groups.
That's just one idea, but I'm really trying to ask about the broader status of EA-diaspora communities / non-canonically "EA" community groups under EAIF? I would like to more clearly understand what the canonical "stewards of the EA brand" in CEA and the EAIF have in mind for the future of EA groups and the movement as a whole? What does success look like here; what are these groups trying to be / blossom into? And to the extent that my personal vision for "the future of EA" is different, is a clear-break / diaspora the way to go?
Thanks!
Jamie_Harris @ 2024-12-16T15:29 (+2)
I didn't write that wording originally (I just copied it over from this post), so I can't speak exactly to their original thinking.
But I think the phrasing includes the EA community, it just uses the plural to avoid excluding others.
Some examples that jump to mind:
- EA
- Rationality, x-risk, s-risk, AI Safety, wild animal welfare, etc to varying degrees
- Org-specific communities, e.g. the fellows and follow-up opportunities on various fellowship programmes.
I would like to more clearly understand what the canonical "stewards of the EA brand" in CEA and the EAIF have in mind for the future of EA groups and the movement as a whole?
I think this suggests more of a sense of unity/agreement than I expect is true in practice. These are complex things and individuals have different views and ideas!
Thanks for thinking this stuff through and coming up with ideas!
VictorW @ 2024-12-14T00:11 (+1)
Would you recommend applying now or later for a project that would start in 6 months and has a 50% chance of being cancelled before starting (without any resources spent)?
Jamie_Harris @ 2024-12-15T09:50 (+2)
Based on this information alone, EAIF would likely prefer an application later (e.g. if there is some event affecting the uncertainty that would pass) to avoid us wasting our time.
But I don't think this would particularly affect your chances of application success. And maybe there are good reasons to want to apply sooner?
And I wouldn't leave it too long anyway, since sometimes apps take e.g. 2 months to be approved. Usually less, and very occasionally more.