Why is Apart Research suddenly in dire need of funding?

By Eevee🔹 @ 2025-05-28T07:43 (+97)

Apart Research recently started an ambitious fundraising campaign—they say that the funding is urgently needed to keep the lights on. According to their Manifund post, they're trying to raise $954,800 (almost $1 million!) in the next 21 days, which is equivalent to about 12 months' runway:

Caveat: the post also says that the budget should be "scale[d] down accordingly", presumably based on the amount of money they raise. For instance, if they raise $500,000, that's enough to last 6 months.

Apart has said that they might be forced to shut down (or at least drastically downsize their team) if they can't raise the funds:

Insufficient funding: Without adequate resources, we would be forced to disband a high-functioning team built over 2.5 years, losing a proven talent pipeline at a critical time for AI safety and canceling valuable talent capital and research projects.

Mitigation: We have already diversified our funding drastically, including partnerships and sponsorships.

I donated about $300 to them last December because I think their hackathons serve as an important entry point for people interested in getting into AI safety, and I think the AI safety field benefits from a robust talent pipeline. So it'd be a shame if they had to shut down. (I participated in a hackathon myself and was inspired to donate to them by this Forum post—donating to a community project that has increased your impact provides a strong signal that the program is impactful.)

But I can't help but wonder why Apart Research is in this situation in the first place. Did they abruptly lose grants from major funders? If so, why? And why do they not seem to have a cash reserve? They've hired 8 FTEs—did they scale up too fast without building up enough savings to sustain it? I'm not criticizing, I just think we'd all benefit from more context so that donors can determine the best ways to keep Apart going.

Also, they've only raised $346 on Manifund so far, and they need to raise $10k before any of the money gets paid out to them. They could be raising a lot more money through other channels, like Every.org, but why is the threshold so high? The fundraiser seems to be moving pretty slowly—campaigns on Manifund often see large donations from regrantors, to the tune of $1–2k.


Esben Kran @ 2025-05-31T04:07 (+47)

I was extremely grateful for your donation and the impact Apart has had on individuals' personal stories are what makes all this work worth it! So we really really appreciate this.

This is an in-depth answer to your questions (reasons behind this campaign, why the timeline, what we do, how this situation relates to the general AIS funding ecosystem, what mistakes we made, and a short overview of the impact report and newer numbers).

Read the campaign page and the Apart Research Impact Report V1.0 before you continue.

This campaign

We're extremely grateful for the response we've received on this campaign, such as the many personal comments and donations further down on the /donate page and on Manifund, and this is really what makes it exciting to be at Apart!

We have one of the more diverse funding pools of organizations in our position[1] but org-wide community-building funding mostly depends on OpenPhil, LTFF, and SFF. This situation comes after a pass from LTFF that was high confidence for us because we outperformed our last grant with them, but we misjudged that LTFF itself was underfunded, unfortunately. Additionally, OpenPhil has been a smaller part of our funding than we would have hoped.

The last-minute part of this campaign is largely a consequence of delayed response timelines (something that is pretty normal in the field, see further down for elaboration) along with somewhat limited engagement from OpenPhil's GCR team on our grants throughout our lifetime.

I'll also mention that non-profits generally spend immense amount of time with fundraising campaigns and what we feel is important to share transparently as part of this campaign are all the parts of our work that otherwise gets overlooked in a "max 200 words" grant application focused on field-building. 

We've been surprised at how important anecdotes actually are and have prioritized them too little in our applications - everyone has shared their personal stories now and they are included across the campaign here as a result. Despite this, Apart was still the second highest-rated grant for our PI at LTFF and they simply had to reject it due to the size since they were themselves underfunded.

With OpenPhil, I think we've been somewhat unlucky with the depth of grant reviews and feedback from their side and missing the opportunity to respond to their uncertainties. Despite receiving some top-tier grants at SFF and LTFF in 2024, an organization like ours are dependent on larger OP grants unless we have successful long-term public campaigning similar to Wikimedia Foundation or direct interfacing with high net worth individuals, something every specialized non-profit outside AI safety need as they scale.

Hope that clarifies things a bit! We've consistently pivoted towards more and more impactful areas and I think Apart is now harvesting the impact of growing as an independent research lab. Our latest work is very exciting, the research is getting featured across media, backend software is in use now, and governments are calling us for help, so it's unfortunate to find the organization in this situation.

For others raising funds, what Apart could have done to improve the situation is:

  1. Stay more informed about funding shortfalls for specific foundations we rely on. This was especially important for the current situation.
  2. Rely less on expectations that larger funding pools would follow AI capabilities advancement and the urgency of AI safety.
  3. Related to the above point, avoid scaling based on the YoY trend line of funding Apart received from 2022 to 2023 to 2024 (conditional projected growth capacity) since this wasn't followed up in 2025 and the mid-2024 scale may have been better to stay on for longer (though the speed of AI development leads to somewhat different conclusions and we didn't grow more than 50% from last year's budget).
  4. Be more present in SF and London to interface face-to-face with the funders and provide answers to the uncertainties (they usually come back after the grant process and we often have relatively good answers to them but don't have the chance to provide them before a decision is made).
  5. Communicate more of our impact and our work throughout the EA and AIS community, beyond large academic participation and our communication to direct beneficiaries of our work, participants, newsletter subscribers, partners, etc. This is already under way but I would guess there's a six month lead time or so on this.
  6. Engage other visionary donors outside AIS to join in on the funding rounds, potentially under other attractive narratives (something that usually takes two years to activate and that I'm certain will be possible by 2026).
  7. Rely less on previous results as evidence for forecasts of grant-making decisions.

With that said, I think we've acted as well as we could, and this campaign is part of our contingency plans, so here we are! We could've launched it earlier but that is a minor point. I'm confident the team will pull through, but I'll be the first to say that the situation could be better. 

The team and I believe a lot in the work that happens at Apart, and I'm happy that it seems our researchers and participants agree with us - we could of course solve it all by pivoting to something less impactful, but that would be silly.

So overall, this is a relatively normal situation for non-profits outside AI safety and we're just in a place where the potential funders for AI safety community-building are few and far between. This is not a good situation for Apart, but it is what it is!

Some notes on what Apart does

Since this is a longer answer, it may also be worth it to clarify a few misunderstandings that sometimes come up around our work due to what seems like an early grounding of the 'Apart narrative' in the community that we haven't worked enough to update:

Funding ecosystem

The situation for Apart speaks to broader points about the AI safety funding ecosystem that I'll leave here for others who may be curious about how an established organization like Apart may run a public campaign with such a short runway:

Appendix: Apart Research Impact Report V1.0

Since you've made it this far...

Our impact report makes Apart's impact even clearer and it's definitely worth a read!

https://apartresearch.com/impact/report 

If you'd like to hear about the personal impact we've had on the people who've been part of our journey, I highly recommend checking out the following:

Since V1.0, we've also fine-tuned and re-run parts of our impact evaluation pipeline even more and here's a few more numbers:

Citations of our research from (excluding universities): Centre for the Governance of AI, IBM Research, Salesforce, Institute for AI Policy and Strategy, Anthropic, Centre for the Study of Existential Risk, UK Health Security Agency, EleutherAI, Stability AI, Meta AI Research, Google Research, Alibaba, Tencent, Amazon, Allen Institute for AI, Institute for AI in Medicine, Chinese Academy of Sciences, Baidu Inc., Indian Institute of Technology, State Key Laboratory of General Artificial Intelligence, Thomson-Reuters Foundational Research, Cisco Research, Oncodesign Precision Medicine, Institute for Infocomm Research, Vector Institute, Canadian Institute for Advanced Research, Meta AI, Google DeepMind, Microsoft Research NYC, MIT CSAIL, ALTA Institute, SERI, AI Quality & Testing Hub, Hessian Center for Artificial Intelligence, National Research Center for Applied Cybersecurity ATHENE, Far AI, Max Planck Institute for Intelligent Systems, Institute for Artificial Intelligence and Fundamental Interactions, National Biomarker Centre, Idiap Research Institute, Microsoft Research India, Ant Group, Alibaba Group, OpenAI, Adobe Research, Microsoft Research Asia, Space Telescope Science Institute, Meta GenAI, Cynch.ai, AE Studio, Language Technologies Institute, Ubisoft, Flowers TEAM, Robot Cognition Laboratory, Lossfunk, Munich Center for Machine Learning, Center for Information and Language Processing, SĂŁo Paulo Research Foundation, National Council for Scientific and Technological Development

Job placements: Cofounder of stealth AI safety startup @ $20M valuation, METR, GDM, Anthropic, Martian (four placements as research leads of new mech-int team plus staff), Cooperative AI Foundation, Gray Swan AI,  HiddenLayer, Succesif, AIforAnimals, Sentient Foundation, Leap Labs, EluetherAI, Suav Tech, Aintelope, AIS Cape Town, Human Intelligence, among others

Program placements: MATS, ERA Cambridge, ARENA, LASR, AISC, Pivotal Research, Constellation, among others

  1. ^

    In terms of our funding pool diversity, it spans from our Lambda Labs sponsorship of $5k compute / team to tens of sponsorships from partners for research events, many large-scale (restricted) research grants, paid research collaborations, and quite a few $90k-$400k general org support grants from every funder you know and love.

MathiasKB🔸 @ 2025-05-28T10:38 (+37)

(conflict of interest note, I'm pretty good friends with Apart's founder)

One thing I really like about Apart is how meritocratic it is. Anyone can sign up for a hackathon, and if your project is great, win a prize. They then help prize winners with turning their project into publishable research. This year two prize winners even ended up presenting their work orally at ICLR (!!).

Nobody cares what school you went to. Nobody is looking at your gender age or resume. What matters is the quality of your work and nothing but.

And it turns out that when you look just at quality of the work, you'll find that it comes from all over the world - often countries that are otherwise underrepresented in the EA and AI safety community. I think that is really really cool.

I think apart could do a much better job at communicating just how different their approach is to the vast majority of AI upskilling programmes, which heavily rely on evaluating your credentials to decide if you're worthy of doing serious research.

I don't know anything about the cost-per-participant and whether that justifies funding apart over AI safety projects, but there is something very beautiful and special about Apart's approach to me.

Chris Leong @ 2025-05-28T11:28 (+35)

Sad to see this. I agree that Apart adds something distinctive to the ecosystem (an extremely easy entry point), so it would be a shame to see it disappear.

I wonder whether this is because there's so much competition for competitive fellowships like MATS (and perhaps even for some of the unpaid AI safety opportunities) that funders feel less need to fund projects earlier in the pipeline?

mjkerrison🔸️ @ 2025-05-29T10:01 (+12)

I also have this question. As someone looking to apply for funding to continue an org in the space, all of this uncertainty is tough to grapple with.

Ozzie Gooen @ 2025-05-30T03:27 (+9)

The situation these days seems pretty brutal. I'm really hoping that some other large funders enter this space soon, the situation now feels very funding-constrained. 

Jeffrey Kursonis @ 2025-05-30T01:27 (+5)

I'm in a cause area most of the big funds are not yet on board with yet (6x only, not yet to GiveWell's 10x bar)...so we have to go out to the traditional philanthropy world to find funding. That can have many good benefits for both sides. 

Holly Elmore ⏸️ 🔸 @ 2025-05-28T19:03 (+5)

Is it for the same reason CAIP appears to have gone bankrupt? That a “major funder” (read: Open Phil) pulled support and that triggered a cascade of funders pulling out?

EDIT: This is my unconfirmed understanding of the situation.

Neel Nanda @ 2025-05-30T22:29 (+20)

That's not my understanding of what happened with CAIP, there's various funders who are very happy to disagree with OpenPhil who I know have considered giving to CAIP and decided against. My understanding is that it's based on actual reasons, not just an information cascade from OpenPhil

No idea about Apart though

akash 🔸 @ 2025-05-28T19:26 (+7)

Assuming this is true, why would OP pull funding? I feel Apart's work strongly aligns with OP's goals. The only reason I can imagine is that they want to move money away from the early career talent building pipeline to more mid/late-stage opportunities. 

Esben Kran @ 2025-05-31T04:12 (+22)

OP has not pulled any funding. They've provided a few smaller grants over the last years that have been pivotal to Apart's journey and I'm extremely grateful for this. OP has been a minority of Apart's funding and the lack of support for the growth has been somewhat hard to decipher for us. Generally happy to chat more with OP staff about this, if anyone wish to reach out, of course.

akash 🔸 @ 2025-05-31T04:24 (+8)

EAG London would be the perfect place to talk about this with OP folks. Either way, all the best fundraising!