A brainstorm of meta-EA projects (MCF 2023)
By michel, MaxDalton, Sophie Thomson @ 2023-11-03T22:38 (+27)
This post is part of a sequence on Meta Coordination Forum 2023. It summarizes pre-event survey respondents’ brainstorming on projects they’d like to see.
You can read more about the pre-event survey results, the survey respondents, and the event here.
About this survey section
We solicited project proposals from Meta Coordination Forum (MCF) 2023 attendees by asking a few closely related, optional questions. These included:
- What new projects would you like to see in EA?
- What obviously important things aren’t getting done?
- What projects should have existed a year ago?
- What’s a “public good for EA” that nobody has a direct incentive to do but that would benefit lots of people?
The resulting list is not a definitive list of the best meta-EA projects; it’s more like a brainstorm and less like a systematic evaluation of options.
- Respondents filled in their answers here quickly and may not endorse them on reflection.
- Respondents probably disagree with each other. We never asked respondents to evaluate the suggestions of others, but we’re pretty sure that if we had it would have revealed big disagreements. (There was significant disagreement on most other survey questions!)
- The value of these projects depends on how well they are executed and who owns them.
If someone is interested in taking on one of these projects and would like to connect with the person who proposed it, please reach out. We may be able to put you in touch.
Project Proposals
Coordination and Communication
- Projects focused on improving connections to groups outside EA (i.e. government, companies, foundations, media, etc.).
- A common knowledge spreadsheet of directly responsible individuals for important projects.
- More “public good”-type resources on the state of different talent pipelines and important metrics (e.g., interest in EA).
- More coherent and transparent communication about the funding situation/bar and priorities.
- More effort going into identifying and making known low-integrity actors through some transparent mechanism.
- More effort into improving boards and reducing conflicts of interest across organizations / boards.
- More risk management capacity for EA broadly as a field and not just individual orgs.
Career Advice and Talent Allocation
- Advanced 80K: Career advice targeted at highly committed and talented individuals.
- A separate organization that is an 80k analogue for mid-career people.
Community Engagement
- More creative and fun ways for young people to learn about EA principles that don’t place as much emphasis on doing "the single most important thing".
- More support and appreciation for people doing effective giving work (GWWC, Longview), and encouragement for others to do more of this.
- A survey to identify why high-value potential members "bounce off" EA.
- A better way to platform great community members who can promote virtues and key principles.
AI Safety
- AI Safety Next Steps: A guide to facilitate entry into AI safety research and activism.
- Something to help people understand and evaluate the actions of AI labs, and possibly critique them.
- An org that can hire and lightly manage independent researchers.
- A better understanding of the relevance of UK or EU AI policy on x-risk, and comparison to US policy.
- A really good book on AI risk.
- AGISF in workshop form.
- More AIS grantmaking.
- A public policy institution advocating straightforwardly for the case of existential risk from AI.
Evaluation and Accountability
- More charity evaluators.
- More measurement and evaluation/accountability of meta projects.
- A public EA impact investing evaluator.
Fundraising and Donor Engagement
- More work on donor cultivation and fundraising.
- A new grantmaker with various beneficial attributes like speed, judgment ability, and infrastructure.
- More community building for effective giving.
Education and Training
- Systematic educational/training materials and community building in areas outside AIS.
- Leadership fast-track program.
Media and Outreach
- A podcast to keep people updated on EA-related developments.
- A bunch of media platforms for sharing EA ideas (YouTube, podcast, Twitter, etc.).
- An analog of Non-trivial but for university students.
- Better on-ramps to the most impactful career paths.
Diversity and Inclusion
- An organization that specializes in improving ethnic, racial, and socioeconomic diversity within EA.
Other Initiatives
- A high-quality longtermist incubator.
- EAG-like cause-specific conferences.
- Fastgrants and other quick funding mechanisms.
- A post-FTX investigative unit.
- An awards program to create more appreciation within the community.
- More badass GHD obvious wins like Wave.
- An initiative that helps people prepare for crunch time and crises.
- More applied cause-prioritization work outside of Open Philanthropy.
- More critiques of views closely associated with Open Philanthropy funding.
- Cause-specific community-building organizations, analogous to what CEA does for EA.
(Reversed) What is a project or norm that you don’t want to see?
- Incubators: One respondent stated that incubators are "super hard and over-done," mentioning that they are too meta and often started by people without entrepreneurial experience.
- Making Donor Participation Onerous: One respondent is concerned that setting high standards for donors could make it difficult for new donors to contribute to EA, possibly leading to the shrinkage of the community.
- Community Building and Early Funnel Bottlenecks: One respondent expressed the opinion that non-targeted community building may be overrated and that there may not be much of a bottleneck in the early stages of community funneling except for exceptional cases.
- Community Building Projects Split: One respondent is, on the margin, against community building projects that are specifically focused on either neartermism or longtermism instead of broader EA.
Minh Nguyen @ 2023-11-04T04:25 (+3)
Just gonna weigh in on some of these from my time researching this stuff at Nonlinear.
A common knowledge spreadsheet of directly responsible individuals for important projects.
Strongly agree. It's logistically easy to do, one person could cover 80% of EA projects within a week. I've been using AI Existential Safety Map (aisafety.world) a lot in my list of followups for 1-on-1s.
In the long run, a well-maintained wiki similar to/synced with the EA Opportunities Board (which I also heavily recommend) could make this really comprehensive.
More “public good”-type resources on the state of different talent pipelines and important metrics (e.g., interest in EA).
I read every EA survey I see. They're often quite interesting and useful. I wouldn't say they're neglected since EAs do seem to love surveys, but usually a net positive.
More coherent and transparent communication about the funding situation/bar and priorities.
I am of the opinion that every EA funder should be as transparent and detailed about their funding bar/criteria as possible. Unlike for-profits/VCs, I don't see a strong reason for secrecy other than infohazards. It helps applicants understand what funders look for which helps both funders and applicants. I believe that applicant misconceptions about "what funders want" can hinder EA a lot in the long run due to mismatched incentives. I see a lot of compelling project directions censored/discarded in the early stages simply because applicants think they should be more generic (because being more generic works well in conventional success pathways).
More risk management capacity for EA broadly as a field and not just individual orgs.
I really liked this post Cash and FX management for EA organizations — EA Forum (effectivealtruism.org) by @JueYan.
Advanced 80K: Career advice targeted at highly committed and talented individuals.
Agree, but I never figured out how to scalably execute this. Usually, if someone has a skillset+motive to do really well in EA, my priority is to 1. inform them of helpful resources to fill in themselves 2. try to link them with someone doing what they're trying to do.
The problem is that it seems hard to predict in advance who they'd consider a valuable connection. I think none of my most valuable connections in EA so far would've been referred to me by someone else.
Tractable idea: A list of helpful links sent to EAGx and EAG attendees post-conference.
A survey to identify why high-value potential members "bounce off" EA.
I actually have bounced off EA for 3 years before (2019-2022). For me, the big reason was that I couldn't find any follow-up steps to pursue (especially coming Singapore). My experience within EA is very inspiring and exciting interactions followed by not much follow-up (guidance, next steps, pursuing opportunities, encouraging people to start projects etc.).
[just gonna agree with all the AI Safety points, they've all come up before in my discussions]
Evaluation and Accountability
Shoutout to @Mo Putera who is working on this.
Media and Outreach
Casual observation that I can't recall a single EA social media account that I browse simply because it's fascinating and not because I wanna support EA on social media.
And I'm into weird stuff, too. I just binged hour-long videos on Soviet semiconductors and the history of Chang'an.
Incubators: One respondent stated that incubators are "super hard and over-done," mentioning that they are too meta and often started by people without entrepreneurial experience.
Agree, this point has been discussed in detail before. What we learned from a year incubating longtermist entrepreneurship — EA Forum (effectivealtruism.org)
I think it's just hard to do well because there's so many points of failure, it takes a long time for any results to show and it requires both social skills and technical expertise. That said, I do think a longtermist version of Charity Entrepreneurship seems promising to pilot (actually, I'm gonna bring this up to Kat Woods right now).
Fastgrants and other quick funding mechanisms.
I really like Manifund as a platform!
michel @ 2023-11-09T20:00 (+2)
Thanks for adding these thoughts!
michel @ 2023-11-03T23:30 (+2)
A quick note to say that I’m taking some time off after publishing these posts. I’ll aim to reply to any comments from 13 Nov.