CEA's Plans for 2021
By MaxDalton @ 2020-12-11T00:27 (+67)
This post lays out the Centre for Effective Altruism's plans for 2021. We've also published a review of our progress in 2020.
Summary
Long-term approach
Our mission is to build a community of students and professionals acting on EA principles, by nurturing high-quality discussion spaces.
We think we should focus on facilitation: curating spaces (online, at events, within groups) for quality discussion, and helping more experienced community members to onboard newer members, rather than writing content ourselves or doing personalized outreach.
To break this down, we plan to work across three broad areas: recruitment, retention, and risk reduction.
We aim to focus our recruitment efforts in 2021 around student groups.
Young people seem to be the most likely to get deeply involved, probably because they have more free time, flexibility in their plans, and openness to new ideas. We also have a strong track record with this group, and they are easy to reach. The main downsides are the long time it takes for students to make an impact and the possibility of value drift,[1] but we still think this should be a part of EA’s community building portfolio.
Groups are one of the top ways that young people get more deeply involved (as demonstrated by the EA survey and interviews with community members). We are especially focused on supporting groups at highly ranked universities, which have a high concentration of people who will be influential in the future. We think that targeted events and resources can bolster student groups, and we aim to integrate events and the EA Forum more closely with groups work (as we did with the introductory events and Student Summit this year).
We aim for these people to take significant actions (a career plan/change or a 10% giving pledge) based on a good understanding of EA principles. We plan to carry out interviews to learn what actions they’ve taken (and why), but we don’t feel that we’re well-placed to rank individuals or career options, so we will be relatively agnostic as long as they seem to have a good understanding of the ideas. We hope that our focus on extremely promising populations and quality/in-depth presentations of key ideas will allow us to recruit excellent people. Throughout this work, we aim to be welcoming to people of different backgrounds.
We hope to retain these young people as they progress into exceptionally high-impact roles: this allows them to change their priorities in the future, develop knowledge and networks that will help them, and help to onboard new people. Related projects include EA Global, the Forum, and city groups.
We plan to maintain risk reduction work that preserves EA’s ability to grow and produce value in the future. This includes work on PR, diversity, risky actors, and epistemics.
We do not plan to work on reaching mid-career professionals or high net worth individuals. We also do not plan to work on fundraising, grantmaking (other than to groups), cause-specific community building, career advice, or research.
2021 plans
We are not confident that these are the best focuses, and they don’t cover all the ways we’ve created value in the past. However, we think that it’s useful to be more focused, and my impression is that these goals are facilitating collaboration between teams, generating new ideas, and stimulating open discussion about how the goals should be improved and changed. I think that our events and groups teams got more done in the past five months in significant part because we had clearer goals.
Recruitment
We aim to help onboard new students and young professionals, particularly from top universities and underrepresented groups. We plan to do this by following a cohort of students through the academic year,[2] and helping them to engage with quality content, make connections, and take action.
- Groups: The groups team will take leads from events and the Forum, and work with group leaders to deliver mentorship, workshops, 1:1 support, and content to increase the engagement of group members.
- Forum: The Forum team will aim to make more referrals from the Forum to high-quality in-person interactions.[3]
- Events: We may repeat the student introduction/Student Summit events. We’ll assess feedback from these events before deciding whether to repeat them in 2021.
- People from underrepresented groups: As part of this outreach plan, we plan to put additional effort into providing mentorship and opportunities for 1:1 connections for group members from demographic groups underrepresented in EA.
Retention
Existing community members need to continue to develop their understanding and networks, and to sustain their motivation. We aim to increase the total value that highly engaged EAs get from EA discussion spaces by 30%, via two areas: engaging with high-quality content (which will encourage continued learning) and making social/professional connections.
Continued learning
- Vision: Highly-engaged EAs continue to learn about EA-related ideas.
- Key measure: Engagement time from existing EAs with content (e.g. Forum, Newsletter, events, YouTube, group activities).[4]
- Team goals:
- Groups: We increase engagement time via online content and more capacity for group organizers and people in key hub cities.
- Forum/Online: Double the hours of EA Forum content viewed by highly-engaged EAs per day; raise by 30% the hours of total content viewed by highly-engaged EAs per day.
- Events: Goal to be determined; likely to be centred on an event like EA Global but more tailored to existing engaged EAs.
Quality connections
- Vision: Highly-engaged EAs make valuable, meaningful connections through EA discussion spaces
- Key measure: New positively-rated meaningful connections between highly-engaged EAs[5]
- Team goals:
- Groups: Group organizers and group members in hub cities make new connections.
- Forum: Improve referrals from the Forum to high-quality in-person interactions.[6]
- Events: Goal to be determined; likely to be centred on an event like EA Global but more tailored to existing engaged EAs.
Risk reduction
We will focus on the following areas:
- Brand: Draft and test ways of briefly and clearly communicating EA to students in our target audience.
- Culture/epistemics: Run four experiments to improve EA culture/epistemics, or scale one of those experiments.
In addition, we will maintain other aspects of community health work. For more details, see the community health section below.
Executive team and operations
We plan to:
- Complete the EA Funds spinoff (or explore alternatives, like a low-input version of Funds, if a spinoff doesn't work).
- Develop our ability to measure impact, to support the goals above.
- Make incremental improvements to internal systems (especially internal financial reporting, cybersecurity, grantmaking, hiring, and personal development).
We also plan to hire to support the organizational goals above, likely by hiring additional groups support.
Plan for 2022
We expect to continue to improve on our work in 2021. Whilst I expect our recruitment work to be increasingly student-focused and goal-oriented, I anticipate that our work to support existing community members and reduce risks will remain broader.
Programs
All program plans are our current drafts, which we may update. They are not intended to be commitments.
Community Building Grants
Baseline Budget: $1.29M
Expansion Budget: $1.68M
FTE: 1.0
Goals:
- Increase community building capacity of priority groups by the equivalent of 5 FTEs.
- The majority of grants made are highly valuable based on well-defined criteria.
- CBGs are considered to be one of the most promising career options by top group organizers.
Location-specific application rounds
Our top priority is to make better grants. Roughly, the best grants come when we fund a strong organizer in an important location. Our experience in NYC and Cambridge suggests that location-specific application rounds are a good way of finding unusually strong organizers in important locations. They are also more likely to be counterfactual than reactive grants. We think these benefits outweigh the increased evaluation costs, and plan to run more such rounds this year.
Enhanced support
In order to attract top organizers, the program needs to be considered one of the best available career opportunities for them. Therefore, we would also like to provide individual mentorship, training, networking opportunities, and a smooth experience for grantees. We might hire a contractor or staff member to provide this extra support.
Defining success
The goals refer to priority groups and grants that are highly valuable. We would like to develop those definitions for different types of groups. We generally feel like we understand these qualities for university groups better than we understand city- or national-level strategy. We are planning to spend some time thinking about strategy for these groups.
Group Support
Baseline Budget: $523,000
Expansion Budget: $734,000
FTE: 3.0, plus 6 contractors
Goals:
- Build a system for onboarding new EAs
- Improve the understanding and networks of group organizers and EAs in key cities by sharing content and supporting knowledge sharing
- Maintain (or improve) our existing support for groups
- Maintain our work to mitigate key risks in groups by referring cases to community health.
Onboarding new EAs
The groups team will take primary responsibility for the organization-wide recruitment goal. We’ll focus on following a cohort of students[7] through the academic year, and helping them to engage with quality content, make connections, and take action. Following a cohort of students will help us to understand their bottlenecks and develop the most useful resources.
One-to-one follow-up
We are testing a system where individuals who attended the EA Student Summit will have a follow up from their local group leaders or another experienced member of the community. We also plan to continue notifying them about relevant content and future events.
Fellowships and resource development
We think that one of the best ways to give students a high-fidelity introduction to EA principles is through fellowships. A multi-week fellowship increases commitment and provides a space to read, discuss, and process core materials. As a result, we’re prioritizing:
- Continuing to refine and improve the introductory fellowship so that we can roll out a high-quality version across groups.
- Improving the In Depth Fellowship so that early-stage group leaders and active group members can further develop their understanding of key EA ideas.
We’re still collecting data on other in-demand resources from EA introductory fellowship and student summit attendees, but we’ve seen initial demand for career fellowships, cause area fellowships, social events, and podcast / book discussion groups.
University group support
We are exploring making a hire to add capacity to support university groups. Assuming we can make this hire, we plan to:
- Encourage qualified individuals to apply for part-time CBGs.
- Pilot a workshop related to strategy for focus university groups to onboard younger committee members.
- Connect focus university group leaders with mentors
- (Post COVID) Have a retreat for group leaders
- Have more one-to-one calls with group leaders to chat through their group plans and bottlenecks
Underrepresented groups
We are exploring opportunities to provide an additional personal touchpoint for students from underrepresented demographic groups who aren’t yet highly engaged after the student summit / introductory fellowship. Following this cohort of students will allow us to better understand any specific barriers these groups face, so that we can better tailor our support.
Improve the understanding and networks of group organizers and EAs in hub cities
Our largest focus within the groups team is on recruitment, but within our retention goal above, we plan to focus on two user groups: existing EAs in EA hub cities, and group organizers.
- Support EA hub cities and online platforms to increase learning and connections for existing EAs.
- Improve the handover of graduating group members from university groups to city groups
- Add capacity (via CBGs) to cities with a high number of highly-engaged EAs relative to organizer capacity
- Pilot online groups content (e.g. fellowships, book group discussion groups) to provide engagement opportunities for highly-engaged EAs outside of hub cities
- Improve the understanding and networks of group organizers:
- Continue to grow our mentorship program for group organizers.
- Provide additional networking opportunities for group organizers.
- Encourage group organizers to continue to engage with EA content, especially recent developments and key updates.
Broad support and organizer training
In addition to the above, we aim to maintain or improve the resources and training available to all organizers.
This will likely include:
- Improving the onboarding process for new group organizers (e.g. have a structured training program including participation in intro and advanced fellowships).
- Developing additional resources on the EA Hub for organizers focused on the goals of groups, measuring success and best practices, and improve our communication channels to increase consumption of these resources.
- Continuing to host regular online meetups, organize in-person retreats if possible, and invest in training and workshops for organizers.
In addition, we will track and aim to improve organizer satisfaction with our support.
Mitigate key risks posed by groups or group members
The Groups team interacts with hundreds of group organizers, so we’re well placed to scan for risks and escalate cases to the Community Health team. These include PR risks, risks of low-fidelity translation, conflicts within the group, and cultural/epistemic issues. We aim to maintain our current work here.
Hiring
We are currently hiring for someone to mentor university groups and create quality groups resources. We currently have approximately two FTE contractors, who are piloting and sharing resources for groups, and we have set aside funding to add four more contractors (for a total of six) across this area, to scale it quickly while minimizing risk.
Effective Altruism Forum and online content
Baseline Budget: $361,000 (baseline is the same as expansion)
FTE: 2.5
Goals:
- Increase content viewed by highly-engaged EAs. 100% year-on-year increase in 30-day trailing average hours per day.
- Increase referrals. 30-day trailing average of 0.5 people/day referred to groups, events, or one-on-one conversations by EOY 2021.
Potential activities
We think that the following activities might help us to achieve these goals:
- Integrating the Forum with other CEA and community websites. This would allow someone to apply to EA Global or log in to the EA Hub using information they have already added to their Forum account.
- Working with Pablo Stafforini to integrate his EA encyclopedia project with the Forum’s system of content tags, creating a wiki-style collection of content that cross-links to thousands of relevant articles.
- Continuing to encourage new posts via author outreach and editing services, and to encourage more views of those posts by sharing Forum posts on social media and in newsletters.
Events
Baseline Budget: $1.328M (baseline is the same as expansion)
FTE: 4.5 (new hire under expansion)
We plan to run a similar program of events to last year. However, our 2021 plans are still quite uncertain, because:
- We have organized four events in the second half of the year, so we haven’t had time to reflect on 2021 plans in depth yet.
- Plans will be dependent on COVID.
- We want to analyze the data from the EA Student Summit and EAGxAsia-Pacific before making firm plans.
The table below shows our best guess under different scenarios. The plans below will depend on advice from our COVID Advisory Board. Our current best estimates are a 30-35% chance that London can go ahead and 60-70% chance that San Francisco will be able to go ahead.
We are planning to run six events in 2021, where three of these events may be held either virtually or in-person, depending on the advice of our advisory board.
In addition, the team will provide funding and support to selected EAGx organizers. These events may be virtual or in-person, depending on COVID.
Community health
Baseline Budget: $471,000
Expansion Budget: $566,000
FTE: 3.5 (new hire under expansion)
Goals:
- Focus on developing a positive "EA brand" and improving culture and epistemics in EA.
- Better support new EAs from underrepresented demographic groups.
- Maintain our responsive work.
We conducted an internal analysis of which areas to focus on in Q3, from among the different risks we monitor.[8] We made the following estimates:
As a result, we’re starting to put more effort into the first two risk areas.
Goals
In general, a good chunk of our work is responsive, and we want to stay flexible as new opportunities arise. Below is a proposal of what we might do given our current capacity and understanding of key risks.
- Brand: Have well-validated messaging for EA suited to our target audience:
- Run a baseline open-ended survey of people’s inclination/understanding of EA within our target audience.
- We’re considering compiling a variety of EA messages and testing them with students at universities. We then plan to release these results so community builders can tailor their outreach and intro materials.
- Culture/epistemics: Run experiments to improve EA culture/epistemics. We may end up not doing any of these, but some project ideas include:
- Mentoring a subset of university group leaders to help them improve their reasoning.
- Helping to support and train facilitators for EA fellowships.
- Finding ways to improve epistemic norms in widely used group resources.
- Commissioning pieces that translate rationality concepts into more accessible language.
- Providing support for EA groups on handling tension between social justice and other frameworks.
See the groups support section for more on our proposed work on DEI.
In addition, we aim to keep the following areas stable compared to 2020:
- Risky actors:
- We aim for no increase in risky actors’ ability to cause harm, whether directly to individuals or to EA’s reputation. This is a large portion of Julia’s responsive work.
- PR/brand:
- We plan to continue prioritizing cases related to major brand risks to EA or major early field-building risks.
- We will opportunistically train up individuals for media spokesperson roles.
- Early field building:
- We aim to reduce the risk of “poisoning the well” in sensitive early field building situations.
- We plan to continue monitoring and consulting on risks in countries and fields where EA is developing that have greater than usual levels of risk.
Executive
Baseline Budget: $1.08M
Expansion Budget: $1.13M
FTE: 4.3 (new hire under expansion)
Goals:
- Complete the EA Funds spinoff or put it into a low-input mode.
- Develop our ability to track individual community members and measure impact, in order to support the goals above.
- Create a plan for CEA’s future.
- Hire to support that plan.
Max's priorities:
- Finalize plans for Funds to spin out of CEA. If all goes well, this may happen in December 2020.
- Paternity leave starting ~late January
- Research into if/how CEA should grow, especially how much we should focus on reaching promising emerging markets, and how much we should invest in recruitment vs. retention. This will inform hiring plans.
- Hiring: I hope to help team leads make quality hires, and I’m also considering hiring and headhunting for additional senior roles.
Joan will be focused on driving forward our work on recruiting students, especially from top universities, and on gathering data to track and inform progress towards our goals.
Other plans:
- Restart in-person team retreats, if possible.
- Test out our new plan for supporting personal development, based around an organization-wide list of desired skills, from which staff and managers will select a skill and create a plan for developing the skill on the job.
- Run more staff training sessions, particularly to improve people’s understanding of the latest EA ideas.
Plus ongoing staff support work, like tracking and addressing staff morale issues; biannual feedback rounds; staff satisfaction interviews; digital content to support remote staff engagement; updating our compensation policy; encouraging upward feedback; and tracking and enacting potential role changes that better match people’s responsibilities to their areas of strength.
Operations
Baseline Budget: $626,000
Expansion Budget: $640,000
FTE: 5.25 (new hire under expansion)
Goals:
Our key team metric is the internal satisfaction score from the annual survey of all operations customers (employees and grantmakers, weighted towards those we serve most directly).
- Finance: CEA’s organization-wide finance systems provide useful information to support decision-making[8:1] and are more efficient.[9]
- Fundraising: Fundraising at CEA is more efficient[10] and successful.[11]
- CRM: Implement an organization-wide CRM.
- Grantmaking:
- Average grant processing time is less than 20 days;each additional grant requires less than 10 minutes of Operations time on average.
- Grantees rate their experience >4.5/5, and grantmakers rate it >8/10.
- HR administration: Key HR risks are reduced and onboarding is improved.[12]
Budget for 2021
Summary
For 2021, we have set a baseline budget of $5.60M and an expansion budget of $6.28M, compared to last year’s budget of $6.02M.
The chart below shows CEA’s plans for 2021 under both baseline and expansion, against the 2020 budget:
- The baseline budget assumes no further hires or other increases, with a mid-range estimate for community building grantmaking.
- The expansion budget includes planned hires to support the activities set out above, and a higher estimate for community building grants.
As discussed in the executive summary above, CEA is focusing on three key areas: recruitment of EAs, retention of highly engaged community members, and risk reduction. By grouping our programs into their respective focus areas, we can see that our biggest investment is in recruitment to EA.
Net budget
We derive our net budget by summing the budgeted expenditure on programs and deducting non-fundraising income, such as EA Global tickets.
Funding gap
As discussed above, we believe we’ve made good progress overall in 2020, and we believe we’re set up to improve further. Over the past two years, we have increased our focus on the most important aspects of our work whilst improving execution and keeping costs relatively stable.
In 2020, our board agreed that in order to increase resilience, in line with other EA organizations and advised standard practice, CEA should aim to have 12 months of runway at all times. This will also help us to make multi-year commitments for Community Building Grants (increasing stability for grantees), make multi-year commitments to event venues (reducing costs), and attract staff. As such, we are seeking funding through to December 2022.
Our funding gap estimate is equal to our two-year budget, less our balance on hand and fundraising target from some of our major long-standing donors. We estimate our funding gap to be $2.2m on our expansion budget.
If you are considering making a major donation, please get in touch and we will be able to give you more details of our current funding situation.
Value drift occurs when someone’s values change over time, leading them to e.g. be less interested in effective altruism at age 25 than they were at age 20. ↩︎
All students who attended an introductory fellowship, our introductory event, or the EA Student Summit in 2020. ↩︎
Measured by 30 day trailing average of 0.5 people/day referred from Forum to groups, events, or one-to-one introductions by EOY 2021. This might involve developing Forum profiles (adding pictures, extra fields for information, tagging, etc,), although we’re not sure if we will actually do this. ↩︎
Our best guess is that we want this to increase by 30% year-on-year. ↩︎
Question: “Roughly, how many new valuable connections did you make over the past year? For example, how many new people do you feel comfortable asking for a favour?” We're aiming for a 30% increase in total connections. ↩︎
Same measure as above: 30 day trailing average of 0.5 people/day referred from Forum to groups, events, or one-to-one introductions by EOY 2021. This might involve developing Forum profiles (adding pictures, extra fields for information, tagging, etc.). ↩︎
All students who attended an introductory fellowship, our introductory event, or the EA Student Summit in 2020. ↩︎
Overall risk: “What’s the likelihood something happens (an event, or a series of events) that causes the expected number of engaged EAs to be net 200 less in the next 5 years [than it otherwise would have been]?”
Maximum potential tractability: “If we had the entire CEA community health team + 80k and Open Phil taking effort to mitigate this risk, how much of the risk would we reduce?” ↩︎ ↩︎
As rated by key users. ↩︎
By the end of the year, Louis is able to reduce his working hours on core finance tasks by 20%. ↩︎
Requires ~35% less executive time. ↩︎
Improved retention, and donors report a high level of satisfaction with the service they receive. ↩︎
EdoArad @ 2020-12-11T07:59 (+13)
Thanks for this informative write-up!
The mission as stated is
to build a community of students and professionals acting on EA principles, by nurturing high-quality discussion spaces.
The focus on improving discussion spaces is relatively narrow compared to possible alternatives. Off the top of my head, some other (not necessarily good) alternatives might be:
- Directly manage content creation
- improving coordination among donors
- lead a centralized information platform
- lead a common research agenda
- develop a single "brand" for EA and for main cause areas
- Support promising individuals and organizations
- obtain and develop tools and infrastructure to support EA researchers
- Leading to common answers and community-wide decisions to some key questions about EA (should we expand or keep it small, should we have a different distribution of cause areas, should we invest more in cause prioritization or meta causes, ..)
- ...
I think that it makes a lot of sense to keep the focus on discussion platforms, and it seems that you are also naturally working on stuff outside of this focus as needed. I'd be interested to hear a bit more about what you'd want other initiatives from the community to take on, which you could have taken on yourselves but decided to withdraw. I'm also not sure from the post if you consider this mission as a long-term focus of CEA, or if this is only for the 1-2 coming years.
Maxdalton @ 2020-12-11T16:30 (+7)
Hi Edo, This is something that we’re keen to clarify and might publish more on soon. So thanks for giving me the opportunity to share some thoughts on this!
I think you’re right that this is a narrower mission: this is deliberate.
As we say on our mistakes page:
Since 2016, CEA has had a large number of projects for its size...Running this wide array of projects has sometimes resulted in a lack of organizational focus, poor execution, and a lack of follow-through. It also meant that we were staking a claim on projects that might otherwise have been taken on by other individuals or groups that could have done a better job than we were doing (for example, by funding good projects that we were slow to fund).
Since we wrote that, we have closed EA Grants and spun Giving What We Can out (while continuing to provide operational support), and we’re exploring something similar with EA Funds. I think that this will allow us to be more focused and do an excellent job of the things we are doing.
As you note, there are still many things in the area of building the EA community that we are not doing. Of course these things could be very impactful if well-executed (even though we don’t have the resources to take them on), so we want to let people know what we’re not doing, so they can consider taking them on.
I’ll go through some of the alternatives you mention and talk about how much I think we’ll work in this space. I’ll also share some rough thoughts about what might be needed, but I’m really not an expert in that question - I’d tend to defer to grantmakers about what they’re interested in funding.
A theme in what I write below is that I view CEA as one organization helping to grow and support the EA community, not the organization determining the community’s future. I think it’s mostly good for there not to be one organization determining the community’s future. I think that this isn’t a real change: the EA community’s development was always influenced by a coalition of organizations. But I do think that CEA sometimes aimed to determine the community’s future, or represented itself as doing so. I think this was often a mistake.
Directly manage content creation
We don’t have plans to create more content. We do curate content when that supports productive discussion spaces (e.g. inviting speakers to events, developing curricula for fellowships at groups). We also try to incentivize the creation of quality content via giving speakers a platform and giving out Forum prizes.
80,000 Hours is maybe the most obvious organization creating new content, but many research organizations are also creating useful content, and I think there’s room for more work here (while having high quality standards).
improving coordination among donors
We are currently running EA Funds, which I see as doing some work in this space (e.g. I think Funds and the donor lottery play some of this role). There might be room for extra work in this space (e.g. coordination between major donors), but I think some of this happens informally anyway, and I don’t have a sense of whether there’s a need for more at the moment.
lead a centralized information platform
I’m not sure quite what you have in mind here. I think the Forum is playing this role to some extent: e.g. it has a lot of posts/crossposts of important content, sequences, user profiles, and a tag/wiki system. We also work on the EA Hub resources. We don’t have plans beyond further developing these.
lead a common research agenda
We are not doing this, and we haven’t been doing research since ~2017. I think there are lots of great research organizations (e.g. Global Priorities Institute, Open Philanthropy, Rethink Priorities) that are working on this (though maybe not a single leader - I think this is fine).
develop a single "brand" for EA and for main cause areas
We do not plan to do this for specific cause areas. We do plan to do some work on testing/developing EA’s brand (as mentioned above in the community health section). However, I think that other organisations (e.g. 80,000 Hours) also play an important role, and I think it’s OK (maybe good) if there are a few different representations of EA ideas (which might work well for different audiences).
Support promising individuals and organizations
Supporting organizations: as mentioned in our annual review, we do some work to support organizations as they work through internal conflicts/HR issues. We also currently make grants to other organizations via EA Funds. We also provide operational support to 80,000 Hours, Forethought Foundation, GWWC, and a long-termist project incubator. Other than this, we don’t plan to work in this space.
Supporting individuals: Again, we currently do this to some extent via EA Funds. Historically we focused a bit more on identifying and providing high-touch support to individuals. I think that our comparative advantage is to focus more on facilitating groups and discussion, rather than identifying promising individuals. So this isn’t a current focus, although we do some aspects of this via e.g. support for group leaders. I think that some of this sort of work is done via career development programs like FHI’s Research Scholars Program or Charity Entrepreneurship’s internship program. I also think that lots of organizations do some of this work via their hiring processes. But I think there might be room for extra work identifying and supporting promising individuals.
In terms of non-financial support, the groups team provides support and advice to group organizers, and the community health team provides support to community members experiencing a problem or conflict within the community.
obtain and develop tools and infrastructure to support EA researchers
I think that the Forum provides some infrastructure for public discussion of research ideas. Apart from that, I don’t think this is our comparative advantage and we don’t plan to do this.
Leading to common answers and community-wide decisions to some key questions about EA (should we expand or keep it small, should we have a different distribution of cause areas, should we invest more in cause prioritization or meta causes, ..)
We do some work to convene discussion on this between key organizations/individuals (e.g. I think this sometimes happens on the Forum, and our coordination forum event allows people to discuss such questions, and where they can build relationships that allow them to coordinate more effectively). But we don’t do things that lead to “common answers or community-wide decisions”.
I actually don’t think we need to have a common answer to a lot of these questions: I think it’s important for people to be sharing their reasoning and giving feedback to each other, but often it’s fine or good if there are some different visions for the community’s future, with people working on the aspect of that which feels most compelling to them. For instance, I think that CEA has quite a different focus now from GWWC or Charity Entrepreneurship or OP or GPI, but I think that our work is deeply complimentary and the community is better off having a distribution of work like this. I also think that it works pretty well for individuals (e.g. donors, job-seekers) to decide which of those visions they most want to support, thus allowing the most compelling visions to grow.
For similar reasons, I think it would be bad to have a single organization “leading” the community. I think that CEA aspired to play this role in the past but didn’t execute it well. I think that the current slightly-more-chaotic system is likely more robust and innovative than a centralized system (even if it were well-executed). (Obviously there’s some centralization in the current system too - e.g. OP is by far the biggest grantmaker. I don’t have a strong view about whether more or less centralization would be better on the current margin, but I am pretty confident that we don’t want to be a lot more centralized than we currently are.)
Some other things we’re not planning to focus on:
- Reaching new mid- or late-career professionals (though we are keen to retain mid- or late- career people and to make them feel welcome, we’re focused on recruiting students and young professionals)
- Reaching or advising high-net-worth donors
- Fundraising in general
- Cause-specific work (such as community building specifically for effective animal advocacy, AI safety, biosecurity etc)
- Career advising
- Research, except about the EA community
Some of our work will occasionally touch on or facilitate some of the above (e.g. if groups run career fellowships, or city groups do outreach to mid-career professionals), but we won’t be focusing on these areas.
As I mentioned, we might say more on this in a separate post soon.
I'm also not sure from the post if you consider this mission as a long-term focus of CEA, or if this is only for the 1-2 coming years.
I expect this mission to be our long-term focus.
EdoArad @ 2020-12-12T08:22 (+2)
Thanks! I think that this is clear enough for me to be able to mostly predict how you'd think about related questions :)
I am personally very confused about the benefits of centralization vs. decentralization and how to compare these in particular cases and can find myself drawn to either in different cases. For what it's worth, I like the general heuristic of centralized platforms for decentralized decision-making.
To investigate it a bit further, I opened this question about possible coordination failures.
AnonymousEAForumAccount @ 2020-12-11T14:39 (+3)
CEA’s Values document (thank you for sharing this) emphasizes the importance of “specific, focused goals.” It’s helpful to see the specific goals that specific teams have, but what do you see as the most important specific goals for CEA as an organization in 2021? I feel like this writeup gives me a sense of your plans for the year, but not the well-defined criteria you currently expect to use at the end of 2021 to judge whether the year was a success.
Maxdalton @ 2020-12-11T17:35 (+6)
Hi, thanks for your question!
The section on 2021 plans is intended to be a summary of these criteria, sorry that wasn’t clear.
- One target is focused on recruitment: building a system for onboarding people to EA (to the level where they are taking significant action based on a good understanding of EA principles). Specifically, we aim to help onboard 125 people to this level.
- The second target is focused around retention: for people who are already highly engaged EAs, growing the amount of time they spend engaging with high-quality content via our work (e.g. Forum view time or watching a talk from one of our events on YouTube) by 30%, and also growing the number of new connections (e.g. at events) they make by 30%.
- The third target is focused on risk-reduction: this is covered in the community health targets above (which are slightly more tightly specified and fleshed out internally).
Internally we obviously have more details about how we plan to measure/assess these things, but we wanted to just give a summary here. We expect that most of these org-wide goals will be achieved as a collaboration between teams, but we have a single person responsible for each of the org-wide goals. (Operations and executive goals are a bit more complex, and are covered above.)
AnonymousEAForumAccount @ 2020-12-11T20:12 (+9)
This is super helpful- thank you! I feel like I’ve got a much better understanding of your goals now. It really cleared things up to learn which of your multiple goals you're prioritizing most, as well as the precise targets you have for them (since you have a specific recruitment goal it might be worth editing the OP to add that).
I have two followup questions about the recruitment goal.
- How did you set your target of recruiting 125 people? That’s much lower than I would have guessed based on other recruitment efforts (GWWC has run a two-month pledge drive that produced three times as many pledges, plus a bunch of people signing up for Try Giving). And with $2.5 million budgeted for recruitment, the implied $20,000 per recruit seems quite high. I feel like I might be misunderstanding what you mean about "following a cohort of students who attended an introductory fellowship, our introductory event, or the EA Student Summit in 2020" (discussed in the second bullet point).
- The recruitment section discusses a “plan to put additional effort into providing mentorship and opportunities for 1:1 connections for group members from demographic groups underrepresented in EA.” Do you have any specific goals for these efforts? For example, I could imagine having a goal that the cohort you recruit be more diverse than the current EA population along certain dimensions. If you don’t have specific goals, what do you plan to look at to know whether your efforts are having the desired effect?
Maxdalton @ 2020-12-14T20:12 (+9)
Thanks for your questions.
Re: target of 125 people. This is a relatively high bar: it’s focused on people who have taken significant action based on a good understanding of EA principles. So the bar is somewhat higher than the GWWC pledge, because we interview people and get a sense of why they chose the path they’re in and what would change their mind. We think that for most people this means >100 hours of engagement with quality content, plus carefully thinking through their career plans and taking action toward those plans (which might include significant donations).
I actually think that $20,000 per person in this position would still be a good deal: the expected lifetime value of a GWWC pledge might be around $73,000, and some people might be doing things significantly more promising than the average GWWC pledge. I don’t think that will be the full cost - e.g. these people will probably also benefit some from e.g. the Forum or 80k resources. However, I also think that these 125 people only represent some of the value that groups work creates (e.g. groups also help persuade people to take less intensive action, and to retain and empower people who are already engaged). I also think there’s a fair chance that we beat this target.
We arrived at 125 by estimating the number of individuals we think met this definition in 2019, applying a ~30% growth rate in the community, and then increasing this number further within key populations. One of our internal benchmarks is that the cohort of engaged EAs recruited in 2021 is more demographically diverse than the cohort of engaged EAs recruited in 2020.
AnonymousEAForumAccount @ 2020-12-15T15:54 (+7)
Thanks for the explanations Max!
MarisaJurczyk @ 2020-12-14T02:13 (+1)
Thanks for the thorough post! I appreciate the transparency CEA has been keeping in its strategy and plans.
Small question: does EEAs = engaged EAs? Is that defined by a specific metric?
Maxdalton @ 2020-12-14T08:55 (+2)
Hey Marisa, thanks, I'm glad you appreciated this!
Yes, EEAs=highly-engaged EAs (I've now edited this throughout, so that it's a bit less jargon-y). This is a term that we're using internally to refer to people who are taking significant action (e.g. a career plan or a significant giving pledge or similar) based on a detailed understanding of EA ideas.