CEA seeks co-founder for AI safety group support spin-off

By Agustín Covarrubias 🔸, jessica_mccurdy, Uni Groups Team @ 2024-04-08T15:42 (+62)

Summary

Background

Currently, CEA provides support to AI safety university groups through programs like the Organizer Support Program (OSP). For the last two semesters, OSP has piloted connecting AI safety organizers with experienced mentors to guide them. CEA has also supported these organizers through events for community builders  â€” like the recent University Group Organiser Summit â€” to meet one another, discuss strategic considerations, skill up, and boost participants’ motivation.

Even though these projects have largely accomplished CEA’s goals, AI safety groups could benefit from more ambitious, specialized, and consistent support. We are leaving a lot of impact on the table.

Furthermore, until now, AI safety groups’ approach to community building has been primarily modelled after EA groups. While EA groups serve as a valuable model, we’ve seen early evidence that not all of their approaches and insights transfer perfectly. This means there’s an opportunity to experiment with alternative community-building models and test new approaches to supporting groups.

For these reasons, CEA hired AgustĂ­n Covarrubias to incubate a new project. The project will encompass the support CEA is already giving AI safety groups, plus provide the opportunity to explore new ways to help these groups grow and prosper. The result will be a CEA spin-off that operates as a standalone organization or a fiscally sponsored project. Since AI Safety groups are not inherently linked to EA, we think spinning out also allows this project to broaden its target audience (of organizers, for example).

We’re now looking to find a co-founder for this new entity and invite expressions of interest and recommendations. We think this is a compelling opportunity for people passionate about AI safety and community building to address a critical need in this space.

Our vision

We think growing and strengthening the ecosystem of AI safety groups is among the most promising fieldbuilding efforts. These groups have the potential to evolve into thriving talent and resource hubs, creating local momentum for AI safety, helping people move to high-impact careers, and helping researchers, technologists, and even advocates collaborate in pursuit of a shared mission. We also think some of these groups have a competitive advantage in leveraging local ecosystems; for example, we’ve seen promising results from groups interacting with faculty, research labs, and policy groups.

But this won’t happen by default. It will take careful, proactive nurturing of these groups’ potential. We’re ready to fill this important gap. Our vision for the new organization is to:

It will take an agile but highly competent team to make this vision a reality. We see ourselves rapidly setting up basic support infrastructure; iterating on new projects, events, and programming for groups and their organizers; and creating ad hoc resources for new and existing organizers.

Some of these activities are mentioned in our early strategic vision (and are already in progress), including the creation of a new AI Safety Groups Resource Center. We also plan to develop a more specialized version of the Organizer Support Program, help identify bottlenecks in the AI safety talent pipeline, and work with organizers to develop new programming for their groups.

That said, our vision is a work in progress. The co-founder we seek will play a big role in helping us explore other options and refine our thinking.

Key facts about the role

We’re looking to secure the conditional commitment of a second co-founder for the project in the next two to three months in preparation for the spin-off.

Other key facts about the role include:

Key competencies and skills

Based on the broad definition of this role and the competencies we think would complement those of AgustĂ­n, these are the main competencies and skills we're seeking in candidates:

Must-haves

Nice-to-haves

We’re excited to hear from you! If you have questions about the role, please contact agustin.covarrubias@centreforeffectivealtruism.org.