Interested in EA/longtermist research careers? Here are my top recommended resources
By MichaelA🔸 @ 2022-06-26T17:13 (+112)
Since 2020, I estimate I've given career advice to >200 people in the EA community. This post is my up-to-date, prioritized list of recommended resources for people who are interested in research careers, longtermism-aligned careers, and/or working at EA orgs (rather than e.g. high-impact roles at non-EA orgs that work on relevant topics). The more you're interested in each of those three things, the more likely it is that this post will be useful to you - but I expect it would also be useful to many people interested in just one or two of those things.
This post is also likely to be more useful to people who are earlier in their relevant career journeys, since many of the resources will be well-known or unnecessary for people with more relevant experience.
Will these resources be useful to you? Hard to say, as I don't know you or what you already know! But this post itself should only take a few minutes to read, and then you can make your own decision about which links to open and how closely to engage with them.
These resources should hopefully help you have an impact, build your career capital (e.g., knowledge, skills, connections, and credentials), & test your personal fit for various roles, and find other good advice or valuable opportunities for doing those things. This post won't itself give you many specific tips.
I've bolded the resources that I expect will typically be most useful.
Behold! The herald resources:
Things you could apply to
- Note: I suggest you don’t spend too long thinking about the pros and cons of applying to an opportunity (e.g., a job, grant, degree program, or internship). Assuming the initial application wouldn’t take you long, if it seems worth thinking hard about, you should probably just apply instead.
- I and Akash elaborate on this in Don’t think, just apply! (usually).
- You could use the 80,000 Hours Job Board to find listings of jobs relevant to effective altruism, including jobs that could help in building skills and testing fit.
- You could apply to EA-aligned research training programs or internships. You can find lists of such programs here and here.
- I previously put together a list of EA funding opportunities, and noted: “I strongly encourage people to consider applying for one or more of these things. Given how quick applying often is and how impactful funded projects often are, applying is often worthwhile in expectation even if your odds of getting funding aren’t very high. (I think the same basic logic applies to job applications.)”
- These funding opportunities could be used to support a very wide range of activities, such as research, career exploration and planning, community building, and entrepreneurship.
- See also Why YOU should consider applying for funding
- You could apply for coaching from Effective Thesis or check our their resources if you're working on a thesis or other piece of academic research or considering doing so.
- You could subscribe to the Effective Altruism Newsletter to stay up to date with the effective altruism community and get updates about new job opportunities.
Other sources of career advice
- 80,000 Hours have a lot of valuable guidance on having more social impact with your career. In particular, I often suggest people check out their in-depth process and template for career planning and apply for their free career advising.
- If you're reading this post, you've almost certainly heard of 80,000 Hours and know they have a career planning process and a career advising service. But I still find that many people who are keen to talk to me for career advice haven't actually worked through 80k's process or applied for their advising service. In most such cases, I strongly suggest actually doing so!
- Michael Aird on how to do impact-driven research - Hear This Idea, 2022
- A nearly 4 hour interview where I tried to dump in one place all the advice I think the sort of people reading this post would most often want.
- I'm really happy with how it turned out.
- If you want, you could look at the episode's chapters or transcript to figure out whether this interview would be useful for you and to engage with just the relevant parts.
- In 2021, I made a more thorough, less prioritized, less up to date collection of advice and resources along these lines: Notes on EA-related research, writing, testing fit, learning, and the Forum. I suggest explicitly considering engaging with all of the above links before engaging with that longer post (if you do so at all)
Advice on doing good research/writing
- Reasoning Transparency - Muehlhauser, 2017
- Using the “executive summary” style: writing that respects your reader’s time - 2022
- Learning By Writing - Karnofsky, 2022
- Building a Theory of Change for Your Research - Aird (me), 2022 [slides]
- Useful Vices for Wicked Problems - Karnofsky, 2022
- Tips for conducting worldview investigations - Muehlhauser, 2022
- Readings and notes on how to write/communicate well - Aird (me), 2021-2022
- Michael Aird on how to do impact-driven research - Hear This Idea, 2022
Other ways to build career capital and/or test your fit for various roles
- The best thing to do is probably to apply to lots of things!
- Application processes themselves can provide you with some info on your fit and with some career capital. And if you actually get an opportunity, that'll probably be much better for building career capital and testing fit than what you would've done independently would be.
- But you may still have time left over after making a bunch of applications. This section is primarily about how to use that time.
- It's often worth reaching out to people for advice, feedback, etc. When doing that, you might find my doc of Tips & readings on getting useful input from busy people helpful.
- A list of EA-related podcasts and/or A ranked list of all EA-relevant (audio)books I've read may be useful if you like learning via audio content.
- You can search the EA Forum Topics page for EA Wiki entries relevant to your interests / career plans, read those Wiki entries, use them as directories to useful readings (in the Bibliography, Further reading, and tagged posts sections), and/or consider reaching out to the authors of some of those readings (for discussion, feedback, pointers to other recent/ongoing work, etc.).
- It’s sometimes worth trying doing independent research and sharing it for feedback (e.g., by posting it on the EA Forum and sharing it directly with researchers or other people working on relevant topics).
- But note that entirely independent research is often bad for people's happiness, productivity, and ability to learn rapidly. I expect most people will probably be happier, be more productive, and learn faster if they arrange to receive mentorship or at least get an “accountability buddy” and regular feedback.
- Relatedly, it may often be best to apply to jobs or research training programs and see independent research only as a backup plan.
- To think of a good research topic, you could browse through this directory for open research questions and/or ask a potential mentor what they might be interested in mentoring you working on.
- I suggest at least considering making and using Anki cards. Here's the article that triggered me to do so and gave me some useful tips.
- You could practice forecasting on Metaculus or Good Judgment Open.
- Where to find EA-related videos might be useful for some people.
Parts of this post are adapted from a collection of resources put together for people who applied to Rethink Priorities but didn't end up with an offer. That older collection was put together by me with some help from some of my colleagues at RP, e.g. Peter Wildeford. But I wrote this in a personal capacity and it doesn't necessarily represent the views of my colleagues or anyone else.
MichaelA @ 2022-06-26T17:13 (+22)
Resources that are only relevant to people interested in AI governance and (to some extent) technical AI safety
- EAGx Oxford 2022 AI Governance Resources
- You could participate in the AGI Safety Fundamentals course’s Governance track, or - when the course isn’t running - work through all or part of the curriculum independently. This seems like an unusually good way for most people to learn about AI risk and AI governance (from a longtermist or existential-risk-focused perspective).
- Description of some organizations relevant to long-term AI governance (non-exhaustive) (2021) collects and overviews some organizations you might be interested in applying to. (This link is from week 7 of the AGI Safety Fundamentals course's Governance track.)
- I think Some AI Governance Research Ideas would be my top recommendation for a public list of AI governance research ideas.
- But I'd suggest being discerning with this list, as I also think that some of those ideas are relatively low-priority and that the arguments presented for prioritizing those particular ideas are relatively weak, at least from a longtermist/existential-risk-focused perspective.
Mess_Bomb @ 2022-06-26T22:34 (+23)
Strong +1 to the "work through all or part of the curriculum independently". Having participated in AGISF (governance track), I'd say that >95% of the value for me came from doing the reading, as opposed to participating in the discussion sessions.
(I don't want this comment to be seen as a negative review of the discussions - I'm mainly making this point because I think almost anyone can get significant value out of following the curriculum themselves, and so I'd like to nudge people toward feeling agency and viewing self study as A Thing You Can Do.)
MichaelA @ 2022-06-26T18:00 (+4)
Notes on things I should consider integrating into this post later
- Consider attending EAG(x)s
(I expect I'll add to this comment in future)