What are some convincing arguments & pieces of evidence that people of a wide range of backgrounds can have fulfilling, enjoyable careers while also working in a top EA cause area?
By Anjay F @ 2021-12-28T11:50 (+17)
[EDIT: I realize that this is not always true and am definitely interested in arguments/evidence for that too]
For context, I lead a university group and constantly find myself talking to members about why I don't think there is a real sacrifice to wellbeing in choosing to work on the most pressing problems [as opposed to the ones that students gravitated to when they were young]. Any resources that address concerns about sacrificing happiness when using EA to inform career plans would be much appreciated!
Misha_Yagudin @ 2021-12-28T15:00 (+30)
Honestly, I don't think this is true for the top EA cause areas. These have been selected for impact and not for utilizing people with a wide range of backgrounds and preferences.
OTOH, it's pretty self-evident that people can do "the most good they can do."
Direct work in the top cause areas is a relatively narrow interpretation of EA principles. And, personally, I find the broader interpretation more encouraging and even somewhat relaxing.
Anjay F @ 2021-12-29T14:36 (+2)
Hi Misha. Thanks for your answer. I was wondering why you believe top EA cause areas to not be capable of utilizing people with a wide range of backgrounds and preferences. It seems to me like many of the top causes require various backgrounds. For example, reducing existential risk seems to require people in academia doing research, in policy enacting insights, in the media raising concerns, in tech building solutions, etc.
Misha_Yagudin @ 2021-12-29T23:05 (+12)
So let's be more specific, current existential risk reduction focuses primarily on AI risk and biosecurity. Contributing to these fields requires quite a bit of specialization and high levels of interest in AI or biotechnology — this is the first filter. Let's look at hypothetical positions DeepMind can hire for: they can absorb a lot of research scientists, some policy/strategy specialists, and a few general writers/communication specialists. DM probably doesn't hire much if any people majoring in business and management, nursing, educations, criminal justice, anthropology, history, kinesiology, and arts — and these are all very popular undergraduate majors. There is a limited number of organizations, these organizations have their peculiarities and cultural issues — this is another filter.
Seconding Khorton's reply, as a community builder you deal with individuals, who you can help select the path of most impact. It might be in an EA cause area or it might be not. The aforementioned filters might be prohibitive to some or might not pose a problem to others. Everyday longtermism is likely the option available to most. But in any case, you deal with individuals and individuals are peculiar :)
Anjay F @ 2021-12-30T15:47 (+2)
This makes a lot of sense and thanks for sharing that post! It's certainly true that my role is to help individuals and as such it's important to recognize their individuality and other priorities.
I suppose I also believe that one can contribute to these fields in the long-run by building aptitudes like Ines' response discusses, but maybe these problems are urgent & require direct work soon, in which case I can see what you are saying about the high levels of specialization.
Misha_Yagudin @ 2021-12-31T05:54 (+4)
Agree; moving into "EA-approved" direct work later in your career while initially doing skill- or network-building is also a good option for some. I would actually think that if someone can achieve a lot at the conventional career, e.g., achieving some local prominence (just as a goal in itself or as preparation to move into a more "directly EA role"), that's great. My thinking here was especially influenced by an article about the neoliberalism community.
(Urgency of some problems, most prominently AI risk, might be indeed a decisive factor under some worldviews held in the community. I guess most people should plan their career as it most makes sense to them under their own worldviews, but I can imagine changing my mind here. I need to acknowledge that I think that short timelines and existential risk concerns are "psychoactive," and people should be carefully exposed to them to avoid various failure modes.)
Khorton @ 2021-12-28T14:26 (+18)
Why not encourage students to experiment for themselves? Try a summer internship or volunteering or taking a class on a topic where they could help solve one of the world's top problems, as well as exploring areas they've been drawn to since childhood, and keep an open mind as they explore.
I think a lot of people will find it really satisfying to see how they can help people, but some people might genuinely be happier working on something they've been interested in since childhood, and we shouldn't try to deceive those people!
Anjay F @ 2021-12-29T14:27 (+3)
This is a good option. I hadn't really considered this. And I agree that we definitely shouldn't try to deceive anyone.
Anjay F @ 2021-12-28T12:04 (+10)
I believe this primarily due to arguments in So Good They Can't Ignore You by Cal Newport that suggest that the application of skills we excel at is what leads to enjoyable work as opposed to a passion for a specific job or cause, but also because I think that community & purpose is super important for happiness and most top EA causes seem to provide both.
Ines @ 2021-12-29T16:37 (+7)
I think this is often a real tradeoff, but there are other ways of framing it that might help:
A) You should work in something you at least somewhat enjoy and have a good personal fit for in order to avoid burnout (I think this is 80k's position as well). Within the range of things that meet this criteria, some will be more impactful than others, and you should choose the most impactful one. EA frameworks are very useful for discerning which one this might be.
B) The aptitude-building approach (from Holden Karnofsky's 80k podcast episode): You should become great at something you like and are very good at, and then wield it in the most impactful way you can, which knowledge of EA is again useful for. (Even if it is not initially obvious how, most skills can be applied to EA in some way—for example, creative writing like HPMOR has served as a great tool for community building.)
If someone is unwilling to move away from a low-impact cause, there are still ways EA can be useful for helping them be more impactful within their cause. Similarly, if someone is set on a certain skill, EA can help them use it to do good effectively.
Anjay F @ 2021-12-30T15:49 (+2)
Thanks Ines for this thoughtful answer! It makes me want to emphasize the aptitude-building approach more at my group.
Ben Williamson @ 2021-12-28T13:38 (+6)
This article by 80,000 Hours on job satisfaction is probably a useful resource on how working on the most pressing problems doesn't necessarily have to involve sacrificing happiness.
Anjay F @ 2021-12-29T14:37 (+1)
Thanks Ben for sharing this!
Lenny McCline @ 2021-12-30T15:27 (+2)
Cal Newport argues in favor of this in his book So Good They Can’t Ignore You, as a uni group leader myself I’ve found his points useful when talking with new members.
I believe Cal Newport’s career advice has been quite influential in 80000 hour’s own, so you might not find anything terribly new there but I do think it’s worth checking out.