Best resources for introducing longtermism and AI risk?

By aogara @ 2020-07-16T17:27 (+17)

If someone is interested in learning about longtermism and AI risk, what do you send them? Imagine they're already curious or have heard a surface-level pitch. What followup links do you send them? How do you personalize depending on the person's background or interests?

Introducing new people to EA ideas seems very important, and quite difficult. I'd be interested in any other ideas about how to do so well.


Thanks to Ben_West for asking this question in another thread, and Pablo_Stafforini for suggesting this post.


Aidan O'Gara @ 2020-07-16T17:38 (+6)

For example, I emailed the following to a friend who'd enjoyed reading Doing Good Better and wanted to learn more about EA, but hadn't further engaged with EA or longtermism. He has a technical background and (IMO) is potentially a good fit for AI Policy work, which influenced my link selection.

...

The single best article I'd recommend on doing good with your career is by 80,000 Hours, a non-profit founded by the Oxford professor who wrote Doing Good Better, incubated in Y-Combinator, and dedicated to giving career advice on how to solve pressing global problems. If you'd prefer, their founder explains the ideas in this podcast episode.

If you're open to some new, more speculative ideas about what "doing good" might mean, here's a few ideas about improving the long-run future of humanity:

[Then I gave some info about two near-termism causes he might like: grantmaking, by linking to GiveWell and the Open Philanthropy Project, and global poverty, by linking to GiveDirectly and other GiveWell top charities.]

Aidan O'Gara @ 2020-07-16T17:40 (+1)

If anyone's interested, here was my intro to grantmaking and global poverty:

...

If you'd prefer more mainstream ways of improving the world, here's some top organizations and job opportunities:

  • Grantmakers within effective altruism are researching the most impactful donation opportunities and giving billions to important causes. 
    • GiveWell researches top donation opportunities in global health and poverty. Founded by ex-hedge fund analysts, they focus on transparency, detailed public writeups, and justifying their decisions to outsiders. You might like their cost-effectiveness model of different charities. They're hiring researchers and a Head of People. 
    • The Open Philanthropy Project funds a wider range of causes - land use reform, pandemic preparedness, basic science research, and many more - in their moonshot approach of "hits-based giving". OpenPhil has billions to donate to its causes, because it's funded by Dustin Moskovitz, co-founder of Facebook and Asana.  
  • World-class organizations are working directly on all kinds of highly impactful problems (and they're hiring! :P)
    • GiveDirectly takes money and gives it to poor people, no strings attached. They typically hire from top private sector firms and have an incredibly well-credentialed team. They're recommended by GiveWell as an outstanding giving opportunity. 
    • Effective global poverty organizations include many for-profits (Sendwave (jobs), TapTap Send (jobs)) and non-profits (Evidence Action (job), ID Insight (jobs)). 
    • 80,000 has a big ol' job board
    • (You're probably not looking for a new job, but who knows, don't mind my nudge)
Ben_West @ 2020-07-22T21:13 (+5)

Note: this comes from a comment thread that has some more discussion in it for those interested.

MichaelA @ 2020-07-27T00:40 (+4)

Thanks for asking this question.

I think a good starting place would likely be the EA Hub's "reading lists" for particular cause areas, including longtermism and AI safety (as well as biorisk, nuclear security, and climate change). And if someone has ideas of great resources to use which aren't shown in those reading lists, you can comment to add them there, so that we can all benefit from those centralised directories of links.

But those reading lists just provide lists of options, which one would then have to narrow down to a handful of links that suit the particular context and purpose. That second step can be tricky, and I don't have anything especially useful to say there, unfortunately.

Jérémy Perret @ 2020-07-20T07:11 (+3)

I'd like to answer this. I'd need some extra clarification first, because the introductions I use highly depend on the context:

(if the answer is "all of the above", I can work with than too, but it will be edited for brevity)

Aidan O'Gara @ 2020-07-21T03:19 (+4)

Cool! Thanks for asking for clarification, I didn't quite realize how much ambiguity I left in the question.

I'm mainly interested in persuading people I know personally who are already curious about EA ideas. Most of my successful intros in these situations consist of (a) an open-ended free flowing conversation, followed by (b) sending links to important reading material. Conversations are probably too personal and highly varied to advice that's universally applicable, so I'm most interested in the links and reading materials you send to people.

So, my question, better specified: What links do you send to introduce AI and longtermism?

Jérémy Perret @ 2020-07-21T13:07 (+1)

That's much more specific, thanks. I'll answer with my usual pointers!