What (standalone) LessWrong posts would you recommend to most EA community members?

By Vaidehi Agarwalla 🔸 @ 2022-02-09T00:31 (+67)

Edit 11 Feb 2022: Jeremy made a post about starting a low-commitment LW Article club where he'd be linkposting articles on a weekly basis from this list for people to engage with! 

Context / Motivation: 

My ask:


Pablo @ 2022-02-10T15:11 (+16)

I'm a big fan of some of the early LessWrong content, e.g.

More generally, I'd recommend much of the content by Scott Alexander ("Yvain"), Paul Christiano, Wei Dai, Gwern, Greg Lewis ("Thrasymachus"), Anna Salamon and Carl Shulman (I'm probably forgetting other names).

AllAmericanBreakfast @ 2022-02-09T05:56 (+15)

Maybe the thing to do would be to start a low-commitment LW book club? There's so many old posts that it doesn't feel fresh to comment on them, but having a way to put some group attention on a couple posts at a time might help.

Jeremy @ 2022-02-10T15:29 (+4)

I made a separate post to get the ball rolling and make sure this happens. 

vaidehi_agarwalla @ 2022-02-09T19:12 (+4)

Would love to do this !

Miranda_Zhang @ 2022-02-09T20:55 (+3)

Agreed - would love to participate in something like this, and would encourage other group members (esp. organizers) to as well!

Jeremy @ 2022-02-09T17:09 (+3)

I'd be interested in something like this. 

Edward Tranter @ 2022-02-09T19:01 (+2)

I'd also be interested in pursuing this idea! LW can definitely be overwhelming, and it'd be a fun (and useful) project to take a deep dive and perhaps produce a recommended reading list for others (broadly defined).

Jeremy @ 2022-02-28T18:44 (+1)

It took me a while to get rolling, but I have done a first Less Wrong repost here and will continue weekly as long as there is enough interest. 

Habryka @ 2022-02-09T08:09 (+14)

I think a lot of old Scott Alexander posts are quite accessible. Some top ones are: 

https://www.lesswrong.com/posts/gFMH3Cqw4XxwL69iy/eight-short-studies-on-excuses

https://www.lesswrong.com/posts/895quRDaK6gR2rM82/diseased-thinking-dissolving-questions-about-disease

Some other good classical posts: 

https://www.lesswrong.com/posts/aHaqgTNnFzD7NGLMx/reason-as-memetic-immune-disorder

https://www.lesswrong.com/posts/PBRWb2Em5SNeWYwwB/humans-are-not-automatically-strategic

https://www.lesswrong.com/posts/yA4gF5KrboK2m2Xu7/how-an-algorithm-feels-from-inside

Aaron Gertler @ 2022-02-09T09:28 (+10)

Privileging the Question changed my life in college. I don't know how useful it would be for the average person already involved in EA, but it played a huge role in my not getting distracted by random issues and controversies, and instead focusing on big-picture problems that weren't as inherently interesting. I'd at least recommend it to new members of university EA groups, if not "most community members".

Yonatan Cale @ 2022-02-09T23:40 (+9)

This got me to leave my girlfriend and has remained a permanent way that I think:

https://www.lesswrong.com/posts/627DZcvme7nLDrbZu/update-yourself-incrementally

 

I read it as part of all the sequences, no idea how helpful it will be to others or as a standalone post

AllAmericanBreakfast @ 2022-02-09T05:35 (+9)

My take is that LessWrong is best understood as a mix of individual voices, each with their own style and concerns. The approach I'd recommend is to select one writer whose voice you find compelling, and spend some time digesting their ideas. A common refrain is "read the sequences," but that's not where I started. I like John Wentworth's writing.

Alternatively, you might find yourself interested in a particular topic. LessWrong's tags can help you both find an interesting topic and locate relevant posts, though it's not super fine grained or comprehensive.

One of the key sources of value on LessWrong is that it provides a common language for some complex ideas, presented in a relatively fun and accessible format. The combination of all those ideas can elevate thinking, although it's no panacaea. My intuition is that it's best to slowly follow your curiosity over a period of a few years, rather than trying to digest the whole thing all at once, or pick a couple highlights.

Jeremy @ 2022-02-11T16:40 (+1)

Any particular Wentworth posts that stand out to you? I'd like to include some in the LCLWBC (full credit to you for the name!), but I am not too familiar.

AllAmericanBreakfast @ 2022-02-11T17:45 (+15)

John had several posts highly ranked in the 2020 LessWrong review, and one in the 2019 LessWrong review, so there's a community consensus that they're good. There was also a 2018 LessWrong review, though John didn't place there.

In general, the review is a great resource for navigating more recent LW content. Although old posts are a community touchstone, the review includes posts that reflect the live interests of community members that have also been extensively vetted not only for being exciting, but for maintaining their value a year later.

Jeremy @ 2022-02-13T13:14 (+1)

Thank you!

RyanCarey @ 2022-02-10T03:37 (+8)

Here are a few from Eliezer:

And a few from others:

I might add more later.

reallyeli @ 2022-02-09T06:35 (+6)

I really like Ends Don't Justify Means (Among Humans) and think it's a bit underrated. (In that I don't hear people reference it much.)

I think I find the lesson generally useful: that in some cases it can be bad for me to "follow consequentialism," (because in some cases I'm an idiot) without consequentialism being itself bad.

antimonyanthony @ 2022-02-10T08:19 (+3)

The noncentral fallacy nicely categorizes a very common source of ethical disagreement in my experience.

[Edit:] Somewhat more niche, but considering how important AI risk is to many EAs, I'd also recommend Against GDP as a metric for timelines and takeoff speeds, for rebutting what is in my estimation a bizarrely common error in forecasting AI takeoff.

rileyharris @ 2022-02-10T20:43 (+1)

Saved these all to pocket, thanks for the recommendations!