Aaron Gertler's Quick takes

By Aaron Gertler πŸ”Έ @ 2019-11-15T12:40 (+7)

null
Aaron Gertler @ 2022-04-20T21:57 (+27)

My ratings and reviews of rationalist fiction

I've dedicated far too much time to reading rationalist fiction. This is a list of stories I think are good enough to recommend.

Here's my entire rationalist fiction bookshelf β€”a mix of works written explicitly within the genre and other works that still seem to belong. (I've written reviews for some, but not all.)

Here are subcategories, with stories ranked in rough order from "incredible" to "good". The stories vary widely in scale, tone, etc., and you should probably just read whatever seems most interesting to you.

If you know of a good rational or rational-adjacent story I'm missing, let me know!

Long stories (rational fiction)

Long stories (not rational fiction, but close)

Short stories and novellas (rational or close)

alexrjl @ 2022-04-21T05:26 (+4)

If be keen to hear right how you're defining the genre, especially when the author isn't obviously a member of the community. I loved worm and read it a couple of years ago, at least a year before I was aware rational fiction was a thing, and don't recall thinking "wow this seems really rationalist" so much as just "this is fun words go brrrrrrrr"

Aaron Gertler @ 2022-04-21T17:41 (+7)

I think that "intense, fanatical dedication to worldbuilding" + "tons of good problem-solving from our characters, which we can see from the inside" adds up to ratfic for me, or at least "close to ratfic". Worm delivers both.

alexrjl @ 2022-04-21T20:04 (+2)

Sounds right to me! I'm reading worth the candle at the moment :)

JasperGeh @ 2022-05-05T07:09 (+1)

Ah, that makes sense. I absolutely adore Fine Structrue and Ra but never considered it ratfic (though I don’t know whether Sam Hughes is hanging in rat circles)

EdoArad @ 2022-04-21T10:22 (+2)

I also love Alexander Wales' ongoing This Used To Be About Dungeons

Charles He @ 2022-04-21T04:26 (+2)

Can you give your view why The Dark Forest is an example of near rationalist work?

I guess it shows societal dysfunction, the (extreme) alienness or hostility of reality, and some intense applications of game theory.

I think I want to understand β€œrationality” as much as the book.

Aaron Gertler @ 2022-04-21T17:43 (+9)

The book poses an interesting and difficult problem that characters try to solve in a variety of ways. The solution that actually works involves a bunch of plausible game theory and feels like it establishes a realistic theory of how a populous universe might work. The solutions that don't work are clever, but fail for realistic reasons. 

Aside from the puzzle element of the book, it's not all that close to ratfic, but the puzzle is what compelled me. Certainly arguable whether it belongs in this category.

Rubi @ 2022-05-02T07:46 (+1)

I like many books on the list, but I think you're doing a disservice by trying to recommend too  many books at once. If you can cut it down to 2-3 in each category, that gives people a better starting point.

Aaron Gertler @ 2022-05-02T10:36 (+5)

If you want recommendations, just take the first couple of  items in each category. They are rated in order of how good I think they are. (That's if you trust my taste β€” I think most people are better off just skimming the story summaries and picking up whatever sounds interesting to them.)

Rubi @ 2022-05-03T20:02 (+1)

Cool, thanks!

Aaron Gertler @ 2021-03-09T10:32 (+24)

On worrying too much about the impact of a certain EA Forum post:

I am the Forum's lead moderator, but I don't mean to write this with my "mod" hat on β€” these are personal concerns, from someone who cares a lot about this space.

Michael Aird recently published a list of EA-related books he'd read

Some people thought this could have bad side effects. The phrase "echo chamber" came up multiple times:

These titles should not become even more canonical in the EA community than they already are (I fear this might lead to an echo chamber)

And:

I agree with Hauke that this risks increasing the extent to which EA is an echo chamber.

I could appreciate some commenters' object-level concerns about the list (e.g. the authors weren't very diverse). But the "echo chamber" concern felt... way off.

Reasons I don't think this concern made much sense:

If this post has any impact on EA as an entire movement, I'd guess that impact will be... minimal, so minimal as to be nigh-untraceable.

Meanwhile, it seems like it could introduce a few people to books they'll find useful β€” a positive development, and a good reason to share a book list!

*****

More broadly: 

There is a sense in which every Forum post plays a role in shaping the culture of EA. But I think that almost every post plays a very small role. 

I often hear from people who are anxious about sharing their views because they're afraid that they'll somehow harm EA culture in a way they can't anticipate. 

I hear this more frequently from authors in groups that are already underrepresented in EA, which makes me especially nervous about the message spreading further.

While concerns about cultural shifts are sometimes warranted, I think they are sometimes brought up in cases where they don't apply. They seem especially misplaced when an author isn't making a claim about EA culture and is instead sharing a personal experience.

I'd like EA's culture to be open and resilient β€” capable of considering and incorporating new ideas without suffering permanent damage. The EA Forum should, with rare exceptions, be a place where people can share their thoughts, and discussion can help us make progress.  

Downvoting and critical comments have a useful role in this discussion. But this specific type of criticism β€” "you shouldn't have shared this at all, because it might have some tiny negative impact on EA culture" β€” often feels like it cuts against openness and resilience. 

MichaelA @ 2021-03-10T09:17 (+5)

I think these are good points, and the points made in the second half of your shortform are things I hadn't considered.

The rest of this comment says relatively unimportant things which relate only to the first half of your shortform.

---

If people read only this shortform, without reading my post or the comments there, there are two things I think they should know to help explain the perspective of the critical commenter(s):

  • It wasn't just "a list of EA-related books [I'd] read" but a numbered, ranked list.
    • That seems more able to produce echo-chamber-like effects than merely a list, or a list with my reviews/commentary but without a numbered ranking
      • See also 80,000 Hours' discussion of their observation that people have sometimes overly focused on the handful of priorities paths 80,000 Hours explicitly highlight, relative to figuring out additional paths using the principles and methodologies 80,000 Hours
        • I don't immediately recall the best link for this, but can find one if someone is interested
  • "the authors weren't very [demographically] diverse" seems like an understatement; in fact, all 50+ of them (some books had coauthors) were male, and I think all were white and from WEIRD societies. 
    • And I hadn't explicitly noticed that before the commenters pointed that out
    • I add "demographically" because I think there's a substantial amount of diversity among the authors in terms of things like worldviews, but that's not the focus for this specific conversation

I think it's also worth highlighting that a numbered list has a certain attention-grabbing, clickbait-y quality, which I think slightly increases the "risk" of it having undue influence.

All that said, I do agree with you that the post (a) seems less likely to be remembered and have a large influence that a couple critical commenters seemed to expect, and (b) seems less likely to cause a net increase in ideological homogeneity (or things like that) than those commenters seemed to expect.

MichaelA @ 2021-03-10T09:24 (+4)

[Some additional, even less important remarks:]

I don't know how much karma it had when the "echo chamber" comments were made, but it finished with 70 (as I write this), outside the top 10 posts for February.

Interestingly, the strong downvote was one of the first handful of votes on the post (so it was at relatively low karma then), and the comment came around then. Though the I guess what was more relevant is how much karma/attention it'd ultimately get. But even then, I think the best guess at that point would've been something like 30-90 karma (based in part on only 1 of my previous posts exceeding 90 karma).

If Will MacAskill published a list of book ratings, I could understand this kind of concern (though I still think we should generally trust people not to immediately adopt Will's opinions). Michael Aird is a fantastic Forum contributor, but he doesn't have the same kind of influence.

Fingers crossed I'll someday reach Will's heights of community-destruction powers! (Using them only for good-as-defined-unilaterally-by-me, of course.)

Aaron Gertler @ 2021-06-21T04:37 (+22)

New EA music

JosΓ© Gonzalez (GWWC member, EA Global performer, winner of a Swedish Grammy award) just released a new song inspired by EA and (maybe?) The Precipice.

Lyrics include:

Speak up
Stand down
Pick your battles
Look around
Reflect
Update
Pause your intuitions and deal with it

It's not as direct as the songs in the Rationalist Solstice, but it's more explicitly EA-vibey than anything I can remember from his (apparently) Peter Singer-inspired 2007 album, In Our Nature.

KarolinaSarek @ 2021-06-21T15:32 (+8)

"Visions" - another song he released in 2021 gives me very strong EA vibes. Lyrics include:

Visions
Imagining the worlds that could be
Shaping a mosaic of fates
For all sentient beings
[...]

Visions
Avoidable suffering and pain
We are patiently inching our way
Toward unreachable utopias

Visions
Enslaved by the forces of nature
Elevated by mindless replicators
Challenged to steer our collective destiny

Visions
Look at the magic of reality
While accepting with all honesty
That we can't know for sure what's next

No, we can't know for sure what's next
But that we're in this together
We are here together

Aaron Gertler @ 2022-05-20T09:04 (+14)

Memories from starting a college group in 2014

In August 2014, I co-founded Yale EA (alongside Tammy Pham). Things have changed a lot in community-building since then, and I figured it would be good to record my memories of that time before they drift away completely.

If you read this and have questions, please ask!

 

Timeline

I was a senior in 2014, and I'd been talking to friends about EA for years by then. Enough of them were interested (or just nice) that I got a good group together for an initial meeting, and a few agreed to stick around and help me recruit at our activities fair. One or two of them read LessWrong, and aside from those, no one had heard of effective altruism.

The group wound up composed largely of a few seniors and a bigger group of freshmen (who then had to take over the next year β€” not easy!). We had 8-10 people at an average meeting.

Events we ran that first year included:

We also ran some projects, most of which failed entirely:

 

What it was like to running a group in 2014: Random notes

 

But mostly, it was really hard

 The current intro fellowships aren't perfect, and the funding debate is real/important, but oh god things are so much better for group organizers than they were in 2014.

I had no idea what I was doing. 

There were no reading lists, no fellowship curricula, no facilitator guides, no nothing. I had a Google doc full of links to favorite articles and sometimes I asked people to read them.

I remember being deeply anxious before every meeting, event, and email send, because I was improvising everything and barely knew what we were supposed to be doing (direct impact? Securing pledges? Talking about cool blogs?).

Lots of people came to one or two meetings, saw how chaotic things were, and never came back. (I smile a bit when I see people complaining that modern groups come off as too polished and professional β€” that's not great, but it beats the alternative.)

I looked at my journal to see if the anxious memories were exaggerated. They were not. Just reading them makes me anxious all over again.

But that only makes it sweeter that Yale's group is now thriving, and that EA has outgrown the "students flailing around at random" model of community growth.

Aaron Gertler @ 2020-06-25T22:53 (+13)

Excerpt from a Twitter thread about the Scott Alexander doxxing situation, but also about the power of online intellectual communities in general:

I found SlateStarCodex in 2015. immediately afterwards, I got involved in some of the little splinter communities online, that had developed after LessWrong started to disperse. I don't think it's exaggerating to say it saved my life.

I may have found my way on my own eventually, but the path was eased immensely by LW/SSC. In 2015 I was coming out of my only serious suicidal episode; I was in an unhappy marriage, in a town where I knew hardly anyone; I had failed out of my engineering program six months prior.

I had been peripherally aware of LW through a few fanfic pieces, and was directed to SSC via the LessWrong comments section.

It was the most intimidating community of people I had ever encountered -- I didn't think I could keep up. 

But eventually, I realized that not only was this the first group of people who made me feel like I had come *home,* but that it was also one of the most welcoming places I'd ever been (IRL or virtual).

I joined a Slack, joined "rationalist" tumblr, and made a few comments on LW and SSC. Within a few months, I had *friends*, some of whom I would eventually count among those I love the most.

This is a community that takes ideas seriously (even when it would be better for their sanity to disengage).

This is a community that thinks everyone who can engage with them in sincere good faith might have something useful to say.

This is a community that saw someone writing long, in-depth critiques on the material produced on or adjacent to LW/SSC...and decided that meant he was a friend. 

I have no prestigious credentials to speak of. I had no connections, was a college dropout, no high-paying job. I had no particular expertise, a lower-class background than many of the people I met, a Red-Tribe-Evangelical upbringing and all I had to do, to make these new friends, was show up and join the conversation.

[...]

The "weakness" of the LessWrong/SSC community is also its strength: putting up with people they disagree with far longer than they have to. Of course terrible people slip through. They do in every group -- ours are just significantly more verbose.

But this is a community full of people who mostly just want to get things *right,* become *better people,* and turn over every single rock they see in the process of finding ways to be more correct -- not every person and not all the time, but more than I've seen everywhere else.

The transhumanist background that runs through the history of LW/SSC also means that trans people are more accepted here than anywhere else I've seen, because part of that ideological influence is the belief that everyone should be able to have the body they want.

It is not by accident that this loosely-associated cluster of bloggers, weird nerds, and twitter shitposters were ahead of the game on coronavirus. It's because they were watching, and thinking, and paying attention and listening to things that sound crazy... just in case.

There is a 2-part lesson this community held to, even while the rest of the world is forgetting it: 

  • You can't prohibit dissent
  • It's sometimes worth it to engage someone when they have icky-sounding ideas

It was unpopular six months ago to think COVID might be a big deal; the SSC/LW diaspora paid attention anyways.

You can refuse to hang out with someone at a party. You can tell your friends they suck. But you can't prohibit them from speaking *merely because their ideas make you uncomfortable* and there is value in engaging with dissent, with ideas that are taboo in Current Year.

(I'm not leaving a link or username, as this person's Tweets are protected.)

aarongertler @ 2020-02-06T02:58 (+13)

Another brief note on usernames:

Epistemic status: Moderately confident that this is mildly valuable

It's totally fine to use a pseudonym on the Forum. 

However, if you chose a pseudonym for a reason other than "I actively want to not be identifiable" (e.g. "I copied over my Reddit username without giving it too much thought"), I recommend using your real name on the Forum.

If you want to change your name, just PM or email me (aaron.gertler@centreforeffectivealtruism.org) with your current username and the one you'd like to use.

Reasons to do this:

Some of these reasons won't apply if you have a well-known pseudonym you've used for a while, but I still think using a real name is worth considering.

Aaron Gertler @ 2022-04-24T07:31 (+11)

Memories from running a corporate EA group

From August 2015 - October 2016, I ran an effective altruism group at Epic, a large medical software corporation in Wisconsin. Things have changed a lot in community-building since then, but I figured it would be good to record my memories of that time, and what I learned.

If you read this and have questions, please ask!

Launching the group

Running the group

Reflections // lessons learned

akrolsmir @ 2022-04-24T08:19 (+3)

Thank you for writing this up! This is a kind of experience I haven't seen expressed much on the forum, so found extra valuable to read about.

Curious, any reason it's not a top level post?

Aaron Gertler @ 2022-04-25T01:09 (+3)

The group was small and didn't accomplish much, and this was a long time ago. I don't think the post would be interesting to many people, but I'm glad you enjoyed reading it!

Aaron Gertler @ 2020-06-23T07:33 (+10)

I recommend placing questions for readers in comments after your posts.

If you want people to discuss/provide feedback on something you've written, it helps to let them know what types of discussion/feedback you are looking for.

If you do this through a bunch of scattered questions/notes in your post, any would-be respondent has to either remember what you wanted or read through the post again after they've finished.

If you do this with a list of questions at the end of the post, respondents will remember them better, but will still have to quote the right question in each response. They might also respond to a bunch of questions with a single comment, creating an awkward multi-threaded conversation.

If you do this with a set of comments on your own post -- one for each question/request -- you let respondents easily see what you want and discuss each point separately. No awkward multi-threading for you! This seems like the best method to me in most cases.

Aaron Gertler @ 2022-04-08T02:21 (+9)

Advice on looking for a writing coach

I shared this with someone who asked for my advice on finding someone to help them improve their writing. It's brief, but I may add to it later.

I think you'll want someone who:

* Writes in a way that you want to imitate. Some very skilled editors/teachers can work in a variety of styles, but I expect that most will make your writing sound somewhat more like theirs when they provide feedback.

* Catches the kinds of things you wish you could catch in your writing. For example, if you want to try someone out, you could ask them to read over something you wrote a while back, then read over it yourself at the same time and see how their fixes compare to your own regrets. 

* Has happy clients β€” ask for references. It's easy for someone to be a good writer but a lousy editor in any number of ways (too harsh, too easygoing, bad at explaining their feedback, disorganized and bad at follow-up...).

aarongertler @ 2019-11-15T12:40 (+8)

Quick PSA: If you have an ad-blocking extension turned on while you browse the Forum, it very likely means that your views aren't showing up in our Google Analytics data. 

That's not something we care too much about, but it does make our ideas about how many users the Forum has, and what they like to read, slightly less accurate. Consider turning off your adblocker for our domain if you'd like to do us a tiny favor.

Larks @ 2019-11-15T14:50 (+5)

Done.

Aaron Gertler @ 2021-07-18T21:27 (+4)

I enjoyed learning about the Henry Spira award. It is given by the Johns Hopkins School of Public Health to "honor animal activists in the animal welfare, protection, or rights movements who work to achieve progress through dialogue and collaboration."

The criteria for the award are based on Peter Singer's summary of the methods Spira used in his own advocacy. Many of them seem like strong guiding principles for EA work in general:

  1. Understands public opinion, and what people outside of the animal rights/welfare movement are thinking.
  2. Selects a course of action based upon public opinion, intensity of animal suffering, and the opportunities for change.
  3. Sets goals that are achievable and that go beyond raising public awareness. Is willing to bring about meaningful change one step at a time.
  4. Is absolutely credible. Doesn't rely on exaggeration or hype to persuade
  5. Is willing to work with anyone to make progress. Doesn't "divide the world into saints and sinners."
  6. Seeks dialogue and offers realistic solutions to problems.
  7. Is courageous enough to be confrontational if attempts at dialogue and collaboration fail.
  8. Avoids bureaucracy and empire-building.
  9. Tries to solve problems without going through the legal system, except as a last resort.
  10. Asks "Will it work?" whenever planning a course of action. Stays focused on the practical realities involved in change, as well as ethical and moral imperatives.

See the bottom of this page for a list of winners (most recently in 2011; I don't know whether the award is still "active").

Aaron Gertler @ 2021-07-05T09:07 (+4)

Tiny new feature notice: We now list all the comments marked as "moderator comments" on this page

If you see a comment that you think we should include, but it isn't highlighted as a moderator comment yet, please let me know.

aarongertler @ 2020-02-06T02:54 (+3)

Brief note on usernames:

Epistemic status: Kidding around, but also serious

If you want to create an account without using your name, I recommend choosing a distinctive username that people can easily refer to, rather than some variant on "anonymous_user".

Among usernames with 50+ karma on the Forum, we have:

I'm pretty sure I've seen at least one comment back-and-forth between two accounts with this kind of name. It's a bit much :-P
 

Linch @ 2020-02-06T02:58 (+2)

One possibility is to do what Google Docs does, and pick an animal at near-random (ideally a memorable one), and be AnonymousMouse, AnonymousDog, AnonymouseNakedMoleRat, etc.

aarongertler @ 2020-02-06T02:59 (+2)

This feels like the next-worst option to me. I think I'd find it easier to remember whether "fluttershy_forever" or "UtilityMonster" said something than to remember whether "AnonymousDog" or "AnonymousMouse" said it.

Linch @ 2020-02-06T03:13 (+8)

I think this makes it clear that people are deliberately being anonymous rather than carrying over old internet habits.

Also I think there's a possibility of information leakage if someone tries to be too cutesy with their pseudonyms. Eg, fluttershy_forever might lead someone to look for similar names in the My Little Pony forums, say, where the user might be more willing to "out" themselves than if they were writing a critical piece on the EA forum.

This is even more true for narrower interests. My Dominion username is a Kafka+Murakami reference, for example.

There's also a possibility of doxing in the other direction, where eg, someone may not want their EA Forum opinions to be associated with bad fanfiction they wrote when they were 16.

aarongertler @ 2020-02-06T12:43 (+2)

The deliberate anonymity point is a good one. The ideal would be a distinct anonymous username the person doesn't use elsewhere, but this particular issue isn't very important in any case.

jpaddison @ 2020-02-06T18:29 (+1)

You could write that the username is deliberately anonymous in your Forum bio.

Aaron Gertler @ 2021-05-20T02:27 (+2)

Subnormality, my favorite webcomic, on collective effort:

Spoken by the immortal Sphinx, a voice of wisdom in the series.

"I'd say you have only to do what's within your power as one person. I suppose it's probably harder to conceptualize when you haven't literally watched pyramids being constructed, for instance, but nevertheless β€” there's a collective imagination that is gradually served by individual efforts. Great works that are quite impossible by yourself, but also impossible without you.

"As people go, so do societies. Like anything meaningful, it may not always feel like much in the moment, but then suddenly you've got the future on your hands! And you'll never have as much as in the future, but by extension you'll always have more than in the past, in some regard at least. More to build on, more examples, more... pyramids, certainly."

aarongertler @ 2019-12-27T02:24 (+2)

I do a lot of cross-posting because of my role at CEA. I've noticed that this racks up a lot of karma that feels "undeserved" because of the automatic strong upvotes that get applied to posts. From now on (if I remember; feel free to remind me!), I'll be downgrading the automatic strong upvotes to weak upvotes. I'm not canceling the votes entirely because I'm guessing that at least a few people skip over posts that have zero karma.

This could be a bad idea for reasons I haven't thought of yet, and I'd welcome any feedback.

Stefan_Schubert @ 2019-12-27T10:32 (+12)

Could the option to strongly upvote one's own comments (and posts, in case you remove the automatic strong upvotes on posts) be disabled, as discussed here? Thanks.

jpaddison @ 2019-12-27T05:01 (+8)

Just to clarify, you know your self-votes don't get you karma?

aarongertler @ 2019-12-27T20:58 (+2)

...nope. That's good to know, thanks! Given that, I don't think I'll bother to un-strong-upvote myself.