Where would I find the hardcore totalizing segment of EA?

By Peter Berggren @ 2023-12-28T09:16 (+16)

Basically what it says on the tin. I have this psychological need to find a really intense structured organization to help me accomplish what I want in life (most importantly, saving the world), and EA organizations are natural candidates for this. However, most of the large ones I've found display too much "performative normalcy" and aren't really willing to be as hardcore as I want and need.

Any recommendations on where to find a hardcore totalizing community that can inject more structure into my life so I'm better equipped to save the world? I'm living in Boston for the next two years or so, so anything that requires moving somewhere else won't work, but other than that, all kinds of ideas are welcome.


RomanHauksson @ 2023-12-28T09:55 (+17)
Peter Berggren @ 2023-12-28T16:09 (+6)

Thanks for the advice. I was more wondering if there was some specific organization that was known to give that sort of environment and was fairly universally recognized as e.g. “the Navy SEALs of EA” in terms of intensity, but this broader advice sounds good too.

Conor Barnes @ 2023-12-28T14:41 (+9)

I think this is a joke, but for those who have less-explicit feelings in this direction:

I strongly encourage you to not join a totalizing community. Totalizing communities are often quite harmful to members and being in one makes it hard to reason well. Insofar as an EA org is a hardcore totalizing community, it is doing something wrong.

Peter Berggren @ 2023-12-28T15:56 (+6)

This was semi-serious, and maybe “totalizing” was the wrong word for what I was trying to say. Maybe the word I more meant was “intense” or “serious.”

CLARIFICATION: My broader sentiment was serious, but my phrasing was somewhat exaggerated to get my point across.

Karthik Tadepalli @ 2023-12-28T11:55 (+6)

What you're asking for sounds risky; see here for a reflection from a former "hardcore" EA. I also imagine there aren't many really hardcore segments after the fall of Leverage Research, but I have no particular insight into that.

Peter Berggren @ 2023-12-28T16:15 (+1)

Thanks for the reflection.

I’ve read about Leverage, and it seems like people are unfairly hard on it. They’re the ones who basically started EA Global, and people don’t give them enough credit for that. And honestly, even after what I’ve read about them, their work environment still sounds better to me than a supposedly “normal” one.

RyanCarey @ 2023-12-30T19:16 (+3)

Yes, they were involved in the first, small, iteration of EAG, but their contributions were small compared to the human capital that they consumed. More importantly, they were a high-demand group that caused a lot of people serious psychological damage. For many, it has taken years to recover a sense of normality. They staged a partial takeover of some major EA institutions. They also gaslit the EA community about what they were doing, which confused and distracted decent-sized subsections of the EA communtiy for years.

I watched The Master a couple of months ago, and found to be a simultaneously compelling and moving description of the experience of cult membership, that I would recommend.

Habryka @ 2023-12-30T19:46 (+4)

Yes, they were involved in the first, small, iteration of EAG

I agree with a broad gist of this comment, but I think this specific sentence is heavily underselling Leverage's involvement. They ran the first two EA Summits, and also were heavily involved with the first two full EA Globals (which I was officially in charge of, so I would know).

Chris Leong @ 2023-12-29T22:50 (+3)

Sorry, I know you said you're stuck in Boston, but tbh you're most likely to find like-minded people in the Bay Area[1]. Even if you're stuck in Boston for now, perhaps, it'd be possible for you to visit it at some point?

Just to echo other commenters: This is something to be very careful with. Even if you're certain that you want an intense environment, other people who say they want the same, may not actually be the kind of person who thrives in such an environment.

  1. ^

    I've heard that EA's in the Bay Area are more intense than EA's elsewhere. I suspect that this effect is a result of people who are serious about AI Safety moving to the Bay area and this probably affects the culture in general. 

Peter Berggren @ 2023-12-30T22:32 (+1)

Additionally, I wonder why there hasn't been an effort to start a more "intense" EA hub somewhere outside the Bay to save on rent and office costs. Seems like we're been writing about coordination problems for quite some time; let's go and solve one.

RyanCarey @ 2024-01-03T11:15 (+13)

There is an "EA Hotel", which is decently-sized, very intensely EA, and very cheap.

Occasionally it makes sense for people to accept very low cost-of-living situations. But a person's impact is usually a lot higher than their salary. Suppose that a person's salary is x, their impact 10x, and their impact is 1.1 times higher when they live in SF, due to proximity to funders and AI companies. Then you would have to cut costs by 90% to make it worthwhile to live elsewhere. Otherwise, you would essentially be stepping over dollars to pick up dimes.

Chris Leong @ 2024-01-03T11:40 (+3)

One advantage of the EA hotel, compared to a grant, for example, is that selection effects for it are surprisingly strong. This can help resolve some of the challenges of evaluation.

Chris Leong @ 2023-12-31T04:35 (+2)

There have been attempts:

Coordination of Rationality/EA/SSC Housing Projects
New EA Hub Search and Planning

Unfortunately, they haven't gotten anywhere. If you think you can solve the problem, then go for it! But keep in mind that people have tried this in the past and failed.

Peter Berggren @ 2023-12-30T18:28 (+1)

Thanks for the advice. To be clear, I'm not certain that a hardcore environment would be the best environment for me either, but it seems worth a shot. And judging by how people tend to change in their involvement in EA as they get older, I'll probably only be as hardcore as this for like ten years.