My tentative best guess on how EAs and Rationalists sometimes turn crazy

By Habryka @ 2023-06-21T04:11 (+160)

This is a crosspost, probably from LessWrong. Try viewing it there.


titotal @ 2023-06-21T09:27 (+94)

My theory is that while EA/rationalism is not a cult, it contains enough ingredients of a cult that it’s relatively easy for someone to go off and make their own. 

Not everyone follows every ingredient, and many of the ingredients are actually correct/good, but here are some examples:

These ingredients do not make EA/rationalism in general a cult, because it lacks enforced conformity and control by a leader. Plenty of people, including myself, have posted on Lesswrong critiquing the sequences and Yudkowsky and been massively upvoted for it. It’s decentralised across the internet, if someone wants to leave there’s nothing stopping them. 

However, what seems to have happened is that multiple people have taken these base ingredients and just added in the conformity and charismatic leader parts. You put these ingredients in a small company or a group house, put an unethical or mentally unwell leader in charge, and you have everything you need for an abusive cult environment. Now it’s far more difficult to leave, because your housing/income is on the line, and the leader can use the already established breaking of social norms as an excuse to push boundaries and consent in the name of the greater good.  This seems to have happened multiple times already. 

I don’t know what to do if this theory is correct, besides to take extra scrutiny of leaders of sub-groups within EA, and maybe ease up on unnecessary rituals and jargon. 

Habryka @ 2023-06-21T16:41 (+37)

For what it's worth, I view the central aim for this post to be to argue against the "cult" model, which I find quite unhelpful. 

In my experience the cult model tends to have relatively few mechanistic parts, and mostly seems to put people into some kind of ingroup/outgroup mode of thinking where people make a list of traits, and then somehow magically as something has more of those traits, it gets "worse and scarier" on some kind of generic dimension.

Like, FTX just wasn't that much of a cult. Early CEA just wasn't that much of a cult by this definition. I still think they caused enormous harm.

I think breaking things down into "conformity + insecurity + novelty" is much more helpful in understanding what is actually going on. 

For example, I really don't think "rituals and jargon" have much to do what drives people to do more crazy things. Basically all religions have rituals and tons of jargon! So does academia. Those things are not remotely reliable indicators, or are even correlated at all as far as I can tell. 

John G. Halstead @ 2023-06-22T13:34 (+41)

Could you elaborate on how early CEA caused enormous harm? I'm interested to hear your thoughts. 

Habryka @ 2023-06-24T19:41 (+6)

I would like to give this a proper treatment, though that would really take a long time. I think I've written about this some in the past in my comments, and I will try to dig them up in the next few days (though if someone else remembers which comments those were, I would appreciate someone else linking them). 

(Context for other readers, I worked at CEA in 2015 and 2016, running both EAG 2015 and 2016)

Linch @ 2023-08-29T02:52 (+11)

Did you have a chance to look at your old comments, btw?

Habryka @ 2023-08-29T05:16 (+2)

I searched for like 5-10 minutes but didn't find them. Haven't gotten around to searching again. 

titotal @ 2023-06-22T11:18 (+17)

Unfortunately, I think there is a correlation between traits and rituals that are seen as "weird", and the popups of these bad behaviour. 

For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places. 

I don't think your model sufficiently explains why would be the case, whereas my theory posits the explanation that the bay area rats have a much greater density of "cult ingredients" than, say, the EA meetup in some university in belgium or wherever. People have lives and friends outside of EA, they don't fully buy into the worldview, etc. 

I'm not trying to throw the word "cult" out as a perjorative conversation ender. I don't think Leverage was a full on religious cult, but it had enough in common with them that it ended up with the same harmful effects on people. I think the rationalist worldview leads to an increased ease of forming these sorts of harmful groups, which does not mean the worldview is inherently wrong or bad, just that you need to be careful

David_Moss @ 2023-06-23T11:01 (+4)

For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places. 

I don't think your model sufficiently explains why would be the case

 

One explanation could simply be that social pressure in non-rationalist EA mostly pushes people towards "boring" practices rather than "extreme" practices. 

Ozzie Gooen @ 2023-06-22T02:06 (+9)

I think cults are a useful lens to understand other social institutions, including EA. 

I don't like most "is a cult"/"is not a cult" binaries. There's of often a wide gradient. 

To me, lots of academia does do highly on several cult-like markers. Political parties can do very highly. (Some of the MAGA/Proud Boys stuff comes to mind). 

The inner circle of FTX seemed to have some cult-like behaviors, though to me, that's also true for a list of startups and intense situations. 

I did like this test, where you can grade groups on a scale to see how "cult-like" they are.
http://www.neopagan.net/ABCDEF.html

All that said, I do of course get pretty annoyed when most people criticize EA as being a cult, where I often get the sense that they treat it as a binary, and also are implying something that I don't believe.

Michael_PJ @ 2023-06-21T16:38 (+28)

I think another contributing factor is that as a community we a) prize (and enjoy!) unconventional thinking, and b) have norms that well-presented ideas should  be discussed on their merits, even if they seem strange. That means that even the non-insane members of the community often say things that sound insane from a common-sense perspective, and that many insane ideas are presented at length and discussed seriously by the community.

I think experiencing enough of this can disable your "insane idea alarm". Normally, if someone says something that sounds insane, you go "this person is insane" and run away. But in the EA/rationalist community you instead go "hm, maybe this is actually a great but out-oft-the-box idea?".

And then the bad state is that someone says something like "you should all come and live in a house where I am the ruler and I impose quasi-military discipline on you" and you go "hm, maybe this is an innovative productivity technique for generating world-saving teams" instead of immediately running away screaming.

ChanaMessinger @ 2023-06-22T13:48 (+5)

I strongly resonate with this; I think this dynamic also selects for people who are open-minded in a particular way (which I broadly think is great!), so you're going to get more of it than usual.

John G. Halstead @ 2023-06-22T13:33 (+25)

Thanks for this, I agree with almost all of it. Just one observation, which I think is important. You say "Now one might think that because we have a lot of smart people, we might be able to avoid the worst outcomes here, by just not enforcing extreme standards that seem pretty crazy."

I think the main factor that could contain craziness is social judgment rather intelligence. So, I think the prevalence of autism in the community means that EAs will be less able to than other social groups to contain craziness. 

Vasco Grilo @ 2023-06-22T18:00 (+10)

Hi John,

Is there concrete data on the prevalence of autism in the EA community?

David_Moss @ 2023-06-23T13:03 (+22)

The 2020 SlateStarCodex survey has some data on this. Obviously it is quite limited by the fact that the sample is pre-selected for the (rationalist-leaning) SSC audience already, which you might also expect to be associated with autism.

This survey asked about identification as EA (No/Sorta/Yes) and autism (I don't have this condition and neither does anyone in my family/I have family members (within two generations) with this condition/I think I might have this condition, although I have never been formally diagnosed/I have a formal diagnosis of this condition). Unfortunately, these leave a lot of researcher degrees of freedom (e.g. what to do with the "Sorta" and the "never been formally diagnosed" respondents).

Just looking at the two raw measures straightforwardly, we see no significant differences, though you can see that there are more people with a formal diagnosis in the "Yes" category. 

I'd be wary of p-hacking from here, though fwiw, with different, more focused analyses, the results are borderline significant (both sides of the borderline), e.g. looking at whether someone said "Yes" they are an EA * whether they are formally diagnosed with autism, there is a significant association (p=0.022).

I have previously considered whether it would be worth including a question about this in the EAS, but it seems quite sensitive and relatively low value given our high space constraints.

John G. Halstead @ 2023-06-22T19:33 (+8)

Not to my knowledge, though I think it's pretty clear

Dr. David Mathers @ 2023-06-23T11:31 (+5)

This sounds kind of plausible to me, but couldn't you equally say that you'd expect autism to be mitigating factor to cultiness, because cults are about conformity to group shibboleths for social reasons which autistic people are better at avoiding ? (At least, I'd have thought we are. Maybe only I have that perception?) That kind of makes me think that actually it is just easy to generate plausible-sounding hypothesis about the effects of a fairly broad and nebulous thing like "autistic traits" and maybe none of them should be taken that seriously without statistical evidence to back them up. 

Dr. David Mathers @ 2023-06-21T08:58 (+21)

I think this sounds sensible in general, but FTX might have just been greed. Probably only a handful of people knew about the fraud, after all.

NickLaing @ 2023-06-21T11:18 (+2)

Yep I think this is most likely

ChanaMessinger @ 2023-06-22T13:53 (+20)

I'd be interested in more thoughts if you have them on evidence or predictions one could have made ahead of time that would distinguish this model from other (like maybe a lot of what's going on is youth and will evaporate over time (youth still has to be mediated by things like what you describe, but as an example).

Also, my understanding is that SBF wasn't very insecure? Does that affect your model or is the point that the leader / norm setter doesn't have to be?

Aaron Bergman @ 2023-06-24T01:24 (+14)

Seems like the forces that turn people crazy are the same ones that lead people to do anything good and interesting at all. At least for EA, a core function of orgs/elites/high status community members is to make the kind of signaling you describe highly correlated with actually doing good. Of course it seems impossible to make them correlate perfectly, and that’s why setting with super high social optimization pressure (like FTX) are gonna be bad regardless.

But (again for EA specifically) I suspect the forces you describe would actually be good to increase on the margin for people not living in Berkeley and/or in a group house which is probably a majority of self-identified EAs but a strong minority of the people-hours OP interacts with irl.

Jobst Heitzig (vodle.it) @ 2023-09-13T05:45 (+2)

The "impossible to correlate perfectly" piece is like in AI alignment, where one could also argue that perfect alignment of a reward function to the "true" utility function is impossible.

Indeed, one might even argue that the joint cognition implemented by the EA/rationality/x-risk community as a whole is a form of "artificial" intelligence, let's call it "EI" and thus we face an "EI alignment" problem. As EA becomes more powerful in the world, we get "ESI" (effective altruism superhuman intelligence) and related risks from misaligned ESI.

The obvious solution in my opinion is the same for AI and EI: don't maximize, since the metric you might aim to maximize is most likely imperfectly aligned with true utility. Rather satisfice: be ambitious, but not infinitely so. After reaching an ambitious goal, check if your reward function still makes sense before setting the next, more ambitious goal. And have some human users constantly verify your reward function :-)

RyanCarey @ 2023-06-21T12:36 (+14)

This could explain the psychology of followers, but I'm not sure it speaks to the psychology of people who lead the cults, and how we can and should deal with these people.

Habryka @ 2023-06-21T16:41 (+18)

My current model is that the "leaders" are mostly the result of the same kind of dynamic, though I really don't feel confident in this. 

At least in my experience of being in leadership positions, conformity pressures are still enormously huge (and indeed often greater than being in less of a leadership position, since people apply a lot more pressure what you believe and care a lot more about shaping your behavior). 

Indeed, I would say that one of the primary predictions of my model is that solo-extremists are extremely rare, and small groups of extremists are much more common. 

Karl von Wendt @ 2023-06-22T12:41 (+10)

I think a major contributing factor in almost all extremely unhealthy group dynamics is narcissism. Pathological narcissists have a high incentive to use the dynamics you describe to their advantage and are (as I have heard and have some anecdotical evidence, but can't currently back up with facts) often found in leading positions in charities, religious cults, and all kinds of organizations. Since they hunger for being admired and are very good at manipulating people, they are attracted to charities and altruistic movements. Of course, a pathological narcissist is selfish by definition and completely unable to feel anything like compassion, so it's the antithesis of an altruist. 

The problem is made worse because there are two types of narcissists: "grandiose" types who are easily recognizable, like Trump, and "covert" ones who are almost impossible to detect because they are often extremely skilled at gaslighting. My mother-in-law is of the latter type; it took me more than 40 years to figure that out and even realize that my wife had a traumatic childhood (she didn't realize this herself all that time, she just thought something was wrong with her). About 1-6% of adults are assumed to be narcisissts, with the real dangerous types probably making up only a small fraction of that. Still, it means that in any group of people there's on average one narcissist for every 100 members, and I have personally met at least one of them in the EA community. They often end up in leading positions, which of course doesn't mean that all leaders are narcissists. But if something goes badly wrong with an altruistic movement, narcissism is one of the possible reasons I'd look into first.

It's difficult to tell from a distance, but for me, it looks like SBF may be a pathological narcissist as well. It would at least explain why he was so "generous" and supported EA so much with the money he embezzled.

Muireall @ 2023-06-21T15:06 (+10)

Social dynamics seem important, but I also think Scott Alexander in "Epistemic Learned Helplessness" put his finger on something important in objecting to the rationalist mission of creating people who would believe something once it had been proven to them. Together with "taking ideas seriously"/decompartmentalization, attempting to follow the rules of rationality itself can be very destabilizing.

Sarah Levin @ 2023-06-23T19:53 (+8)

I think trying to figure out the common thread "explaining datapoints like FTX, Leverage Research, [and] the LaSota crew" won't yield much of worth because those three things aren't especially similar to each other, either in their internal workings or in their external effects. "World-scale financial crime," "cause a nervous breakdown in your employee," and "stab your landlord with a sword" aren't similar to each other and I don't get why you'd expect to find a common cause. "All happy families are alike; each unhappy family is unhappy in its own way."

There's a separate question of why EAs and rationalists tolerate weirdos, which is more fruitful. But an answer there is also gonna have to explain why they welcome controversial figures like Peter Singer or Eliezer Yudkowsky, and why extremely ideological group houses like early Toby Ord's [EDIT: Nope, false] or more recently the Karnofsky/Amodei household exercise such strong intellectual influence in ways that mainstream society wouldn't accept. And frankly if you took away the tolerance for weirdos there wouldn't be much left of either movement.

Toby_Ord @ 2023-06-24T10:32 (+52)

extremely ideological group houses like early Toby Ord's

You haven't got your facts straight. I have never lived in a group house, let alone an extremely ideological one.

Sarah Levin @ 2023-06-24T16:46 (+13)

Huh! Retracted. I'm sorry.

DC @ 2023-06-23T21:59 (+7)

I think trying to figure out the common thread "explaining datapoints like FTX, Leverage Research, [and] the LaSota crew" won't yield much of worth because those three things aren't especially similar to each other, either in their internal workings or in their external effects.

 

There are many disparate factors between different cases. The particulars of each incident really matter lest people draw the wrong conclusions. However I think figuring out the common threads insofar as there are any is what we need, as otherwise we will overindex on particular cases and learn things that don't generalize. I have meditated on what the commonalities could be and think they at least share what one may call "Perverse Maximization", which I intend to apply to deontology (i.e. the sword stabbing) as well as utilitarianism. Maybe there's a better word than "maximize". 

I think I discovered a shared commonality between the sort of overconfidence these extremist positions hold and the underconfidence that EAs who are overwhelmed by analysis paralysis hold: a sort of 'inability to terminate closure-seeking process'. In the case of overconfidents, it's "I have found the right thing and I just need to zealously keep applying it over and over". In the case of underconfidents, it's "this choice isn't right or perfect, I need to find something better, no I need something better, I must find the right thing". Both share an intolerance of ambiguity, uncertainty, and shades of gray. The latter is just more people suffering quiet misery of their perfectionism or being the ones manipulated than it ending up in any kind of outward explosion.

Nathan Young @ 2023-06-21T08:17 (+5)

And the short answer to that is “well, the group standard is what everyone else believes the group standard is”. And this is the exact context in which social miasma dynamics come into play. To any individual in a group, it can easily be the case that they think the group standard seems dumb, but in a situation of risk aversion, the important part is that you do things that look to everyone like the kind of thing that others would think is part of the standard.

 

  1. Combat general social miasma dynamics (e.g. by running surveys or otherwise collapsing a bunch of the weird social uncertainty that makes things insane). Public conversations seem like they should help a bunch, though my sense is that if the conversation ends up being less about the object-level and more about persecuting people (or trying to police what people think) this can make things worse. 

My general sense is that we can do a lot to surface what we as a community actually believe and that will help a lot here. 

eg Polls like this where we learn what the median respondent actually thinks https://viewpoints.xyz/polls/ea-strategy-1/analytics 

ChanaMessinger @ 2023-06-22T13:50 (+4)

Yeah, I'm confused about this. Seems like some amount of "collapsing social uncertainty" is very good for healthy community dynamics, and too much (like having a live ranking of where you stand) would be wildly bad. I don't think I currently have a precise way of cutting these things. My current best guess is that the more you push to make the work descriptive, the better, and the more it becomes normative and "shape up!"-oriented, the worse, but it's hard to know exactly what ratio of descriptive:normative you're accomplishing via any given attempt at transparency or common knowledge creation.

Nathan Young @ 2023-06-22T15:38 (+1)

exactly what ratio you're accomplishing.

Sorry what do you mean here? With my poll specifically? The community in general?

ChanaMessinger @ 2023-06-22T15:39 (+2)

Ratio of descriptive: "this is how things are" to normative: "shape up"

Joseph Lemien @ 2023-06-21T14:42 (+2)

I'm nervous about non-representative samples (such as only polling people who are active on Twitter) to strongly inform important decisions, but in general I am supportive of doing some polling to learn what people tend to think. I'd rather have some rough data on what people want than simply going by "vibes" and by what the people I read/talk to tend to say.

Evan_Gaensbauer @ 2023-06-21T04:42 (+4)

Do you intend to build on this by writing about the roles played by those other dynamics you mentioned but didn't write about in this post?

Jeffrey Kursonis @ 2023-06-22T20:58 (+1)

I have many years of being in multiple kinds of religious communities and I can say I've seen this kind of dynamic...this is great thinking. 

I remember reading about a group of people early in Alameda, before FTX started and that had a good number of EA's among them, who stood up against SBF and demanded a better contract and he dismissed them and they all left...one can't help but think that helped create an evaporative concentration of people willing to go along with his crazy.

As an aside, for pondering, the evaporative cooling dynamic is something I experience in cooking almost daily where you reduce the liquid in order to make it a sauce with more concentrated flavors...they always refer to this as reduction, or a reduction sauce. In this case it's a good thing...but knowing this dynamic in cooking helped me understand your argument.