80,000 Hours: Anonymous contributors on EA movement growth
By Aaron Gertler đ¸ @ 2020-02-18T00:09 (+33)
This is a linkpost to https://80000hours.org/2020/02/anonymous-answers-effective-altruism-community-and-growth/
The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes donât represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think itâs valuable to showcase the range of views on difficult topics where reasonable people might disagree.
This entry is most likely to be of interest to people who are already aware of or involved with the effective altruism (EA) community.
Whatâs your current view on effective altruism (EA) movement growth? Should it grow faster, or slower? Should it be broader, or narrower?
Effective altruism should be broad
We should be aiming for 100% of everybody. Everyone who is a philanthropist should know what effective altruism is â but almost no philanthropists I talk to outside of EA know what it is currently. Theyâve never heard of 80,000 Hours.
I worry about EAs becoming disinterested in broadening the movement. I think the fact that EA Global was bigger four years ago than it is today is a mistake. I think the effective altruism community needs to be more than a few thousand talented passionate people in order to achieve its goals.
It should stay narrow
I think effective altruism should stay pretty narrow. I think itâs very hard to manage a very large movement. I think EA is still partly figuring out what it is. So, I think we should be aiming for something like the 90th percentile of altruistic people and 99th percentile of high-performing contributors to their fields or areas of study â thatâs quite a narrow segment of society.
I think it would be good if we were growing a bit faster than we were. I feel like the optimal is steady, exponential growth â where steady means not slow to the point where people start feeling bored or unexcited. And I worry that weâre currently maybe at the low end of the good range.
I definitely donât think we should be going for an all-out broad, environmentalist-esque mass movement.
It should focus on having a great culture first
Rather than growth, the thing Iâd most want to alter is effective altruist culture. I want anyone in EA to think âwow, this is such a great community to be part of â I think this is great!â, rather than feeling really ambivalent, or even stressed, or finding themselves often annoyed at other people.
I think thereâs stuff that could change that would make EA feel more like that first description. One way would be cultivating an attitude of celebration and welcomingness. Iâve heard people talking about how earning to give was âdenigratedâ in EA, that no one should be getting career capital, things like that. And I think thatâs evidence of both how ideas about what people ought to do spread in EA, and about how sensitive people are to them. So it gets very exaggerated.
Whereas if we could create a community atmosphere which is like âoh, youâre a school teacher who donates 10% of your income to AMF? We love you! Thank you for being in this world and doing so much good â Iâm honoured to be in a community with youâ.
I feel like that would be a healthier, less stressful community, where people made better decisions. Compared to the current situation, where people can feel that unless youâre working for a handful of organisations that are focused on short AI-timelines â you are basically worth nothing.
We do select for people who are very scrupulous, and anxious, but maybe that just means that we have to work much harder to counter those impressions. I mean, EA is just like âwhatâs the thing you can do thatâs the optimal, most important thing?!â â itâs kind of a stressful idea. Perhaps we could do more to counteract that.
I think the ideal EA community would be a place where everyone really promising who finds it, likes it. I donât think EA is that right now. Thatâs also a really high bar to hit, thatâs hard. But thatâs what I care about more than how fast EA is growing.
I think that the bad aspects of EA might be putting people off of these ideas by making them seem kind of cold, nerdy, and dismissive. So, broad vs. narrow isnât how Iâd think about this. Iâd want to focus more on changing the tone of EA, putting out ideas in a way that seemed more helpful, and then Iâd be much more likely to think it was good if it grew faster.
I want us to have somewhat better infrastructure for making people who are joining feel slightly held, rather than lost.
I think the issue of âitâs hard to get jobs in EAâ is a sore point, and I worry it could be exacerbated further if growth continues without improving that situation. But if we have better things there, then growth seems great.
It should take a higher variance approach to recruitment
Iâd be doing more to recruit people with more diverse mindsets.
I think I might take a higher variance approach. There are all these EAs who say âas soon as I heard about EA, I was instantly an EAâ. So thatâs great, you can get those people quickly and easily. You just need to make sure theyâve heard of the definition of effective altruism, and had the chance to get a copy of Famine, Affluence, and Morality.
And then thereâs another group of people who are sympathetic to EA principles, but arenât as naturally drawn. And they might have talents that seem really useful to the community.
So I might have a two-pronged strategy where I, i) focus on the easy cases, ii) make sure we pick up the people who arenât as naturally drawn, but who have skills we really need.
I think you could have an approach that was less focused on medium-level engagement.
If you have one EA person who has a skill thatâs really in demand in the community â get that person to work on recruiting more people like them. For people who arenât immediately drawn to EA, itâs really useful to get the message from someone who is similar to them.
I donât think natural EA enthusiasts â young, nerdy, philosopher types â are very good at convincing people who arenât like that.
You could eventually have this system of a person convincing someone whoâs a little bit farther, and then that person convincing someone else whoâs a little farther away still â and emphasise to everyone along the way how valuable a role they can play in movement building and recruitment.
We should be wary of committing to one direction
I think weâve got a problem here. The right thing to do given short AI timelines is not the right thing to do given longer AI timelines.
If AI timelines are short, then we probably shouldnât be focused on becoming a mass movement. There are only so many things you can do at once, and some aspects of being a mass movement are competitive with short timelines. In particular, if almost every EA leader is working extremely hard on a short AI timeline scenario, then most public outreach is going to end up being quite deceptive. Itâs probably not going to say âhey, this is a movement of people who think the world as we know it wonât exist in 10 yearsâ.
If AI timelines are not short, EA should be focusing on becoming a mass movement. I think movements that donât put a lot of energy into successfully handing down ideas to younger people, and transferring expertise â end up ceasing to exist.
But because weâre unsure, we end up being ambivalent on this question. And I think thatâs bad in some ways, although it might be better than committing to one direction or the other.
Iâm really happy about the switch that happened from expanding as quickly as possible, to expanding carefully in high-fidelity ways. Iâm now thinking that maybe it needs to tweak slightly back in the other direction, but Iâm not sure.
We should consider people who arenât ready for a big commitment
I know some people have said, âall movements have a spectrum of involvement, and thatâs going to happen with EAâ. But it is possible to say âwell, there are certain criteria that qualify you as an EAâ. And then if you havenât met those, you could say youâre an aspiring EA. If you havenât quite donated 10% of your income or free time effectively, or moved to an effective career, etc.
I know some people think âeffective altruismâ as a term is too presumptuous, and that everyone should call themselves âaspiring EAsâ. I personally think thatâs too modest. Because I think there are a lot of people who are doing really good work.
But the term might work for someone who says âIâm on board with the ideas, but Iâm only donating 1%. I havenât quite made the shiftâ. What I say to them is, thereâs this whole field called diffusion of innovations. What they found is that when people change their minds, they may not change their behaviour for a year, or more. Thatâs normal. And we shouldnât be frustrated with people who havenât made this bigger shift yet.
This distinction would allow you to pull people in who might not be ready for a big commitment, without diluting the active EA community.
We should reach out to influential people
I think EAs should focus more on reaching out to people who already have influential positions.
We should want more exposure for both good and bad ideas
A big part of me intuitively thinks that, insofar as these ideas are correct, I want more people to encounter them. And even if theyâre wrong, I want more people to encounter them â because thatâs a good way of getting rid of bad ideas. I think itâs generally good to shine light on ideas. So thatâs an argument for getting more exposure for the ideas, and then letting that affect growth one way or the other.