What high-level change would you make to EA strategy?

By Nathan Young @ 2021-11-04T16:31 (+39)

Imagine you are a powerful person within EA. Maybe a large donor or someone with great connections. What high level change would you make in EA strategy? 


Nathan Young @ 2021-11-04T16:35 (+40)

I would do research on whether EA should be a narrow movement of people making significant impact, or a much broader one of shallower impact. It feels like this has been talked about for years, but I've not seen actual research so it seems we are going to sleepwalk into making no decision here. 

Greg_Colbourn @ 2021-11-07T13:51 (+22)

I've sometimes wondered whether it would be good for there to be a distinct brand and movement for less hardcore EA, that is less concerned with prestige, less elitist, more relaxed, and with more mainstream appeal. Perhaps it could be thought of as the Championship to EA's Premier League. I think there are already examples, e.g. Probably Good (alternative to 80,000 Hours), TLYCS and OFTW (alternatives to GWWC), and the different tiers of EA investing groups (rough and ready vs careful and considered). Places where you feel comfortable only spending 5 minutes editing a post, rather than agonising about it for hours; where you feel less pressure to compete with the best in the world; where you are less prone to analysis paralysis or perfect being the enemy of the good; where there is less stress, burnout and alienation; where ultimately the area under the impact curve could be comparable, or even bigger..? Perhaps one of the names mentioned here could be used.

[Note I expect pushback on this,  and considered posting anonymously, but I'm posting in the spirit of the potential broader movement. Apologies if I've offended anyone by insinuating they are "only" Championship material. That was not my intention - the Championship is still a very high standard in absolute terms!]

Greg_Colbourn @ 2021-11-07T14:51 (+2)

Some related discussion here.

James Ozden @ 2021-11-05T09:41 (+14)

I'm planning on doing research not too far off this! 

Specifically - researching previous/current social movements (climate, anti-nuclear, civil rights, etc.) and trying to understand how effective mass social movement organisations are or could be. My research will be focused more on the use of protest within movements but I can imagine there will be some overlap. Some things I'm hoping to find are:

  • Does increasing public attention lead to increased public support?
  • Does increasing public support lead to positive policy change?
  • When to be a mass-movement vs when to stay more targeted 
  • The role of different actors (activists, policymakers, NGOs) at different times
  • What are some internal and external factors that might dictate movement success?
  • What is the base rate for a successful mass movement?

Going to post some research I've been doing on Extinction Rebellion and the climate movement in the next couple of weeks so stay tuned! 

MichaelA @ 2021-11-07T13:10 (+6)

I agree that questions in this vicinity seem worth someone doing more work on. 

Some relevant prior discussion can be found in the Bibliography, tagged posts, and "Discussion" page here: https://forum.effectivealtruism.org/tag/value-of-movement-growth 

casebash @ 2021-11-05T13:58 (+3)

This might actually vary by cause area. If it did, I wonder if it would be possible for different causes to adopt different stances or whether that's just not going to be possible.

Stefan_Schubert @ 2021-11-04T17:29 (+35)

I would consider buying more companies, think tanks, media, etc, depending on opportunities.

casebash @ 2021-11-05T13:58 (+5)

You can buy think tanks?

Simon_Grimm @ 2021-11-06T14:32 (+4)

Is there a historical precedent for social movements buying media? If so, it'd be interesting to know how that influenced the outlet's public perception/readership.

As of now, it seems like movements "merely" influence media, such as the NYTimes turning more leftward in the last few years or Vox employing more EA-oriented journalists.

Nathan Young @ 2021-11-05T11:11 (+4)

Stefan, can you use the link in the top right hand corner to turn this into an answer (not a comment) so that people are more likely to see it.

Stefan_Schubert @ 2021-11-05T11:48 (+2)

Alright, done.

Nathan Young @ 2021-11-04T18:02 (+3)

Certainly funding current think tanks to write more EA articles seems cheaper than I expected. I reckon you could get significant shifts for like $50k for small think tanks.

Nathan Young @ 2021-11-04T16:34 (+22)

I would have a set of on polls on EA strategy which refresh every month on this forum. 

"What is the pain points in your EA experience"
"What is the biggest underrated cause area"
etc.

Nathan Young @ 2021-11-04T16:38 (+16)

I would test new governance structures in upcoming EA orgs. 

I am concerned that EA could become slow moving over time in the same way most large movements do if there isn't experimentation in governance.

In some way, it seems inappropriate to say that institutional decision making is a cause area when I don't see innovation within EA institutions.
 

lukasberglund @ 2021-11-09T23:20 (+1)

Is there evidence/theoretical reason to believe that not experimenting in governance leads a movement to become slow over time?

toonalfrink @ 2021-11-05T22:10 (+14)

I would train more grantmakers. Not because they're necessarily overburdened but because, if they had more resources per applicant, they could double as mentors.  

I suspect there is a significant set of funding applicants that don't meet the bar but would if they received regular high-quality feedback from a grantmaker.

(like myself in 2019)

MarcSerna @ 2021-11-04T23:03 (+12)
  1. Collectivize operations of EA organizations eg. do all hiring or Compliance by an operations team that helps many orgs.

  2. found two replicas of Open Philantropy Foundation with teams with slightly different opinions on key issues

  3. an EA humanitarian agency applying reason to respond to the world's worst humanitarian crises. We could call it Emergency Aid.

Nathan Young @ 2021-11-05T11:09 (+7)
  1. I don't know that I'd want to see complete centralisation but I'd like to see an org which allows EAs to apply once to many application processes at once.
MarcSerna @ 2021-11-05T13:36 (+1)

Yes that is the idea!

Great post idea, maybe you can refloat it every year.

casebash @ 2021-11-05T14:00 (+6)

Great ideas. I definitely think that EA could gain a lot of credibility by successfully responding to major crises.

Nathan Young @ 2021-11-05T11:10 (+4)

2. This seems like a really good idea that would avoid bad outcomes.

Yonatan Cale @ 2021-11-08T19:01 (+11)

People are too afraid of stepping on the toes of the central organizations, afraid of moving and breaking something, and as far as I understand - even these central organizations don't want this situation.

This (and similar things) make new projects go really slowly, lots of reviews for every step.

I think we're not calibrated well on [moving carefully] vs [moving fast].

("meta": I am going to post this drafty unconcise comment as is!)

sberens @ 2021-11-04T21:26 (+7)

This is not a massive scale change, but it is something that can have a large effect nonetheless in my opinion. 

I am currently a co-president of an EA club, and from my experience it seems every EA club has to start from scratch and figure most things out for themselves. 

It would be very useful if there was some centralized repository of best practices for EA clubs that you could follow for optimal engagement/effect. Some specific areas I'm thinking about are meeting topics, organizational structure, EA speakers, etc...

One great initiative is the virtual EA programs that we can direct students to for a vetted curriculum.

This centralization would also be useful for other clubs that are trying to make some topic popular.

Aaron Gertler @ 2021-11-04T22:51 (+14)

Have you had any support calls with CEA's groups team and/or referenced the resources on the EA Hub? They have a large repository of best practices, a big speaker database, etc.

Clubs mostly had to start from scratch ~5 years ago, but these days, there's so much more collective experience that you should be able to find good advice on just about anything.

markus_over @ 2021-11-08T15:28 (+3)

One thing I could imagine being very helpful is some kind of ongoing local group "mentoring". So instead of one or two single calls on strategy or bottlenecks, having some experienced person more deeply invested with any particular local group in need. Somebody who might (occasionally) participate at our virtual meetups, our planning/strategy calls, gets to know our core members, our situation, needs and problems, and can provide actionable insights on all of them.

The problem with calls I've had in the past is that it's quite difficult to get accross everything relevant, so we might just focus on one or two issues, obtain some pointers to other people or resources who might be helpful, and some relatively generic advice. Not to say it isn't useful - but it also doesn't seem to be like a complete solution. I've also read most of the EA Hub resources on running a group, but tend to come out of these articles thinking "yup this makes sense" and not actually turning it into anything concrete. Which, again, is probably entirely my responsibility. But I could imagine I'm not the only time/energy constrained local group coordinator struggling with properly utilizing the existing resources.

On the other hand, such more involved support over longer time of course also comes with significantly higher cost, and I can't tell whether that would be worth it.

MaxRa @ 2021-11-06T10:29 (+4)

I don’t have much of an opinion yet, but heard these sentences on EAG that might be interesting to consider here:

Simon_Grimm @ 2021-11-06T14:53 (+4)

Caveat: I work in Biosecurity.

I agree with the last point. Based on Ben Todd's presentation at EAG,

  • 18% of engaged EAs work on AI alignment, while
  • 4% work on Biosecurity.

Based on Toby Ord's estimates in the Precipice,  the risk of extinction in the next 100 years from

  • Unaligned artificial intelligence is ∼ 1 in 10, while
  • the risk from engineered pandemics is ∼ 1 in 30.

So, the stock of people in AI is 4.5x higher than Biosecurity, while AI is only  3x as important.

There is a lot of nuance missing here, but I'm moderately confident that this dysbalance warrants more people moving into Biosecurity. Especially now that there we're in a moment of high traceability concerning pandemic preparedness.

Yonatan Cale @ 2021-11-08T21:41 (+1)

We have a large and growing amount of documentation about how to measure impact, but we have too few potentially-impactful organizations (outside the main EA geographical locations. specifically I'm thinking about Israel, where I live, which hardly has EA opportunities).

I'd move some people towards searching for more of these organizations  instead of writing more documentation.

Most people who are looking for a job would prefer concrete suggestions (a job board) rather than documentation on how to conduct a complicated search. It's also more efficient for the community as a whole.

Yonatan Cale @ 2021-11-08T18:32 (+1)

I'd invest more in rationality (lesswrong / cfar / ..)