Has EA given up on politics?

By jacquesthibs @ 2026-01-18T18:16 (+26)

Despite politics as a whole not being neglected, I’m surprised that there seems to be absolutely no effort from EA to improve the geopolitical situation of the world, along with the culture war.

To my surprise, it was a prominent LessWrong mod who made a post about the topic and considered it a strong priority for 2026.

Is it because EAs feel helpless in addressing this problem? Do they think it’s simply not neglected enough to be worth the impact? Are they avoiding it in order to survive politically with respect to AI? Do they consider it a problem at all?


Benton 🔸 @ 2026-01-19T00:06 (+13)

When I first joined the forum early last year, I was also surprised that politics seems neglected in EA circles. Though I still think the current geopolitical situation is an incredibly important issue (perhaps the most important due to how many other issues it affects), I unfortunately don’t think it’s very tractable. Maybe I’m missing something, but I really don’t see anything a niche community can do to improve the complex situation that is the current political climate. I imagine most EAs think the same. 

Ebenezer Dukakis @ 2026-01-19T06:32 (+2)

I expect that very novel approaches, like as described in my old post Using game theory to elect a centrist in the 2024 US Presidential Election, could be more tractable.

titotal @ 2026-01-19T11:12 (+17)

I think the problem here is that novel approaches are substantially more likely to be failures due to being untested and unproven. This isn't a big deal in areas where you can try lots of stuff out and sift through them with results, but in something like an election you only get feedback like once a year or so. Worse, the feedback is extremely murky, so you don't know if it was your intervention or something else that resulted in the outcome you care about. 

David T @ 2026-01-19T23:14 (+6)

Also failures trying to do really outlandish things like bribing Congresspeople to endorse Jim Mattis as a centrist candidate in the 2024 US Presidential Election are likely to backfire in more spectacular ways than (say) providing malaria nets for a region with falling malaria or losing a court case against a factory farming conglomerate. That said, this criticism does apply to some other things EAs are interested in, particularly actions purportedly addressing x-risks.

Ebenezer Dukakis @ 2026-01-20T14:48 (+2)

If each election is a rare and special opportunity to collect a bit of data, that makes it even more important to use that data-collection opportunity effectively.

Since we are looking for approaches which are unusually tractable, if effectiveness looks extremely murky, that's probably not what we wanted.

Joseph_Chu @ 2026-01-18T18:48 (+12)

We tried earlier. Carrick Flynn received substantial support from EA and the result was mediocre, with criticisms of EA actually having a negative effect on his campaign, as people pointed out the connection to the "billionaires and techbros" who apparently fund EA and such.

Also, the head of RAND, Jason Matheny, is an EA, and there's some connections between EA and the American NatSec establishment. CSET for instance was funded partly by OpenPhil. There is a tendency among a lot of EAs is to try not to be partisan and mostly support effective governance and policy kind of things.

That being said, Dustin Moskovitz, the billionaire who is the main donor behind what was previously called Open Philanthropy and is now Coefficient Giving, has donated significantly and repeatedly to Democrats. OpenPhil has historically been by far the largest funder of EA stuff, particularly since SBF fell from grace, so Dustin's contributions can be seen tacitly as EA support for the Dems.

So, I don't think it's accurate to say EAs have made absolutely no effort on this front. We have, and it has stupidly backfired before and we're in this very awkward position politically where the whole TESCREAL controversy makes the EA brand tarnished to the Left, even though past surveys have shown that most rank and file EAs are centre-left to left. It's a frustrating situation.

jacquesthibs @ 2026-01-18T18:55 (+4)

So, I don't think it's accurate to say EAs have made absolutely no effort on this front.

Thanks for the comment. I’m aware of the situations you mentioned and did not say that EA had not previously put effort into things. In fact, my question is essentially “Has EA given up on politics (perhaps because things went poorly before)?”

Also, note that I am not exactly suggesting pushing for left-wing things. Generally remedying the situation may need to go beyond trying to get one person in elected office. In fact, I think that such a bet would be unambitious and fail to meet the moment.

Jackson Wagner @ 2026-01-19T08:58 (+9)

There is a very substantial "abundance" movement that (per folks like matt yglesias and ezra klein) is seeking to create a reformed, more pro-growth, technocratic, high-state-capacity democratic party that's also more moderate and more capable of winning US elections.  Coefficient Giving has a big $120 million fund devoted to various abundance-related causes, including zoning reform for accelerating housing construction, a variety of things related to building more clean energy infrastructure, targeted deregulations aimed at accelerating scientific / biomedical progress, etc. https://coefficientgiving.org/research/announcing-our-new-120m-abundance-and-growth-fund/

You can get more of a sense of what the abundance movement is going for by reading "the argument", an online magazine recently funded by Coefficient giving and featuring Kelsey Piper, a widely-respected EA-aligned journalist: https://www.theargumentmag.com/

I think EA the social movement (ie, people on the Forum, etc) try to keep EA somewhat non-political to avoid being dragged into the morass of everything becoming heated political discourse all the time.  But EA the funding ecosystem is significantly more political, also does a lot of specific lobbying in connection to AI governance, animal welfare, international aid, etc.

OllieBase @ 2026-01-20T12:20 (+4)

I think most answers here are missing what seems the most likely explanation to me: the people who are motivated by EA principles to engage with politics are not public about their motivations or affiliations with EA. Not just because the EA brand is disliked by some political groups, but it seems generally wise to avoid having strong idealogical identities in politics beyond motivations like "do better for my constituents".

Ozzie Gooen @ 2026-01-20T21:34 (+2)

Quick things:
1. There are some neat actions happening, but often they are behind-the-scenes. Politics tends to be secretive. 
2. The work I know about mostly falls into work focused on AI safety and bio safety. There's some related work trying to limit authoritarianism in the US. 
3. The funding landscape seems more challenging/complex than with other things. 

I think I'd like to see more work on a wider scope of interventions to do good via politics. But I also appreciate that there are important limitations/challenges here now. 

huw @ 2026-01-19T04:01 (+2)

I think maybe a little bit of nuance is lost when just saying ‘electoral politics isn’t neglected and might be quite hard’—that’s not the EA response to large global health issues, or existential risks. It’s just that once you get down to brass tacks, most Western political systems are pretty easy to buy your way into, and it’s substantially cheaper to effect meaningful piecemeal change by paying for lobbyists.

You only need electoral politics when trying to undertake massive political/ideological shifts (see: the kochs/mercers shifting the U.S. to a sort of anarcho-capitalism), and fundamentally, most EAs are on the centre-left and don’t see these kinds of changes as desirable.

(You can see this in the LessWrong post you linked, most of the post and replies are proposing exactly what Kamala Harris did in 2024 and lost doing)

((Vastly oversimplifying but I hope it provides some nuance that the other answers are missing))

Dylan Richardson @ 2026-01-20T02:31 (+1)

I accept that political donations and activism are among the best ways to do good as an individual. 

But it is less obvious that EA as an academic discipline and social movement has the analytical frameworks that suit it to politics - we have progress studies and the abundance movement for that.

It is of course necessary for political donations to be analysed as trade offs against donations to other cause areas. And there's a lot of research that needs doing on the effectiveness of campaign donations and protest movements in achieving expected outcomes. And certain cause areas definitely have issue-specific reasons to do political work.

 But I wouldn't want to see an "EA funds for Democrats" or a "EAs Against Trump" campaign.

Vaipan @ 2026-01-19T14:00 (+1)

It is interesting how so many EA think of EA as an 'apolitical' movement, e.g. that EA is beyond left and right because it's data-driven and not ideology-driven. 

That does not make sense to me. Personally I'm an opportunist. When the Tories create the AISI, it's politics. When the left endorses campaigns promoting animal welfare and plant-base options, it's politics. When CoefficientGiving works on land reforms, it's politics. 

I like to think in terms of cause-area and which party is the most well-placed to push for progress in these causes; which means I'm ready to collaborate with everyone who advocates for sensible things.