What Makes Outreach to Progressives Hard

By Cullen 🔸 @ 2021-03-14T00:32 (+301)

This post summarizes some of my conclusions on things that can make EA outreach to progressives hard, as well as some tentative recommendations on techniques for making such outreach easier.

To be clear, this post does not argue or assume that outreach to progressives is harder than outreach to other political ideologies.[1] Rather, the point of this post is to highlight identifiable, recurring memes/thought patterns that cause Progressives to reject or remain skeptical of EA.

My Background (Or, Why I am Qualified to Talk About This)

Nothing in here is based on systematic empirical analysis. It should therefore be treated as highly uncertain. My analysis here draws on two sources:

  1. Reflecting on my personal journey as someone who transitioned from a very social-justice-y worldview to a more EA-aligned one (and therefore understands the former well), who is still solidly left-of-center, and who still retains contacts in the social justice (SJ) world; and
  2. My largely failed attempts as former head of Harvard Law School Effective Altruism to get progressive law students to make very modest giving commitments to GiveWell charities.

Given that the above all took place in America, this post is most relevant to American political dynamics (especially at elite universities), and may very well be inapplicable elsewhere.[2]

Readers may worry that I am being a bit uncharitable here. However, I am not trying to present the best progressive objections to EA (so as to discover the truth), but rather the most common ones (so as to persuade people better). In other words, this post is about marketing and communications, not intellectual criticisms. Since I think many of the common progressive objections to EA are bad, I will attempt to explain them in (what I take to be) their modal or undifferentiated form, not steelman them.

Relatedly, when I say "progressives" through the rest of this post, I am mainly referring to the type of progressive who is skeptical of EA, not all progressives. There are many amazing progressive EAs, who do not see these two ideologies to be in conflict whatsoever. And many non-EA progressives will believe few of these things. Nevertheless, I do think I am pointing to a real set of memes that are common—but definitely not universal—among the American progressive left as of 2021. This is sufficient for understanding the messaging challenges facing EAs within progressive institutions.

Reasons Progressives May Not Like EA

Legacy of Paternalistic International Aid

Many progressives have a strong prior against international aid, especially private international aid. Progressives are steeped in—and react to—stories of paternalistic international aid,[3] much in the way that EAs are steeped in stories of ineffective aid (e.g., Playpumps).

Interestingly, EAs and progressives will often (in fact, almost always) agree on what types of aid are objectionable. However, we tend to take very different lessons away from this.

EAs will generally take away the lesson that we have to be super careful about which interventions to fund, because funding the wrong intervention can be ineffective or actively harmful. We put the interests of our intended beneficiaries first by demanding that charities demonstrably advance their beneficiaries' interests as cost-effectively as possible.

Progressives tend to take a very different lesson from this. They tend to see this legacy as objectionable due to the very nature of the relationship between aid donors and recipients. Roughly, they may believe that the power differential between wealthy donors from the Global North and aid recipients in developing countries makes unobjectionable foreign aid either impossible or, at the very least, extremely difficult. They may therefore prefer aid frameworks in which parties approach each other more as equals[4] or in which there is high-context transfer of feedback from recipient to donors. Of course, these heuristics will tend to privilege interventions within existing communities, and be harder to deploy internationally—hence progressives' skepticism of foreign aid. The fact that this in effect entirely cuts off the world's poorest people from aid at all counts for very little in the progressive worldview, probably as a result of the act-omission distinction: the bad to be avoided is paternalistic international aid, and simply abstaining from international aid is an easy way to do that.

The Oppression Worldview

Modern progressivism focuses a lot on oppression, which may be defined (from their perspective) as social systems that cause equally-worthy groups to receive preferential treatment or receive disparate rewards.

For reasons that elude my comprehension, many progressives do not seem to conceptualize the current assortment of economic and legal policies that cause some countries to be ~100x richer than others to be a relevant form of oppression. If they do, they are unlikely to give it as high a priority as, e.g., within-country racial disparities or within-country economic inequality.

A full analysis of why, exactly, global poverty is often not treated as a leading form of injustice by many progressives (as evidenced by the comparatively few progressive resources that go towards it) seems very valuable, and I cannot yet provide it. But I do feel confident in saying that to many progressives, global poverty is apparently a non-central example of oppression, or a lower-priority one.[5]

Econ Aversion

Many progressives are skeptical of the tools of modern economics, believing them (inaccurately, in my view) to play a central role in legitimating domestic income inequality and other maladies. This is probably due to domestic political tendencies for the right to emphasize the value of markets and economic growth more than liberals (who tend to focus more on economic equality). Thus, they may tend to have a negative reaction to EAs relying on economic concepts and tools, including things like cost-benefit analyses, marginal thinking, and QALYs. They may also distrust interventions that leverage market forces or promote economic growth as such. They may tend to believe, despite evidence from economic history, that extreme poverty is solely the result of past injustices, which may have implications for how we ought to understand our moral obligations to the global poor. They are also very hesitant to accept that global poverty is much worse than domestic poverty in extent and severity, which leads to a larger focus on the latter.

Diversity, Equity, and Inclusion (DEI) Issues

Progressives see shared identity as very important to understanding and advocating for the interests of a group. If a group claims to be advocating for some group X, but lacks a member of X in its leadership, this will make progressives very suspicious. Specifically, when EAs purport to advocate for members of the global poor, but our leadership lacks people from the world's poorest countries, they are immediately skeptical that we actually do have their best interests in mind, or can effectively advocate for them. This and the Legacy of Paternalistic International Aid (see above) reinforce each other.

Incompatibility Between Intersectionality and Prioritization

Intersectionality is one of the dominant frameworks on the progressive left for understanding and advocating for social change. The academic and popular uses of intersectionality differ, but the slogan "[t]here is no such thing as a single-issue struggle, because we do not live single-issue lives"[6] captures much of how this is currently understood and used in progressive spaces.

Intersectionality thus implies a strong anti-prioritization framework—or at least a hesitancy to engage in prioritization. Intersectionality implies that narrow prioritization (e.g., an AIDS charity prioritizing education and condom distribution over ART) is pro tanto objectionable insofar as it fails to consider, and allocate equal resources to, the differing needs of all members of a population.

Systemic Change and a Preference for State Action

Seasoned EAs will no doubt be aware that our critics on the left are some of the biggest proponents of the systemic change objection to EA. Progressives seem more likely to believe that major problems can or should only be solved through dramatic restructuring of society, in ways that EAs may be skeptical by default of for a variety of reasons. And when both agree on the need for some form of systemic change, they may often disagree on what that should look like.

Ordinal Speciesism

Some people on the progressive left (especially in the US, it seems) are averse to animal advocacy due to what I will call "ordinal speciesism": the belief that prioritizing animal welfare over human welfare is objectionable. Consider the following quotes from this article (which I just selected arbitrarily because it seemed pretty representative of views I see):

White vegans’ priority is the top layer of veganism–animal exploitation, but they ignore the socio-economic impact that comes from the movement becoming more popularized. Some white vegans even go as far to compare historical genocides that have affected BIPOC to the workings of the meat and dairy industries. . . . Veganism can only be about the liberation of animals when it also stops the oppression of people.

The idea that the two can be meaningfully analyzed separately, and that it may be appropriate to prioritize animal welfare over human welfare, is anathema to this worldview, apparently.

Population Ethics

EAs tend to reject person-affecting views of population ethics. This, however, has uncomfortable implications for some hot-button issues on the left, like reproductive rights and environmental ethics.

Guesses at How To Improve Messaging to Progressives

I am not a messaging expert, and have not had any overwhelming success at getting progressives more interested in EA. With that said, here are some of my guesses at what a more progressive-friendly approach to EA messaging could look like. Of course, this does not consider important tradeoffs, such as the potential for alienating other audiences. This will therefore be most useful to people whose primary audience is progressives.

Intervene Early

I would consider progressive-friendly messaging from the outset of any public-facing communications, not as a band-aid to be deployed in response to criticisms from the left. First impressions are really important, and so starting messaging with things that are, at the very least, not off-putting to progressives should help advance conversation without as much negative reaction.

Develop and Highlight Community Feedback Mechanisms

As a fairly welfarist and quantitative bunch, internal EA discussion on charity evaluation focuses a lot on cost-benefit analyses and much less on qualitative factors that either inform or complement such analyses when making final recommendations. I don't think this is substantively wrong, but I do think it can give the impression that EAs care a lot about quantified spreadsheet inputs and not human factors like recipient's assessments of charities. The latter should not be simply treated as a "nice to have": if CEAs and users' assessments of a program differ dramatically, we can suspect that something has gone wrong, and end-users/recipients can be extremely valuable sources of feedback and suggestions for improvement.

I am not an expert on GiveWell's evaluation process, and am aware that they do do some of this already, but I still think EA as a community could benefit from maybe roughly doubling(?) our cultural attention to the existence and performance of community feedback mechanisms for human-facing charities. This has been a philosophical commitment since the early days of EA, yet information on how we (or the charities we prioritize) actually confirm with recipients that our programs are having the predicted positive impact on them receives, AFAICT, little attention in EA.[7] It may also be that many top charities simply don't have good user feedback mechanisms because donors don't demand them, in which case we should probably encourage more charities to develop them anyway. Mechanisms like accessible feedback hotlines and recipient ombuds may be worth exploring further.

A Digression on GiveDirectly

GiveDirectly is often highlighted as a standout charity on this point, for good reason: features like GDLive and their customer support centers (and, of course, their general model), generally make clear that they care deeply about trusting and receiving honest feedback from end-users. But to the extent that EAs point to GD when this objection is raised without caring about whether other GiveWell charities (which generally receive more funding) have similar mechanisms in place, it feels like a bit of a motte-and-bailey.

Use the Right Words/Framings

Many EA actions can be accurately framed in ways that are more palatable to a progressive worldview. I often remember this quote from a Yale EA as an example:

For me, taking the Giving What We Can pledge was an expression of my commitment to using my class privilege to contributed to a movement towards a more equitable world for current and future generations

Note how this isn't framed in terms of maximizing QALYs/dollar or generalized impact, but rather as "using class privilege" to achieve "a more equitable world." Not only is this still quite faithful to EA principles, but it's also much more palatable to a progressive audience. Similarly, EAs can reconsider framing global health/development work as working towards "global health justice," "global income inequality," or "global healthcare access" while also highlighting the tools we use to prioritize between interventions in those cause areas.

Improve DEI

While I think it's very easy to focus too much on DEI efforts at the expense of impact, I also think that improving DEI in leadership at global health charities—and especially inclusion of people from the recipient countries—can send a good signal about the relationship between the charity and the populations it intends to serve. Such leaders can probably also provide valuable perspective about the communities in which the charity is operating. At the very least, I think it poses a huge communications liability for a lot of these charities among Western progressive audiences.[8]

Bring Policy In Earlier

A common way to communicate about EA is to first talk about "finding the most cost-effective charities" or something similar, then explaining the true scope of our ambitions (including policy goals) only later. This mirrors its internal evolution from global health prioritization to the inclusion of animals and ultimately future generations. Policy interventions came pretty late in this evolution.

But as policy becomes an ever-larger part of the EA portfolio, this message makes less and less sense, and reinforces the perception of EA as averse to enacting systemic change. EA should figure out catchy messages about the types of policy work we support, as we have done for our charitable work.

Build Alliances

There are a lot of topics on which EA will have shared interests with typical progressive causes, like environmentalism, climate change, tax justice, welfare spending, immigrants' rights, incarceration reform, and pacifism. Where possible, EA groups should consider showing up for and helping to promote and organize events with common interests. This should enhance our credibility in those spaces.

Things We Shouldn't Do: Reduce Intellectual Rigor

I think there are serious problems with a lack of intellectual rigor and openness in many progressive spaces today. Despite being quite liberal, this is one reason I prefer EA spaces more to typical progressive ones. I think intellectual rigor remains vitally important to the project of EA, and nothing in this should be used to suggest that we should reduce our emphasis on that.


  1. Indeed, EAs tend to be more progressive/left-of-center than the general population. See this post ↩︎

  2. Inshallah. ↩︎

  3. Christian missionary work is often an archetypal example of this. The ABC approach to AIDS prevention may be another. ↩︎

  4. The rise of "mutual aid" as a framework for aid in leftist circles is an example of this. ↩︎

  5. As measured by revealed preferences in the form of comparative resource allocation. ↩︎

  6. Audre Lorde, Learning from the 60s, in Sister Outsider: Essays & Speeches by Audre Lorde 138 (2007). ↩︎

  7. As an example, after ten minutes of searching I could not find information on GiveWell's overall view on this subject on their website. ↩︎

  8. E.g., as far as I can tell, there's not a single person from sub-Saharan Africa on AMF's current staff, trustees, or Malaria Advisory Group. I think this is a pretty big optics liability for them among progressive audiences, independent of its substantive importance. ↩︎


MaxRa @ 2021-03-14T09:02 (+51)

Thanks for writing this, I think this topic is worthy of more discussion.

Of course, this does not consider important tradeoffs, such as the potential for alienating other audiences. This will therefore be most useful to people whose primary audience is progressives.

I wonder how much we should even recommend leaning into the progressive/social justice framing when the audience primarily comes from this ideological bent.

If I’d read this testimonial on the local EA website, there’d be a solid chance I‘d have been significantly less interested because it doesn’t connect to my altruistic motivations and (in my head) strongly signals a political ideology.

For me, taking the Giving What We Can pledge was an expression of my commitment to using my class privilege to contributed to a movement towards a more equitable world for current and future generations

I think some points you mention, like highlighting more that aid recipients’ feedback is strongly taken into account, don’t risk turning off non-social justice people while still connecting to their motivation and worries, so maybe I’d wish to see more of that kind.

Julia_Wise @ 2021-03-15T17:30 (+38)

I think "The Privilege of Earning to Give" by Jeff Kaufman (who I'm married to) helped bridge a gap between us and our non-EA friends, who tend to have much more standard leftist views than we do.

Max_Daniel @ 2021-03-17T11:20 (+43)

[I was sympathetic to common progressive/left-wing/social justice views before encountering EA. I'm from Germany, so my experience might not apply as much to the US.]

I'm wondering if another reason why some progressives may not like EA is a much more cynical prior about the intentions of powerful people and institutions, plus an unwillingness to update away from it or inability to identify evidence that would allow for such updates. 

E.g. it strikes me that before I encountered EA the only context in which I ever had heard about the Gates Foundation was in contexts where it was at least implied that obviously we should expect its activities to serve Gates's private interests rather than the common good. It requires some knowledge about the particulars of the Foundation's activities, and context to understand how it differs from the activities of other foundations, to come to a more sympathetic view. 

Max_Daniel @ 2021-03-17T11:25 (+35)

One explanation based on Haidt's Moral foundations theory would be:

I don't know much about how solid moral foundations theory is, and haven't thought much about how plausible I find this explanation or how much of the effect I'd guess it explains.

Jordan_Warner @ 2021-03-18T09:04 (+30)

I honestly think that the progressive movement increasingly values Loyalty (i.e. you're not a real  minority if you're politically conservative) and Sanctity ( saying the N-word or wearing blackface make white people "unclean" in a way that cannot fully be explained by the Care/Harm framework),  so if anything I think Haidt's Moral Foundations theory is more right than even Haidt suspected, the taboos and tribes of the Left are simply still being defined.

Max_Daniel @ 2021-03-18T13:21 (+5)

Interesting, yeah. That sounds at least partly right to me, though I don't know enough about moral foundations theory or current progressive discourse to have a strong take on how much I believe in this vs. other explanations for the observations you've described.

Benjamin_Todd @ 2021-03-16T12:08 (+34)

Thank you for this summary!

One thought that struck me is that most of the objections seem most likely to come up in response to 'GiveWell style EA'.

I expect the objections that would be raised to a longtermist-first EA would be pretty different, though with some overlap. I'd be interested in any thoughts on what they would be.

I also (speculatively) wonder if a longtermist-first EA might ultimately do better with this audience. You can do a presentation that starts with climate change, and then point out that the lack of political representation for future generations is a much more general problem.

In addition, longtermist EAs favour hits based giving, and that makes it clear that policy change is among the best interventions, while acknowledging it's very hard to measure effects, which seems more palatable than an approach highly focused on measurement of narrow metrics.

Stefan_Schubert @ 2021-03-16T13:18 (+55)

There might be a risk that some view the (very) long-run future as a "luxury problem", and that focusing on that, rather than short-term problems in your own country, reveals your privilege. (That attitude may be particularly common concerning causes like AI risk.) My guess is that people are less likely to have such an attitude towards someone who is focusing on global poverty. 

JoshYou @ 2021-03-17T01:39 (+38)

Longtermism isn't just AI risk, but concern with AI-risk is associated with a Elon Musk-technofuturist-technolibertarian-Silicon Valley idea cluster. Many progressives dislike some or all of those things and will judge AI alignment negatively as a result.

Jordan_Warner @ 2021-03-18T09:07 (+11)

I wonder if it's a good or bad thing that AI alignment (of existing algorithms) is increasingly being framed as a social justice issue, once you've talked about algorithmic bias it seems less privileged to then  say "I'm very concerned about a future in which AI is given even more power".

EmmaAbele @ 2021-06-06T19:55 (+31)

In talking to many Brown University students about EA (most of who are very progressive), I have noticed that longtermist-first and careers-first EA outreach does better and seems to be because of these objections that come up in response to 'GiveWell style EA'. 

HowieL @ 2021-03-14T14:37 (+30)

Indeed, IIRC, EAs tend to be more progressive/left-of-center than the general population. I can't find the source for this claim right now.

 

The 2019 EA Survey says:


"The majority of respondents (72%) reported identifying with the Left or Center Left politically and just over 3% were on the Right or Center Right, very similar to 2018."

https://forum.effectivealtruism.org/posts/wtQ3XCL35uxjXpwjE/ea-survey-2019-series-community-demographics-and#Politics

timunderwood @ 2021-03-23T09:58 (+12)

I think the survey is fairly strong evidence that EA has a comparative advantage in terms of recruiting left and center left people, and should lean into that.

The other side though is that the numbers show that there are a lot of libertarians (around 8 percent) and more 'center left' people who responded to the survey than there are 'left' people. There are substantial parts of SJ politics that are extremely disliked amongst most libertarians, and lots of 'center left' people. So while it might be okay from a recruiting and community stability pov to not really pay attention to right wing ideas, it is likely essential for avoiding community breakdown to maintain the current situation where this isn't a politicized space vis a vis left v center left arguments.

Probably the idea approach is some sort of marketing segmentation where the people in Yale or Harvard EA communities use a different recruiting pitch and message that emphasizes the way that EA is a way to fulfill the broader aim of attacking global oppression, inequity and systemic issues, while people who are talking to Silicon Valley inspired earn-to-give tech bros should keep with the current messages that seem to strongly resonate with them.

More succinctly:  Scott Alexander shouldn't change what he's saying, but a guy trying to convince Yale Law students to join up shouldn't sound exactly like Scott.

Epistemologically this suggests we should spend more time engaging with the ideas of people who identify as being on the right, since clearly this is very likely to a bigger blindspot than ideas popular with people who are 'left wing'.

Cullen_OKeefe @ 2021-03-14T23:00 (+2)

Thanks!

Cullen_OKeefe @ 2021-03-24T05:43 (+29)

Another thought I meant to include with my original post:

These reflections/experiences have also led me to believe that, all else equal, EA groups at colleges are more valuable than ones at grad schools. Anecdotally, One For The World college chapters were much more successful on average than HLS's, despite HLS grads' higher earning potential. My model is that many people adopt the sort of EA-skeptical progressive worldview described here in college, which makes outreach in grad schools harder.

I think making EA a viable alternative or complement to which college students are exposed during their formative years would be very valuable for this.

jlewars @ 2021-04-04T04:06 (+8)

Thanks for the mention :-)

Not sure how helpful this is, but grad schools typically move more money (certainly per pledger/per student/per class etc. and often in naive terms). We have no idea yet of the long term changes in attitudes/actions and how those relate to school-type.

Also FWIW someone just started raising OFTW pledges at HLS and is absolutely crushing it - about $20k/annum of pledges in about a fortnight!

Cullen_OKeefe @ 2021-04-04T23:43 (+2)

Ah great, very happy to hear about the broader success. Seems like the causes may have been more local to my approach while leading HLSEA.

AGB @ 2021-03-17T11:18 (+24)

This has been a philosophical commitment since the early days of EA, yet information on how we (or the charities we prioritize) actually confirm with recipients that our programs are having the predicted positive impact on them receives, AFAICT, little attention in EA.

[Within footnote] As an example, after ten minutes of searching I could not find information on GiveWell's overall view on this subject on their website.

 

FWIW, the most closely related Givewell article I'm aware of is How not to be a "white in shining armor". Relevant excerpts (emphasis in original):

We fundamentally believe that progress on most problems must be locally driven. So we seek to improve people’s abilities to make progress on their own, rather than taking personal responsibility for each of their challenges. How can we best accomplish this?...

A common and intuitively appealing answer is letting locals drive philanthropic projects...At the same time, we have noted some major challenges of doing things this way. Which locals should be put in charge?...

Another approach to “putting locals in the driver’s seat” is quite different. It comes down to acknowledging that as funders, we will always be outsiders, so we should focus on helping with what we’re good at helping with and leave the rest up to locals...

It’s not that we think global health and nutrition are the only important, or even the most important, problems in the developing world. It’s that we’re trying to focus on what we can do well, and thus maximally empower people to make locally-driven progress on other fronts.

Meadowlark @ 2021-03-17T02:48 (+19)

Great post! I think this is an issue worth a lot of exploration. My sense though—both from reacting to your post and from my own reflection—is that there is probably a pretty low ceiling in terms of how much is possible here. I'll speak from my own experience as both a fan of EA and as a leftist.

1. It seems to me that EA, right now, has two areas of congregation (very broadly speaking): university/city groups and professional networking circles. So if you're involved in EA you're probably one of the following: a student, someone with a pretty niche expertise, or someone in between. You might have a graduate degree from a top university, and you might be a serious contender for some pretty "big" jobs at important institutions. Pretty much, you (might be, obviously this is a generalization) a member of the "professional-managerial class" (PMC). This class status—which is distinct from working-class and capital owner—is, I think, always what EA will be and, therefore, will always appear (understandably) as "elitist" to leftists who are sensitive to working-class politics. To many leftists, EA will always appear like a niche intellectual exercise that is being done by members of the PMC, and will never be truly available to members of the working-class, who leftists view as the true source of political power. 

2. Smaller point, but I would distinguish between progressives (say, like Elizabeth Warren or Ezra Klein) and leftists (like Bernie Sanders or Elizabeth Bruenig). These two groups have similarities but are different in one of the most important ways: their views on capitalism. The former is more likely, it seems, to be interested in EA (especially people like Klein who of course already likes EA), but the latter will never be fully  into EA because EA, generally speaking, does not make fundamental critiques of global capitalism. You can work on things like diversity, equity, and inclusion, but a failure to criticize capital will lead to a dead-end in how many people on the left are interested because that's what the left is.  

So, I guess my point is that although I think this is important, unless you can make EA look more like a working-class movement (or at least less like a movement created by and ran by the PMC) then there will never be much overlap between leftists and EAs. 

As both an EA and a leftist myself, this is of course very troubling to me if true! 

Linch @ 2021-03-18T17:17 (+9)

Speaking descriptively, are most active leftists members of the working class rather than the PMC? My impression is that while many working-class people have implicitly leftist views on economics, the demographics that leftists predominantly draw from for activism is the highly educated PMC class, similar to EA. 

This impression can of course be wrong due to selection bias of who I end up talking with, so I'd personally find it valuable to correct for this bias! 

Meadowlark @ 2021-03-18T19:04 (+2)

Good point! My intuition is that it's probably true that self-identified leftists are often indeed members of the PMC. But this could be in part because of a similar selection bias on my part.

 I think the difference is, though, that left politics often draws power from the working-class even if the working-class of course contains people of very diverse political viewpoints. Like not everyone striking in a labor union necessarily an identified socialist, but the political act they're engaging in is one arguably. 

Whereas with EA, it is both the case that members of the community and where power is locating in the community is mostly the PMC (with exceptions). Like, descriptively most EAs are well-educated and so on, and  most EA solutions are ones that would derive from well-educated people. 

Jordan_Warner @ 2021-03-18T09:11 (+6)

I feel EA would  be very interested in a socialist running a cost-benefit analysis of the global proletariat revolution, the 20th century has presumably given us enough data to make it less speculative than a lot of things EAs are concerned about.

Garrison @ 2021-03-18T14:33 (+6)

The thing that dem socs in the us want, a socialist economy and government, hasn't really happened in a rich country. The closest example would be Sweden in the 70s. I don't think there's much value in comparing the results of left wing revolutions in extremely poor and war ravaged countries with what might happen if dem socs like bernie sanders were to be able to enact their agendas in rich countries. The most economically left wing governments and societies in the rich world, i.e. Scandinavia, are some of the best places to live based on a whole host of metrics.

Jordan_Warner @ 2021-03-19T07:28 (+8)

I think it's important to be clear that Scandinavian Social Democracy is not a socialist economy or a socialist government - I'm a big fan of the Nordic countries and think they'd be great to emulate, but (like all  countries) Sweden is somewhere in between "capitalism" and "socialism",  using taxation and a strong welfare state to ensure that the benefits of capital are widely distributed without total redistribution.  Based on the 20th century, I'm pretty confident that the optimal system of government has both free markets and government control.

I see the Capitalist/Socialist false dichotomy a a relic of the Cold War, with neither side able to admit that the other had a point. Total laisse fare Capitalism is pretty unpleasant for the people on the bottom, but it's the height of hubris to think the government can centrally plan the entire economy - and as soon as the Chinese stopped trying, it turned out pretty well for them!

timunderwood @ 2021-03-23T10:10 (+2)

Possibly the solution should be to not try to integrate everything you are interested in.

By analogy, both sex and cheese cake are god, but it is not troubling that for most people there isn't much overlap between sex and cheese cake. EA isn't trying to be a political movement, it is trying to be something else, and I don't think this is a problem.

Meadowlark @ 2021-03-23T13:45 (+8)

I think this is more or less correct. EA is not destined to be compatible with everything that we care about, and I think we should be thinking hard about what EA is capable of being and that the project of bringing in leftists is way more difficult than a few messaging tweaks. Those tweaks might bring in a few left-liberals, but once many leftists really see EA—i.e. as more than just a "you should donate more effectively" project—they will not be super interested, I think. 

Cullen_OKeefe @ 2021-03-24T16:03 (+16)

Discussion of progressive ordinal speciesism on the latest 80,000 Hours podcast:

Robert Wiblin: What’s something important that your political fellow travelers get really wrong, in your view?

Ezra Klein: Animal rights. Maybe since I’ve already said that, you want me to do a different one. But I do first want to say just animal rights.

Ezra Klein: I think this is just a tremendous quantity of suffering that a political movement that thinks of itself as concerned with suffering ignores. Not only ignores, but mocks and dismisses. A lot of people who think of themselves as good on all these issues, you say, “Well, how about we don’t torture so many chickens?” They’re like, “Oh, you crazy vegan.”

Robert Wiblin: Yeah.

Ezra Klein: I really don’t like it. I think it’s a way we teach ourselves to be less compassionate.

Harry_Taussig @ 2021-03-15T16:49 (+9)

Thanks for writing! This definitely helped clarify some of the push-back I often get when trying to explain these ideas to friends.

For reasons that elude my comprehension, many progressives do not seem to conceptualize the current assortment of economic and legal policies that cause some countries to be ~100x richer than others to be a relevant form of oppression. If they do, they are unlikely to give it as high a priority as, e.g., within-country racial disparities or within-country economic inequality. 

This will definitely stick with me. It seems the only way to get around this contradiction is to just not think about it, but maybe I'm missing something?

Cullen_OKeefe @ 2021-03-15T23:48 (+8)

I think it's a matter of prioritization and non-quantification: they either don't really appreciate how much bigger/worse extreme poverty is, or else agree that it's very bad but just don't want to get involved in stopping it because they're worried about being Neo-Colonialist or something similar and it's easier to just focus on the domestic context.

sky @ 2021-03-19T16:45 (+36)

I haven't read this whole thread, so forgive me if I'm re-stating someone else's point. 
I think there's another explanation: they have a hypothesis about you/EAs/us that we are not disproving. 

My experience has been that people in any numerical or social minority group (e.g. Black Americans, people with disabilities, someone who is the "only" person from a given group at their workplace, etc), are used to being met with disappointing responses if they try to share their experiences with people who don't have them  (e.g. members of the numerical or social majority group that they are different from).  Most of us have had this experience at least some of the time, maybe as EAs! People get blank stares, unwanted pity or admiration,  or outright dismissal and invalidation (e.g. "it can't be all that bad" or "you're just playing the [race/poverty/privilege/ whatever] card"). This is definitely the kind of conversation people see over and over again on the internet. So, until proven otherwise, that's what people expect. Majority group members are expected to be ignorant of what life is really like for people who experience it differently. I think this is a rational expectation at least some of the time. The hypothesis then goes: EAs look like majority group members and often are, ergo anything EAs say about which problems are "most important" is assumed to be somewhat ignorant. Maybe people see it as well-meaning or callous ignorance. Regardless, ignorance is assumed as most probable, because it's true of most people. (I think EAs and progressives also have different models of when ignorance matters the most and when differences matter the most, but that's a different thread).  

I've usually taken the view that I don't get to assume people will see me as an informed, compassionate person on the progressive left until I disprove the hypothesis above. If the first thing I say is something like why local US poverty issues are "less important" than other issues, I've just reinforced the hypothesis rather than disproven it. It sounds like denying the reality that they know is true -- they've seen the real-life people impacted and/or read their stories or studied the human impact of these issues.  At least in my case, it's not true that they struggle to think of people in other countries as real people too. (My progressive friends have often lived abroad, have family in other countries, or work in immigrant communities). It's a trust issue. If they see me denying that local issues are "real/important," I must be ignorant, and worse, I must be unwilling to be bothered with the real-life experiences of people different from me. Why should they trust anything I say after that about helping people? "But Africa though!" sounds like a deflection, not a genuine consideration or a sincere, compassionate challenge of their own thinking about poverty. 

When I speak first about things we both care about and share sincere examples of the ways that I do see and care about the depth of personal stress that US poverty and racial disparities have on people I actually know, I haven't had a progressive friend respond by saying that poverty in other countries didn't matter.  I brought it up second though, and that seems to make a difference. If someone trusts that I am a caring, informed person, not a callous ignorant one, we can expand the scope of the conversation from there.

Fwiw, I can't think of a time this has led to changed actions on their part. 

Bluefalcon @ 2021-03-16T12:06 (+10)

I think it's because they know women/poc/trans ppl/ppl on whatever fashionable domestic axis of inequality you want to look at, but don't know anyone who lives in Burundi, and because the experience of oppressed people in America is still close enough to their own to actually empathize with. Lot easier to empathize with your friend who got called a slur than with someone dying of malaria in Africa. Both because they are your friend, and because you've probably been called mean names, maybe even by the same type of asshole tossing slurs at them, whereas deadly diseases that affect young healthy people are hard to even imagine. 

Meadowlark @ 2021-03-17T16:29 (+31)

This is a useful point but I would add a little bit to it. People on the left often think about racism, transphobia, and homophobia as quite a bit more than a POC friend of theirs being called a slur. Leftists often think of these as fundamentally systemic issues with very real, often physical, consequences. Like, racism in the US can manifest as, say, an entire generation of poor Black families being poisoned by a local CAFO, or an inability to develop intergenerational wealth due to explicitly racist economic policy.

I think sometimes EAs can offer a rather uncharitable take of the left, like that the left's concern with racism is just "SJW Safe Space" stuff or whatever. Not saying that's what's happening in this thread, but I would just say that if EA wants to be more open to progressives and leftists, it has to take very seriously what they actually  believe. 

As an example, I was pleased to see that the broad EA take during the BLM summer protests didn't seem to be just "well people should donate to AMF instead of buying markers and signs," Which may have been the take of 2015 EA. Whether EAs agree with them or not, ideas like socialism, progressivism, social justice, and so on, are serious ideas and shouldn't be dismissed in the way that I sometimes have seen them dismissed. 

ljusten @ 2021-03-17T16:22 (+9)

Yes there is a kind of "Narcissism of small differences" in which societal progress is measured in the context of a wealthy western countries instead of the broader world. The social justice initiatives in the U.S. do not benefit or extend to people of color in poorer countries who often suffer under even more pronounced economic or state injustices (e.g. deadly malaria mosquitoes, malnutrition, lack of access to healthcare, jobs, education, and internet, government oppression, etc).  I believe this is in part because people in the U.S. don't know how how much worse quality of life can be in poorer or more  authoritarian countries. 

Larks @ 2021-03-14T17:25 (+9)

EAs tend to reject person-affecting views of population ethics. This, however, has uncomfortable implications for some hot-button issues on the left, like reproductive rights and environmental ethics.

 

I can see why left wing views on abortion would biased people against totalist views, because they do not want to accept the implication that someone's desire to abort their child could be 'outweighed' by the interests of a possible-person. And I guess totalism would also imply we should have more children, in contradiction to the idea that we should have fewer to protect the environment. But it would naively seem that being concerned about the environment would make you more amenable to longtermist views (as distinct from totalism), because if you don't care about future people then most of the damage from climate change can be ignored.

Cullen_OKeefe @ 2021-03-15T23:47 (+4)

And I guess totalism would also imply we should have more children, in contradiction to the idea that we should have fewer to protect the environment.

This is mostly what I was referring to. Matt Yglesias has often said that he gets a lot of pushback against his One Billion Americans book from leftists who implictly have some sort of prior against both population and economic growth.

Also, as Michael says below, I think they (like most people who aren't moral philosophers) just don't really have coherent population ethics.

jackmalde @ 2021-03-14T19:57 (+3)

Also, person-affecting views can lead to the bizarre conclusion that we don't need to worry much about contributing to climate change because the people in the future wouldn't have existed if we hadn't done so - so we won't actually have harmed them (provided their lives are net good).

AKA the non-identity problem.

MichaelStJules @ 2021-03-15T20:28 (+5)

I would assume that progressives concerned with the welfare of future generations (maybe most?) don't have these specific kinds of person-affecting views, although most probably have not thought that much about population ethics or metaphysical identity issues at all. I think the closest steelman might look like:

  1. the wide and soft asymmetry view here (Thomas) or here (Frick), which does fine on the non-identity problem,
  2. dying is bad, so extinction would at least be bad for the people who die and don't want to,
  3. and maybe they separately value the preservation of humanity,  like this (Frick), or something like an animal conservationist way, but more humans isn't (always) better. Or, they aren't actually person-affecting, but recognize decreasing marginal value in additional lives as a population increases.
david_reinstein @ 2021-11-08T00:51 (+5)

Perhaps overlooked take? (Somewhat echoing other commenters, though)

US politics is supremely polarised. Those self-identifying as progressive are mainly motivated by opposing the right, and vice versa.

EA doesn’t particularly focus on labelling the biggest issues as problems that are the “fault” of the right (or left).

Thus our reasoned and goal-driven approach will leave people on each side cold.

But conversely, I think some of the steps suggested, could, at the very least, make outreach to people to the right of center or (‘anti-woke’) more difficult.

E.g., “ commitment to using my class privilege…” narratives.

deluks917 @ 2021-03-14T06:31 (+5)

I think of the intersectionality/social justice/anti-oppression cluster as being a bit more specific than just 'progressive' so I will only discuss the specific cluster. Through activism, I met many people in this cluster. I myself am quite sympathetic to the ideology. 

But I have to ask: How do you hold this ideology while attending Harvard Law? From this perspective, Harvard law is a seat of the existing oppressive power structure and you are choosing to become part of this power structure by attending. The privileges that come from attending Harvard Law are enormous. Harvard law graduates earn extremely high salaries (even the starting salaries are high)and often end up with very high net worths. Harvard law is also obviously strongly connected to many parts of the neoliberal capitalist system. 

From a certain perspective being a leftist at Harvard law can be viewed as trying to become some sort of 'class traitor' to the neoliberal elite. This does not seem like the obvious thing to do from a leftist perspective. Much leftist analysis would suggest that it's much more likely you just end up part of the neo-liberal power structure instead of subverting it. 

In your experience how do these people resolve the contradiction?

Julia_Wise @ 2021-03-15T17:15 (+38)

Ironically, the situation in which I have most frequently been asked about whether EA is elitist is while giving intro talks about EA at MIT, Yale, etc.

Cullen_OKeefe @ 2021-03-15T23:49 (+7)

This is my experience too.

xuan @ 2021-03-21T02:14 (+33)

Based on my experiences as a Yale undergraduate, I've come away with the perhaps overly pessimistic conclusion that a lot of class-privileged leftists at Ivy+ schools don't actually resolve that contradiction, and are unfortunately not that interested in interrogating and addressing their class privilege, or thinking about redistributing what familial or future wealth / resources they may have access to. I say this as both a former organizer of Yale EA, but also as someone who started a Resource Generation chapter there, and found it difficult to get people to engage. By way of comparison, it was considerably easier to find people interested in the local DSA chapter.

(For context, Resource Generation is a movement that organizes young (USAmerican) people with wealth or class privilege to redistribute their wealth, land, and power, and I see it as perhaps the most viable movement for class-privileged US leftists who are really interested in addressing the contradiction of being both leftist and wealthy. See for example their giving pledge guidelines, which are considerably more ambitious than GWWC, and have as their goal for the " top 10% to develop plans to redistribute all or almost all (see below) inherited wealth and/or excess income". )

 It's hard to have a charitable take in response to that data, but I think it's partly that people find it quite uncomfortable to talk about class, what more interrogate their own class privilege in a deep way. The other part is that the social incentives in these schools and activist circles tend to reward more external-facing leftist actions like fossil fuel divestment protests, and not internal-facing actions like confronting one's wealthy family to redistribute their wealth - in part because to do that publicly, you have to reveal your family is wealthy, which isn't exactly celebrated in leftist spaces.

Max_Daniel @ 2021-03-22T08:35 (+5)

That's really interesting, thanks for sharing your experience with these efforts.

Only partly on-topic, but I'm wondering if Jerry Cohen's If You're an Egalitarian, How Come You're so Rich? may be a good book for such audiences.

As far as I remember it, it doesn't actually make that strong a case that rich egalitarians ought to redistribute most of their wealth. (I actually think that most of what I got from that book was reflecting on some weird parallels between Marxism and AI risk thought, and the role of philosophers in both.) But it at least raises and somewhat discusses the question, and it's by one of the main 'analytical Marxists' and so might have more initial credibility to leftists.

xuan @ 2021-03-26T18:16 (+6)

I have read the paper, not the book! And have tried to get friends to read it, though unfortunately I don't think it was necessarily very effective either. I did end up writing an op-ed (Reparation, not just Charity) once trying to motivate wealthy students to redistribute more of their wealth, and it received a lot of likes on social media, but I'm not sure that it led to meaningful behavioral change :/ I think behavioral changes and commitments just take a lot more work, and a supportive community to encourage it. 

John_Maxwell @ 2021-05-23T05:15 (+4)

Just for reference, there's a group kinda like Resource Generation called Generation Pledge that got a grant from the EA Meta Fund. I think they've got a bit more of an EA emphasis.

deluks917 @ 2021-03-21T20:02 (+4)

Really cool to learn about resource generation. These fellows are hardcore. I promote the following to EA type people:
-- Donate at least 10% of pre-tax income (I am above this)
-- Be as frugal as you can. Certainly don't spend more than could be supported by the median income in your city. 
-- Once you have at least ~500K net worth give away all additional income. In my opinion, 500K is enough to fund a lean retirement if you are willing to accept a little risk. 

--If you get a big windfall I suggest either putting it in a trust or just earmarking it for charity instead of immediately donating the whole thing; your cause prioritization may change (I regret how I donated a big windfall during the first crypto bull market. )

I don't think people should have to work if they don't want to so I think it's reasonable to 'save yourself'. But don't strive for too much security and keep your spending lean. I was objectively raised in a far from top 10% household and have no received much money from my parents. For example, they contributed zero dollars to my college. But anyone who is able to 'speedrun to 500K while donating' (or even seriously consider it) must be very privileged somehow.

If you actually take my advice seriously it is quite strict. But RG seems a lot more hardcore than that. 

timunderwood @ 2021-03-23T10:20 (+2)

I feel like trying to be charitable here is missing the point.

It mostly is Moloch operating inside of the brains of people who are unaware that Moloch is a thing, so in a Hansonian sense they end up adopting lots of positions that pretend to be about helping the world, but are actually about jockeying for status position in their peer groups.

EA people also obviously are doing this, but the community is somewhat consciously trying to create an incentive dynamic where we get good status and belonging feelings from conspicuously burning resources in ways that are designed to do the most good for people distant in either time or space.

tamgent @ 2021-03-24T13:23 (+12)

I don't think xuan's main point was about being charitable, although they had a few thoughts in that direction. More generally, trying to be charitable is usually good. Of course it's going to miss a point (what finite comment isn't), but maybe it's making another?

I appreciate you trying to bring the discussion towards what you see as the real reason for lefty positions being held by privileged students (subconscious social status jockeying), but I wonder if there's a more constructive way to speculate about this?

Maybe one prompt is: how would you approach a conversation with such a lefty friend to discover if that is their reason, or not?

You could be direct, put your cards on the table, and say you think they are just interested in the social status stuff, and let them defend themselves (that's usually what happens when you attack someone's subconscious motivation, regardless what's true). Or you could start by asking yourself, what if I was wrong here? Is there is another reason they might hold this position on this topic? That might lead you to ask questions about their reasons. You could test how load-bearing their explanations are, by asking hypotheticals, or for them to be concrete and specific. Maybe you, or they, end up changing/modifying your position or beliefs, or at least have a good discussion, with at least one person having more understanding going out than you had coming in. In any case, I think a conversation that assumes good faith is more likely to lead to a productive discussion.

Circling back to the initial thing: I'm assuming that you do see the value in being charitable and assuming good faith in general, and just feel it is hard to practice this in conversations when people are very attached to their positions. But let me know if not, i.e. if you do genuinely think there is no point in being charitable (as that would be our true disagreement, this seems unlikely).

Please correct me if I've misunderstood you here. 

+ nitpick: you use terms people might not have heard of. If I look up 'Moloch' I don't immediately see the article by Scott Alexander that I think you have in mind, just a Wikipedia article about the god. 

david_reinstein @ 2021-11-08T00:56 (+4)

Also, I wanted to ask for more evidence for “EAs tend to reject person-affecting views of population ethics”.

I am personally fairy sympathetic to this view, at least in a common sense moderated version as suggested by Michael st Jules. I know some prominent voices (Wiblin) seem to favor the totalist view, but my impression is that they typically they moderate this with “but even if you have other views of population ethics we think the recommendations are similar”.

willbradshaw @ 2021-11-08T09:51 (+11)

I don't know where you would go to get more quantitative evidence of this (the EA survey?) but the headline claim matches my experience. Many longtermist EAs seem to be highly motivated by "creating huge numbers of future happy people" sorts of arguments that fall flat from a (strongly) person-affecting view.

tamgent @ 2021-03-14T11:47 (+4)

Thanks for the article, interesting and well-written. I'm sure will be useful as a reference for me in some future conversations.

With reference to your section titled Incompatibility Between Intersectionality and Prioritization - how do you see worldview diversification fitting in?

To me, this perspective incorporates the value of diversification of causes (which intersectionality protects) whilst still being realistic about actually getting things done (which prioritization protects).  Under a worldview diversification lens, prioritization is less about one thing to the exclusion of all others, whilst still not going as far as to say all causes are equal and should have an equal place at the table.

Jsevillamol @ 2021-03-14T23:40 (+11)

I feel like invoking worldview diversification here is discussing things at the wrong level. 

Is like saying "oh its ok that you believe in intersectionality, because from a worldview diversification perspective we want to work on many causes anyway", and failing to address the fundamental disagreement that within their worldview a intersectionalist does not find cause prioritization useful.

Like, I feel the crux of intersectionality is about different problems being interwoven in complex, hard-to-understand ways. So as OP pointed out, if you believe this you'll need to address all problems at once by radically restructuring society.

Meanwhile, the crux of worldview diversificationists is that we are not certain of our own values and how they will change, so it is better to hedge your bets by compromising between many views.

tamgent @ 2021-03-15T10:20 (+20)

That wasn't really what I was saying, and I don't think you're steelmanning the intersectionalist perspective, although I agree with your description of the crux. I think many (maybe most?) people who like intersectionality would agree that prioritization is sometimes necessary and useful.

An attempt to steelman intersectionality for a moment:
- problems are usually interwoven and complex
- separating problems from their contexts can cause more problems
- saying one problem is more important than  another has negative side effects, because we are trying to fix a broken hammer with broken hammer (comparison culture is a cause of many problems, is a belief of many progressives, I believe)

I am unsure this is incompatible with prioritization, which in my view is simply a practical consequence of not having infinite resources. I think they'd agree, and would not take issue with, for example, someone dedicating their life to only climate change, as long as that person did not go around saying climate change is more important than all the other important issues, and also saw how climate change is related to, for example, improving international governance, or reducing corruption and worked with those efforts rather than in competition with/undermining them.

I think viewing most intersectionality proponents as people who cannot ever work on one thing because they literally need to address all problems at once is an overly literal interpretation, although it's possible to get this impression if there are a few loud ones like this (I don't know enough to know).

The disagreement seems to be more about whether it is helpful to compare the importance of issues in a public way. Comparing things, whilst necessary and important, can have side effects such as making some people feel bad about a the good thing that they are doing because it isn't the best thing a person in theory could be doing. We are familiar with this from 80K's mistakes.

I was focusing more on the marketing side like Cullen, and wondering whether worldview diversification might be a way to better connect with intersectionality proponents via a message like this:

problems are complicated and sometimes entangled, and we can work on many at once, on a group level, but also our resources are finite, so when allocating them, trade-offs will need to be made

Jsevillamol @ 2021-03-15T14:11 (+5)

I find your steelman convincing (would love more intersectionalists to confirm though!).

Re: downsides of intercause prioritization. Beyond making people feel bad about their work, systematic prioritization can systematically misallocate resources, while a more informal, holistic and intersectional approach is less likely to make this kind of mistake.

Arguably, while EAs are very well aware of the importance of hit-based giving, they are overly focused on a few cause areas. Meanwhile my (naive) impression is that intersectionalists are succesfully tackling a much wider array of problem areas and interventions, from community help to international aid and political lobbying. 

I do not think it is a stretch to think that prioritization frameworks are  partly to blame for cause convergence in the EA community.

Cullen_OKeefe @ 2022-05-16T05:36 (+3)

Great discussion of Econ Aversion from Julia Wise here: https://juliawise.net/economics-not-as-bad-as-i-thought/

rootpi @ 2021-03-18T10:50 (+3)

Thanks for this great post. I'm closer to left-libertarian or classical liberal myself, but I have many friends and family (mostly in the US) who are more traditional progressives and much more sympathetic to typical social justice concerns than to EA. I agree with many of the issues identified here (including in the comments); my own experience has been that it is largely that they want to be able to "walk and chew gum at the same time". As an economist, I'm imbued with notions like opportunity cost and only being able to optimize one goal at a time (potentially itself an aggregation of course), but this is very foreign and off-putting to them. Either they don't understand the size of the actual disparities between issues, or... well actually I'm not sure, it's hard for me to wrap my head around.

However I particularly wanted to mention an illuminating recent post by Matt Yglesias (who came up elsewhere in the comments) on his substack:  https://www.slowboring.com/p/slate-star-codex 

The main topic is distinct, but from "The radicalism of effective altruism" onward it is very relevant and informative. On the one hand Yglesias is criticizing the journalist's progressive critique of EA, SSC, Silicon Valley, etc. On the other hand Yglesias (who is definitely on the left, and who likes evidence and reason a lot) doesn't end up very sympathetic to EA himself. He thinks of it as purely consequentialist, extreme, etc. Even if it's hard to attract some full-on progressives, someone like him should be exactly the type of person who  supports EA. Something has gone wrong with the messaging if that isn't the case, and we are missing out.

MaxRa @ 2021-03-18T12:25 (+6)

I agree with the last point, and I think EA is doing fairly well on the being sympathetic to Matt Yglesias front:

Jordan_Warner @ 2021-03-18T08:54 (+3)

I found this helpful, I'm in a similar situation of moving from "social justice" (mainly concerned with homelessness in my own city) to Effective Altruism, and so am trying to think of good ways to engage people/slightly concerned that if we don't phrase things in the correct way the left may try to destroy us.

I wonder if talking about the causes of international economic inequality makes it seem more like an issue of injustice to be addressed from a progressive/social justice framework? That's one way I'd frame the issue when talking about EA principles to a left-of-centre audience.  I don't subscribe to a zero-sum view of development in which all wealth is taken from someone else, but it's undeniable that most currently wealthy nations benefitted from colonialism at the expense of the rest of the world, and we all continue to participate in an economic system that is pretty clearly constructed to benefit multinational corporations rather than individual producers. I'd  also argue that donating to effective charity should at least be part of living an ethical lifestyle, and that many of the other issues people may find more emotionally compelling, like human trafficking and exploitative employment, are primarily rooted in poverty. 

I also point out how basically everyone in the audience is in the top 10% globally, although I feel like this is probably less effective when talking to students since their wealth is mostly in the future. I've also found that the very progressive idea that everyone should be treated equally is one argument in favour of international aid, that x100 multiplier goes a long way! However, it is difficult to convince people that life for the poorest 10% of people in the world really is a lot worse than life for the poorest 10% of people in a wealthy country, although access to food, medicine and housing is probably the area that makes this clearest.

Also, use emotional appeals, although that's just good advice when trying to persuade humans generally, although ideally use this to support  rather than instead of facts and evidence, because we probably can't win solely based on emotional appeals. This is obviously easiest in the context of global health, AMF has loads of pictures of smiling children holding mosquito nets, and GiveDirectly has loads of personal stories of how people actually spent the money.

Arepo @ 2021-03-18T09:09 (+2)

Helpful post!

What makes you say rejecting person-affecting views has uncomfortable (for progressive) and environmental ethics, out of curiosity? I would have thought the opposite: person-affecting views struggle not to treat environmental collapse as morally neutral if it leads to a different set of people existing than would have otherwise.