Is the 10% Giving What We Can Pledge Core to EA's Reputation?

By DirectedEvolution @ 2023-06-06T05:42 (+16)

Introduction

I don't think the 10% norm forms a major part of EA's public perception, so I don't believe tweaking it would make any difference. - RobertJones

10% effective donations has brand recognition and is a nice round number, as you point out. -- mhendric

These two comments both received agreement upvotes on my recent post Further defense of the 2% fuzzies/8% EA causes pledge proposal, and although they're differently worded and not irreconcileable, they seem basically to stand in contradiction with each other. Is the 10% Giving What We Can pledge, in which participants commit to donating 10% of their annual income to an effective charity, part of EA's brand or reputation?

Tl;dr

This does not resolve the question of how core the specific idea of a 10% pledge of one's income to effective charities is to the EA movement or to its reputation. But the pledge is provocative and the fact that so many EAs commit to it seems clearly to play a role in the media's interest in the movement, and both its appreciation for EA and its criticism of it. Based on the couple of hours I put into looking through this material, here is my model of the role of a 10% GWWC pledge in EA and in outward-facing reputation:

Clearly, a 2%/8% or 2%/10% fuzzies/utilons standard for an earning to give pledge would be a concrete way to show we've taken onboard some of these critiques. It would be an example of making allowance for people's multiple loyalties, showing that we are not absolutist, and taking action in response to valid criticism.

I think that non-EAs who hear about the 10% pledge may sometimes find its clarity compelling, but others will be alienated by the perceived absolutism. It may not be clear that you can also donate more, or that it's probably fine (and at the very least a huge marginal improvement) if you donate a little less to EA causes and the rest to the opera.

Reaffirming My Position

Overall, I conclude that the notion of earning to give 10% to effective charities is a core part of EA's brand, and that its simplicity is often a source of criticism. These are good reasons to continue arguing for a shift to a 2%/8% or 2%/10% pledge.

What is the Giving What We Can pledge and how popular is it within EA?

On its about page, GWWC says it was founded by Toby Ord and Will MacAskill. Its vision statement says:

Giving What We Can's mission is to make giving effectively and significantly a cultural norm. We mean this quite literally: our goal isn't just to marginally increase the amount of money going to effective charities — we're aiming to make meaningful cultural change.

They define their community primarily in terms of its donating members:

At its heart, Giving What We Can is a community. We are a group of like-minded people who are committed to make a meaningful positive impact on others’ lives by donating to highly effective charities. 

The wording of the GWWC pledge is:

"I recognise that I can use part of my income to do a significant amount of good. Since I can live well enough on a smaller income, I pledge that from __ until __ I shall give __ to whichever organisations can most effectively use it to improve the lives of others, now and in the years to come. I make this pledge freely, openly, and sincerely."

GWWC says that 9,217 of its members have taken the pledge, and it's this number that they refer to in defining their size:

We’ve got over 9,217 members across the world

As of 2017, it was experiencing 80% year-over-year growth.

In addition to recruiting and supporting donors in maintaining a substantial level of charitable giving, GWWC also does research to identify charities they consider effective, and run a donation platform.

Among EA forum posts, a search for "giving what we can pledge" returns 417 results, with 230 for "GWWC pledge," 819 for "giving what we can," and 2638 for "GWWC." Note that GWWC puts out a monthly newsletter that has "GWWC" in the title, and they have a dedicated EA Forum account; this accounts for a few dozen of these entries. I briefly skimmed a few of the top posts that pop up when searching for "GWWC," and they are, no surprise, typically focused at least in part on motivating charitable giving.

"Earn to give" gets 1515 results.

For comparison with various cause areas, "vegetarian" gets 399 results, while "vegan" gets 1338 results. "Pandemic" gets 1471 results, "x-risk" gets 1016 results, and "ai safety" gets 1985 results.

Among EA-related organizations, "GiveWell" gets 1984 results, "GiveDirectly" 573 results, "EA Forum" gets 4544 results, "MIRI" gets 5291 results, and "Future of Humanity Institute" gets 495 results. "OpenPhil," "Open Phil," and "Open Philanthropy" respectively get 1532, 2661, and 1846 results.

Giving What We Can's 10% donation pledge was prominently featured in one of Scott Alexander's most popular slatestarcodex posts, Nobody is Perfect, Everything is Commensurable. It gets plenty of mentions on LessWrong. Overall, I think it is fair to say that GWWC is identified pretty strongly with its 10% pledge, that it's among the most prominent EA organizations, and that it is enacting one of the most prominent, visible EA ideas, earning to give.

How does the media think about the idea of earning to give and the GWWC pledge?

The most I can offer here is an hour or two of searching the internet to find references to GWWC and see how the media portrays it in relation to Earning to Give.

The first thing I searched for was Will MacAskill's 12-minute interview on The Daily Show, which is the biggest pop culture moment for EA that I know of. 

The first time in the interview that Trevor Noah digs into the topic of charitable donations is just after the 3:30 mark, talking about the magnitude of MacAskill's giving, how it ties into issues of privilege. He specifically mentions the idea of giving away 10-20% of one's income, and MacAskill reinforces it (along with the idea of donating to effective nonprofits) at about the 10:55 mark. About 75% of the segment revolves around the idea of earning to give. Giving What We Can and the GWWC pledge are not mentioned in the interview, but Will's book is. Giving What We Can and its pledge are mentioned several times in the book.

The EA Forum has an Effective Altruism in the Media tag. The highest-relevancy post covers the podcast between Sam Harris and Will MacAskill. This podcast resulted in Sam joining GWWC after two episodes, driving a spike of about 600 GWWC memberships, and according to Aaron Gertler, the post's author:

An extremely engaged community builder told me in February 2021: "I feel like most new EAs I've met in the last year came in through Sam Harris."

So just a couple years ago, an extremely well-received Sam Harris podcast was an extremely important influence in driving EA membership, and it was a big promoter specifically of the GWWC pledge.

The Joe Rogan podcast also featured MacAskill and specifically mentioned the GWWC 10% pledge. You can listen to it here

NY Times critical coverage of Sam Bankman Fried and his link to EA says that EA:

holds that taking a high-paying job is worthwhile if the end goal is to give much of the income away.

But it doesn't cover the GWWC pledge or the idea of specifically donating 10% of one's income. It does mention that Jane Street contains a lot of people who practice earning to give and are into effective altruism or similar ideas. The article links to Nicholas Kristof's glowing coverage of Matt Wage, another practitioner of earning to give and effective altruism who donates half his income, and also mentions that Peter Singer donates a third of his income. Kristof describes EA as:

... a new movement called “effective altruism,” aimed at taking a rigorous, nonsentimental approach to making the maximum difference in the world.

Despite the very positive coverage of EA, one of Kristof's main concerns is specifically about the level of an earning-to-give commitment (% of income) and the idea that 100% of that donation should be focused specifically on effective charities:

First, where do we draw the line? If we’re prepared to donate one-third of our incomes to maximize happiness, then why not two-thirds? Why not live in a tent in a park so as to be able to donate 99 percent and prevent even more cases of blindness?

I want to take my wife to dinner without guilt; I want to be able to watch a movie without worrying that I should instead be buying a bed net. There is more to life than self-mortification, and obsessive cost-benefit calculus, it seems to me, subtracts from the zest of life.

Second, humanitarianism is noble, but so is loyalty. So are the arts, and I’m uncomfortable choosing one cause and abandoning all others completely.

For my part, I donate mostly to humanitarian causes but also to my universities, in part out of loyalty to institutions that once gave me scholarships.

Another NY Times article on the FTX collapse describes EA as:

...a philosophy that advocates applying data and evidence to doing the most good for the many...

 

In a New Yorker article covering the collapse, the author articulates the tension at the heart of EA:


On the one hand, what makes the movement distinct is its demand for absolute moral rigor, a willingness, as they like to put it, to “bite the philosophical bullet” and accept that their logic might precipitate extremes of thought and even behavior—to the idea, to take one example, that any dollar one spends on oneself beyond basic survival is a dollar taken away from a child who does not have enough to eat. On the other hand, effective altruists, or E.A.s, have recognized from the beginning that there are often both pragmatic and ethical reasons to defer to moral common sense. This enduring conflict—between trying to be the best possible person and trying to act like a normal good person—has put them in a strange position. If they lean too hard in the direction of doing the optimal good, their movement would be excessively demanding, and thus not only very small but potentially ruthless; if they lean too hard in the direction of just trying to be good people, their movement would not be anything special...

The broader culture is marked by neither a widespread sensitivity to misery nor a pervasive sense of obligation to do something practical about it, and for all of its faults the culture of E.A. was. One didn’t have to agree with everything they did to believe that they created a worthwhile role for themselves and acquitted themselves honorably.

Although this doesn't mention the 10% GWWC pledge, it is digging into the dilemma that the pledge is meant to address - setting a substantial but manageable standard for what it means to do an adequate job of earning to give.

For the author, one of the problems with EA's culture that enabled Sam Bankman Friend to pull the wool over our eyes is an attitude among the leadership that he attributes to Rob Wiblin:

In other words, it seems as though the only thing that truly counts for Wiblin is the inviolate sphere of ideas—not individual traits, not social relationships, not “she said” disagreements about whether it was wise to throw in one’s lot with billionaire donors of murky motive, and certainly not “traditional liberal concerns.” (Wiblin told me, “I wasn’t talking about articles that focus on personal virtue, integrity, or character. I was talking about, for example, a focus on physical appearance, individual quirks, and charisma.”) Effective altruism did not create Sam Bankman-Fried, but it is precisely this sort of attitude among E.A.’s leadership, a group of people that take great pride in their discriminatory acumen, that allowed them to downweight the available evidence of his ethical irregularities. This was a betrayal of the E.A. rank and file, which is, for the most part, made up of extremely decent human beings.

Again, we find that one of the critical themes about EA is its resistance to compromise with extant moral and cultural ways of parsing issues.

Luke Freeman, the Executive Director at Giving What We Can, said in 2022 that:

We are aware that we are one of the "shop fronts" [of the Effective Altruism movement] at Giving What We Can. 

A Times Article on MacAskill and EA specifically mentions the Giving What We Can 10% pledge.


mhendric @ 2023-06-06T20:43 (+17)

I enjoyed reading your thoughts on whether the 10% pledge is central to EA's public perception. 

I do not agree with how you relate your positive proposal to the critiques of EA. Two points stuck out to me: the "earning to give" point and the "is 10% the correct amount" point. In both cases, I see no reason to believe "a 2%/8% or 2%/10% fuzzies/utilons standard for an earning to give pledge would be a concrete way to show we've taken onboard some of these critiques.".

Earning to give is weird. You improve the world by becoming a (checks notes) Banker or Lawyer? People that criticize earning to give do not criticize the notion of them donating their money, but typically criticize Banking/Lawyering as a profession where one can do good (e.g. because they believe these jobs are net-negative), or see the pledge as greenwashing one's otherwise rich life. I do not see how a banker donating 2% to his favorite Opera would change any of these critiques. The critic does not want you to donate to the Opera - they want you to stop saying being a banker may have more positive ethical payoffs than being a social worker.

EA argues for a duty of beneficence and asks members to donate 10%. 10% is an arbitrary shelling point. Why not 11%? Why not 12% (you are here)? But consider: why not 13%? (...) Why not 99%? These worries are a classic critique of duties of beneficence, at least since Singer released Famine, Affluence, and Morality. I am confident that such critiques will not be resolved by setting the donation percentage 2% higher. The critic does not want you to donate 12% - they want you to explain why X% is morally required, but X+1% is not morally required.

I agree with the point that a newcomer to EA may wrongly get the impression of not being allowed to donate to non-effective charities. This would be bad. But I think there are significantly easier ways to signal to them that they can do so than to reform the Giving What We Can Pledge (talking to them/leading by example/putting it in a FAQ). 

I also still think your positive proposal would likely be harmful, partly for the same reasons I laid out in a previous post. First, why make fuzzy donations mandatory? Someone with very Utilitarian convictions may be put off by this, or someone who would otherwise donate 10% effectively and donate fuzzies separately may reduce their effective donations while keeping fuzzies constant (this is on the original 8%/2% proposal and does not affect the 10%/2% proposal). When I encountered EA, a pitch of "Donate X% to the most effective ways of improving lives, then spend an additional 2% on whatever you feel like" would have created more rather than less confusion in me. Most people, I reckon, do not need approval to spend the other 90% on things they want to spend them on, including charity that is not effective. 

Much more importantly, I think this has a big potential for being a PR disaster, rather than a PR boon. I don't know how I would explain why my organization has a norm of donating to charities we don't consider to be effective. I think the reasons you provide are by and large "to improve our reputation". I am quite confident that EA explicitly foregoing its efficiency principles to mandate a 2% fuzzies tax to improve its reputation would not land well in the press, or with critics. Much of this sounds to me like an attempt at 4d-chessing the public perception of EA. Frankly, even if I were an EA-sympathetic journalist, I would find the idea quite insulting - it's pretty transparent. 

I also agree with Isaac that the initial downvotes and overall vote tally strike me as disagreement with your proposal, rather than a rejection of your discussion.

DirectedEvolution @ 2023-06-07T00:08 (+2)

Hi mhendric. First, thank you for your continued engagement and criticism - it sharpens my own thinking and encourages me to continue. I will respond in greater depth to some of the critiques you've made here in my next post.

Briefly:

  • My wording obviously has been muddy. My proposal is not a mandatory 2%-to-fuzzies-causes pledge, but a 10% pledge of which 80% is allocated to effective causes and 20% is explicitly to whatever cause the donor is passionate about. This discretionary 20%-of-the-10% (i.e. 2% of annual income) could also go to effective causes, but it could also go to the arts, alma maters, or anything else. In this way, this modification encompasses the original GWWC pledge, but also adds a flexible portion for those not comfortable with or who perceive absolutism in its structure.
  • I agree with you that we can guide EAs to a more sophisticated interpretation of the pledge internally. My concern about the current format of the pledge is that it misdirects conversations with non-EAs, prevents a deeper engagement with these ideas and giving habits, and contributes to EA's perception of absolutism by that portion of the public that is aware of the movement at all. This is why having a concrete way to address these concerns seems beneficial for structuring conversations about these ideas, and also for increasing the amount of donations we are able to motivate for effective causes. I believe it would make EA a bigger tent community than it is at present.
  • While I agree strongly that much criticism of earning to give relates to concerns about net-negative professions and greenwashing, I also found in this research that a substantial portion of the critique is specifically about the 10% level and the idea of 100% donations to causes deemed effective. As examples, Trevor Noah, mimics the critique an ordinary person might make in a country without a social safety net, saying 'maybe you in the UK can afford a 10% donation to charity, but I'm in the USA, where our healthcare is very expensive.' The Kristof column I link questions the rule that 100% of donations would go to effective charities. These are also impressions non-EAs I've spoken with about EA have picked up in conversation and that I have struggled to address.
  • I agree that 10% is a Schelling point. I believe that a thorough understanding of the logic of Schelling points would overcome the slippery slope objection of "why not X+1%." Where I believe you and I disagree is the idea that a Schelling point cannot be modified without destroying it. In my view, a Schelling point, once established, is like an elastic tether. The further away from the anchor point you go, the more resistance you meet. But if there are big benefits to marginal moves away from the exact tether point, then you should be able to do so. Metaphorically speaking, if Grand Central Station is the place to converge to find your friend when you're both lost in New York City, you can sit on a park bench outside, but you can also get a (vegan) hotdog from the stand nearby. I believe that a 2%/8% or 10%/12% modification is comfortably close to the tether point to not break the Schelling point, while providing the benefits I have described.

Each critique you have made deserves a full post in reply, and I anticipate that some or all of them will as I continue this series. These paragraphs are just meant as compressed versions of my beliefs at this time, not comprehensive arguments.

tobycrisford @ 2023-06-06T07:42 (+13)

I think this is an interesting question, and I don't know the answer.

I think two quite distinct ideas are being conflated in your post though: (i) 'earning to give' and (ii) the GWWC 10% pledge.

These concepts are very different in my head.

'Earning to give': When choosing a career with the aim of doing good, some people should pick a career to maximize their income (perhaps subject to some ethical constraints), and then give a lot of it away to effective causes (probably a lot more than 10%). This idea tells you which jobs you should decide to work in.

GWWC pledge: Pretty much whoever you are, if you've got a decent income in a rich country, you should give 10% of it away to effective causes. This idea says nothing about which jobs you should be working in.

I think these two ideas are very different.

'Earning to give' gets a lot of criticism from people outside EA, but I don't see much criticism of the idea of donating 10% of your income. Sure, you can call the amount arbitrary and dispute the extent to which it is an obligation, but I think even major critics of EA often concede that the 10% pledge is still an admirable thing to do.

DirectedEvolution @ 2023-06-06T08:29 (+3)

Thank you for your response.

I completely agree that earning to give and the GWWC pledge are conceptually distinct. Ideally, anyone dealing with these ideas would treat them as such.

Where I disagree with you is that my post is conceptually 'conflating' these two ideas. Instead, my post is identifying that a bundle of associated ideas, including the GWWC pledge and earning to give, are prominent parts of EA's reputation.

Here is an analogy to the point I am making:

  • When people think of engineering, they think of math, chemicals and robots.
  • When people think of Effective Altruism, they think of earning to give and donating 10% of your income to effective charities

The abstract relationship I am drawing with this analogy is that people who are not part of a specific community often have a shallow, almost symbolic view of major topics in the community. They do not necessarily come to a clear understanding of how all the parts fit together into a cohesive whole. My post is not at all arguing the virtues of earning to give or a 10% pledge. It is arguing that these two topics are part of a bundle of ideas that people associate with EA's brand or reputation, in response to the debate suggested by the two seemingly contradictory claims I quoted at the top of the post.

I don't think my post represents the critics it cites as saying donating 10% of one's income to charity is a bad thing to do. What they critique is a perception of absolutism and the tension inherent in setting any specific standard for such a pledge, given various forms of inequality.

On the one hand, this doesn't exactly reflect the true beliefs of EA thought leaders: MacAskill calls for the ultra-wealthy to donate as much as 99% of their income, and Giving What We Can has a Trial Pledge option, which is a way to make a smaller and more time-limited commitment. Nobody is stopping you from donating 10% to an effective charity and an extra 2% to the opera.

But psychologically, when people are processing the complex bundle of ideas that EA has to offer, in the context of a media appearance or magazine article, these conceptual distinctions can be lost. People really will come away with reactions like:

  • So you're saying I have to donate at least 10% or I'm a bad person?
  • So you're saying that everything I donate has to go to EA charities and I can't donate to anything else?
  • So you're saying that anything I donate to other causes is basically worthless compared to donating to EA causes?
  • So you're saying that my knowledge and intuition about the charities I'm interested in and the good they do in the world is valueless compared to your big fancy spreadsheets?
  • So you're saying that [my favorite charity] isn't effective? What the hell do you know about it???
  • Isn't everybody who's donating to charity earning to give?

And EAs will argue with them in a way that exacerbates these conflicts.

Recognizing the ways that a call for a 10% donation to effective charities can have a negative psychological impact on potential donors, relative to a minor modification to a 2%/8% split, is what my articles are about more broadly. This specific post is just meant to look holistically at how the 10% pledge, and its bundle of associated ideas, people and organizations, is represented in media coverage of EA.

David_Moss @ 2023-06-06T16:52 (+6)

Is the 10% Giving What We Can pledge, in which participants commit to donating 10% of their annual income to an effective charity, part of EA's brand or reputation?

 

These questions seem empirically tractable through surveys and related experiments. It's relatively straightforward to assess how many many familiar with EA associate it with the 10% pledge (the main challenge is that so few people have any familiarity with EA at all).

It would also be possible to assess how the pledge or association with effective giving more broadly, influences the reputation of EA. i.e. by conducting experiments, where people are randomly presented depictions of EA which include reference to the 10% pledge or to effective donations in general. This would also allow assessment of how these effects differ across different groups. RP could conduct this kind of experiment, though would need funding to do so.

DirectedEvolution @ 2023-06-07T00:31 (+2)

As one additional note, first, thank you for linking to the survey about people's familiarity with EA. Although I think it is probably useful evidence, and am extremely supportive of attempts to gather such evidence in general, one of my immediate concerns is that the data was gathered in April 2022.

This means the results predate both Will MacAskill's high-profile publicity tour for What We Owe The Future as well as the downfall of FTX. My guess is that the number of people who have heard of Effective Altruism has increased substantially since then. The New York Times has 8.6 million digital subscribers and has covered EA a decent amount over the last year (often negatively), although I am confident that only a fraction of its subscribers read these articles.

What we can learn from it is how EA was perceived prior to these two important signal-boosting and reputation-altering events.

One specific relevant point is the figure for how many people have heard of GWWC relative to other EA orgs: it is the second-most-recognized of the institutions they asked about, at 4.1% of respondants (vs. 7.8% for GiveWell, the most recognized organization).

I am not a professional pollster, so my ability to parse the results in a sophisticated way is limited. But I give some deference to the idea of the Lizardman Constant - the idea that a small fraction (on the order of 2-5%) will endorse just about anything in the context of a poll, including the idea that Lizardmen rule the earth. As most of the results are roughly in this range, I have to treat the results with moderate skepticism.

David_Moss @ 2023-06-07T09:39 (+2)

I think the numbers initially claiming to have heard of EA (19.1%) are strongly inflated by false positives (including lizardmen), but the numbers after the 'stringent' checks (including giving a qualitative explanation of what EA is) were applied (1.9-3.1%) are much less so (though, as we argue, still somewhat inflated). Note that the org results didn't have the same checks applied, so those definitely shouldn't be taken at face value and should be expected to be inflated by lizardmen etc.

This means the results predate both Will MacAskill's high-profile publicity tour for What We Owe The Future as well as the downfall of FTX. My guess is that the number of people who have heard of Effective Altruism has increased substantially since then.

We'll be publishing results about this soon, but as we noted here, we don't think there's been such a substantial increase in awareness of EA due to FTX, including among elite groups.

DirectedEvolution @ 2023-06-07T00:13 (+2)

Yes, I have tentative plans to conduct some interviews and MTurk surveys as a cheap and easy way to gather more empirical information. I don't think these will resolve the question, but hopefully they will continue to elevate the discussion with critique that is less focused on convenience sampling and ad hoc interpretation by a potentially motivated debater (which is how I would criticize the quality of the evidence I present here).

Larks @ 2023-06-06T12:59 (+5)

"I don't think the 10% norm forms a major part of EA's public perception, so I don't believe tweaking it would make any difference." - RobertJones

"10% effective donations has brand recognition and is a nice round number, as you point out." -- mhendric

These two comments both received agreement upvotes on my recent post Further defense of the 2% fuzzies/8% EA causes pledge proposal, and although they're differently worded and not irreconcileable, they seem basically to stand in contradiction with each other. [quotation marks added for nested quotes]

Robert wrote a much longer comment than that! Most of it was quite critical of your proposal; I agreevoted because I thought it had good criticism, not because of that first line. I think agreeing with ~90% of a comment is enough to warrant an agreevote. The apparent contradiction is resolved when you realize that both comments were critical of the idea and were upvoted as a result.

DirectedEvolution @ 2023-06-07T00:11 (+2)

That makes sense, and thank you for providing that context for your vote. Part of the challenge here is that our differences seem to be the result of more than one belief, which makes it challenging to parse the meaning of upvotes and agreevotes.

DirectedEvolution @ 2023-06-06T06:50 (+5)

To readers of this post, I would like to note that a small number of people on the forum appear to be strong-downvoting my posts on this subject shortly after they are published. I don't know specifically why, but it is frustrating.

For those of you who agree or disagree with my post, I hope you will choose to engage and comment on it to help foster a productive discussion. If you are a person who has chosen to strong-downvote any of the posts in this series, I especially invite you to articulate why - I precommit that my response will be somewhere between "thank you for your feedback" and something more positive and engaged than that.

Isaac Dunn @ 2023-06-06T09:20 (+10)

My guess it's that it's an unfortunate consequence of disagree voting not being an option on top-level posts, so people are expressing their disagreement with your views by simply downvoting. (I do disagree with your views, but I think it's a reasonable discussion to have!)

DirectedEvolution @ 2023-06-07T08:30 (+2)

Update: based on analytics and timing, I now believe that there are one or two specific individuals (whose identities I don’t know) who are just strong-downvoting my posts without reading them.

While they may be doing this because they disagree with what they can glean of my conclusions from the intro, I do not consider this to be different from suppression of thought. I am not certain this is happening but it is the best explanation for the data I have at this time.

DirectedEvolution @ 2023-06-07T00:09 (+2)

Thank you Isaac. Based on this post's more positive reception, I'm more inclined to update in favor of your view.