EA has a lying problem [Link Post]

By Nathan Young @ 2022-08-16T21:57 (–3)

This is a linkpost to https://srconstantin.wordpress.com/2017/01/11/ea-has-a-lying-problem/

If you don't think other people should read this post, downvote it. That's what, in my opinion, downvotes are for. 

I don't love the title of this post and if you look at the comments, it consumed the EA discourse 5 years ago. 

That said, it's still in the best 3 (?) pieces of EA criticism I've read, in that it gives specific examples and highlights some important issues 

The pro-Brexit campaign was successful. It also strategically used some figures and methods that were debatably dishonest, in that if you fully wanted to understand the situation they are not the figures that you would personally land on to think with (I’m specifically referring to the gross £350M per week that the UK pays the EU, which should be netted against funds received from the EU). EA organizations are in essentially a similar situation to that of the Vote Leave campaign. They want to convince the great mass of humanity to undertake some action, but John Q Public is not interested and perhaps not even capable of understanding the nuances involved. In some cases things are so complicated that no one fully understands them. Ultimately with our current state of knowledge and individual intellectual capacity, there is no way to get people to be fully-informed any any topic, especially one as abstract and irrelevant to normal daily living as the EA movement.

Convincing the masses that don’t understand a thing that it is still a good idea has a long, long history. That it is easiest to do this with careful deception is unfortunate, but ultimately you have to work with the reality that you have. (That’s the story people tell themselves to feel better)

The comments are pretty instructive too:

From Holly Elmore:

There are many problems with this post– the fact that all the evidence is from facebook and forum posts and is heavily extrapolated from, for one– but the most central problem is Sarah’s refusal to question her own personal ethics. What if, for the sake of argument, it *was* better to persuade easy marks to take the pledge and give life-saving donations than to persuade fewer people more gently and (as she perceives it) respectfully? How many lives is extra respect worth? She’s acting like this isn’t even an argument. When Ben talks about weighing the costs of harm to the movement, he’s taking an ethical position on this question. Movements like EA most often die by being picked apart from the inside– it *is* a serious risk, and keeping it alive might be worth some compromises. Sarah is also taking an ethical position, that this is never okay and our lack of “respect” for the public is all the condemnation she needs to offer.

Raemon has also written what I think is a great piece on community knowledge in response to this. https://forum.effectivealtruism.org/posts/hM4atR2MawJK7jmwe/building-cooperative-epistemology-response-to-ea-has-a-lying 


Kaleem @ 2022-08-16T23:06 (+14)

I think your reflections on the piece are valuable and the important issues you point out are valuable. However I'm downvoting because I wouldn't want people wasting their time reading the linked post because:

  1. the author's self-admitted use of dishonesty in writing the article in the first place, which happens between this comment from Will and this comment from Will. 
  2. Even if the post had some merit in the way the EA movement and EA orgs treated criticism at the time of posting, it seems to be extremely out of date and inaccurately reflecting how EA relates to criticism in 2022 (E.g. we offer people significant reward for writing good critiques of EA and make clear to people in the community and outside of it that we value criticism highly)
Linch @ 2022-08-17T06:00 (+15)

I don't think #1 is a major problem for criticism in general.  We shouldn't expect our critics to be saints, and I think hypocrisy is an over-applied heuristic for intellectual or moral disagreements.

Somewhat agreed with #2 however.