What do you view as core EA principles?

By leillustrations🔸 @ 2024-08-22T01:56 (+21)

Zach Robinson writes: 'In my role at CEA, I embrace an approach to EA that I (and others) refer to as “principles-first”.'

Patrick Gruban responds: 'an approach focussed on principles...could be more powerful when there is broader stakeholder consensus on what they are.'

I've definitely noticed that EA manifests slightly differently in different places. I think it would be helpful to discuss:

(Here is CEA's list of core principles that Zach references)


Arepo @ 2024-08-22T09:02 (+14)

One that I think is super important (and I think used to be on CEA's list?) is transparency:
* The movement was founded on Givewell/GWWC doing reviews of and ultimately promoting charities for which transparency is a prerequisite.
* Givewell themselves have been a model of transparency in their reasoning, value assumptions, etc
* It seems importantly hypocritical as a movement to demand it of evaluees but not practice it at a meta level
* Much of the sea of criticism (including my own) that followed FTXgate involved concerns about lack of transparency
* If as Zachary says the community  is 'CEA’s team, not its customers', it's hard for us to make useful decisions about how to participate without knowing the rationale or context for their key decisions
 

titotal @ 2024-08-22T13:48 (+7)

Out of the four "core" ideas, the one I take most issue with is the "scout mindset":

Scout mindset: We believe that we can better help others if we’re working together to think clearly and orient towards finding the truth, rather than trying to defend our own ideas. Humans naturally aren’t great at this (aside from wanting to defend our own ideas, we have a host of other biases), but since we want to really understand the world, we aim to seek the truth and try to become clearer thinkers.

I don't think this "scout vs soldier" is the most important thing when it comes to establishing truth. For example, a criminal trial is as "soldier" as you can get, but I would argue that trials still are truth seeking endeavors that often work quite well. 

Also, merely having a scout mindset is not enough: you could intend to find the truth, but be using really shit methods to do so. 

Instead, I would talk about a more general case of honesty, evidence-based reasoning, and testing/interrogation  of ideas, akin to scientific work. 

tobycrisford 🔸 @ 2024-08-22T16:31 (+3)

I think your "criminal trial" counter-example to the "scout mindset" narrative is really interesting.

I'm not convinced it quite holds up though, for a couple of reasons.

Firstly, I think there's two separate questions which you're conflating:

  1. How can someone, as an individual, best form accurate opinions about something?
  2. How can we design a process which will reliably lead to accurate decisions being made about contentious issues? And how can we design it so that those decisions will be broadly be trusted by the public?

These questions are similar, but not the same. In (1), there is not a trust problem. You know your own mind, and you know that you are sincerely committed to finding out what the truth is, whatever that might be. But in (2), we are designing some process that will be followed by people in positions of power. We have to be worried about the possibility that those people might be corrupt. Trust is a much bigger issue.

I'd have thought that the reason criminal trials are designed the way they are is related to this issue of trust, rather than because the criminal trial setup is an inherently good way of reaching the truth. In an ideal world filled with perfect people, maybe we'd let an impartial judge adopt a scout mindset, assess all the evidence in a case, and reach a decision. But in the real world, this would create unacceptable opportunities for corruption and abuse of power. We then tackle the risk that one of the people involved might be approaching the problem with a soldier mindset, by guaranteeing that they are, but also making sure that there are people doing this on both sides.

Secondly, I'm not sure the "soldier mindset" is really the right way to describe what a lawyer does anyway. A lawyer has to be able to defend someone well even when they might privately believe that they are guilty. The ability to do this well seems like it would require a "scout mindset" way of thinking, rather than a "soldier mindset" one.

titotal @ 2024-08-22T23:42 (+5)

Secondly, I'm not sure the "soldier mindset" is really the right way to describe what a lawyer does anyway. A lawyer has to be able to defend someone well even when they might privately believe that they are guilty. The ability to do this well seems like it would require a "scout mindset" way of thinking, rather than a "soldier mindset" one.

I see "soldier mindset" being described as akin to "motivated thinking" (eg here), and I think it's a stretch to say that a prosecution lawyer is not doing motivated thinking (in that trying to prove one thing true is their literal job). 

And yeah, for the reasons that you stated, if you can't trust people to be impartial (and people are not good at judging their own impartiality), setting up a system where multiple sides are represented by "soldier mindset" can legitimately be better at truth-seeking. Most episodes in scientific history have involved people who were really really motivated to prove that their particular theory was correct. 

My real point, though, is that this "soldier vs scout" dichotomy is not the best way to describe what makes scientific style thinking work. You can have a combination of both work just fine: what matters is whether your overall process is good at picking out truth and rejecting BS. And I do not think merely trying to be impartial and truthseeking is sufficient for this. "scout mindset" is not a bad thing to try, but it's not enough. 

katriel @ 2024-08-26T01:06 (+3)

To me, the core EA principles that I refer to when talking about the community and its ideas (and the terms I use for them) are:

  1. Cosmopolitanism: The same thing that CEA means by "impartiality." Beings that I have no connection to are no less ethically important than my friends, family, or countrymen. 
  2. Evidence orientation: I think this is basically what CEA calls "Scout mindset."
  3. Attention to costs and cost-effectiveness:  The same thing that CEA calls "Recognition of tradeoffs"
  4. Commensurability of different outcomes: GiveWell, Open Philanthropy, and others make explicit judgments of how many income doublings for a family (for example) are equivalent to one under-5 life saved, or similar. This enables you to do "cause prioritization" - without it, you get into an "apples to oranges" problem in a lot of resource allocation questions. 

I like CEA's explicit highlighting of "Scope sensitivity" - I will embrace that in future conversations. But I'm writing this post to highlight outcome commensurability too. I think it is the one principle that most differentiates EA-aligned international development practitioners from other international development practitioners who have a firm grounding in economics. 

Emrik @ 2024-08-23T21:27 (+3)

Principles are great!  I call them "stone-tips".  My latest one is:

Look out for wumps and woozles!

It's one of my favorite. ^^  It basically very-sorta translates to bikeshedding (idionym: "margin-fuzzing"), procrastination paradox (idionym: "marginal-choice trap" + attention selection history + LDT), and information cascades / short-circuits / double-counting of evidence…  but a lot gets lost in translation.  Especially the cuteness.

The stone-tip closest to my heart, however, is:

I wanna help others, but like a lot for real!

I think EA is basically sorta that… but a lot gets confusing in implementation.

Arturo Macias @ 2024-08-25T07:13 (+1)

The core of EA is to provide tools for optimizing the marginal impact of altruistic individual efforts given a large array of preferences and beliefs.

The most natural application is selecting optimal recipients of donations for any possible worldview.