Impossible EA emotions
By ClaireZabel @ 2015-12-21T20:06 (+35)
[This is a personal and emotional post about my feelings about EA. Probably not for everyone! Could cause sadness, scrupulosity concerns, and guilt.]
I think it's true that 2x the suffering is 2x as bad, and it would be emotionally accurate for it to make me 2x as sad, i.e. if it did then my emotions would better reflect reality. But I worry that a lot of people get tangled up with the distinction between emotional accuracy and instrumental value of emotions. They're often correlated; it's useful to be more scared of dying in a car crash than dying by lion attack. And emotions can be motivating, so having emotions that reflect reality can cause greater effectiveness.
But this gets tricky with EA.
I believe the moral importance of suffering increases linearly as suffering increases, but there are non-linear marginal returns to having emotions that reflect that. Just as there are instrumentally rational techniques that require irrationality, there are instrumentally useful emotions that require emotional inaccuracy. I don't know what emotions are most instrumentally useful for improving the world, but they're probably not going to be the ones that correspond linearly to the reality of the amounts of suffering in the world.
I only know from the inside my own seemingly morally relevant experiences, my subjective feelings of joy and serenity and curiosity and sorrow and anger and apathy. In practice, I can only at my most emotionally expansive moments hold in my mind all the morally relevant experiences I think I have in my median hour. So I can maybe comprehend less than 1/140,000 of the morally important things I've personally felt*. I don't know if I'm an outlier in that regard, but I'm pretty certain that I am completely incapable of emotionally understanding a fraction of the value of a life (even when I have the huge advantage of having felt the life from the inside). And that's not changing any time soon.
Yet, it somehow seems to be true that billions or trillions of beings are having morally relevant experiences right now, and had them in the past, and (many times) more could have morally relevant experiences in the future. My emotions are not well-equipped to deal with this; they can't really understand numbers bigger than a three or experiences longer than an hour (true story) (I may be unusually incompetent in this regard, but probably not by many orders of magnitude).
The cost to save a human life might be a few thousand dollars. The value of each sentient life is incomprehensibly vast**. EA is a "bargain" because so many lives are so drastically undervalued by others. And resources are scarce; even if some lives weren't undervalued relative to others, we still couldn't give everyone what their value alone would compel us to, if we had more.
Having to triage is desperately sad. The fact that we can't help everyone is terrible and tragic; we should never stop fighting to be able to help everyone more. I worry about losing sight of this, and denying the emotional correctness of feeling an ocean of sorrow for the suffering around us. To feel it is impossible, and would be debilitating.
I can't emotionally comprehend all of what I'm doing and not doing, and wouldn't choose to if I could. That's why, for me, effective altruism is a leap of faith. I'm learning to live a life I can't emotionally fully understand, and I think that's okay. But I think it's good to remind myself, from time to time, what I'm missing by necessity.
*Assuming I have no morally relevant experiences while sleeping, which seems untrue.
**With the exception of borderline-sentient or very short-lived beings that have lives with little (but nonzero!) moral value.
undefined @ 2015-12-22T18:24 (+2)
Great post!
Out of interest, can you give an example of an "instrumentally rational technique that require irrationality"?
undefined @ 2015-12-21T22:42 (+2)
Great post! It reminds me of this one: http://mindingourway.com/the-value-of-a-life/
undefined @ 2015-12-21T22:30 (+2)
This is really powerful, Claire, thank you for sharing it!
Let me be clear that I am not responding to how you should feel, but just brainstorming about an instrumental approach toward the most effective emotional tonality.
For an ideal emotional tonality, I wonder if it might be helpful to orient not toward feeling sorrow for the lives not saved or morally relevant experiences not had, but to feeling neutral about them, and only get positive experience from additional lives saved and morally relevant experiences had. This can tap into the power of positive reinforcement and rewards, which research shows tends to function better than negative reinforcement in motivating effective behavior. Since sadness is, science suggests, not motivating, Effective Altruists might be better off orienting to avoiding sadness, and focusing on experiencing joy over successes.
This presumes an ability to self-modify one's emotions, which is certainly doable, but quite effortful.
Again, this is not meant to be prescriptive, but just responding to what an ideal emotional tonality could be.
undefined @ 2015-12-22T02:51 (+1)
Thanks Gleb.
It was my understanding that thinking of both potential good and bad outcomes (mental contrasting) was more powerfully motivating than thinking of either alone. In my experience, psychology research on this subject also isn't super reliable. Personally, I definitely find thinking about bad outcomes motivating, as I'm a naturally happy person and good outcomes don't make me much happier than the baseline for long.
I expect this varies a lot from person to person.
undefined @ 2015-12-22T03:41 (+1)
The motivational aspects do vary a lot from person to person :-) The nature of the specific emotions and their impact on motivation is more consistent across the majority, however - far from all, but for the majority.
Negative feelings of sadness/sorrow tend to be demotivating, and may lead to depression. Anxiety can be motivating or demotivating, depending on the extent of the anxiety. Anger/frustration tends to be motivating.
Positive feelings of satisfaction/contentment are usually demotivating. Joy/pleasure/excitement can be motivating, especially if coupled with a clear means of gaining these experiences.
MichelJusten @ 2022-07-16T06:22 (+1)
Just discovering this now, but it really resonates. “A leap of faith” — I like that.
undefined @ 2015-12-21T23:12 (+1)
I think this is an incredibly powerful post, and definitely worth sharing. I wonder if there's a way to edit some of it to make it more front-facing, without losing out on any of the emotional power.