Emphasizing emotional altruism in effective altruism

By michel @ 2022-07-05T19:47 (+148)

Thank you to Emma Abele, Lennart Justen, Lara Thurnherr, Nikola Jurković, Thomas Woodside, Aris Richardson, Jemima Jones, Harry Taussig, Lucas Moore, and James Aung for inspiration and feedback. All mistakes are mine. 

Summary

Introduction

Effective altruism grapples with profound themes. Themes like suffering at unimaginable scales, futures that transcend our notions of what “good” could look like, and the end of everything we’ve ever known do not occupy the realm of ordinary thoughts. And for many, myself included, these themes do not inspire ordinary emotions. Rather, they are entangled with deep, visceral emotions – emotions that can cast us into deep reflection, that can bring us to tears, and that may underlie the very reason we are in EA.

The feelings I’m gesturing at are delicate, and so is the art of conveying their immensity. Unfortunately, I’m not here to offer them the poetic justice they deserve. Rather, I’m here to claim that these emotions don’t get enough attention, and discuss why I think giving them more attention could benefit the EA community.

What emotions am I talking about?

I’ll give you a sense of what emotions I’m talking about when I use terms like “visceral motivation” and “emotional altruism” by sharing what thoughts can, at times, invoke these feelings for me. But a caveat first: I’m not referring to any one emotion. I’m thinking about a constellation of different emotions, all sharing an ability to inspire good. Feelings that come to mind for me include:

Some people may relate to a few of these emotions, some none, and some might wish to add their own. (Please share what feelings resonate for you in the comments!) I don’t think the specifics matter for the argument I’m making. Crucially, there exist raw, deeply personal emotions that underpin one’s desire to do good, and I think many EAs' feel such emotions in the face of suffering or the vision of a brighter future. This visceral response to a cruel world or a vast potential is honed by rational thought to land at effective altruism, but the motivation is emotional at the core for many. 

This isn’t the case for everyone: some people may arrive at EA following a series of rational arguments void of strong emotional appeals. And that’s fine. I do not wish to prescribe a “correct” motivation to do good in the world, I just want to speak to the emotional one. Even if it is not present at all times or accessible in all settings, I think powerful emotions form the bedrock of many people’s connections to effective altruism.

Yet the emotional weightiness of EA can feel muffled, glossed over at times. This can be at a personal level: in moments where life is asking a lot, it can be difficult to find those pure wells of motivation. That seems normal. But I also sense this at a community level: aren’t we here for the same reason? Don’t so many of us feel the same yearnings? 

There are fair reasons to quell strong emotional motivations. At times we need to dilute our emotions with a pragmatism that allows us to make progress on the very thing our emotions are telling us is important. And talking about these emotions can be scary. They’re personal, and disclosing them can leave one feeling vulnerable. 

But I think the resonance we can feel with others who share our hopes for a better future is worth stepping out of our comfort zone for. What's more, if we gloss over the emotional weightiness of what's at stake and how it makes us feel, I think we undersell the EA project. We risk giving outside observers a tainted picture of why we do what we do, and we risk not inviting in people who too feel that the world is fucked up. 

We should be mindful of muffling the emotional grandeur embedded in the problems we tackle. I’m not claiming EA needs a seismic shift in the extent to which it relies on emotional vs. rational appeal, or that all members should gush about their most personal motivations for doing what they do. But, for those among us who draw a lot from emotional reservoirs, a little more emotionality could go a long way.

 

Possible benefits of a greater emphasis on emotional motivation

Ways to affirm emotional motivation in effective altruism

Below are ways I think the effective altruism community could better tap into people’s emotional motivations to do good in this world. I’m excited about these, but I think they need to be approached delicately. Conversations, content, or experiences in the wrong settings can come across weird or culty. Some of the below recommendations need more careful consideration than I’m giving them here. 

More within community discussions 

I wish it were more common to talk about our EA motivations with fellow community members. Why do you dedicate yourself to this? Such conversations should be approached delicately, but, if you ever really resonate with someone, I’d encourage you to go out of your comfort zone and steer the conversation to the feel-y realm.

Example: weekly work check-ins to share motivations and try to cultivate a shared sense of purpose. Some people may also draw a lot from ‘metta,’ or loving-kindness, meditations.

Talk about beneficiaries in outreach 

When we talk about EA with those unfamiliar, we should remind people of the end goal: helping sentient beings. Give others something to grasp onto. “Why are you going to this EA Global conference?” “I want to help people in our generation and future generations, and connecting with this community seems like one of the best ways I can do so.” EA is instrumental in helping others, but I’m worried our appeals often ascribe it terminal value[3] (not that EA isn’t something to celebrate!)  

Example: When talking about biosecurity, paint a picture of the millions of people who could die awful deaths if a virus were released, not just the different levels at which one could intervene. 

Craft experiences that allow for powerful emotions

It’s hard to overestimate the importance of setting and vibes when conveying emotional motivations. The same words in the context of a fluorescent-lit classroom and a moonlit field invoke very different feelings. With this in mind, I think smaller, more intimate retreats can often offer the most transformative experiences. They allow for a change in setting and a sense of shared purpose that even EAG events can’t match. (EAG after-parties or late-night walks, however, are a different story).

Example: Organize events (e.g. small retreats) that take aspiring EAs out of their default settings. Be deliberate in designing programming (e.g. late night walks) that allow for these conversations, but don’t force it. 

More powerful writing

Powerful writing has already left a mark on many EAs. I and many others cite works like On Caring as formative in our EA engagement. I’d love to see more people try to capture their own motivations (or challenges) in personal pieces, or even fiction.

Examples: more writing like On Caring, 500 Million, But not a Single More, The value of a life, and Killing the Ants. I’d love to see more thoughtful pieces detailing why you are involved in EA. I’d also like to see more people experiment with poetry, fiction, and a variety of different framings of EA concepts (need not be explicitly reference EA). For example, I’d love to see more explorations of:

More videos and virtual reality content

For some people, I think the best video could be better than the best piece of writing for conveying the ‘why’ of EA. And I think the best virtual reality content could be even better. [5]

Examples: Videos like The Egg, the Last Human, There’s No Rule That Says We’ll Make It, and novel virtual reality content have an enormous power to convey EA concepts and their corresponding emotional weight. 

New creative artwork

Cold take: art in EA seems underexplored. Another cold take: Art evokes strong feelings in some people. This post does a nice job sourcing and categorizing some existing art connected to EA. Creating good art that overlaps with EA seems difficult, and my naive recommendation would be to focus more on conveying certain EA-related ideas or mindsets (e.g. coordination problems).

Example: Try to create a comic or graphic novel that conveys EA ideas. 

Deliberately having standout conversations.

Having standout conversations about peoples’ personal emotions and motivations towards EA feels like a skill. I perceive attributes of the best conversations to be: consensual (people want to engage), sheltered (people understand that nothing they say will harm them), curious (people try to genuinely understand where the other is coming from), and loving (radiating goodwill). 

Example: I’d be excited about people thinking deliberately about how they could improve on such conversations and bringing that energy to EA spaces (e.g., fellowships, board meetings). 

Meta-ideas

Conclusion

Here’s a metaphor I find fitting for EA’s project: 

When it comes to doing good, let emotions be your gas pedal, and careful reasoning your steering wheel. [6]

We’re good at not forgetting how unreliable our feelings are as guides for how to help other beings. But sometimes I worry we're also good at forgetting the feelings themselves. Don’t, I say. Let’s repurpose those feelings to do exactly what we want them to do. 

We don’t need to impose emotional weightiness on concepts like existential risk, animal suffering, or whatever else we’re dedicating ourselves to. Emotional weightiness is already embedded in the parts of reality these concepts point to. We can cultivate emotional motivation that stays true to the pursuit of doing the most good. 

This post is written from a place of and about a visceral desire to improve the world. In pure EA fashion, I’ve done gone and intellectualized that feeling. But I hope the sentiment still comes across. Promoting EA should stay closely coupled with promoting – or tapping into – the desire to improve the world. When we talk about effective altruism, we should allow it to be profound.

  1. ^

     This could be its own post. I think there are growing rifts in our community, and I wish we focused more on what we had in common. The AI safety/ rationalist communities also care a lot, even if they could convey this better. (AI safety/rationalist communities, please convey this better). 

  2. ^

     See psychology research: Does maximizing good make us look bad? and review of prosocial behavior and reputation. Also, when’s the last time you saw a movie where the protagonist was a consequentialist?

  3. ^

     Thomas Woodside’s speech to Yale EA captures a similar sentiment: “I don’t care about Yale EA”

  4. ^

     See this recent publication on The Psychology of (In)Effective Altruism by Lucius Caviola, Stefan Schubert, and Joshua Greene.

  5. ^

     I’d love to see people think more about what virtual reality could do here – and then do it. I remember being awe-struck at the grandeur VR could convey the first time I put on a headset and floated through the free trial of the International Space Station.

  6. ^

     I forget where I found this and who to credit. Can anyone help me out? 


Marcus Daniell @ 2022-07-06T19:49 (+15)

YES. 

I love this post. I think EA has a weakness when it comes to storytelling and grabbing hearts. We're great at appealing to the cerebral folk with careful reasoning and logic, but they're a small minority. If we want EA ideas to percolate deeply we need to be outcompeting the other heart-grabbers, which means appealing to emotion and layering the logic on top. IMHO.

Darius_M @ 2022-07-06T11:55 (+12)

At the risk of self-promotion, I wrote a motivational essay on EA a few years ago, Framing Effective Altruism as Overcoming Indifference

MichelJusten @ 2022-07-06T13:16 (+3)

Thanks for sharing! I hadn’t come across this but I like the framing.

Geoffrey Miller @ 2022-07-07T01:16 (+8)

Wonderful post Michel; thanks for your thoughts.

I think there's an understandable wariness of emotionality in EA, since it so often leads into Rousseau-ian romanticism, sentimentality, and virtue-signaling that's the exact opposite of Bentham-ite utilitarianism, scope-sensitivity, and rational thinking valued in EA. See, e.g. Paul Bloom's excellent 2018 book 'Against empathy'.

However, I think it's important for EAs to take emotions more seriously for at least a couple of reasons. (Full disclosure: I've taught a seminar on 'Human Emotions' about ten times for upper-level undergrads, and we often end up discussing EA topics.)

First,  emotional experiences are the basic building blocks of sentient utility. If we're trying to maximize the quality, quantity, and variety of sentient well-being, it might be important to understand the origins, nature, functions, triggers, and details of animal, human, and artificial emotions. Both from the outside (e.g. as students of emotion research) and from the inside (as people who have deep personal experience with the full range of human emotions -- including holding them at arm's length during mindfulness meditation or psychedelic experiences). 

Second, emotional experiences, as you argue, can be great for recruiting new people, helping them understand the stakes and ethos of EA, and in keeping current EAs happy, motivated, and inspired. I agree that more development of EA-oriented stories, art, music, videos, films, etc could help. The challenge is to recruit the kinds of artsy creatives who will actually resonate to the hyper-rationality of the EA mind-set. The Venn diagram of people who can write great screenplays, and people who actually understand long-termism and scope-sensitivity, might be quite small, for example. But those are the kinds of people who could really help with EA movement-building -- as long as they don't dilute, distort, or trivialize the core EA principles.

MichelJusten @ 2022-07-07T16:21 (+1)

Thank you for the thoughtful comment Geoffrey. I agree it's a fine balance between being wary – but not dismissive – of emotions.

I spent the latter half of my psychology undergraduate harping against emotions – talking about how our evolution has left us with unreliable guides for who to care about and how much to care. (Basically regurgitating Against Empathy). 

Yet here I am writing a whole post on emphasizing emotions. With enough nuance on which emotions we're talking about and in what settings, however, I think both views are compatible. (I appreciate your comment to that effect).

I also think your Venn diagram comment is apt. I agree it's a narrow overlap, but it's one I'd like to see a few more people with the right aptitudes lean into. 

Caro.Daniell @ 2022-07-06T18:32 (+8)

I really appreciated this post and it touches on themes I have wanted to write about as well. I have two related questions/comments:

  1. I wonder about also emphasizing the fact that these potentially helpful/motivating emotions (like loving-kindness and awe) are things that can be cultivated on an individual level - we tend to think emotions are somehow set/intrinsic parts of our personality but our ability to cultivate some emotions (and even altruism) through practices like mindfulness has been well-documented.
  2. Also I would challenge the perceived dichotomy between reason and emotion. Modern neuroscience has shown that emotion and reason are deeply intertwined and that this separation is part of the overall fallacy of mind-body dualism. Books like Antonio Damasio's Descartes Error  are helpful in demonstrating this point more, but my sense has been that the EA community holds up reason as some sort of ultimate capacity when in fact it is much more individualistic and dependent on our specific individual experience/emotion than it is often perceived.

Thanks for your thoughtful post!

MichelJusten @ 2022-07-06T22:40 (+1)

Thanks!

  1. I agree that emotions and their determinants are generally viewed as too static. Within reason, you can actively import certain emotions and change the way you relate to the ones that bubble up. ~Meditation~ :) 
  2. Agree, again. I hope I didn't make too strong of a dichotomy. Paul Slovic, a psychology researcher I'm fond of, has published on the importance of the 'affect heuristic.' We use our instinctive emotions to make lots of our decisions, and we're fooling ourselves if we claim all our decisions come from pure reason. I think the influence of emotions, either in the moment or at a more trait level, is generally greater than people think. 
kshen @ 2022-07-06T22:54 (+7)

Awesome post! I often think that EA needs to better incorporate this huge part of human experience into its discourse, and I think it can go beyond simply motivating people.

This essay also touched on a lot of themes of the Replacing Guilt essay series, which also came out of the EA community.

Stefan_Schubert @ 2022-07-30T15:07 (+6)

I agree that emotional approaches can have upsides. But they can also have downsides. For instance, Paul Bloom has a book-length discussion (which covers EA) on the downsides of empathy. Likewise, it's well-known that moral outrage can have severely negative consequences. I think this post would have benefitted from more discussion on the potential costs of a more emotional strategy, since it seems a bit one-sided to only discuss the potential benefits. (And the comments to this post seem pretty one-sided as well.)

Emily Dardaman @ 2022-07-07T16:25 (+5)

Organizational purpose consultant here. You would not believe the human potential left on the table by orgs that don't tap into our deeper, non-rational / personal motivations.

Aaron Bergman @ 2022-07-06T21:48 (+5)

Nice piece, and I mostly agree as written. 

One concern I have about raising the salience of emotions in EA is that might more frequently turn into prospective incentives (generally bad) instead of retrospective motivation (generally good).

What I mean is that people might (naturally and likely unconsciously) gravitate towards projects and cause areas that inspire positive emotions (say, awe about the future) and away from those that induce negative ones (say, distress about suffering). 

MichelJusten @ 2022-07-06T22:32 (+4)

This seems right and is something EAs who resonate a lot with emotional pitches should be cautious of. A recommendation would be to first do cause prio at a rational, pragmatic level. Then, when digging into the problem you deem most important, try to find parts you can emotionally latch onto to keep you motivated. 

Aaron Bergman @ 2022-07-06T23:15 (+1)

Yes, totally agree!

Doina @ 2022-07-06T17:10 (+4)

Fantastic topic and writing.  Very well argued, and its ethos comes through loud and clear.

wANIEL @ 2022-07-11T01:56 (+3)
  1. I agree with >80% of what's written here and jump for joy knowing that someone else is thinking these thoughts―pushing these thoughts to the fore.
  2. Those who experience the emotions described by Michel might cherish this Longtermism playlist that I've curated through a few hours of experimentation. 
    (I'm not looking for self-advancement here. Rather, it is my hope that this musical inspiration be a "gas pedal" for others as it has been for me over the years)
tswizzle96 @ 2022-11-11T02:27 (+1)

Figuring out the intersection of EA and music is such a brilliant thought that I had yet to have, but yet getting to connect two worlds I am so deeply involved with seems quite interesting to me, and I love that you gave this a go. Want to open it up to be collaborative so I can try to place songs that evoke a similar feeling for me? If not, nw, I can begin crafting my own. 

But also, I noticed that there are multiple connections to visual media throughout (Day One - Interstellar, The Night We Met - 13 Reasons Why, Ashitaka and San - Princess Mononoke, etc.) and given I think I resonate with the connected thoughts to these other works, was wondering if maybe it would be good to make a sort of "Compassionate Longtermist Emotion Building" list of movies and shows? I can easily see this list including Interstellar and Mononoke (I have qualms with 13 Reasons, but fully feel the beauty of the song) and feel like I have at least a couple more strong additions to go from there (beyond the obvious Don't Look Up). Would love to hear your thoughts on this!

Sarah Weiler @ 2022-07-09T12:22 (+3)

Great post, thanks for writing this up! I'm especially impressed by the compilation and description of different types of motivating emotions, seems quite comprehensive and very relatable to me.

I have one question about a minor-ish point you make:

"This isn’t the case for everyone: some people may arrive at EA following a series of rational arguments void of strong emotional appeals."

I've been wondering about that sort of reasoning quite a bit in the past (often in response to something an EA-minded person said). How can you arrive at EA-ish conclusions and goals solely through a serious of rational arguments? Do you not need emotions to feature at some point in order to define and justify how and why you seek to "make the world a better place"? (In other words: How can you arrive at the "ought" solely through rational argument?)

Brendon @ 2022-07-07T04:37 (+2)

Great article, this is how to reach people. Use the tools of creativity to shine the light on the reasoning. For example Max Tegmark is concerned about slaughterbots. The Back Mirror slaughterbot episode gave everyone who saw it an emotional shiver down their spine. The deep creative skillset large brands tap into should be used on the world most pressing problems. I've been working on this for the last few years and have started a creative agency to do exactly that.

Rafael Ruiz @ 2022-07-06T11:18 (+2)

This is post great! I had the idea for a similar post but you put it better than I could.

I hope more diverse messaging attracts a variety of people to EA and makes them more engaged overall.

Hauke Hillebrandt @ 2022-07-06T10:07 (+2)

Cf. I always recommend this excellent philosophy paper "On the aptness of anger".

kshen @ 2022-07-06T22:52 (+9)

That was a very interesting essay! I love the distinction of aptness vs instrumentality.

However, the closing paragraphs posed an odd turn of arguments -- essentially, the author tries make a move to say, "I reserve the right to be angry because it is one of the few/last instruments I be any kind of productive." While I agree with assessment, it seems to do a disservice to the author's argument to draw the attention back to the instrumentality of anger. The whole strength of her argument was that there is some place for anger, just as we grant to aesthetics and beauty and senses of justice, in discourse, that stands before considerations of instrumentality.

Lastly, it is also interesting that this essay expresses some disdain for consequentialism as oppressive. That is another intricate dynamic that may be pertinent to EA.

MichelJusten @ 2022-07-06T22:29 (+8)

I like it! One of my hopes in writing this post was sourcing posts people have related to this theme so I appreciate you sharing this. 

Marc Wong @ 2022-10-19T17:25 (+1)

Carl Sagan was inspiring and a great educator.
We must inspire people to think better (or be less wrong), and bring out the best in others (or do good better). We can't reason or lecture people into changing their behavior.
Here's an example:
If your next car were 10x more powerful, would you want more safety features, traffic rules, and driver training? Would you trust car companies alone to address all risks created by these 10x more powerful cars? What safety features, regulations, and public education will be needed when social media, AI, nanotechnology, robotics, or genetic engineer becomes 10x more powerful? Do you trust companies alone to address all the risks created by new technology?
Perhaps more importantly, what would you do to help humanity become 10x better at being objective, understanding and respecting others, and helping others?

Lastly, (self-promotion coming...) my post about inspiring humanity to be its best:
https://forum.effectivealtruism.org/posts/7srarHqktkHTBDYLq/bringing-out-the-best-in-humanity