Good practices for changing minds

By Nikola @ 2022-04-07T15:20 (+20)

[highly inspired by Akash and Kuhan’s talk on ambitious thinking in longtermist community building, informed by my experience with one-on-ones]

Most of us are trying to make the world better. Most of us disagree on how to do this, which means most of us are wrong. Most of us can change our opinions to become closer to the truth and get better at making the world better.

Here are some rules of thumb that I’m most certain about being helpful. These are to be added on top of a foundation of careful listening to your interlocutor, gently trying to identify cruxes, rephrasing your interlocutor’s claims to see if you got them right, etc.

TL;DR - read the titles

Adopt the alliance mindset

It’s important to keep in mind that we’re all on the same team, but some of us are confused about specific things about how the world works. Getting closer to the truth is a collaborative effort where both of you are trying to help each other get what both of you want.

Avoid typical mind fallacy

Bayesian thinkers tend to be better at changing their minds when presented with new information, but most people are not very Bayesian in their thinking. Changing one’s mind about fundamental things is not only an intellectual activity, but also emotional and social. Thinking that “people should change their minds based on facts” is not going to help much.

Shed your identity

Tying into the above point, it is good to avoid vocabulary that mixes identity with beliefs and competencies.

This is good on an individual scale but also good because it encourages others to use similar vocabulary and get better at changing their minds.

Don’t dunk on people

Most people don’t like being humiliated, and changing one’s mind during an argument is for some people a form of humiliation. If you sense that someone is very inertial about ceding a specific point, back off. 

There are two failure modes here:

  1. Your interlocutor grudgingly cedes that they are wrong, faces humiliation, and never wants to talk to you again
  2. Your interlocutor chooses to die on the wrong hill, bites a stupid bullet, faces humiliation, and never wants to talk to you again

If these things are likely to happen, then back off and let people change their minds in privacy, where there is much less humiliation. Maybe send them a blog post or article about the thing you talked about. The seed is planted, you just need to let it grow. If they want to keep talking, they’ll probably contact you, or say yes to you asking if they want to get a coffee/meal sometime.

Related: leave people a line of retreat, that is, point out to them that it’s not the end of the world if they’re wrong (well, ignoring that it might actually be the end of the world pretty soon if we’re right about x-risk).

Assume that people are better and smarter

When forming beliefs about others’ beliefs and motivations, it is best to err on the side of behaving as if they are smarter and more benevolent than your best guess. Overshooting these leads to flattery at worst, but undershooting leads to insulting people, which you do not want to do under any circumstance. Do not assume that people are unintelligent or malevolent, assume that there is a misunderstanding.

Keep in mind that I’m only making a claim about implementing this in specific conversations. You should not be eager to trust people with your bank account information.


ClaireZabel @ 2022-04-08T06:57 (+16)

Riffing off of the alliance mindset point, one shift I've personally found really helpful (though I could imagine it backfiring for other people) in decision-making settings is switching from thinking "my job is to come up with the right proposal or decision" to "my job is to integrate the evidence I've observed (firsthand, secondhand, etc.) and reason about it as clearly and well as I'm able". 

The first framing made me feel like I was failing if other people contributed; I was "supposed" to get to the best decision, but instead I came to the wrong one that needed to be, humiliatingly, "fixed". The frame is more individualistic, and has more of a sense of some final responsibility that increases emotional heat and isn't explained just by bayesian reasoning.  

The latter frame evokes thoughts like "of course, what I'm able to observe and think of is only a small piece of the puzzle, of course others have lots of value of add" and shifts my experience of changing decisions from embarrassing or a sign of failure to natural and inevitable, and my orientation towards others from defensiveness to curiosity and eagerness to elicit their knowledge. And it shifts my orientation towards myself from a stakesy attempt to squeeze out an excellent product via the sheer force of emotional energy, to something more reflective, internally quiet, and focused on the outer world, not what my proposals will say about me. 

I could imagine this causing people to go easy on themselves or try less hard, but for me it's been really helpful. 

Trevor Levin @ 2022-04-07T15:38 (+4)

Great post, possibly essential reading for community-builders; adding a link to this in several of my drafts + my retreat post. I think another important thing for CBers is to create a culture where changing your mind is high-status and having strongly held opinions without good reasons is not, which is basically the opposite of the broader culture (though I think EA does a good job of this overall). Ways I've tried to do this in settings with EA newcomers:

1) excitedly changing your mind - thinking of a Robi Rahmanism "The last time I changed my mind about something was right now." This doesn't just model openness; it also makes changing your mind a two-way street, rather than you having all the answers and they just need to learn from you, which I think makes it less identity-threatening or embarrassing to change your mind.

2) saying, in conversations with already-bought-in EAs that are in front of newcomers, things like "Hmm, I think you're under-updating." This shows that we expect longtime EAs to keep evaluating new evidence (and that we are comfortable disagreeing with each other) rather than just to memorize a catechism.

Nikola @ 2022-04-07T15:49 (+2)

Strongly agree, fostering a culture of openmindedness (love the example from Robi) and the expectation of updating from more experienced EAs seems good. In the updating case, I think making sure that everyone knows what "updating" means is a priority (sounds pretty weird otherwise). Maybe we should talk about introductory Bayesian probability in fellowships and retreats.

Trevor Levin @ 2022-04-07T20:01 (+2)

Yes, true, avoiding jargon is important!

Jack Ryan @ 2022-04-07T19:27 (+3)

I might add something to the tune of "have them lead the conversation by letting their questions and vague feelings do the steering"