Exporting EA discussion norms
By Eevee🔹 @ 2021-06-01T13:35 (+24)
I moderate a Discord server unrelated to EA, and recently we had a discussion where participants were being dismissive of each other's POVs and making personal attacks. I think the EA community has better discussion norms than other online spaces I've been in, and I wish I could spread them to the other communities that I'm part of to improve the quality of discourse in those spaces. Has anyone tried to bring "EA-like" discourse norms to non-EA spaces? Is it easy to improve a community's discussion norms?
For reference, my server is less than a year old with about 100 members, ~15 of whom are active, and many members come from the same two subreddits.
Stefan_Schubert @ 2021-06-01T14:41 (+11)
I guess you could see, e.g. Julia Galef's The Scout Mindset as doing that, in part.
Matt_Lerner @ 2021-06-01T13:58 (+9)
I think about this all the time. It seems like a really high-value thing to do not just for the sake of other communities but even from a strictly EA perspective— discourse norms seem to have a real impact on the outcome of decision-relevant conversations, and I have an (as-yet unjustified) sense that EA-style norms lead to better normative outcomes. I haven't tried it, but I do have a few isolated, perhaps obvious observations.
- For me at least, it is easier to hew to EA discussion norms when they are, in fact, accepted norms. That is, assuming the best intentions of an interlocutor, explaining instead of persuading, steelmanning, etc— I find it easier to do these things when I know they're expected of me. This suggests to me that it might hard to institute such norms unilaterally.
- EA norms don't obviously all go together. You can imagine a culture where civility is a dominant norm but where views are still expressed and argued for in a tendentious way. This would suck in a community where the shared goal is some truth-seeking enterprise, but I imagine that the more substantive EA norms around debate and discussion would actually impose a significant cost on communities where truth-seeking isn't the main goal!
- Per the work of Robert Frank, it seems like there are probably institutional design decisions that can increase the likelihood of observing these norms. I'm not sure how much the EA Forum's designers intended this, but it seems to me like hiding low-scoring answers, allowing real names, and the existence of strong upvotes/downvotes all play a role in culture on the forum in particular.
Venkatesh @ 2021-06-02T10:43 (+1)
Specifics matter. There can be no one discussion norm to get people to be nice to each other.
I think things like discussion norms are highly contextual. The platform in which the discussion is happening, the point being discussed, the people who are involved in the discussion are some of the many factors that could end up mattering. Given these factors, transporting discussion norms from one virtual place to another might not be the right way to think about it.
I think the "EA-like" discussion norm is a function of several things. In addition to the factors mentioned above, the concept of EA itself seems to ask for people to be uncertain and humble.
Consider the following thought experiment - say you took all the same people from the EA Forum and put them all in a Facebook group. Do you think the "EA-like" discussion norms currently here would be maintained? Or imagine putting them all in a forum, not about EA or Philosophy or Sciency stuff. What would happen?