Brainstorming ways to make EA safer and more inclusive
By richard_ngo @ 2022-11-15T11:14 (+149)
After some recent discussion on the forum and on twitter about negative experiences that women have had in EA community spaces, I wanted to start a discussion about concrete actions that could be taken to make EA spaces safer, more comortable, and more inclusive for women. The community health team describes some of their work related to interpersonal harm here, but I expect there's a lot more that the wider community can do to prevent sexual harrassment and abusive behavior, particularly when it comes to setting up norms that proactively prevent problems rather than just dealing with them afterwards. Some prompts for discussion:
- What negative experiences have you had, and what do you wish the EA community had done differently in response to them?
- What specific behaviors have you seen which you wish were less common/wish there were stronger norms against? What would have helped you push back against them?
- As the movement becomes larger and more professionalized, how can we enable people to set clear boundaries and deal with conflicts of interest in workplaces and grantmaking?
- How can we set clearer norms related to informal power structures (e.g. people who are respected or well-connected within EA, community organizers, etc)?
- What codes of conduct should we have around events like EA Global? Here's the current code; are there things which should be included in there that aren't currently (e.g. explicitly talking about not asking people out in work-related 1:1s)?
- What are the best ways to get feedback to the right people on an ongoing basis? E.g. what sort of reporting mechanisms would make sure that concerning patterns in specific EA groups get noticed early? And which ones are currently in place?
- How can we enable people who are best at creating safe, welcoming environments to share that knowledge? Are there specific posts which should be written about best practices and lessons learned (e.g. additions to the community health resources here)?
I'd welcome people's thoughts and experiences, whether detailed discussions or just off-the-cuff comments. I'm particularly excited about suggestions for ways to translate these ideas to concrete actions going forward.
EDIT: here's a google form for people who want to comment anonymously; the answers should be visible here. And feel free to reach out to me in messages or in person if you have suggestions for how to do this better.
CristinaSchmidtIbáñez @ 2022-11-15T12:29 (+78)
Hey Richard, thanks for starting the discussion! I'd suggest making it easier to submit answers to these questions anonymously e.g. via an anonymous Google Form. I think that will help with opening up the discussion and making the brainstorming more fruitful.
richard_ngo @ 2022-11-15T17:00 (+24)
Great idea, here's an anonymous google form, and the answers should be visible here.
Julia_Wise @ 2022-11-17T03:45 (+23)
Thanks to people who are leaving ideas here!
A note on the comment that asked why the community health team isn’t visibly “pulling these guys aside and privately warning them that they are making people uncomfortable.” We have definitely done that when someone lets us know about a problem and are ok with us doing something. In other cases, the person reporting the problem doesn’t want us to take action (often because they don’t want the other person to guess that they spoke up.)
If you’ve experienced a problem and want us to talk to someone, we’re very willing to do that. And we certainly do consider these types of complaints for admissions to future events. More here.
Isabel @ 2022-11-15T16:24 (+67)
I can't shake the feeling this is a misidentification of the problem. I feel like people have been writing for years about how to make communities more diverse, inclusive, and safer for the people in them. The problem is not a lack of ideas that needs to be rectified by brainstorming - we have the information already. The problem seems to be that no one wants to act on this information.
I don't know what kind of norm has yet to be stated that would convince people it's bad to have an expert panel that's just 5 white guys. I don't know what kind of "norm" I need to explicitly articulate to make it clear that it's not okay to come to a discussion of gender relations in EA and, as a man, open the discussion by asking how to hit on girls without it being creepy.
I get the feeling everyone wants to pay lip service to these values without actually addressing the problem (thus the great karma on this post with no comments actually replying to the content of it [although I also agree, in line with Vaidehi and a great Cards Against Humanity card, that Hell will freeze over before we have a sane and respectful discussion of gender and race on the internet]). But the problem is behavior. The behavior simply needs to change. We are past brainstorming. Stop doing the things that make EA spaces hard to be in.
richard_ngo @ 2022-11-15T18:42 (+15)
Hi Isabel, thanks for commenting. I think your frustration is understandable, but I'm more optimistic than you about the usefulness of collecting experiences and examples in a place where people can easily refer back to them, and giving people a space to contribute their perspectives. It's hard for any one person to have a full picture of the situation across many local communities in many countries; even the couple of examples you gave are useful for me in thinking about which interventions to prioritize. (Of course, to the extent that this has already been done, I'm happy to just fall back on that work rather than reinventing the wheel - please let me know if there are specific resources you think more people should see, and I'll link them in the original post).
I also wanted to note that people who read the EA forum are a relatively small proportion of the people who go to EA meetups, attend EA events, and so on. So one of the intentions of my post was to better understand the best ways for the people reading this to spread the message from your comment more broadly.
Vaidehi Agarwalla @ 2022-11-15T15:06 (+39)
I think this topic is very important and your question prompts are great, and agree that wider community could maybe do more. I'm skeptical that much progress can be made via online community discussion.
This topic involves are sensitive, complicated and often there won't be a obviously right answer (especially around norms).
My guess is that a dedicated person or small team would be needed to collect experiences (whether via surveys, interviews etc), consult with experts on best practices and then get buy in from relevant stakeholders (e.g ea orgs or groups) and work with them (possibly in a pretty hands on way) to test and implement norm changes.
Markus Amalthea Magnuson @ 2022-11-15T15:51 (+17)
It seems very important to involve the community at all levels, including the main arena of discussions.
Additionally, delegating important community-affecting processes (and eventually, decisions) to small "expert groups" might actually be one of the mechanisms one could criticise the EA community to be over-reliant on, and that causes some of the problems in the first place.
I also wanted to point out that norm changes might not be entirely what a lot of people have in mind, but rule changes too. An important distinction.
richard_ngo @ 2022-11-15T17:41 (+8)
I'm not sure if I agree, but if we suppose you're right, then I'm interested in thinking through what the bottlenecks are for that person or team existing (if they don't already), and who might be a good fit for that work.
Vaidehi Agarwalla @ 2022-11-16T16:36 (+6)
My educated guess is that the people (I believe) would be best positioned / able to do this kind of work well (and I do think it's important to have a high bar here) have a lot of other compelling opportunities for impact.
Ula @ 2022-11-15T15:10 (+6)
I feel like this is a great post, and we should start a conversation here, e.g., by brainstorming problems and solutions. Stopping a conversation just because someone should actually do it, might not be the best way to go, because what guarantees that someone does? At least with this post, we can see if people actually systematically thought about it and already came up with some solutions that could be useful. I have a few ideas for improving EAGs that I would love to share and get feedback on.
Vaidehi Agarwalla @ 2022-11-15T15:41 (+10)
To clarify, I don't want to stop the conversation, just wanted to flag that I don't think it would be very productive relative to the effort people might put in engaging , and there may be a small chance the conversation online becomes unproductive because it's much harder to have conversations in sensitive topics online.
I could be wrong of course - perhaps it's important for people to see that community members want to start these conversations and care about these topics
richard_ngo @ 2022-11-15T17:11 (+4)
Thanks Ula! Curious what your ideas about improving EAGs are?
KMF @ 2022-11-15T18:04 (+28)
We can't take on everyone who applies so we are not the whole answer but I as an individual am always happy to chat with people from underrepresented groups Magnify Mentoring the organization continues to take on hundreds of people (currently women, non-binary and trans people of all genders) for mentoring per 6-months which we will continue doing as long as the average results continue to support such an initiative (which thus far they do). We welcome support and as an individual and possibly as an organization, I welcome the opportunity to support efforts and further productive attention in this space. Thanks :)
richard_ngo @ 2022-11-15T18:28 (+3)
Thanks for your work KMF! I'm curious if there are any specific patterns or common stories you've noticed from talking to people from underrepresented groups who are involved in EA?
(Also, FYI, your Magnify Mentoring link is broken.)
KMF @ 2022-11-15T20:54 (+29)
Thanks so much, Richard! :) Correct link. All of this is what I would have said 10 days ago. These are *MY* thoughts they are not Magnify’s thoughts- any controversies are mine alone.
(1) Not specific to the EA community but I wish people were kinder to one another and took more time to learn about each other's experiences, why they may have the feelings, and thoughts that they do, and where it is coming from. I think spaces that prioritize empathy and understanding are more conducive to working on fixing these problems together:
- There are small things like "be careful with the jargon you use", "be careful to greet everyone and engage with them at your meet-up" and supercharge your empathy when dealing with topics that might disproportionately affect people from underrepresented groups (such as racial and ethnic discrimination, gender-based harassment and violence, etc.) We want to get to the right answer quickly and efficiently but the most productive conversations I've had in this space adopt a team mindset and a heightened level of compassion.
- The EA community’s online spaces in particular need to work on strengthening empathy and compassion in all dialogues. For example, I was name-checked online a while back about how Magnify’s description of our program participants (“women and non-binary”) was discriminatory against trans people. My initial reaction was to be hurt and mad but I forced myself to hear them as a person- she was hurt, and I was hurt- her underlying motivation was to make sure trans people felt comfortable and welcome- I wanted this too -we reached a resolution together without publically taking chunks out of each other and causing further harm.
- I believe the EA community should have a ZERO asshole policy and stick to it bravely. The ideas of the person may be great but if the cost is driving excellent people out I don't believe it's worth it. For me personally, Robin Hanson is an example of this.
(2) A mentee recently asked me if she could be an EA if she liked justice-shaped dialogues. I want to see EA as a toolkit where we all work together and bring in lots of different ideas, tools, and traditions to create excellent outcomes that positively shape the world and reduce suffering. A monolith or single identity is not conducive to this.
(3) I’m biased but I'd also love to see people spend more time mentoring. I think the focus on time optimization is great but it sometimes backfires in that people are not as willing to invest in newer EAs, particularly those who are undercooked. I believe I have had a large impact simply by having chats with people, pointing them at resources, and touching base with them in 6-months.
(4) I'd love to identify some better questions to accurately identify the “seeds” of what we now call value alignment. I think we are still using low-accuracy proxies such as who people know and books they have read. I'd love to deconstruct what we mean (I think it's prioritization clarity) by “value alignment” and how we can measure it correctly.
(5) These topics (power dynamics) are vitally, vitally important. I've been involved in EA for the better part of a decade so I have many people I genuinely love who are working in this space along with me. This network of affection is great, but it is also a very good reason to supercharge adopting best practices on declaring conflicts of interest and raising awareness of power dynamics on grants, hiring, etc. Julia Wise and the community health team have done some excellent work here but this emphasis needs to spread. (To be clear, I think over time EA organizations have significantly professionalized. Every time I have applied for funding I have needed to declare a conflict of interest because a couple of our board members work/ed at CEA. They also asked me to publicly display this on our website. This is to me a sign of a maturing movement and has helped in my opinion.) Thank you very much.
SeeYouAnon @ 2022-11-15T16:43 (+28)
Here's a concrete suggestion: less emphasis on a person's impact in "interpersonal harm" cases.
2 of the 7 rows in the table on the community health team post on this topic are about making sure that we can still get the value of perpetrator's work.
This seems to me to be the sort of naive consequentialist reasoning that people find so offputting about EA. It seems to me that when a perpetrator commits a wrongdoing, the question to ask isn't "do they do work that we value and want more of". The questions are "did they commit a wrong? what sort of wrong? what is the appropriate response to that wrongdoing?".
(And note that my link is to an official statement from the team explicitly dedicated to helping protect the community. Imagine how much more prevalent the attitude of "focus on impact" is in some less cautious corners of the community.)
Neel Nanda @ 2022-11-15T18:37 (+19)
Looking at the context of that table, that is a list of difficult tradeoffs where both sides are valuable and they're not sure they have the right balance. This seems pretty correct to me? It is a difficult tradeoff, both sides ARE valuable, and they may not have the right balance.
SeeYouAnon @ 2022-11-15T18:47 (+52)
My view is that this is false. Whether or not someone's work is useful should play no role in determining the appropriate reaction to predatory behaviour, so there's just no tradeoff we should be reflecting on here. I don't think this is a difficult question. I don't think that (for example) the talent bottleneck is relevant to how EA should respond to predatory behaviour: if people act in a predatory way, that should be acted on even if it makes it harder to find talent. The tradeoff is simple because one side of it should be entirely ignored.
I'm sure that many of the readers of this forum will disagree with me about this. But my view is that the community will never robustly act on predatory behaviour while it continues to treat impact as one of the relevant factors in determining how to respond to such behaviour.
(I also think this is an example of how, despite some people's protestations, EA does in fact engage in a sort of means-end reasoning that is in violation of common sense morality and that does involve a failure of what many people would see as integrity).
I think it's important to be clear about this viewpoint, but I do worry that in doing so it will sound like I'm attacking Neel. So I want to be clear that this is not the case; I don't know Neel but I imagine he's an excellent and lovely human being. I just happen to think he's wrong about this specific issue, and I happen to think that the fact that many EAs hold this view has had serious and negative effects.
ETA: Even with all of that said, I do agree that the full post from the community health team contains much more detail than I summarised in my brief reference, and I think people should not judge the full contents of the post based on my comment (instead, I would encourage people to read the post itself).
SeeYouAnon @ 2022-11-15T18:59 (+23)
It's perhaps worth noting that I think there's a pretty strong consequentialist case against considering impact in these cases. I think doing so has reputational costs, I think it encourages future wrongdoing, and I think it discourages valuable contributions to the community from those who are driven away. (This is just the point that consequentialist EAs are making when they argue against being "naive" consequentialists).
But I will leave someone else to make this case in detail if they wish to, because I think that this is not the point. I personally find it disturbing that I would have to make a case in impact terms in order to encourage robust action against perpetrators, and I don't feel comfortable doing so in detail.
Jenny K E @ 2022-11-15T19:45 (+11)
I think maybe that the balance I'd strike here is as follows: we always respect nonintervention requests by victims. That is if the victim says "I was harmed by X, but I think the consequences of me reporting this should not include consequence Y" then we avoid intervening in ways that will cause Y. This is a good practice generally, because you never want to disincentivize people from reporting by making it so that them reporting has consequences they don't want. Usually the sorts of unwanted consequences in question are things like "I'm afraid of backlash if someone tells X that I'm the one who reported them" or "I'm just saying this to help you establish a pattern of bad behavior by X, but I don't want to be involved in this so don't do anything about it just based on my report." But this sort of nonintervention request might also be made by victims whose point of view is "I think X is doing really impactful work, and I want my report to at most limit their engagement with EA in certain contexts (e.g., situations where they have significant influence over young EAs), not to limit their involvement in EA generally." In other words, leave impact considerations to the victim's own choice.
I'm not sure this the right balance. I wrote it with one specific real example from my own life in mind, and I don't know how well it generalizes. But it does seem to me like any less victim-friendly positions than that would probably indeed be worse even from a completely consequentialist perspective, because of the likelihood of driving victims away from EA.
Guy Raveh @ 2022-11-15T22:44 (+3)
because of the likelihood of driving victims away from EA.
And, after a while, also people who aren't yet victims but know how the community will act (or fall to act) if they become ones, so they just opt out preemptively.
Davit Jintcharadze @ 2022-11-21T14:01 (+2)
This is a valid consideration, however, one could argue that if we were to give victims the option to opt out of the specific consequence that might have been crucial in preventing future wrongdoings by the same person or other people, then perpetrators would think they can still carry on with their behavior. Especially if the victim decides to opt the perpetrator out of all serious consequences. It also could be the case that victims that are affected by what happened to them psychologically might not be able to make an informed judgment of consequences at that very moment, as we know everyone has their own time frame of processing the wrongdoing that was done to them.
Neel Nanda @ 2022-11-15T20:41 (+10)
Hmm, I can see where you're coming from, but this seems hard to argue in absolutes. There's situations where it's unclear and the evidence is murky re whether the predatory behaviour actually happened, or where the behaviour could maybe be seen as predatory in a certain light and cultural context but not in others. I'm reluctant to say that a factor just does not matter, though it seems reasonable to argue that EAs overweight it.
SeeYouAnon @ 2022-11-15T21:09 (+37)
This will be my last message in this thread, because I find this conversation upsetting every time it happens (and every time it becomes clear that nothing will change). I find it really distressing that a bunch of lovely and caring people can come together and create a community that can be so unfriendly to the victims of assault and harassment.
And I find it upsetting that these lovely and caring people can fall into serious moral failure, in the way that this is a serious moral failure from my perspective on morality (I say this while also accepting that this reflects not evilness but rather a disagreement about morality, such that the lovely, caring people really do continue to be lovely and caring and they simply disagree with me about a substantive question).
To reply to your specific comments, I certainly agree that there is room for nuance: situations can be unclear and there can be clashes of cultural norms. Navigating the moral world is difficult and we certainly need to pay attention to nuances to navigate it well.
Yet as far as I'm concerned, it remains the case that someone's contributions via their work are irrelevant to assessing how we should respond to their serious wrongdoing. It's possible to accept the existence of nuance without thinking that all nuances matter. I do not think that this nuance matters.
(I'm happy to stick to discussing serious cases of wrongdoing and simply set aside the more marginal cases. I think it would represent such a huge step forwards if EA could come to robustly act on serious wrongdoing, so I don't want to get distracted by trying to figure out the appropriate reaction to the less crucial cases.)
I cannot provide an argument for this of the form that Oliver would like, not least because his comment suggests he might prefer an argument that is ultimately consequentialist in nature even if at some layers removed, but I think this is the fundamentally wrong approach.
Everyone accepts some moral claims as fundamental. I take it as a fundamental moral claim that when a perpetrator commits a serious wrong against someone it is the nature of the wrong (and perhaps the views of the person wronged, per Jenny's comment) that determine the appropriate response. I don't expect that everyone reading this comment will agree with this, and I don't believe it's always possible to argue someone into a moral view (I think at some fundamental level, we end up having to accept irreconcilable disagreements, as much as that frustrates the EA urge to be able to use reason to settle all matters).
(At this point, we could push into hypothetical scenarios like, "what if you were literally certain that if we reacted appropriately to the wrongdoing then everyone would be tortured forever?". Would the consequences still be irrelevant? Perhaps not, but the fact of the matter is that we do not live in a hypothetical world. I will say this much: I think that the nature of the wrongdoing is the vastly dominating factor in determining how to respond to that wrongdoing. In realistic cases, it is powerful enough that we don't need to reflect on the other considerations that carry less weight in this context.)
I've said I don't expect to convince the consequentialists reading this to accept my view. What's the point then? Perhaps I simply hope to make clear just how crucial an issue of moral conscience this is for some people. And perhaps I hope that this might at least push EA to consider a compromise that is more responsive to this matter of conscience.
Neel Nanda @ 2022-11-16T12:17 (+12)
I'm sorry you've found this conversation upsetting, and think it's entirely reasonable to not want to continue it, so I'll leave things here. I appreciate the openness, and you still being willing to express this opinion despite expecting to find the conversation upsetting!
Habryka @ 2022-11-15T19:38 (+4)
I think you could try to argue (but you do have to argue) that the harm from this kind of behavior is much more important than the contributions from the same people, especially when the behavior is minor. Or you could try to argue that there is a moral schelling fence here that suggest some kind of deontological rule that we shouldn't cross, not because we know what happens when we cross it, but because it sure is a pretty universal rule (which, to be clear, in this case I don't think applies, though I think there is an interesting argument to be made here). Or you could argue that there is some group of experts on this topic with a good track record that we should defer to on this topic, even if we don't understand their reasoning.
But I do think at the end this is a position that has to be argued against (and I think there are interesting arguments to be made), and I don't think this comment succeeds at that. I think it contains snippets of considerations, but I don't like the degree to which it tries to frame its position as obvious, while mostly only hinting at underlying arguments.
bruce @ 2022-11-16T02:38 (+7)
Just to be more concrete, what would you say is an example of a behaviour that you think does not warrant action, because "the harm from this kind of behaviour is not much more important than the contributions from the same people"?
And where would you personally draw the line? i.e., what does the most harmful example look like that still does not warrant action, because the harm is not much more important the contributions?
bruce @ 2022-11-16T03:29 (+23)
While I agree that both sides are valuable, I agree with the anon here - I don't think these tradeoffs are particularly relevant to a community health team investigating interpersonal harm cases with the goal of "reduc[ing] risk of harm to members of the community while being fair to people who are accused of wrongdoing".
One downside of having the bad-ness of say, sexual violence[1]be mitigated by their perceived impact,(how is the community health team actually measuring this? how good someone's forum posts are? or whether they work at an EA org? or whether they are "EA leadership"?) when considering what the appropriate action should be (if this is happening) is that it plausibly leads to different standards for bad behaviour. By the community health team's own standards, taking someone's potential impact into account as a mitigating factor seems like it could increase the risk of harm to members of the community (by not taking sufficient action with the justification of perceived impact), while being more unfair to people who are accused of wrongdoing. To be clear, I'm basing this off the forum post, not any non-public information
Additionally, a common theme about basically every sexual violence scandal that I've read about is that there were (often multiple) warnings beforehand that were not taken seriously.
If there is a major sexual violence scandal in EA in the future, it will be pretty damning if the warnings and concerns were clearly raised, but the community health team chose not to act because they decided it wasn't worth the tradeoff against the person/people's impact.
Another point is that people who are considered impactful are likely to be somewhat correlated with people who have gained respect and power in the EA space, have seniority or leadership roles etc. Given the role that abuse of power plays in sexual violence, we should be especially cautious of considerations that might indirectly favour those who have power.
More weakly, even if you hold the view that it is in fact the community health team's role to "take the talent bottleneck seriously; don’t hamper hiring / projects too much" when responding to say, a sexual violence allegation, it seems like it would be easy to overvalue the bad-ness of the immediate action against the person's impact, and undervalue the bad-ness of many more people opting to not get involved, or distance themselves from the EA movement because they perceive it to be an unsafe place for women, with unreliable ways of holding perpetrators accountable.
That being said, I think the community health team has an incredibly difficult job, and while they play an important role in mediating community norms and dynamics (and thus have corresponding amount of responsibility), it's always easier to make comments of a critical nature than to make the difficult decisions they have to make. I'm grateful they exist, and don't want my comment to come across like an attack of the community health team or its individuals!
(commenting in personal capacity etc)
Julia_Wise @ 2022-11-21T21:46 (+12)
Thanks for raising this, I think I wasn’t clear enough in the post cited.
To clarify - that line in the table is referring specifically to sharing research, not all kinds of participation in the community. I meant it about things like “should people still be able to post their research on the EA Forum, or receive a grant to do research, if they’ve treated other people badly?” I find that a genuinely hard question. I don’t want to ignore the past or enable more harm. But I also don’t want to suppress content that would be useful to other EAs (and to the world) because of the person who produced it.
I see that as a pretty different question from “Should they attend conferences?” and other things more relevant to their participation in the community side of EA.
SeeYouAnon @ 2022-11-23T14:53 (+5)
A few brief comments.
1.) Clearly this is better than the alternative where the same considerations are applied to other ways of participating in the community.
2.) My issue isn't particularly with the community health team, but with a general attitude that I've often encountered among EAs in more informal discussions. Sadly, informal discussions are hard to provide concrete evidence of, so I pointed to an example that I take to be less egregious, though I still think on the wrong side of things here. I am more concerned by the general attitude that is held by some EAs I've spoken to than two specific lines of a specific post.
3.) People are banned from the forum for being rude in relatively minor ways. And yet let's imagine a hypothetical case where someone is accused of serious wrongdoing and further are specifically accused of carrying out some elements of wrongdoing via online social networks. It would seem weird to ban the first person for minor rudeness, but give the second person access to a platform that can allow them to build status and communicate with people via just the sort of medium that they allegedly used to carry out previous wrongdoing. Yet I think this is a plausible outcome of the current policies on when to ban people and how to react to interpersonal harm.
4.) I agree that it's a different question; I still don't think it's a difficult one. For a start, I think it's a little odd to conceive of this as "suppressing" content. People can still post content in lots of other places, and indeed other people can share it on the EA forum if they want to. Further, I don't think you can separate out enabling harm from posting to the forum, given that forum posts can confer status to people and status can help people to commit harm. So I think that the current policy just does enable harm. I think enabling this harm is the wrong call.
5.) I also think we could run the consequentialist case here, pointing to the fact that other people might not contribute to EA because they find the EA attitude to these cases concerning and don't feel safe or comfortable in the community.
All of that said, I think it's important to say again, per point 1, that I do agree that the issue is much less concerning when it doesn't involve real world contact between people, and that I appreciate you taking the time to reply.
Guy Raveh @ 2022-11-15T22:26 (+6)
I strongly agree with this.
On a tangent, I also want to flag that this exemplifies the importance of transparent policies and rationales in orgs relating to the community. Without Julia Wise's post on her approach, which was effectively secret for a long time before, it would be impossible to have this discussion. I believe publishing that post was a result of community pressure for transparency, and that we should continue pressing for that kind of transparency in other areas of EA.
quinn @ 2022-11-20T03:07 (+4)
making sure that we can still get the value of perpetrator's work.
The standard recommendation I've always heard is basically in the family of tradeoffs, but says that you never really land on the side of preserving the perpetrator's contributions when you factor in the victim's contributions and higher order effects from networks/feedback loops.
Habryka @ 2022-11-20T19:49 (+17)
I don't understand what's going on here. Sometimes someone is a bit rude and causes a tiny bit of interpersonal harm. Sometimes someone smells bad. Sometimes someone has a slightly bad temper. Of course I care about being able to benefit from the contributions of those people, many great scientists and thinkers in history had problems of this type.
How is it possible to "never land on the side of preserving the perpetrator's contributions" without specifying the severity of the things going on? Of course there will be many levels of severity where you have to make difficult tradeoffs here, this seems so obvious that I don't understand what is going on in this thread.
quinn @ 2022-11-20T20:22 (+10)
I think the heuristic I mentioned is designed for sexual assault, and I wouldn't expect it to be the right move for less severe values of interpersonal harm.
Realizing now that I did the very thing that annoys me about these discussions: make statements tuned for severe and obvious cases that have implications about less severe or obvious cases, but not being clear about it, leaving the reader to wonder if they ought to round up the less obvious cases into a more obvious case. Sorry about that.
Linch @ 2022-11-20T20:16 (+5)
In context, I definitely read this as about median/modal allegations of harm that are reported to the CEA CH team. I expect them to be substantially more severe than the examples you listed.
Julia_Wise @ 2022-11-21T21:47 (+21)
The modal thing that gets reported to community health is something like “This person did a thing that made me / my friend kind of uncomfortable, and I’d like you to notice if other people report more problems from them.”
Linch @ 2022-11-22T04:36 (+2)
Thanks, this is helpful!
Habryka @ 2022-11-20T20:44 (+6)
Huh, I actually think a lot of relatively minor pieces of harm get reported to the CEA CH team, where probably nobody involved would want the other party to just be completely excluded from the community, or give no care to their ability to continue contributing.
A lot of the things I talk to the CH team about are things like "this person seemed kind of salesy when I interfaced with them, and I would want someone to keep track of whether other people feel the same, and maybe watch out for some bigger pattern".
Making this account feels almost as bad as pulling a "Holden", @ 2022-11-20T21:28 (+2)
I’m trying to get a model of what you’re saying here:
A lot of the things I talk to the CH team about are things like "this person seemed kind of salesy when I interfaced with them, and I would want someone to keep track of whether other people feel the same, and maybe watch out for some bigger pattern".
Is the CH team (terrible initials BTW) initiating contact with you, about this low urgency, low danger work? Or are you initiating contact with them?
In either case, it’s not clear what this is saying or how it’s negative or positive.
For example, in a Bayesian model sort of sense, I don’t see how this gives information on the CH team being ineffective or effective, or the EA community being bad or good.
(To be honest, IMO keeping track of these small things seems very favorable. It seems consistent with the CH team being involved in the community. This seems like it gives depth/competence/context when there is a much more major issue. It is also seems like a class of nuanced, quiet, conscientious work that has long term benefits for everyone, but is less visible, compared to other ways of doing this work, like big splashy announcements (as negative examples think dysfunction of institutions in The Wire)).
What I’m trying to get at is that you are one of the most respected people and have good insights, so if you have a model of how things should improve, or EA institutions are low wattage or high wattage, on the CH team or otherwise, it would be good to hear.
Habryka @ 2022-11-20T21:43 (+13)
Is the CH team (terrible initials BTW) initiating contact with you, about this low urgency, low danger work? Or are you initiating contact with them?
I have some recurring meetings with Nicole (though we sure have been skipping a lot of them in recent months) where I tend to bring these things up.
In either case, it’s not clear what this is saying or how it’s negative or positive.
Sorry, I am just responding to Linch's statement that the median/modal piece of harm that gets reported to the CH team is probably quite severe (whereas I think the majority are pretty minor, and one of the primary jobs of the CH team is to figure out how to aggregate lots of weak points of evidence that might point to some kind of large distributed harm).
(To be honest, IMO keeping track of these small things seems very favorable. It seems consistent with the CH team being involved in the community. This seems like it gives depth/competence/context when there is a much more major issue. It is also seems like a class of nuanced, quiet, conscientious work that has long term benefits for everyone, but is less visible, compared to other ways of doing this work, like big splashy announcements (as negative examples think dysfunction of institutions in The Wire)).
Yep, this seems right to me. I am glad the CH team is filling this function. I think there are better ways of going about it than they historically have, and I have some criticisms, but I am overall happy that an institution like this exists (and indeed think that something nearby that could have aggregated more evidence on Sam's dishonesty could have maybe done something about the FTX situation).
Julia_Wise @ 2022-11-17T03:44 (+25)
Thanks for starting this discussion!
Some previous efforts here:
The community health team has done some more in-depth work, for example interviews about women's experiences in a couple of workspaces. Unfortunately, the in-depth work didn't yield that many useful next steps. (I’m sure this varies, and in some cases in-depth study of what’s going on with the culture in a space would yield useful action points.)
And more general thoughts:
- EA is multifaceted, made of thousands of people in different online spaces, workplaces, cities, and countries. Even understanding the culture(s), let alone shaping the whole thing, is a huge task.
- All of us bring pre-existing expectations from our universities, friend groups, workplaces, etc, so there’s no static EA culture - there’s constant inflow from other cultures.
- The work of shaping culture is usually best done by people who understand their space well (their workplace, local group, etc) rather than an outside entity. The staff at CEA can provide advice and support, but we certainly can’t single-handedly change the culture of all these EA spaces.
Dawn Drescher @ 2022-11-22T00:30 (+3)
Since Isabel argues that this is a deployment rather than a research problem, and since CEA doesn’t have a lot of fine-grained control over all the EA spaces, and since I have the perhaps naive and not-empirically-substantiated view that at least the majority of organizers of EA spaces are well-intentioned – maybe we need high-status group stress the urgency of this problem more.
For example, maybe you can still get permission to work with all these interviews some more, e.g., mix and amalgamate them into a large body of anonymized case studies to convince anyone who thinks that that’s not happening in their particular spaces that they’re likely mistaken and need to address the problem?
Sarah H @ 2022-11-16T05:38 (+19)
I’ve been involved with EA since 2015. I think there’s a lot of room for EA to do better when it comes to inclusivity, especially regarding gender (but also race/class/other identity aspects).
The gender skew in EA exacerbates a lot of the issues related to gender. The gender ratio varies a ton across different geographies and cause areas, but in my experience it ranges from roughly 50/50 to overwhelmingly male (70/30 male/female per 2020 EA survey). When I walk into a meetup and I’m the only woman there, that affects my experience. This was particularly the case when I was first getting involved with EA as a teenager: part of deciding whether you stay involved with a community is your answer to “am I welcome here? Is this community for people like me?,” and repeatedly having experiences where I was one of the only women present gave me the sense that this community wasn’t for people like me. That led me to engage less with EA, though I eventually returned; I suspect it’s more common for women and people from underrepresented groups to “bounce off” of EA like this. It is genuinely surprising in many ways that EA doesn’t have more women, as women tend to be way more involved in the non-profit sector more broadly. It doesn’t have to be this way.
But the gender skew also affects things when there are issues. If someone makes some comment that makes you uncomfortable and the rest of your male conversational partners laugh it off, that’s not super helpful. I think that, as a community, we should work to reduce the gender skew—through making EA spaces more welcoming to women, investing in mentorship programs, etc.—and actively take efforts to mitigate issues created by the gender skew. On a macro level, fewer women in the room when decision-making is occurring means that issues that affect women are less likely to receive their appropriate attention. That necessitates that institutions make a more active effort to pay attention to issues that effect women, collect women’s opinions on issues that affect things, and yes, have more women in the room when decisions are happening. On a micro level, note when you’re at a get-together and it’s overwhelmingly monolithic (in terms of race, gender, etc.). Pay attention to how that affects how you treat people in the non-dominant group.
(splitting into a second comment b/c of length)
Sarah H @ 2022-11-16T06:37 (+11)
I think individuals and institutions in EA need to do a better job of mitigating risks created by unequal power dynamics. In a previous job, I conducted research related to institutional accountability and sexual assault. One common theme is that the way that institutions and communities respond to bad behavior by key figures is shaped by their norms and systems, with certain attributes making accountability more difficult to achieve. In my opinion, there are several aspects of the EA movement as it currently exists—including the blurrier work/life boundaries for many folks, the outsize power of certain community leaders, the frequent reliance on ad-hoc rather than formal systems, and the movement’s small size—that make accountability particularly difficult, and I don’t feel that we have done enough to create systems that respond to these risks.
Let’s think through an example. (To be clear, this is entirely hypothetical.) Imagine that a woman is harassed by a prominent community leader. She works for a small EA org. Her boss is close friends with her harasser, and the org receives significant funding from his organization. She wants to say something, but she doesn’t want to threaten her job or their funding. Not only that, but most of her friends are in EA circles, and she knows speaking up would be divisive.
Some of the things that make this sort of situation more difficult in EA are based on parts of the community that would be difficult or undesirable to change. But some of them are worth changing, and the existence of all of them makes the creation of robust systems even more important.
I think it’s useful for institutions to think through these sorts of exercises. What if a major donor was engaging in bad behavior? An organization’s leader? To what extent would victims feel able to come forward? How likely would it be that the victim would face negative consequences from speaking up, vs. that the perpetrator would face real consequences?
There are always going to be bad actors. It’s up to communities and institutions to set up systems so that when improper behavior occurs (whether harassment, assault, etc.), it is more likely that bad actors will face accountability for their actions. With rare exceptions, the deck is stacked against the victim and towards the perpetrator; good systems can help reduce how strongly the deck is stacked.
What do these good systems look like?
- Well-publicized, accessible systems (within orgs, community spaces, and events) that allow people to report incidents of improper behavior
- Clear policies for how institutions will respond to reports, including how they will maintain confidentiality
- Thoughtful procedures for reducing the likelihood of retaliation
- Explicit conflict of interest policies for orgs and grantmakers
- Robust governance systems
This is just a start, but hopefully a helpful one! These are conversations worth having.
throwaway5 @ 2022-11-15T18:34 (+11)
Just to flag in some ways it would be good to be less inclusive. There is a lot of discussion right now about how/if/when we could have spotted the FTX/crypto fraud sooner and that is all about being quicker to exclude them.
To engage more productively with the prompt, I think the de-normalisation of Polygamy seems plausible. I've long been uncomfortable with (some) EA's embrace of this, as given the harmful effects of the institution on societies, like encouraging male violence and suppressing womens rights. Even though these issues didn't seem like huge issues for EAs, I don't think we should adopt norms that would be bad for society if everyone did them. But there seem to be also two significant more concrete inclusion reasons to oppose it.
Firstly, it enables predatory men and abuses of power. In a traditional environment, all the senior men will be married, and thus any proposition they make to vulnerable young women is clearly illicit. It can still happen - though probably with lower frequency - but the woman will clearly understand from the beginning that a norms violation is occurring, and there is more support for shutting it down sooner. Additionally, to the extent the leaders wives are involved in the community, there is a native constituency naturally opposed to this behavior.
Additionally, as people have pointed out, sexual relations in the workplace create clear conflicts of interest. There is a reason they are tightly regulated in many professional environments. This is not the first scandal we have had where key decision makers seem to have covered up for their romantic partners.
ozymandias @ 2022-11-15T22:32 (+41)
As a queer person, it definitely makes me feel unwelcome to hear people suggest that the social movement I'm part of gets to have an opinion on my consensual relationship choices.
KMF @ 2022-11-15T22:57 (+12)
They don't and I'm sorry. If you want to chat feel free to hit me up (kathryn@magnifymentoring.org).
ozymandias @ 2022-11-15T23:07 (+14)
Don't worry, I'm robust to bad comments on the EA Forum. :) Fortunately, this doesn't seem to be a norm anywhere close to being adopted.
Amber Dawn @ 2022-11-15T20:01 (+26)
So I strongly disagree with this. First, what's the evidence that polyamory has 'harmful effects... on societies, like encouraging male violence and suppressing womens rights'? Isn't it more of a suppression of women's rights to say that they can only have one romantic partner at a time, if they want more?
Second, I don't think polyamory meaningfully enables predatory men. It seems a bit patronizing to say that vulnerable young women can only understand that something is going wrong if the guy who hits on them is married. A bigger issue is that power dynamics make it hard for victims to speak up.
The point about sexual relations in the workplace is a non-sequitur - people can be poly without dating their co-workers.
Finally, I'm poly and a woman, so if the community became hostile to or suspicious of polaymory per se, it would become less inclusive for me. One of the things I like about this community is the fact that people are open to people who make unusual lifestyle choices, and they're not (usually!) tempted to mock or scorn something just because it's weird or unusual. i'd be sad to lose that.
richard_ngo @ 2022-11-15T18:55 (+17)
Thanks for commenting :) I think the dynamics around polyamory are important to think about in these types of discussions.
My own take: I agree that lots of people being poly makes it harder to identify norm violations, compared with traditional environments, and that this is a significant cost. So when thinking about how to set norms about professional boundaries, we should be aware that the "standard" norms are calibrated for primarily-monogamous environments, and therefore err on the side of being more careful than we otherwise would.
De-normalizing is pretty broad, though, so I'm keen to think more about what this might involve. Things like not assuming people are poly by default definitely seem valuable. On the other hand, I wouldn't endorse "opposing" poly more generally - I think we should be very cautious about passing judgement on people's sexual identities (especially when poly people often face hostility from the rest of society).
Monica @ 2022-11-15T20:47 (+5)
Hi Richard, Could you explain how lots of people being poly makes it harder to identify norm violations? What kind of norms do you perceive to be different? I certainly agree it is bad to assume anyone is poly/not poly/interested in any kid of romantic interaction/gay/straight/or anything else, but I am curious about what kind of norm violations you are referring to.
KMF @ 2022-11-15T20:58 (+13)
I don't agree. I am monogamous but some of the most loving and healthy families I have ever seen are poly. Most of my relationship goals are from poly families.
Vaidehi Agarwalla @ 2022-11-16T00:31 (+3)
Terminology clarification: did you mean to say polyamory and not polygamy in this comment?
Isabel @ 2022-11-15T19:00 (+2)
Imo this post seemed pretty explicitly based on the prioritization of inclusivity towards women, nonbinary people, and people of the global majority and while I can see that you could conceivably frame this as a women's safety/workplace harassment thing, there's probably just as much to be said about e.g. monogamy being an antifeminist prison, so it seems strange to me that you'd want to bring this up here.
The rhetoric around "senior men" and "the leaders wives" rings very handmaid's tale-y which is probably an exaggeration. Also not sure why the solution in the third paragraph isn't "don't hit on women who are your professional junior."
throwaway5 @ 2022-11-15T19:37 (+13)
Yes reducing workplace and social harassment of women is an important issue for inclusivity. I brought this up because there is a lot of research that monogamy is good for women because it reduces violence and increases wellbeing.
The rhetoric around "senior men" and "the leaders wives" rings very handmaid's tale-y which is probably an exaggeration.
Do you deny that most organizations are lead by senior men, who sometimes inappropriately approach more junior women? Or that traditionally most senior men had wives? I don't understand the handmaiden's tale reference. In that book important men get multiple wives which I am opposed to?
Also not sure why the solution in the third paragraph isn't "don't hit on women who are your professional junior."
The same reason the solution to theft isn't "don't steal". We need a response which is robust to some bad actors, not just assume everyone will be good. This helps increase the social costs of bad behaviour.
Amber Dawn @ 2022-11-15T20:12 (+28)
Both of the studies you linked are about polygamous cultures where men have multiple (formal) wives, rather than men and women both having multiple partners of varying degrees of commitment, so I don't see why they would be relevant to polyamory as practiced in the EA community.
Also, this whole discussion takes away women's agency and is framed as if women are just passive victims. You know what's 'good for women'? Letting them choose who they date, marry or sleep with.
ozymandias @ 2022-11-15T22:52 (+23)
I don't understand why bad actors who are already willing to harass women wouldn't be willing to cheat on their wives. I also don't understand why we can't just stigmatize people hitting on their employees, if that is the thing we actually care about. Your proposed system has no advantages if the senior men are single or serially monogamous-- both very common.
Your language also strikes me as oddly and unnecessarily gendered. It isn't exactly better if a senior woman is hitting on a younger, vulnerable man! Effective altruists are much more LGBT+ than the general population, and poly effective altruists even more so; it seems to me to be a very incomplete analysis to assume that everyone is heterosexual.
Jenny K E @ 2022-11-15T22:09 (+10)
[ETA: Whoops, realized this is answering a different question than the one the poster actually asked -- they wanted to know what individual community members can do, which I don't address here.]
Some concrete suggestions:
-Mandatory trainings for community organizers. This idea is lifted directly from academia, which often mandates trainings of this sort. The professional versions are often quite dumb and involve really annoying unskippable videos; I think a non-dumb EA version would encourage the community organizer to read the content of the community health guide linked in the above post and then require them to pass a quiz about its contents (but if they can pass the quiz without reading the guide that's fine, the point is to check they understand the contents of the guide, not to make them read it). I imagine that better but higher-effort/more costly versions of this test would involve short answer questions ("How would you respond to X situation?"); less useful but lower-effort versions would involve multiple choice questions. To elaborate, the short answers version forces people to think more about their answer but also probably requires a team of people to read all these answers and check if they're appropriate or not, which is costly.
-some institution (the community health team? I dunno) should come up with and institute codes of conduct for EA events and make sure organizers know about them. There'd presumably need to be multiple codes of conduct for different types of events. This ties in to the previous bullet since it's the sort of thing you'd want to make sure organizers understand. This is a bit of a vague and uninformed proposal -- maybe something like this already exists, although if so I don't know about it, which at minimum implies that if it exists it ought to be more widely advertised.
-maybe a centralized page of resources for victims and allies, with advice, separate from the code of conduct? Don't know how useful this is
-every medium/large EA event/group should have a designated community health point person, preferably female though not necessarily, who makes a public announcement that if someone makes you uncomfortable you can talk to the point person and with your permission they'll do what's necessary to help, and then follows through if people do report issues to them. They should also remind/inform everyone of the role of Julia Wise, and, if someone comes to them with an issue and gives permission to pass it on to her and her team, do that. (You might ask, if this point person is probably just gonna pass things on to Julia Wise, why even have a point person? The answer is that reporting is scary and it can be easier to report to someone you know who has some context on the situation/group.)
Furthermore, making these things happen has to explicitly be someone's job, or the job of a group of someones. It's much likelier to actually happen in practice if it is someone's specific responsibility than if it's just an idea some people talk about on the Forum.
Something I don't think helps much is: trying to tell all EAs that they should improve their behavior and stop being terrible. This won't work because unfortunately, self-identifying EAs aren't all cooperative nice individuals who care about not harming others personally. They don't have incentives to change just because someone tells them to, and worse offenders on these sorts of issues are also very likely to not be the sorts of people who want to read posts like this one about how to do better. That said, I think that posts on this subject that are more helpful are posts that include lots of specific examples or advice, especially advice for bystanders.
Ula @ 2022-11-16T01:05 (+1)
- Training seems to me like a good idea, if it can be online (to a large group - e.g. all organizers) very specific, so as you mentioned: if this situation occurs -> do this (e.g. if a person reports mistreatment of this sort, we do XYZ), free, and mandatory, it could be very helpful.
- The centralized page or e.g., an add-on/button in Swapcard.
- It should be announced in the intro speech who these designated people are (it should be 1 male and 1 female member), and I saw a great idea in EAGxPrague, where they put the photos and contact details to their community health people in the bathrooms (among other places), for the situation when someone runs there cause they are anxious, overwhelmed, etx.
- I agree telling people to just improve their behavior (especially with solid portions of the community being people with poorer social skills) will definitely not work.
Nathan Young @ 2022-11-16T14:45 (+9)
Some suggested heurstics:
- If you wouldn't ask for an EAG meeting with someone if you didn't fancy them, don't ask for the meeting if you do
- If you are talking to someone, leave a break. If they don't ask a question consider whether they look like they are just enjoying the conversation or being polite.
- If you want to leave a conversation, say "I am going to do X", "I am going to head off now", or "[name] it's been a pleasure"
- Imagine you are going to be fined $100 if you make someone uncomfortable. Would you act differently?
S.E. Montgomery @ 2022-11-15T19:50 (+7)
Great post! I agree with a commenter above who says that "The problem is not a lack of ideas that needs to be rectified by brainstorming - we have the information already. The problem seems to be that no one wants to act on this information." That being said, I have a few thoughts:
Regarding code of conduct at events, I'm hesitant to make hard and fast rules here. I think the reality around situations such as asking people out/hitting on people, etc, is that some people are better at reading situations than others. For example, I know couples who have started dating after meeting each others at my local EA group's events, and I don't think anyone would see an issue with that. The issue comes in when someone asks someone out/hits on someone and makes the other person uncomfortable in the process. That being said, not asking people out during 1:1s seems like a good norm (I'm surprised I even need to say this, to be frank), as does not touching someone unless you have explicitly asked for their consent to do so (this can apply even to something like hugs), and not making comments on someone's appearance/facial features/body.
In terms of power structures/conflicts of interest, I would love to see us borrow more from other organisations that have good guidelines around this. I can't think of any specific ones right now, but I know from my time working in government that there are specific processes to be followed around conflicts of interest, including consensual workplace relationships. I'm sure others can chime in with organisations that do this well.
In terms of hiring, I like what Rethink Priorities is doing. They attempt to anonymise parts of applications where possible, and ask people not to submit photos alongside their CVs. I think more could be done to encourage partially blind hiring/funding processes. For example, an employer/funder could write their first impression of someone's application without seeing any identifying information (eg. name, age, gender, ethnicity, etc), then do a second impression after. I'm conscious that names are quite important in EA and that this could add more work to already busy grant-making organisations, but maybe there is a way to do this that would minimise additional work while also helping reduce unconscious bias.
I would also love to see more writing/information/opinions come from the top-down. For example, people who have a big voice in effective altruism could write about this more often and make suggestions for what organisations and local groups can do. We already see this a bit from CEA, but it would be great to see it from other EA orgs and thought leaders. Sometimes I get a sense that people who are higher-up in the movement don't care about this that much, and I would love to be proven wrong.
Lastly, when it comes to engaging with posts on the forum about this topic, I was disappointed to recently see a post of someone writing about their experiences in the EA NYC community be met with a lot of people who commented disagreeing with how the post was written/how polyamorous men were generally characterised in the post. I think we should establish a norm around validating people when they have bad experiences, pointing them to the community health team, and taking steps to do better. There is no "perfect victim" - we need to acknowledge that sometimes people will have bad experiences with the community and will also hold opinions we disagree with. When they bring up their bad experience, it's not the time to say, "not all men are like this" or "I disagree with how you went about bringing this up."
Amber Dawn @ 2022-11-15T20:17 (+15)
So I was one of the top comments disagreeing with that post, and I'm a poly woman, and my interest wasn't to defend predatory poly men but to argue against the idea that my relationship structure, which is consensually, positively practiced by many people the world over, isn't inherently toxic or embedded in predatoriness. Trauma and upset should be met with sympathy, but it doesns't justify shitting on others' morally-neutral choices, and a community that's hostile to polyamory is hostile to many women and NBs, not just men.
S.E. Montgomery @ 2022-11-15T20:33 (+6)
I'm conflicted here. I completely agree with you that shitting on others' morally-neutral choices is not ideal, but I don't think anyone was coming away from reading that post thinking that polyamory = bad. I would hope that the people on this forum can engage thoughtfully with the post and decide for themselves what they agree/disagree with.
If someone had a bad experience with a man, and in the process of talking about it said something like, "all men suck and are immoral," I just don't think that is the right time or place to get into an argument with them about how they are wrong. It may have not even been coming from a place of "I actually 100% believe this," it may have just been something thought/written about in the heat of the moment when they are recounting their negative experiences. Again, there's no "perfect victim" that is going to say things in a way you 100% agree with all the time, but IMO the forum to disagree with them does not need to be while they are recounting their negative experience.
Amber Dawn @ 2022-11-15T21:28 (+17)
I guess I don't see why someone wouldn't come away from the post thinking that polyamory = bad.
I think the analogy here is not "all men suck and are immoral" (though I'm not even sure how much I endorse that), but like, if someone had had a bad experiences with men of a certain race, and in talking about it continually mentioned their race. I think people would rightly call that out as racist and not ok - we want to be sympathetic to victims, but if they are saying things that are harmful to others in the course of telling their experience, it's ok to point that out. Now obviously polyamory and race aren't exactly analogous, but I think the relevant differance is that poly people are a minority, that does face some stigma. And from my point of view, in trying to make the community less toxic for women, the poster made it more toxic for me (and other women like me).
MHR @ 2022-11-15T23:19 (+9)
I worry that race is a poor analogy here and may lead to more heat than light being generated in this discussion. Race has little bearing on dynamics around sex and relationships, while the mono/poly distinction does substantially impact them. I totally agree that it's unfair to stigmatize all poly men as abusers, but I think it's fair to consider whether the high rate of poly folks in the EA community creates unique considerations when thinking about community norms.
S.E. Montgomery @ 2022-11-15T22:07 (+8)
In terms of people coming away from the post thinking that polyamory = bad, I guess I have faith in people's ability on this forum to separate a bad experience with a community from an entire community as a whole. (Maybe not everyone holds this same faith.)
The post was written by one person, and it was their experience, but I expect by now most EAs have run into polyamorous people in their lives (especially considering that EAs on average tend to be young, male, non-religious, privileged, and more likely to attend elite universities where polyamory/discussions about polyamory might be more common) and those experiences speak for themselves. For example, I personally have met lots of polyamorous people in my life, and I've seen everything from perfectly healthy, well-functioning relationships to completely toxic relationships (just like monogamous relationships). So when I engaged with the post, I was thinking, "this person had a bad experience with the poly community, and it sounds terrible. I know from my own experiences that polyamory relationships can be healthy, but unfortunately that's not what this person experienced."
I'm persuaded by your analogy to race, and overall I don't want the EA community to perpetuate harmful stereotypes about any group, including polyamorous people. I think my main conflict here is I also want a world where women feel okay talking about their experiences without holding the added worry that they might not word things in exactly the right way, or that some people might push back against them when they open up (and I think you would probably agree with this).
Amber Dawn @ 2022-11-15T22:32 (+7)
Yeah that's fair, I definitely don't want people to have to watch their wording too closely when sharing their experiences, and I felt complicated about that post and my own replies/reaction to it.
Keerthana Gopalakrishnan @ 2022-11-15T20:42 (+2)
I wrote that post. I just want to clarify that I did not say "all poly men", but "many poly men". The difference is important. As someone who has no theoretical issue with poly practiced consensually, I'm not getting it why Amber Dawn and others feel attacked.
Me: "I was harassed by many poly men. "
Amber: "Stop attacking poly men. Not all poly men."
Read this https://en.wikipedia.org/wiki/NotAllMen
ozymandias @ 2022-11-15T22:43 (+30)
I have been harassed by many monogamous men but if I posted on the LW forum saying "I was harassed by many monogamous men" I would expect a lot of pushback from people who-- very sensibly-- would think I was trying to stigmatize monogamy.
There are places for unendorsed venting. Those places are not the Less Wrong forum.
ETA: I'm guessing from comments of yours I read elsewhere that you didn't mean to come off as anti-poly as you did to me and Amber, and I'm sorry if my comment came off hostile. I know I've definitely written things that came off in ways I didn't intend. :)
Amber Dawn @ 2022-11-15T22:42 (+5)
I feel attacked because it's implied that this is relevant to their toxicity, and it's not.
And you're not getting that you wanted to make the community less toxic for women, but I'm a woman, and poly, and the community will become more toxic for me is polyamory is stigmatized. I'm not interested in defending poly men - though of course it is true that many poly men are perfectly fine - on a more basic level, I'm defending myself.
jtm @ 2022-11-15T15:39 (+7)
Thank you for writing this, I think it's very important.
Nathan Young @ 2022-11-16T14:42 (+5)
I am unsure where the line is, but past some point, making someone uncomfortable at an work EA event (I think parties have a different bar) should result in significant actions. I have heard 4? stories of attendees, some powerful making people uncomfortable in ways that seem very easy to avoid and ought to have been heavily punished so that those doing it thought twice before they did.
Hamish Doodles @ 2022-11-15T12:56 (+5)
"interpersonal harm" link broken
richard_ngo @ 2022-11-15T17:08 (+4)
Fixed, thanks.
Nathan Young @ 2022-11-16T14:39 (+4)
I think we could think about ways of listening to the community without the community specifically creating choices - even a regular poll or regular posts from key decisionmakers about their strategies would allow a focus point for the community to respond.
Darren_Tindall @ 2022-11-17T14:58 (+2)
I think EA's obsession with maximization is an issue and impacts inclusivity of our the community (similar thinking, similar life experiences, similar socio-economic status, similar gender/sex/etc).
Holden Karnofsky summarises this really well in a blog (EA is about maximization, and maximization is perilous):
If you’re maximizing X, you’re asking for trouble by default. You risk breaking/downplaying/shortchanging lots of things that aren’t X, which may be important in ways you’re not seeing. Maximizing X conceptually means putting everything else aside for X — a terrible idea unless you’re really sure you have the right X.
EA is about maximizing how much good we do. What does that mean? None of us really knows. EA is about maximizing a property of the world that we’re conceptually confused about, can’t reliably define or measure, and have massive disagreements about even within EA. By default, that seems like a recipe for trouble.
I think the FTX scandal raises some serious questions about just how well the community is positioned to forecast the future and hence direct large amounts of money that can actually have a net negative impact and how the EA community can essentially be a big echo chamber.
While the EA community claims to favour wild ideas, I think the community is still relatively narrowly focused on a few select cause areas even those these are highly uncertain and we may be completely unaware of other issues. I know the known unknowns have been covered by Toby Ord and others but I think we need to make sure we continue to seek diverse experiences, skills, thinking, and disciplines.
Mindaugas @ 2022-11-16T10:46 (+1)
6. Just written about possible solution in new post whistleblowing tiny wrongdoings before they escalate