EA Forum: content and moderator positions
By Lizka @ 2023-05-18T23:15 (+72)
TL;DR: We’re hiring for Forum moderators — apply by 1 June (it’s the first round of the application, and should take 15-20 minutes). We’re also pre-announcing a full-time Content Specialist[1] position on the Online Team at CEA — you can indicate interest in that [the Content Specialist application round is in progress but the initial application deadline has passed, so we're not accepting new applications].
- ➡️ Apply to be a part-time Forum moderator by 1 June.
- Round 1 of the application should take around 15-20 minutes, and applying earlier is better.
- You can see moderator responsibilities below. This is a remote, part-time, paid position.
- ➡️ Indicate interest in a full-time Content Specialist[1] position on the Online Team at CEA.
- We’ll probably soon be hiring for someone to work with me (Lizka) on content-related tasks on the Forum. If you fill out this form, we will send you an email when the application opens and consider streamlining your application if you seem like a particularly good fit.
- You can see more about the role’s responsibilities below. This is a full-time position, and can be remote or in-person from Oxford/Boston/London/Berkeley.[2]
- ➡️ You can also indicate interest in working as a copy-editor for CEA or in being a Forum Facilitator (both are part-time remote roles).
If you know someone who might be interested, please consider sending this post to them!
Please feel free to get in touch with any questions you might have. You can contact forum@centreforeffectivealtruism.org or forum-moderation@effectivealtruism.org, comment here, or reach out to moderators and members of the Online Team.
An overview of the roles
I’ve shared a lot more information on the moderator role and the full-time content role in this post — here's a summary in table form. (You can submit the first round of the moderator application or indicate interest in the content role without reading the whole post.)
Title | About the role | Key responsibilities | Stage the application is at |
Moderator | Part-time, remote (average ~3 hours a week but variable), $40/hour | Make the Forum safe, welcoming, and collaborative (e.g. by stopping or preventing aggressive behavior, being clear about moderation decisions), nurture important qualities on the Forum (e.g. by improving the written discussion norms or proactively nudging conversations into better directions), and help the rest of the moderation team. | Round 1 is open (and should take 15-20 minutes): apply by 1 June |
Content Specialist[1] | Full-time, remote/ in-person (Oxford/ London/ Boston/ Berkeley[2]) | Encourage engagement with important and interesting online content (via outreach, newsletters, curation, Forum events, writing, etc.), improve the epistemics, safety, and trust levels on the Forum (e.g. via moderation), and more. | Indication of interest (we'll probably open a full application soon) |
We’re also excited for indications of interest for the following part-time contractor roles, although we might not end up hiring for these in the very near future | |||
Copy-editor indication of interest | Part-time, remote (~4 hours a week average), $30/hour by default | Copy-editing for style, clarity, grammar — and generally sanity-checking content for CEA. Sometimes also things like reformatting, summarizing other content, finding images, and possibly posting on the website or social media. | |
Forum Facilitator indication of interest | Part-time, remote (~3 hours a week average), $30/hour | Approving new users and helping them get oriented, classifying and tagging new content on the Forum, noticing other issues, sometimes helping with the Topics Wiki, and more. This is a crucial part[3] of making the Forum run well. |
Moderator (part-time, remote, paid)
We’re looking for new moderators!
As a moderator, you’d play a crucial part in shaping the EA Forum by preventing norm-violating behavior, developing and communicating the Forum’s discussion norms, nudging discussions in better directions if they seem at risk of getting unproductive, and more.
➡️ Apply by 1 June (earlier is better). The first part of the application should take 15-20 minutes.
We plan on evaluating round 1 (of 3) of the application on a rolling basis; if you pass, we’ll send you round 2 as soon as we can. We will do our best to get back to everyone within two weeks of when they submit the form.
Basic facts about the moderation role
- We pay[4] $40/hour for active moderation work.
- We’re also testing an on-call “rotation,” and might start paying people $100/week when they’re on call (which should be on a regular cycle).
- It’s a part-time & remote role. Hours vary a fair bit week by week; active moderators currently estimate that they spend an average of about 3 hours a week on moderation.
- If you’re really busy some weeks and expect to have no time those weeks, that’s ok. But if you have less than an hour a week on average, and don’t think you can free up around 4 hours during one week every month, this role is probably not for you. (We might make exceptions for unusual cases.)
- If the moderation team grows, hours individual moderators spend on moderation per week might diminish (although if people want to do more moderation, I’d be excited for that!).
- If you’re hired, you’d start as soon as you can. We’re especially interested in adding people who can join in the next couple of months.
The current active moderators are Lizka, JP, and Lorenzo, and Felix has recently started. Some others are on the team as “advisors”; they provide second opinions on some decisions, and help when there’s more to respond to. We’re not hiring advisors.
Why moderation?
In brief:
- Moderation is important
- The EA Forum is a key part of the EA network.
- Moderators make the Forum safer, epistemically healthier, and more productive.
- Extra moderators can add a lot of value. We’re quite capacity constrained, which means that we can’t do a lot of what we think would be useful. Even if we were less constrained, I’d be really excited to get extra moderators to get a broader diversity of perspectives and more capacity in times of crisis, or when there’s a sudden cascade of moderation incidents.
- In terms of what it’s like for the moderators, I shared some thoughts and asked for a testimonial from a different moderator about what he likes and doesn’t like about being a moderator — you can see it below.
In greater detail — on the importance of moderation:
The Forum is the heart (or at least one of the hearts) of the online EA network,[5] and a key piece of community infrastructure. If the Forum can be an excellent space for serious and collaborative discussions that help improve the world, I think we’ll make significantly more progress on the problems we’re facing and do more good.
Moderation is about making sure that the Forum is useful and healthy — which in turn can help the EA network make significantly more progress on the problems we’re facing and do more good. Moderators help discussions on the Forum be collaborative and truth-seeking, generally uphold epistemic norms, make the space feel safe and welcoming, and guard against other dangers faced by online discussion spaces (I think unmoderated forums tend to get overrun by trolls or start to feel more like battlegrounds for discussions that have winners and losers). I want Forum users to be able to trust that there’ll be a high baseline of civility and generosity in the interpretation of their words (a sense of cooperative spirit), and I don’t think we can do this without moderators.
We have ~4 active moderators right now, but new moderators could add a lot. We’re still quite strained on capacity, which means we don’t have time to do a lot of what we think would be valuable to do. And moderation incidents tend to come in surges (when it rains, it pours). Extra moderators would be extremely helpful in those surges. New moderators can also add new perspectives; we’re building out our policies, norms, and processes, and more voices there would be helpful.
On what it’s like to be a moderator:
Moderation can be pretty stressful and thankless (although not always!), but we have some big things going for us: people who use the EA Forum are often incredibly well-meaning and helpful. I think our baseline of civility and kindness is really high. And when incidents happen (commenters get into a rough disagreement, etc.), the people involved are often willing to change their minds, accept our requests, or work with us to find a solution to whatever problem is showing up.[6] I don’t know what “the average moderator” for an online space has to deal with, but I’d guess that it’s worse in many ways. A related thing I’m grateful for is that the moderators tend to be wonderful, and it’s just lovely to work with caring and smart people.[7]
I also asked @Lorenzo Buonanno to share his thoughts on being a moderator — what he likes and doesn’t like. (He’s probably biased, etc., but I figured that adding a perspective from someone else, especially someone not employed by CEA, would be better than only including mine.) Lorenzo wrote:
Things I like:
After joining the mod team last November (5 days before the FTX collapse), I was really surprised by how thoughtful the moderation team is, and how thorough they were in their decisions.In most other online forums, I often get the feeling that moderators just ban people and topics they don’t like without much thought. Here I saw really serious effort to be impartial, collaborative, and consider moderation decisions from many perspectives. I see many people on the moderation team as an example of what EA should aspire to be.
I think the results really show. Even on the most controversial and inflammatory topics the discussion has been very surprisingly civil.[8] In a typical week, the discussion standards on this forum are extremely high compared to any other forum I know of.[9]
I feel really proud that I’m a small part in helping a little to maintain this Forum, where I regularly see incredibly valuable discussions[10] on how to help others the most, and that I can take some load off the Forum team so they can focus more on other projects.
Things I don’t like:
We once got a report for a comment from a user whose writing I really like, and although I really agreed with the content of that comment and initially upvoted it, it was clear on a closer read that it was violating forum norms and needlessly inflammatory. Having to write a public warning in that situation (knowing it would be downvoted) was tough.It’s also sometimes hard to find the right balance between having a higher bar for moderating criticism or things I don’t agree with, and limiting the most norm-violating or fight-seeking content.[11]
Moderator responsibilities (and some things that aren’t a moderator’s responsibility)
As a moderator, your goal would be to make the EA Forum a great space for collaborative discussions that will help us do good. You would:
- Make the Forum feel safe, welcoming, and collaborative
- You might do this by handling moderation incidents (preventing or stopping aggressive or otherwise norm-breaking behavior), by being transparent and consistent with moderation decisions (so people are not stressed about the possibility of over-moderation, or unpredictable moderation), or by setting a great example and directly helping other Forum users.
- Nurture important qualities or norms on the Forum (like honesty, civility, generosity in interpretation, etc.)
- You might do this by communicating Forum discussion norms and proactively nudging conversations into more collaborative directions if they seem likely to get iffy.
- Help the rest of the moderation team improve the Forum
- You might suggest improvements to our policies, internal systems, and the Guide to Norms. And you might help other moderators respond to moderation incidents they’re dealing with by flagging new considerations, disagreeing with their proposed approach, etc.
In practice, active moderators work on:
- Responding to norm-breaking behavior (warning users, sometimes banning them for some time, trying to sort out disagreements, etc.)
- A lot of our work is currently reactive. Something bad or iffy happens — someone starts insulting people, some personal information is revealed, someone seems to be getting aggressive, etc. — and we need to respond. This usually gets flagged to us, at which point a moderator either responds directly, brings this up in our Slack for discussion (often they might propose a course of action, and the discussion will just be a sanity check), or asks someone else to take it on.
- Providing input on moderation decisions others are handling
- When situations are higher stakes (e.g. if we’re considering banning someone) or unusual/new (in which case we might be developing a new policy), we’ll often want multiple people to help with a decision.
- Improving our moderation policies, internal systems, and the Forum’s discussion norms (or how they are communicated
- Some of the incidents that we encounter are somewhat formulaic; we’ve done something similar before and know what to do. But we also often encounter new situations, for which we prefer to develop rough policies — or approaches that can generalize to parallel situations — if we can (as opposed to treating them as one-offs). (E.g.) We could also do more to streamline our internal processes, and I want us to improve the Guide to Norms (both in terms of content and how we communicate and encourage people to follow the norms).
- Coordinating with the rest of the moderation team to make sure that we don’t drop some tasks accidentally — the “moderation coordination rotation”
- We’re trialing a system in which active moderators take turns coordinating the team for a week. This doesn’t mean that they have to take care of all the incidents that pop up during that week, but they do have to make sure that everything gets resolved one way or another. (So they might bring something up in Slack and make sure that it’s being owned by a different moderator. Or we’ll decide to not intervene; the goal is not to do something in response to every flag, but rather to avoid dropping incidents we should have responded to.)
- We’re not totally sure if we’re keeping this system — someone might take on coordination all the time — but for now, this is probably a key responsibility. If the system changes, some form of this responsibility will probably stay.
- Proactively nudging discussions into better directions if they seem likely to get iffy, ideally without resorting to actions that only moderators can take (like banning users)
- I would like us to do more of this.
Moderators generally don’t work on approving new users, tagging posts, or guarding against spam. Forum Facilitators are in charge of that.
Skills & fit (moderators)
You don’t need special background or any credentials to be a moderator. You also don’t need to have an extensive presence on the Forum, or a lot of experience in EA.
You do need some of the following key skills and qualities, but if you’re interested and are doubting whether to apply, please just apply. People often underestimate themselves, and we’d much rather have too many applications than miss out on excellent moderators because of this. I’m listing these qualities to give a sense of what we’re looking for, though, and to help people decide that it’s just not worth their time to apply.
- Written communication
- You don’t need to be a native English speaker or to have excellent writing skills. But you do need to be able to communicate clearly in private conversations and in public.
- Good judgement, an open mind, and empathy
- There are a lot of cases where we can’t just follow policies or what we did in previous cases, and good judgement is important for these. Moderators also need to be willing and able to take on different points of view and understand how a situation probably feels from the perspective of different people, whether they agree with them or not.
- Ability to sort out the truth when things are pretty confusing, ability to deal with uncertainty & disagreement
- We sometimes deal with pretty confusing texts, where it’s important and difficult to understand if someone is telling the truth, trying to manipulate readers, etc.
- Independence & the ability to execute without forgetting important details
- Moderators are pretty independent. It’s not easy to neatly delegate moderation-related tasks and it’s important to be able to rely on people doing what they say they’ll do, so this is a key skill.
- Basic understanding of important EA topics
- It’s helpful for moderators to have context on the topics discussed on the Forum, but we don’t need a lot of “EA expertise.”
- Willingness to be wrong in public (and to admit it)
- We make mistakes. We don’t want to hide them, avoid noticing them, or let the fear of making mistakes prevent us from doing useful work. (It’s also important to be able to endorse a choice we’ve made even if there’s public disagreement with it.)
- Willingness and ability to engage with potentially triggering topics and content
The moderator application process
There will be three parts to the application:
- Round 1: A form, which consists of basic information about you and two written questions. This should take 15-20 minutes.
- We’ll try to get back to you as soon as we can, with a target of no more than two weeks’ delay (this might change if we get more applications than we expect).
- Round 2: A follow-up questionnaire, which will focus on some fake moderation incidents — it’ll ask how you might respond to them. This should take at most an hour (I don’t know the exact time, as I’m still finalizing this stage).
- We should get back to you within two weeks.
- Round 3 - paid: This should consist of a couple of timed mock tasks and a brief call with me or another moderator.
We’re hoping to bring on a few new moderators. We'll try to respond quickly after each round of the application process.
“Content Specialist” (pre-announced full-time online content role)
The role title might change a bit.
We’ll probably soon be looking for a full-time “Content Specialist” for the Online Team (remote or in Oxford/Boston/London/Berkeley[2]). This person will work with me (Lizka) and take on many of my current responsibilities (I head the moderation team, seek out and curate content on the Forum, run/write the monthly EA Newsletter and the weekly Forum Digest, and work on other projects). The position is very flexible.
(If you indicate interest, we will email you when the position opens up, and we might streamline the process for some people who seem like they might be a very good fit.)
Why work on content (on the Online Team)?
I’m really excited that we’re getting ready to hire for this role, as I think it could have a very big impact.
In brief:
- The Centre for Effective Altruism plays a critical role in organizing and supporting the EA network.
- The EA Forum is a key part of that (as described in greater detail above), as are other projects this role would probably be involved in (like the EA Newsletter, which goes out to around 59,000 subscribers every month and has a ~40% open rate, and effectivealtruism.org).
- More “content” capacity (thought, skill, and time devoted to improving online content and engagement) would allow the Forum team to expand and improve our projects. There’s a lot that we could do but can’t get to right now, and a lot that we’re just not thinking of trying out. (And this is a real loss; new ideas are being shared on the Forum, people are coordinating, learning, connecting, etc. — but a lot of things that could happen don’t.)
- This is where you come in.
Responsibilities (Content Specialist)
You would have some core responsibilities as a Content Specialist, but as mentioned, the position is very flexible; we expect to shape the role around the person we hire.
Your high-level goals would be to:
- Encourage engagement with the most valuable content and resources
- Make sure that the Forum is as useful as possible for helping people to do more good
Here are some more specific responsibilities you would take on, along with example projects (in no particular order):
- Encourage engagement with important and interesting content
- You could feature and organize content or conversations on the Forum via curation, the weekly Forum Digest (or possible topic-specific newsletters we could try), helping the rest of the Online Team shape the Forum and build new features, etc. (We’ve occasionally considered setting up vetting systems for some content or spaces — there are a lot of possibilities.) Alternatively, you might actively seek out excellent content by doing direct outreach to authors, hosting writing workshops, cross-posting pieces you find outside of the Forum, setting up dialogues or debates on important topics, running potential prizes or themed Forum events, etc. Or you might write your own content as necessary, build out and improve the Topics Wiki, and create better introductory materials on different topics (like intro sequences).
- Improve the epistemics, safety, and trust levels on the Forum
- To do this, you might work on moderation (potentially as head moderator) for the EA Forum, where you would develop and communicate moderation policies, help people have productive conversations, improve systems, hire moderators as necessary, and more
- Help the Online Team (or the EA network, or CEA) prepare for future challenges and opportunities
- For instance, you might actively develop models of different groups who use the Forum (via user interviews, active participation on the Forum, etc.) to better help them coordinate (and to help the rest of the Online Team understand what needs these groups might have, or how we can organize the space for them more helpfully). You might also make internal systems more efficient (or automated), keep informed on upcoming developments or maintain situational awareness in relevant areas, communicate with Forum users, etc.
In the end, what you work on would significantly depend on your interests, skills, and what you/we identify as core needs and bottlenecks.
Skills & fit (Content Specialist)
You don’t need to have years of experience with online forums, or with EA to be a good fit for this role. But to be a good fit for the role, it’s probably the case that:
- You’re dependable and can get things done pretty independently
- You can write clearly and well
- You have good judgement and can deal with uncertainty and disagreement
- You have a baseline of knowledge about core EA topics, and the ability to learn more when needed (this will likely involve being curious and having a growth mindset)
- You’re good at figuring out the truth when things are pretty confusing and have a strong sense for what good content looks like
If you’re not sure, please err on the side of indicating interest!
Other information
CEA is an equal opportunity employer and values diversity at our organisation. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, or disability status. We are happy to make any reasonable accommodations necessary to welcome all to our workplace. Please contact us to discuss adjustments to the application process.
- ^
The exact title of the role might change a bit.
- ^
There’s an office in Oxford that you could work from, and a coworking space in Boston with a couple of people from the Online Team. People from the Online Team and CEA are also working from other coworking spaces in London and Berkeley.
- ^
We “banned and purged” roughly 2550 spam users from the Forum in 2022. (It was around 806 users in 2021.) These are accounts that join with names that range from “Bob” to “TAXI-Vienna-CALL-7777777777” and try to post spam content on the Forum.
Spam users are becoming sneakier these days. I’m incredibly grateful to Facilitators for working on this (and on new content classification, etc.).
- ^
Some moderators are volunteering, but we default to paying people.
- ^
About 4500 users posted something or commented in the past year, and by some analytic measures, we had almost a million users last year. Most people in the EA network have heard of the Forum, have read at least one post on it, etc. A lot of people get actively involved in EA-motivated projects via the Forum.
- ^
E.g. we’ve asked users things like: “Hey, this thread seems unproductive. Could you take a break and return?” And they agreed with us and did it!
- ^
I think we also have other things! Like the fact that there are developers and product managers who are willing to build us tools that make our jobs easier, and the fact that some people have been working on this for years and are happy to give us advice. But I’m skipping them here for now.
- ^
with very few exceptions out of hundreds of posts
- ^
at one point two weeks ago I thought that the reporting functionality must have been broken, given how few reports I was seeing, but we checked and it’s actually just all users being great
- ^
like this great back and forth on the evidence on StrongMinds https://forum.effectivealtruism.org/posts/HqEmL7XAuuD5Pc4eg/evaluating-strongminds-how-strong-is-the-evidence?commentId=byEcGGBy6zjsQ6Zky#byEcGGBy6zjsQ6Zky
- ^
This was a particularly visible mistake I made on this https://forum.effectivealtruism.org/posts/AAZqD2pvydH7Jmaek/lorenzo-buonanno-s-shortform?commentId=CDPS2JQKziWsWg73D
Will Aldred @ 2023-05-19T22:09 (+17)
We “banned and purged” roughly 2550 spam users from the Forum in 2022. [...] These are accounts that [...] try to post spam content on the Forum.
Yikes! I'm all the more appreciative of the behind-the-scenes efforts being made by the mod team.
NunoSempere @ 2023-05-19T03:22 (+12)
Personally, I would give more weight to epistemics over making people feel welcome and safe.
Zach Stein-Perlman @ 2023-05-19T04:34 (+15)
I don't think they trade off much, at least in moderation decisions.
If there's something you think moderators should be doing to promote epistemics, I'd be interested to hear.
Jason @ 2023-05-19T14:55 (+10)
Could you say a little more about how you think the moderators can / should be improving epistemics in their official capacities, either in general or by trading off differently with making people feel welcome and safe? In particular, the mods' hard powers -- curating, deleting, banning, etc. -- are pretty blunt tools, and must be employed carefully to guard against the risk (or perception) of decisions being made too much on the moderators' own viewpoints.
britomart @ 2023-05-19T08:27 (+10)
What is the salary range for the Content Specialist position?
EA Opportunity Board @ 2023-05-19T10:29 (+9)
These seem like great opportunities. They are now live on the EA Opportunity Board!
BrownHairedEevee @ 2023-05-20T11:16 (+8)
Great post, thanks for sharing these positions! I'm excited to apply.
What information should go on your resume for these roles, particularly the moderator role? Since my day job is software engineering, most of my experience related to content moderation is from stuff I've done on the side or in school.
Ruby @ 2023-05-19T01:15 (+7)
Quick thought after skimming, so forgive me if was already addressed. Why is the moderator position for ~3 hours? Why not get full-time people (or at least half-time), or go for 3 hours minimum. Mostly I expect fewer people spending more time doing the task will be better than more people doing it less.
Jason @ 2023-05-19T01:55 (+8)
Although they didn't state exact numbers, it sounds like there may be ~ .5 FTE of moderator capacity right now (~ 4 mods averaging 3 hours a week, plus advisors) and they are looking to hire another fraction of an FTE worth of capacity. Expending all the available budget on 1 or 2 mods with more hours would likely make it more difficult to achieve a "broader diversity of perspectives and more capacity in times of crisis, or when there’s a sudden cascade of moderation incidents."
Ruby @ 2023-05-23T02:04 (+2)
I can follow that reasoning.
I think what you get with fewer dedicated people is people with the opportunity for a build-up of deep moderation philosophy and also experience handling tricky cases. (Even after moderating for a really long time, I still find myself building those and benefitting from stronger investment.)
BrownHairedEevee @ 2023-05-20T11:04 (+4)
I noticed that the moderation on-call rotation would pay about $100/week. Since moderators are expected to work about 3 hrs/week and get paid $40/hr, the on-call pay is equivalent to about 2.5 hours per week, which is less than the normal pay. Is the on-call pay on top of the normal hourly pay?
Lizka @ 2023-06-16T13:16 (+2)
Quick note: we've opened the Content Specialist application (deadline 19 July). I'll be posting more (and more visibly) about it later.