Sam's Hot(?) Takes on EA Community Building
By Sam Smith 🔸 @ 2025-09-09T09:33 (+85)
Executive Summary
I think EA community building is too focused on frameworks and optimising for impact when what actually works is much more basic: genuine relationships and starting where people already are, not just treating the community as instrumental for impact, but something valuable in and of itself.
Learn from groups that actually work. Every other society at freshers' fair is trying to build student communities too, and many of them do it better than us. We're not special just because we care about impact, and our focus on this probably makes us worse than the groups that solely focus on community.
Start with altruism, not EA concepts. Instead of "EA is a way of thinking about how to do the most good," try "Want to figure out how you can do more good for the world? Let's explore that together." People who genuinely care about helping others are more likely to stick around and contribute than those who just want frameworks. This approach welcomes activists and philosophers who get put off by the "EA Handbook" but would add valuable diversity to conversations.
Build actual social community and guide like a D&D dungeon master. Do fun stuff together - board games, films, activities beyond discussing readings. Think of community building like being a dungeon master: you set the stage for exploration and respond to input without forcing predetermined outcomes. You can steer toward relevant questions without "railroading" people toward becoming "star-spangled EAs" who buy into everything. Real exploration means people might conclude EA isn't for them, and that should be okay.
Consider alternatives to traditional fellowships. Intro fellowships often create power dynamics that make genuine friendship difficult and contribute to a problematic deference culture where young EAs work on things they don't believe in. It's worth exploring whether weekly socials, casual conversations, and approaches that treat people as individuals with personal fit work better for your context.
Introduction
During UKGOR last weekend, I voiced many of the takes I discuss below and was pretty surprised by the reaction. Lots of the organisers seemed like they’d never considered the perspectives I lay out and really resonated with a lot of the more person-centred approach I suggest. I felt compelled to write this to present some perspectives I don’t see talked about very much that I suspect community builders may benefit from at least considering, even if they disagree.
These takes are mainly based on intuition I’ve picked up from communities outside of EA and personal reactions to what I have seen in EA groups.
Thanks to Kashvi Mulchandani, Jian-Xin, Chris Clay, Ada Ge, Anirdesh Shankar, Lauren Brickell, and Mina Angus for their feedback and conversations that helped develop this post!
On Learning from Others
Take notes primarily from "normal" societies, not just EA ones. At Freshers' Fair, every society is trying to form a social group around a central issue. A few might have casual reading groups, others have activity-focused meetups, but they're all building communities of students in social groups. EA has only been around for a decade; we don't have more experience running successful societies than literally every other society.
I've seen this work well in queer/trans groups I've been part of (like T-Climbing) and many church communities. People want to feel welcomed into a community and be comfortable existing in a space without a strict rulebook on how to be. Making engagement with a set of principles mandatory for belonging enforces a rulebook rather than welcoming people in and having them explore things of their own volition.
Context doesn't guarantee good community building. Having deep EA knowledge is less important than instinct and soft skills. The best community builders combine genuine care for people with practical social intelligence, not just familiarity with EA concepts. To get people motivated to think about this stuff for themself and support them in that, the actual wanting to be there and talk to you is super important. For me, it feels super obvious when someone is having a conversation because it’s career/ decision relevant rather than just because they want to talk to me. Those conversations are fine and have their place, but making a society a constant EAG means people can never actually relax and really bond with the people who are meant to form their community.
On Starting with Altruism, Not Frameworks
Focus on the A first, not just the E Compare these freshers' fair pitches:
EA-first approach: "EA is a way of thinking about how to do the most good. Through our fellowships, you'll learn how you can 10x your impact by taking the best steps."
Altruism-first approach: "Want to figure out how to do more good for the world? We're a group of students exploring that question together - no prior experience or specific beliefs required, just genuine interest in making a positive difference.” You can then follow up with specific things the society offers, like events, community, access to the wider EA network, etc.
I'm much more excited about people who are altruistically inclined first, since their involvement is likely more sustainable. They're motivated by something more stable than rationalism/problem-solving. Many grassroots activists and philosophers interested in social reform would bring valuable diversity and counterfactual impact to EA discussions, but the EA-first approach shuts them out. This also helps the community and movement as a whole feel more focused and cohesive.
The latter is a welcome into a community, not just a synopsis of a lecture course they can sign up to. In my experience, no one really makes friends in lectures, and there’s a reason - that isn’t what a lecture is! People are there for a specific purpose, which makes sense. But if we want to build real community, people should be there because they want to be in community, not solely because they want to learn something.
On Building Genuine Community
Building social groups requires personal relationships and interaction networks, not just shared cause prioritisation. People stick around because they like the people and feel connected, not because they've studied a reading list.
Successful communities I've observed have a shared sense of identity – something often missing from EA groups since EA feels more instrumental than a core part of who you are. They also often have some focus where everyone is engaging in an activity they enjoy together, and in doing so, they help each other learn and grow and form more positive associations with the group, making them want to stick around. The social element makes the event feel categorically different to doing it alone or with some random group. This is what I think EA groups need to strive to achieve, and I don’t think fellowships are particularly well-suited to this. Just having accountability because people are there isn’t the same as having a community you really care about. A study group is different from a group of friends, not mutually exclusive, but on its own, a study group is not the sort of community I think we should aim for.
Social stickiness is real and underrated. Connect people to you, and more importantly, get them to connect to each other. Plan social activities with vaguely impact-focused framing – playing board games (the trolley-problem card game is a great example), watch Black Mirror episodes and discuss, do lightning talks and "hot seat" sessions where you get to know people as people, not just as cause-aligned agents of impact. The real impact conversations happen individually, but the social activities build the relationships that make those conversations possible.
On Facilitation and Relationships
1-1s often feel contractual rather than social, which is off-putting. People shouldn't need to prep or have specific outcomes to talk to you. Make conversations no-pressure and note any dynamics explicitly to avoid implicit discomfort.
Have your own thoughts, don't just hand out readings and links. Resources are helpful, but talking to someone who just forwards links isn't the same as having a friend who can engage with ideas.
On Strategy and Learning
Community builders should reflect on what is/isn't working, but shouldn't blindly adopt other groups' strategies. Too much focus on prescribed default advice (do freshers fair, give intro talk, run fellowship, figure it out) treats people as "potential impact creators" rather than actual humans. Focus on your specific context and real relationships.
This could look like regularly talking to the members of your group and asking them what they’d like to see and what they’d enjoy. Giving them input into the space you are creating with them helps guarantee people enjoy the stuff you’re doing and also helps reduce the problematic power dynamics I discuss later.
When getting advice from other community builders, remember they are talking about successful experiments that worked for their group; these are not guaranteed to work for yours, as it’s a completely different set of people with different preferences and thoughts!
CB shouldn't be a stepping stone to "better" careers – it's vital in itself. Treating community building as preparation for other work undervalues both the role and the people doing it well. Organising a uni group can be a really important datapoint for how well-suited you could be to doing community building past this group. If you really enjoy it and have good intuitions, it’s possible you could cause lots of impact doing a role with similar skills permanently. Ignoring this, group organisers may walk out of uni completely neglecting the very important (and often time-intensive) personal fit test they’ve done, feeling a step behind their group members who might focus on something like research by doing research fellowships and projects.
On Deference, Personal Fit, and Facilitation
Talking to young EAs, I often get the sense people are doing things because they've deferred to the forum, 80K or their group organiser. I know a lot of people who don't "buy" AI Safety as a cause area but feel obliged to work on it because they've absorbed the message that it's objectively the most important thing. This leads to people spending their life's work on something they don't actually believe in and can't explain without referencing other people's work. We need WAY more emphasis on personal fit and treating people like real individuals rather than career profiles!
For a concrete hypothetical, imagine an organiser talking to someone new to AI Safety and trying to address their doubts. As they talk about their doubts and uncertainties, the organiser gives links to posts and podcasts that address these uncertainties until the person raises a novel point they don't know how to tackle. At this point, the organiser doesn't really know where to go and feels awkward, kind of accepts their point and moves on, feeling frustrated, and the person stops engaging with the society, leaving impact work as an afterthought.
For a different approach, think of community building like being a D&D dungeon master: you're the worldbuilder who sets the stage for exploration and responds to input without forcing the story too much. You can steer toward relevant questions without "railroading" people. In EA terms, railroading looks like giving people an illusion of exploration while ultimately pushing them to become "star-spangled EAs" who buy into everything. Real exploration means people might conclude common EA principles aren't for them, or that different cause areas matter most - and that should be okay and encouraged.
That person new to AI Safety might decide they're not convinced by the field as they work through their uncertainties with their "dungeon master" organiser. Because their questioning has been supported, they decide to continue engaging in their uni EA community and discover a different cause area that they really buy into and do some really cool work in it because it's a better personal fit.
Good community builders shouldn't be certain about everything (or even much) but should actively question. They should be comfortable with uncertainty rather than presenting an "EA canon," even with criticism prefaced. I get the sense from talking to some community builders that they appear more like walking cross-link docs than people with actual opinions and thoughts. That's not what connection looks like - to expect people to be vulnerable enough to question and reflect on these topics, you need to model that by doing exactly the same.
On Power Dynamics and Hierarchy
I think we focus too much on not looking like a cult, rather than focusing on doing less culty things. The way to not be cult-like is to do fewer cult-like things and more things normal communities do; not to constantly reassure people we're not a cult. It’s good to be aware of optics, but focusing on the optics of a thing rather than the thing itself loses sight of the actual community we’re supposed to be focusing on.
Churches often have concerning power dynamics where leaders are seen as closer to divine truth. EA risks similar dynamics through seniority and "EA nepotism" – creating a sense that you need to appease leaders or demonstrate you're "good enough" to learn "our way" or land a job. This is potentially worse than churches because of the selectivity involved.
People's approach to combating this by pointing to criticisms of EA feels like virtue signalling epistemic humility while assuming we're ultimately right but happy to humour disagreements. In the same way a church might occasionally indulge a conversation with a doubting member of its congregation, there is an assumption and explicit focus on pulling them “back into the flock,” an underlying prior that they’ve just lost their way rather than actually encouraging them to follow that doubt and see where it leads. To combat this, see the point in the last section about organisers actively questioning things themselves and being comfortable with uncertainty.
On Programming Alternatives
IF (Intro Fellowship) is overrated and problematic. Fellowships often have high attrition rates and create awkward social dynamics that enforce power imbalances, making genuine friendships difficult. They're also a poor proxy for engagement – passive reading and discussion don't equal value alignment or sustainable motivation to do impact.
Whilst I think your group’s context and the organiser/committee’s personal fit should ultimately decide what you do instead, some options could include:
- The society-first model that focuses on fun socials, which build connection that facilitates deeper community and conversation
- Less formal discussions about the content
- Something more project-based that’s about learning by doing (Non-Trivial does this very well!)
- Focusing on higher context people having more in-depth conversations/ presentations about their interests
It's very hard to serve multiple contexts at once. Either focus on getting people to start thinking about impact, or run higher-context programming for people already engaged. Trying to serve everyone at the same time usually serves no one well.
Instead of prescribed fellowship approaches, focus on regular (weekly) socials, individual conversations about people's interests and thoughts (no-pressure, free-flowing, in nice public spaces), and helping people figure out how to use their existing skills and passions for positive change.
Key Takeaways for Community Builders
I'd encourage community builders to audit their own group dynamics - are people engaging because they want to be there, or because they feel they should? Are your most involved members becoming friends, or just study buddies? Small shifts toward treating EA as a community worth belonging to, rather than just ideas worth learning, might make a bigger difference than you think.
If you're organising a group and want to try some of these approaches,
- Start with altruism in your outreach. Replace framework-heavy pitches with "Want to figure out how to do more good? Let's explore that together." Focus on people's existing motivation to help others rather than analytical approaches.
- Replace formal “1-1s” as the default conversation with casual conversations. Meet in nice public spaces, make it no-pressure, and explicitly acknowledge any power dynamics to avoid implicit awkwardness. Sometimes mentorship and specific feedback, like a traditional 1-1 or conference conversation, are super useful, and there is a place for it. However, this should be the explicit dynamic and specifically communicated, not the norm.
- Learn from other successful societies at your university. Study what hobby groups, religious communities, and activity-focused clubs do well - they've been building communities far longer than EA has existed.
- Question whether you need a traditional fellowship structure. High attrition rates and power imbalances may outweigh the benefits. Consider whether weekly varied socials plus individual conversations serve your goals better. Board games, movie discussions, lightning talks, "hot seat" sessions. Let the deep impact conversations happen in individual chats once people trust you.
If you go for this and don’t yet have a stable group you’re catering to, make sure any event you run could be explained to someone outside of the society and make sense without getting some incredulous stare. Otherwise, it is probably a bit too weird to allow for new members or engage people not already really brought in.
- On Practical Elements: Do buy food and snacks, do fun things, make jokes, laugh! Basic social hospitality matters enormously for vibe setting and also, importantly, your enjoyment!
- Not everything needs to be EA-focused. Some of the best community building happens during non-EA activities that help people get to know each other as humans.
- Be aware of the world, not just the EA Forum. Ground conversations in a broader context, not just internal EA discussions.
- Be mindful of selectivity and exclusion. EA's focus on elite institutions and credentials can drive away talented, value-aligned people from diverse backgrounds who could bring valuable perspectives and solutions.
- Build genuine friendships first. The progress on impact happens when people feel connected to you and each other as humans, not when they've completed readings.
Charlotte Darnell @ 2025-09-09T10:52 (+15)
"Real exploration means people might conclude common EA principles aren't for them, or that different cause areas matter most - and that should be okay and encouraged."
- I think this is important to say, thanks for saying it!
Brad West🔸 @ 2025-09-09T16:28 (+8)
Thanks for sharing your piece, Sam. There's a critical insight here that impact-maximizers might miss if they pattern-match "treat community as intrinsically valuable" to "prioritize feelings over outcomes." The actual claim is deeply pragmatic: authentic relationships are instrumentally superior for maximizing long-run expected impact.
Our current model optimizes for legible short-term proxies (fellowship completions, cause-area conversions) that fit neatly in grant reports but poorly predict what matters: who's still contributing meaningfully in 5-10 years, who's thinking independently rather than deferring, and who's building things that wouldn't exist otherwise. In expected-value terms, 20 people with genuine conviction working for a decade dominate 100 people weakly deferring for two years—especially when those 20 bring epistemic diversity and new ideas for impact rather than reproducing consensus.
If we're serious about maximizing impact, we should at least question whether our current metrics actually maximize it. What would it look like to measure success differently—tracking 36-month retention, independent project initiation, or comfort disagreeing with group consensus? If authentic community building could produce superior long-term outcomes (as history and successful movements suggest), then resisting it isn't principled; it's optimizing the wrong proxies. I'm curious what others think: are we measuring the right things, or are we leaving impact on the table?
Sam Smith 🔸 @ 2025-09-09T20:07 (+5)
Thanks for this Brad, I totally agree!
The thing that made utilitarian ethics click for me was the idea that often the problems stem from the proxies for impact rather than the impact itself and that nth order indirect effects of your direct actions should still be factored into the moral calculus of your decisions.
I also think a portfolio of proxies is useful in the space because of the moral uncertainty about what the most important things to track are. Even if fellowship completions were a strong metric (which I am very sceptical of), I think at least trying my social-first approach could give lots of valuable insights!
Aaron Gertler 🔸 @ 2025-09-09T19:39 (+3)
I didn't read the full post, but the gist of it aligns with what I did as an organizer (started Yale EA):
- I ran another organization (and joined many others) before founding YEA, which gave me some experience with logistics/keeping a group on task. I also talked to a few other new organizers to share ideas and observations (though we were all newbies at that point).
- The most important aspect of the group was that we had fun and became friends. The times I remember aren't the quixotic "EA" activities (which were pretty shambolic, since there were no intro courses back then), but the lunch conversations and movie nights and hard personal things we dealt with together. College is a very crowded time, but people returned to meetings and went to parties because they liked the nerdy, good-natured group we had formed.
- I never even met the person who became the leader for several years after I left — turns out (IIRC) that they attended one party we held and liked the atmosphere so much that they decided to join the year after. If we hadn't been hosting parties with very light theming, the group may not have lasted through the 2010s.
Sam Smith 🔸 @ 2025-09-09T20:11 (+1)
Ah that's awesome Aaron! I do feel like these social approaches can lead to really cool unexpected outcomes, which I think is harder to achieve with things like fellowships and a more "teachy" approach.
I also think this adds to my feeling that the best community builders are... the best at community building, not just the most knowledgeable - good generalist abilities and intuition, prior experience, and strong social skills (approachability, likability, openness, etc.) all feel more important in my mind than a mental index of the forum/ being really brought into the ideas (these things can be helpful, but don't feel nearly as important)
Kestrel🔸 @ 2025-09-09T13:43 (+3)
I agree with pretty much everything you've said, with the caveat that EA should be about an intro course offer (alongside social events, you're bang on about social events). I just don't think we have the right course offer.
Rather than trying to persuade people about cause areas you gotta run teaching which upskills people's effectiveness and their altruism. Then you just kinda set them free to do whatever, and some of them will figure the whole cause thing out for themselves - and these are the kinds of people you want as "highly-engaged EAs", not the kinds of people who yield to cause area marketing pressure.
This might look something like discussions on:
- Importance, Tractability, Neglectedness as an evaluation framework
- Marginals, counterfactuals and absorbency
- How to take risks, reflect and learn from experiences, and get community support
- How to avoid being taken in by marketing, digital content, the outrage machine in your pocket, etc.
- Navigating and resisting reputation, prestige, and power dynamics in life decisions
- Leadership and social skills development
- Balancing relational altruism with an effectiveness mindset
Practical community skill-building in plant-based cooking and first-aid
You also need a very heavy focus on the people who are EAing, including (accurately) portraying them as flawed humans struggling with life decisions just like you.
Another note: Most EAs started altruistic and became more effective, so it's worth structuring your pipeline around this direction. But the ones who went the other way are disproportionately doing the hardest jobs around the place, so don't discount them.
Sam Smith 🔸 @ 2025-09-09T20:01 (+1)
I think I agree with this - I'm very pro agency multiplying (this is what I think, e.g. Non-Trivial and Leaf's fellows programmes do particularly well). With EA Bristol, which I run, for example, the plan is to have the social group dynamics be the basis for these discussions, but I absolutely want to target stuff around soft skill development through project sprints, discussions based on fun stuff (e.g. use media to kickstart conversations around ethics and stuff), etc. Rather than explicitly having a session where we "explore ITN", it would be more conversational in nature, with the emphasis being on people exploring these ideas and discussing them, as well as learning through doing (e.g. projects where the ideation is supported by frameworks like ITN) rather than more teachy sessions which create some of the power imbalances I discuss in the post.
I also agree we shouldn't discount the "Effective-first" EAs, however, I think by their nature they are more likely to carve out a path for themself and access upskilling resources, independent projects and fellowships with less reliance on a central community. I think we of course could still benefit them with community, but it is harder to cater to both camps at the same time and counterfactually I suspect the altruists benefit more.
Of course, feel free to push back on any of this or correct me if I misunderstood your point. Thanks for the comment! :)
Kestrel🔸 @ 2025-09-10T08:58 (+1)
Yup: attract on altruism, upskill on effectiveness. However I have sometimes noticed that altruist-first groups tend to have social ingroup/outgroup judgment criteria on how long one has been an altruist (see e.g. the vegan society's positions of power being occupied by people with long-time "vegan credentials") rather than focusing on the now and the future. They also can place additional value on ineffectively-altruist actions primarily as social signals (e.g. promoting veganism over lacto-vegetarianism really hard for animal welfare reasons despite the fact that all your lifetime dairy consumption equates to roughly one cow.) It's all part of group-bonding against a sort of "enemy". And it can be super off-putting to those who are effective but "upskilling on altruism", and possibly drive them away. It's a social dynamic you've gotta find a way through as an EA group organiser.
We might see each other sometime! My postdoc sometimes takes me to Bristol, and I'm keen to get you lot vaguely joined up with NTR-Net that hangs out at your uni. Chris Clay is starting at Bristol doing Maths this year - was up with me at EA in the Lakes, is keen.
Charlie Garfield 🔹 @ 2025-09-10T07:16 (+1)
I’ve been considering writing something similar for a while, so I’m really glad you posted this (I honestly lacked the courage to do it myself).
My own experience aligns with your altruism-first approach. I got involved with EA Oxford through that route, and when I took over organizing our socials (initially just out of willingness and due to my college’s booking policy), our primary organizer noted how surprisingly effective they were at engaging people.
I’d been planning to bring this social-first model back to my home university, but I’ve been hesitant to buck the conventional wisdom about what works for EA groups. Despite some current issues with my school’s activities council, I was defaulting to running an intro fellowship. Your post makes me reconsider - having evidence from different EA group models could be really valuable.
On pitching EA: I completely agree about reorienting our pitches. At this year’s activities fair, I essentially A/B tested different approaches. The most effective ones focused on “helping people do as much good as possible, whatever that ends up meaning to them,” then describing areas others have found effective through their own frameworks.
On “fellowships”: The term itself reinforces the exclusivity and hierarchy issues you mention. It positions itself as the path into EA, but I don’t think it’s particularly good at that role. We should be introducing ideas and encouraging exploration and sharing, not gatekeeping.
On philosophical grounding: EA often gets bogged down in philosophical prerequisites when our core appeal is simple: people want to do good effectively. We don’t need everyone to choose a philosophical framework first. The desire to save lives can come from virtue ethics, deontology, or just basic human compassion.
This reminds me of a conversation I had with a food service worker at Mission Burrito last year. He was drawn to GiveDirectly as an alternative to what he saw as a corrupt charity world. His entry point was completely different from the typical EA pathway, and it made me realize how many people we might be missing by not meeting them where they are.