Propose and vote on potential EA Wiki entries

By MichaelAšŸ”ø @ 2020-08-04T23:49 (+49)

2022 update: This is now superseded by a new version of the same open thread.


(I have no association with the EA Forum team or CEA, and this idea comes with no official mandate. I'm open to suggestions of totally different ways of doing this.)

Update: Aaron here. This has our official mandate now, and I'm subscribed to the post so that I'll be notified of every comment. Please suggest tags!

2021 update: Michael here again. The EA's tag system is now paired with the EA Wiki, and so proposals on this post are now for "entries", which can mean tags, EA Wiki articles, or (most often) pages that serve both roles.

The EA Forum now has tags, and users can now make tags themselves. I think this is really cool, and I've now made a bunch of tags. 

But I find it hard to decide whether some tag ideas are worth including, vs being too fine-grained or too similar to existing tags. I also feel some hesitation about taking too much unilateral action. I imagine some other forum users might feel the same way about tag ideas they have, some of which might be really good! (See also this thread.)

So I propose that this post becomes a thread where people can comment with a tag idea there's somewhat unsure about, and then other people can upvote it or downvote it based on whether they think it should indeed be its own tag. Details:

Also feel free to use this as a thread to discuss (and upvote or downvote suggestions regarding) existing tags that might not be worth having, or might be worth renaming or tweaking the scope of, or what-have-you. For example, I created the tag Political Polarisation, but I've also left a comment here about whether it should be changed or removed.


MichaelA @ 2020-08-05T00:04 (+22)

Political Polarisation

I already made this tag, but maybe it should be removed.

Arguments against its existence: 

Arguments for its existence:

vaidehi_agarwalla @ 2020-11-23T23:47 (+10)

UPDATE: I've proposed the change to the tag.

Proposal: Change the EA Global tag to EA Conferences.

Since many of the tagged posts are relevant to EA Student Summit, EAGx's etc. and the description itself is conference posts. 

MichaelA @ 2020-08-11T04:22 (+10)

Now vs Later, or Optimal Timing, or Optimal Timing for Altruists, or some other name.

This would be intended to capture posts relevant to the debate over "giving now vs later" and "patient vs urgent longtermism", as well as related debates like whether to do direct work now vs build career capital vs movement-build, and how much to give/work now vs later, and when to give/work if not now ("later" is a very large category!). 

This tag would overlap with Hinge of History, but seems meaningfully distinct from that.

Not sure what the best name would be. 

JP Addison @ 2020-08-11T12:48 (+9)

Patient Philanthropy seems like the general category. Not all of it will be about the debate as to whether it's right, but it seems like a tag that encompasses questions like, "given that I want to give later, how do I do that" seems good.

MichaelA @ 2020-08-11T23:22 (+4)

Thanks for highlighting patient philanthropy as an option, and good point that it'd be good for this tag to not just be about the debate but also how to implement the patient approach.

I've now made this tag, though with the name Patient Altruism. I haven't heard that term used, but it makes sense to me as a generalisation of patient philanthropy to also account for how to use work, not just how to use donations. I've now also written a shortform post arguing for the term.

One worry I have is that by saying Patient Altruism rather than Patient vs Urgent Altruism, this tag puts virtuous connotations on one side but not the other. But the version with "vs Urgent" is longer, it perhaps doesn't as naturally include posts about how to take the patient approach, and I've only heard the term "urgent longtermism", not "urgent philanthropy" (though I do suggest use of the terms "urgent philanthropy" and "urgent altruism" in that shortform post).

Stefan_Schubert @ 2022-02-15T12:10 (+9)

Heavy tailed distributions of cost-effectiveness, or some variant thereof, would probably be good. I seem to recall there was such an entry on the old EA Concepts page.

Some examples of pages that would get this tag:

Pablo @ 2022-02-15T15:10 (+6)

The content of the old EA Concepts page is now part of the cost-effectiveness entry. However, it may be worth creating a separate entry on distribution of cost-effectivenss and moving that content there. I'll do that tomorrow if no one objects by then.

Stefan_Schubert @ 2022-02-15T15:17 (+6)

Sorry, I hadn't seen that. I now added the "cost-effectiveness" tag to the first of these three articles, since that even has "cost-effectiveness" in the title.

The other two articles are actually about differences in performance between people. Potentially that should have its own tag. But it's also possible that that is too small a topic to warrant that.

I'd also be happy for an article on distribution of cost-effectiveness.

Pablo @ 2022-02-15T15:23 (+11)

Thanks. I'll take a look at the articles later today. My sense is that discussion of variation in performance across people is mostly of interest insofar as it bears on the question of distribution of cost-effectiveness, so I'd be tempted to use the distribution of cost-effectiveness tag for those articles, rather than create a dedicated entry.

JasperGeh @ 2021-12-14T13:27 (+9)

Biosurveillance

A central pillar for biodefense against GCBRs and an increasingly feasible intervention with several EAs working on it and potentially cool projects emerging in the near future. Possibly too granular as a tag since there's not a high volume of biosecurity posts which would warrant the granular distinction. But perhaps valuable from a Wiki standpoint with a definition and a few references. I can create an entry, if the mods are okay with it.

Example posts:

Related: GCBR, Biosecurity

Pablo @ 2021-12-14T14:23 (+4)

Hi Jasper,

I agree that this would be a valuable Wiki article, and if you are willing to write it, that would be fantastic.

MichaelA @ 2021-06-24T14:15 (+8)

Update: I've now made this entry.

Surveillance

Some relevant posts:

Pablo @ 2021-06-24T15:16 (+2)

Agree we should have such an entry (I had it in my list of planned articles).

Emanuele_Ascani @ 2020-10-02T10:42 (+8)

I'm surprised that "cost-effectiveness evaluation" doesn't exist yet.

Some others that it's weird enough that they don't exist yet: "meta-charities", "advocacy", "pandemic preparedness".

A couple of tags that would apply to all of my posts: "aging research", "scientific research".

JP Addison @ 2020-10-02T12:29 (+6)

I'd be in favor of all of those tags, except "pandemic preparedness" which I currently think is too overlapping with "Biosecurity".

MichaelA @ 2020-10-02T16:21 (+3)

I'd say "scientific research" is probably covered by Scientific Progress, Research Methods, and tags about specific areas scientific research can be done in?

MichaelA @ 2020-10-02T16:19 (+3)

I think I'm in favour of a Cost-Effectiveness Evaluation tag. (Or maybe Cost-Effectiveness Analysis? I think that's the more common term?) 

That seems similar to Impact Assessment (a tag I made last month), so some of my thoughts on that tag might also be relevant. But I think Cost-Effectiveness Analysis is probably different enough from existing tags to be worth having.

MichaelA @ 2020-08-29T18:20 (+8)

(Update: I've now made this tag.)

Operations

Arguments against

Arguments for:

Larks @ 2020-08-26T15:47 (+8)

I like Lists, so get me a List of Lists for my tag List.

There are a number of good posts that are basically lists of links to different articles (like this one). It would be nice to be able to easily access them.

MichaelA @ 2020-08-26T17:46 (+6)

I very much share this affection for lists.

I think Collection and Resources might cover this? E.g., those reading lists from Richard Ngo have each been given that tag.

Do you think there's still a gap for a List tag, or a way the description of the Collection and Resources tag should be adjusted?

Larks @ 2020-08-26T18:27 (+6)

Ahh yes, that covers it. I looked through the list of tags to check if there was already something on there; I guess I missed that one.

saulius @ 2020-08-07T08:06 (+8)

When tags were introduced, the post said to "submit new tag ideas to us using this form." I made a bunch of suggestions (don't remember what they were) and probably some other people did too. Could someone who has access to results of that form paste all those suggestions here?

MichaelA @ 2020-08-07T09:25 (+4)

That sounds like a great idea!

I think ideally theyā€™d be pasted as separate comments, so they can each be voted up or down separately. (Not saying you were suggesting otherwise.)

MichaelA @ 2021-09-26T18:52 (+7)

Independent impressions or something like that

We already have Discussion norms and Epistemic deference, so I think there's probably no real need for this as a tag. But I think a wiki entry outlining the concept could be good. The content could be closely based on my post of the same name and/or the things linked to at the bottom of that post.

Stefan_Schubert @ 2021-09-27T10:29 (+2)

I agree that it would be good to describe this distinction in the Wiki. Possibly it could be part of the Epistemic deference entry, though I don't have a strong view on that.

Pablo @ 2021-09-26T19:41 (+2)

How about something like beliefs vs. impressions?

MichaelA @ 2021-09-27T07:02 (+2)

Yeah, that title/framing seems fine to me

Pablo @ 2021-09-27T13:57 (+3)

After reviewing the literature, I came to the view that Independent impressions, which you proposed, is probably a more appropriate name, so that's what I ended up using.

MichaelA @ 2022-04-07T07:01 (+6)

Retreat or Retreats

I think there are a fair few EA Forum posts about why and how to run retreats (e.g., for community building, for remote orgs, or for increasing coordination among various orgs working in a given area). And I think there are a fair few people who'd find it useful to have these posts collected in one place.

Pablo @ 2022-04-07T21:24 (+8)

Makes sense; I'll create it.

By the way, we should probably start a new thread for new Wiki entries. This one has so many comments that it takes a long time to load.

MichaelA @ 2022-04-08T07:14 (+4)

Thanks!

And good idea - done

MichaelA @ 2022-02-04T12:59 (+6)

Red teaming or red teams or red team or something like that

Examples of posts that would get this tag:

Uncertainties:

Related entries

https://www.lesswrong.com/tag/epistemic-spot-check https://www.lesswrong.com/tag/conservation-of-expected-evidence https://forum.effectivealtruism.org/tag/community-epistemic-health 

Pablo @ 2022-02-04T15:09 (+4)

Yes! Will create this later today.

MichaelA @ 2021-11-29T20:00 (+6)

ETA: Now created

Corporate governance

Example of a relevant post: https://forum.effectivealtruism.org/posts/5MZpxbJJ5pkEBpAAR/the-case-for-long-term-corporate-governance-of-ai

I've mostly thought about this in relation to AI governance, but I think it's also important for space governance and presumably various other EA issues. 

I haven't thought hard about whether this really warrants an entry, nor scanned for related entries - just throwing an idea out there.

MichaelA @ 2021-08-28T11:21 (+6)

United Kingdom policy & politics (or something like that)

This would be akin to the entry/tag on United States politics. An example of a post it'd cover is https://forum.effectivealtruism.org/posts/yKoYqxYxo8ZnaFcwh/risks-from-the-uk-s-planned-increase-in-nuclear-warheads 

But I wrote on the United States politics entry's discussion page a few months ago:

I suggest changing the name and scope to "United States government and politics". E.g., I think there should be a place to put posts about what actions the US government plans to take or can take, how it's thinking, how to influence that, how useful that is, etc. And it seems like this entry's name wouldn't currently naturally cover that, but I think this entry should cover that, suggesting the name should be broadened.

And I'd say similar here, hence the proposal to say "policy & politics" rather than just "politics". (I know some people favour entry names being single categories rather than "and"s, but I think often the most natural cluster for a tag is better pointed at via more than one term.)

Pablo @ 2021-08-28T13:01 (+6)

Yeah, makes sense. I just created the new article and renamed the existing one. There is no content for now, but I'll try to add something later.

Aaron Gertler @ 2021-07-19T02:21 (+6)

Career profiles (or maybe something like "job posts"?)

Basically, writeups of specific jobs people have, and how to get those jobs. Seems like a useful subset of the "Career Choice" tag to cover posts like "How I got an entry-level role in Congress", and all the posts that people will (hopefully) write in response to this.

EdoArad @ 2021-07-19T15:24 (+2)

What about posts that discuss personal career choice processes (like this)?

MichaelA @ 2021-07-19T18:55 (+2)

My personal, quick reaction is that that's a decently separate thing, that could have a separate tag if we feel that that's worthwhile. Some posts might get both tags, and some posts might get just one.

But I haven't thought carefully about this.

I also think I'd lean against having an entry for that purpose. It seems insufficiently distinct from the existing tags for career choice or community experiences, or from the intersection of the two.

MichaelA @ 2021-07-19T07:41 (+2)

Yeah, this seems worth having! And I appreciate you advocating for people to write these and for us to have a way to collect them, for similar reasons to those given in this earlier shortform of mine.

I think career profiles is a better term for this than job posts, partly because:

  • The latter sounds like it might be job ads or job postings
  • Some of these posts might not really be on "jobs" but rather things like being a semi-professional blogger, doing volunteering, having some formalised unpaid advisory role to some institution, etc.

OTOH, career profiles also sounds somewhat similar to 80k's career reviews. This could be good or bad, depending on whether it's important to distinguish what you have in mind from the career review format. (I don't have a stance on that, as I haven't read your post yet.)

MichaelA @ 2021-07-19T09:49 (+2)

Actually, having read your post, I now think it does sound more about jobs (or really "roles", but that sounds less clear) than about careers. So I now might suggest using the term job profiles

Aaron Gertler @ 2021-07-22T08:29 (+4)

Thanks, have created this. (The "Donation writeup" tag is singular, so I felt like this one should also be, but LMK if you think it should be plural.)

Pablo @ 2021-07-19T13:20 (+2)

Either looks good to me. I agree that this is worth having.

MichaelA @ 2021-07-03T09:13 (+6)

Update: I've now made this entry.

Consultancy (or maybe Consulting or Consultants or Consultancies)

Things this would cover:

Related entries

career choice | Effective Altruism and Consulting Network | org strategy | working at EA vs. non-EA orgs

(maybe there are also other good choices for related entries)

Pablo @ 2021-07-03T12:18 (+6)

Yeah, I made a note to create an entry on this topic soon after Luke published his post. Feel free to create it, and I'll try to expand it next week (I'm a bit busy right now).

Stefan_Schubert @ 2021-06-07T16:48 (+6)

Effective Altruism on Facebook and Effective Altruism on Twitter (and more - maybe Goodreads, Instagram, LinkedIn, etc). Alternatively Effective Altruism on Social Media, though I probably prefer tags/entries on particular platforms.

A few relevant articles:

https://forum.effectivealtruism.org/posts/8knJCrJwC7TbhkQbi/ea-twitter-job-bots-and-more

https://forum.effectivealtruism.org/posts/6aQtRkkq5CgYAYrsd/ea-twitterbot

https://forum.effectivealtruism.org/posts/mvLgZiPWo4JJrBAvW/longtermism-twitter

https://forum.effectivealtruism.org/posts/BtptBcXWmjZBfdo9n/ea-facebook-group-greatest-hits-top-50-posts-by-total

Multiple articles about Giving Tuesday.

 

Also, quite a lot of EA discussion is and has taken place on Twitter and Facebook; there are many EA Facebook groups, etc. Therefore, it  seems natural to have entries on EA Twitter and EA Facebook.

MichaelA @ 2021-06-07T19:35 (+3)

At first glance, I'd prefer to have Effective altruism on social media, or maybe actually just Social media, rather than the more fine-grained ones. (Also, I do think something in this vicinity is indeed worth having.) Reasoning:

  • I'm not sure if any of the specific platforms warrant an entry
  • If we have entries for the specific platforms, then what about posts relevant to effective altruism on some other platform?
    • We shouldn't just create an entry for every other platform there's at least one post relevant to, nor should we put them all under one of the other single-platform-focused tags.
    • But having an entry for Facebook, another for Twitter, and another for social media as a whole seems like too much?
  • Regarding dropping "Effective altruism on" and just saying "Social media":

Though also note that there's already a tag for effective altruism in the media, which has substantial overlap with this. But I think that's probably ok - social media seems a sufficiently notable subset of "the media" to warrant its own entry.

(Btw, for the sake of interpreting the upvotes as evidence: I upvoted your comment, though as I noted I disagree a bit on the best name/scope.)

MichaelA @ 2021-10-21T16:11 (+3)

(Just wanted to send someone a link to a tag for Social media or something like that, then realised it doesn't exist yet, so I guess I'll bump this thread for a second opinion, and maybe create this in a few days if no one else does)

Pablo @ 2021-10-21T16:26 (+2)

I don't have accounts on social media and don't follow discussions happening there, so I defer to you and others with more familiarity.

MichaelA @ 2021-06-05T09:32 (+6)

Maybe we should have an entry for each discipline/field that's fairly relevant to EA and fairly well-represented on the Forum? Like how we already have history, economics, law, and psychology research. Some other disciplines/fields (or clusters of disciplines/fields) that could be added:

Pablo @ 2021-06-05T13:31 (+2)

I'm overall in favor.

I wonder if we should take a more systematic approach to entries about individual disciplines. It seems that, from an EA perspective, a discipline may be relevant in a number of distinct ways, e.g. because it is a discipline in which young EAs may want to pursue a career,  because conducting research in that discipline is of high value, because that discipline poses serious risks, or because findings in that discipline should inform EA thinking. I'm not sure how to translate this observation into something actionable for the Wiki, though, so I'm just registering it here in case others have thoughts along these lines.

MichaelA @ 2021-06-05T14:48 (+2)

Yeah, I do think it seems worth thinking a bit more about what the "inclusion criteria" for a discipline should be (from the perspective of making an EA Wiki entry about it), and that the different things you mention seem like starting points for that. Without clearer inclusion criteria, we could end up with a ridiculously large number of entries, or with entries that are unwarranted or too fine-grained, or with entries that are too coarse-grained, or with hesitation and failing to create worthwhile entries.

I don't immediately have thoughts, but endorse the idea of someone generating thoughts :D

Stefan_Schubert @ 2021-06-05T09:58 (+2)

I agree that humanities disciplines tend to be less EA-relevant than the social sciences. But I think that the humanities are quite heterogeneous, so it feels more natural to me to have entries for particular humanities disciplines, than humanities as a whole.

But I'm not sure any such entries are warranted; it depends on how much has been written.

MichaelA @ 2021-05-13T21:02 (+6)

Maybe we should have a tag for each individual EA Fund, in addition to the existing tag Effective Altruism Funds tag? The latter could then be for posts relevant to EA Funds as a whole.

There are now 60 posts with the Effective Altruism Funds tag, and many readers may only be interested in posts relevant to one or two of the funds.

Pablo @ 2021-05-13T21:38 (+4)

Yes, good idea. Feel free to create them, otherwise I'll do it myself later today or tomorrow.

MichaelA @ 2020-11-15T12:33 (+6)

Markets for Altruism or Market Mechanisms for Altruism or Impact Certificates or Impact Purchases (or some other name)

Tentatively proposed description: 

The Markets for Altruism tag is for posts relevant to actual or potential market-like mechanisms for altruistic or charitable activities. An example would be certificates of impact

See also EA Funding.

The posts listed here would fit this tag. Some other posts tagged EA Funding might fit as well.

I'm unsure precisely what the ideal scope and name of this tag would be. 

JP Addison @ 2020-11-16T07:39 (+2)

I like it. Impact Certificates is more recognizable, but Markets for Altruism is more general. I think I agree with your favoring it.

MichaelA @ 2020-11-16T10:36 (+2)

Cool, thanks for the input - given that, I've now made the tag, with the name Markets for Altruism :)

vaidehi_agarwalla @ 2020-10-31T00:56 (+6)

I think it wouldbe useful to be able to see all the posts from a particular organisation all at once on the forum. For the most part, individuals from those organisations post, rather than a single organisation account it can be difficult to see e.g. all of Rethink Priorities' research on a given topic

Curious to hear if people think it's better to have tags or sequences for group these posts?

vaidehi_agarwalla @ 2021-01-15T16:22 (+2)

New issue: How do we deal with name changes ? (E.g. EAF became CLR, .impact became rethink charity)

I think it's nice to have a single tag (the new name) for continuity but sometimes an org had a different focus or projects associated with the old name.

Maybe it's enough to mention in the tag description "previously called X"?

MichaelA @ 2020-11-25T12:37 (+2)

Update: I've now made tags for Rethink Priorities, Future of Humanity Institute, and Global Priorities Institute. I believe I've tagged all RP posts. I wasn't very thorough in tagging FHI or GPI posts. Other people can tag additional FHI and GPI posts, and/or add tags for other orgs. 

MichaelA @ 2020-10-31T07:55 (+2)

I think something like this would be a good idea :) 

Some thoughts:

  • One downside could be that we might end up with quite a few of these tags, which then clutter up the tags page.
    • Maybe it'd be best if the Forum team can set it up so there's a separate, collapsable part of the tags page just for all the organisation tags?
      • That might also make it easier for someone who's looking for org tags in general (without knowing what specific orgs might have tags) to find them.
  • Most EA organisations probably already have pages on their site where you can find all/most their research outputs. E.g., Rethink Priorities' publications page.
  • But one thing tags allows you to do is (from the home page of the forum) filter by multiple tags at once. So you could e.g. filter by both the Rethink Priorities tag  and the Wild Animal Welfare tag, to find all of Rethink's posts related to that topic.
    • That said, I've never actually used the approach of filtering by multiple tags myself.
    • And the lists of publications on an org's site may often be organised by broad topic area anyway. Though this could still be useful if you want to see if an org wrote something related to a concept/topic they probably wouldn't organise their pages by (perhaps because it's cross-cutting, slightly obscure, or wasn't the main focus of the post) - e.g., if you want to see whether Rethink has written anything related to patient altruism.
  • I think tags might be better than sequences for this purpose. One reason is the above-mentioned benefit of allowing for filtering by both org and some tag. Another reason is that these posts usually won't really be sequences in the usual sense - it won't be the case that the order of publication is the most natural order of reading, and that one gains a lot from reading them all together. (Though some subset of each org's posts may be a sequence, e.g. Rethink's nuclear risk stuff.)
  • A complexity might be deciding which orgs should have tags - in particular, should orgs which aren't especially prominent or don't post often have tags?
    • Maybe some forum users can just make tags for orgs they want there to be tags for, and then orgs can make tags for themselves if they want, and we can see what results.

(It happens to be that I'll be working at Rethink soon, but this comment was just my own opinion, and I only used them as an example because Vaidehi did.)

vaidehi_agarwalla @ 2020-11-04T02:14 (+2)

I agree that tags seem better than sequences. 

I think rather than specific tags, it may be better to just have them regular tags. This would solve the issue about which organisations get org tags. I think it's okay for people to tag their own early stage projects or orgs even if they aren't very big (I'm biased here as I have some projects which I would like to be able to link people to). 

I don't think there's a lot of risk - having a tag doesn't mean your project is endorsed by EA or anything, it's just a organisational tool.  

Maybe some forum users can just make tags for orgs they want there to be tags for, and then orgs can make tags for themselves if they want, and we can see what results.
 

I think this is probably the best strategy!

 

Also congrats on starting at Rethink :) 

EdoArad @ 2020-10-03T08:49 (+6)

I've added a Meta-Science tag. I'd love for some help with clarifying the distinction between it and Scientific Progress.  

Generally, I imagine meta-science as being more focused on specific aspects of the academic ecosystem and scientific progress to be related more to the general properties of scientific advances. There is clearly an overlap there, but I'm not sure where exactly to set the boundaries. 

vaidehi_agarwalla @ 2020-11-04T02:16 (+2)

I think the overlap would be a if say, in the field of survey methodology, someone discovers a new way to measure bias in surveys - this would be a meta-science improvement but also scientific progress in the field of survey methodology

MichaelStJules @ 2020-09-13T03:51 (+6)

Would be good if tags always had descriptions/definitions of the things they're for.

MichaelA @ 2020-09-13T07:04 (+4)

Agreed. I think people creating tags should probably always add those descriptions/definitions.

One thing I'd note is that anyone can add descriptions/definitions for tags, even if they didn't create them. This could be hard if you're not sure what the scope was meant to be, but if you think you know what the scope was meant to be, you could consider adding a description/definition yourself.

MichaelA @ 2020-08-05T00:14 (+6)

Update: I've now made this tag.

[Something about war, armed conflict, or great power conflict]

Arguments against:

Arguments for:

MichaelA @ 2021-12-04T10:59 (+5)

Megaprojects

Would want to have a decent definition. I feel like the term is currently being used in a slippery / under-defined / unnecessary-jargon way, but also that there's some value in it. 

Example posts: 

Related entries:

Constraints on effective altruism

Scalably using labour

nil @ 2021-05-19T22:04 (+5)

David Pearce (the tag will be removed if others think itā€™s not warranted)

Arguments against:

I also should mention that Iā€™m biased in proposing this tag, as Pearceā€™s work played a major role in my becoming an EA.

Arguments for:

Pablo @ 2021-05-20T18:52 (+12)

Michael is correct that the inclusion criteria for entries of individual people hasn't been made explicit. In deciding whether a person was a fit subject for an article, I haven't followed any conscious procedure, but merely relied on my subjective sense of whether the person deserved a dedicated article. Looking at the list of people I ended up including, a few clusters emerge:

  1. people who have had an extraordinary positive impact, and that are often discussed in EA circles (Arkhipov, Zhdanov, etc.)
  2. people who have attained eminence in their fields and who are connected to EA to a significant degree (Pinker, Hassabis, Boeree, etc.)
  3. academics who have conducted research of clear EA relevance (Ng, Duflo, Parfit, Tetlock, etc.)
  4. historical figures that may be regarded as proto-EAs or that are seen as having inspired the EA movement (Bentham, Mill, Russell, etc.)
  5. "core figures" in the EA community (Shulman, Christiano, Tomasik, etc.)

Some people, such Bostrom, MacAskill, Ord, fit into more than one of these clusters. My sense is that David Pearce doesn't fit into any of the clusters. It seems relatively uncontroversial that he doesn't fit into clusters 1-4, so the relevant questionā€”at least if one broadly agrees with the approach I've takenā€”is whether he is sufficiently close to the "core" to merit inclusion as part of cluster 5.

As someone who has been involved with EA since its inception and who has (I believe) a reasonably good sense of how central to the movement different people have been, my impression is that Pearce isn't central enough. If others have different (or similar) impressions, I would encourage them to post them here. We could, alternatively, try to go beyond impressionistic evidence and look at more objective measures, such as citation counts (broadly construed to include not just academic citations but links from the EA Forum and EA Blogs), though conducting that kind of analysis might be time consuming and may not be fully conclusive. Do others have thoughts on how to operationalize the relevant criteria?

MichaelA @ 2021-05-20T19:52 (+2)

Do others have thoughts on how to operationalize the relevant criteria?

FWIW, I think your comment is already a good step! I think I broadly agree that those people who fit into at least one of those clusters should typically have entries, and those who don't shouldn't. And this already makes me feel more of a sense of clarity about this.

I still think substantial fuzziness remains. This is mostly just because words like "eminence" could be applied more or less strictly. I think that that's hard to avoid and maybe not necessary to avoid - people will probably generally agree, and then we can politely squabble about the borderline cases and thereby get a clearer sense of what we collectively think the "line" is.

But I think "people who have had an extraordinary positive impact, and that are often discussed in EA circles (Arkhipov, Zhdanov, etc.)" may require further operationalisation, since what counts as extraordinary positive impact can differ a lot based on one's empirical, moral, epistemological, etc. views. E.g., I suspect that nil might think Pearce has been more impactful than most people who do have an entry, since Pearce's impacts are more targeted at suffering reduction. (nil can of course correct me if I'm wrong about their views.)

So maybe we should say something like "people who are widely discussed in EA and who a significant fraction of EAs see as having had an extraordinary positive impact (Arkhipov, Zhdanov, etc.)"? (That leaves the fuzziness of "significant fraction", but it seems a step in the right direction by not just relying on a given individual's view of who has been extraordinarily impactful.)

Then, turning back to the original example, there's the question: Would a significant fraction of EAs see Pearce as having had an extraordinary positive impact? I think I'd lean towards "no", though I'm unsure, both because I don't have a survey and because of the vagueness of the term "significant fraction". 

Pablo @ 2021-05-22T18:54 (+18)

I think there's a relatively clear sense in which Arkhipov, Borlaug, and similar figures (e.g. winners of the Future of Life Award, names included in Scientists Greater than Einstein, and related characters profiled in Doing Good Better or the 80,000 Hours blog)  count as having had an extraordinary positive impact and Pearce does not, namely, the sense in which also Ord, MacAskill, Tomasik, etc. don't count. I think it's probably unnecessary to try to specify in great detail what the criterion is, but the core element seems to be that the former are all examples of do-gooding that is extraordinary from both an EA and a common-sense perspective, whereas if you wanted to claim that e.g. Shulman or Christiano are among humanity's greatest benefactors, you'd probably need to make some arguments that a typical person would not find very persuasive. (The arguments for that conclusion would also likely be very brittle and fail to persuade most EAs, but that doesn't seem to be so central.)

So I think it really boils down to the question of how core a figure Pearce is in the EA movement, and as noted, my impression is that he just isn't a core enough figure. I say this, incidentally, as someone who admires him greatly and who has been profoundly influenced by his writings (some of which I translated into Spanish a long time ago), although I have also developed serious reservations about various aspects of his work over the years.

MichaelA @ 2021-05-23T09:56 (+2)
  1. If you mean that the vast majority of EAs would agree that Arkhipov, Borlaug, Zhdanov, and similar figures count as having had an extraordinary positive impact, or that that's the only reasonable position one could hold, I disagree, for reasons I'll discuss below.
  2. But if you just mean that a significant fraction of EAs would agree that those figures count as having had an extraordinary impact, I agree. And, as noted in my previous comment, I think that using a phrasing like "people who are widely discussed in EA and who a significant fraction of EAs see as having had an extraordinary positive impact (Arkhipov, Zhdanov, etc.)" would probably work.
    1. And that phrasing also seems fine if I'm wrong about (1), so maybe there's no real need to debate (1)?
    2. (Relatedly, I also do ultimately agree that Arkhipov etc. should have entries.)

Expanding on (1):

  • This is mostly due to crucial considerations that could change the sign or (relative) magnitude of the moral value of the near-term effects that these people are often seen as having had. For example:
    • It's not obvious that a US-Russia nuclear war during the Cold War would've caused a negative long-term future trajectory change.
      • I expect it would, and, for related reasons, am currently focused on nuclear risk research myself.
      • But I think one could reasonably argue that the case for this view is brittle and the case for e.g. the extraordinary positive impact of some people focused on AI is stronger (conditioning on strong longtermism).
    • Some EAs think extinction risk reduction is or plausibly is net negative.
    • Some EAs think population growth is or plausibly is net negative, e.g. for reasons related to the meat-eater problem or to differential progress.
    • It's plausible that expected moral impact is dominated by effects on the long-term future, farm animals, wild animals, invertebrates, or similar, in which case it may be both less clear that e.g. Borlaug and Zhdanov had a net positive impact and less clear that it is "extraordinary" relative to the impact of people whose actions were more targeted to helping those populations.
  • But it's also because of uncertainties about whether they really had those near-term effects, whether similar things would've happened without them, and - at least in Zhdanov's case - whether they had other near-term effects that may have been very negative. For example:
    • My understanding is that it's not actually very clear whether Arkhipov played a crucial role in preventing a launch.
      • E.g., Baum, de Neufville, and Barrett write "The second captain, Vassily Arkhipov, has been credited with having vetoed the decision to launch the torpedo over the objections of the two other officers (Lloyd 2002). Sources conflict on whether the submarine crew had the authority to launch the torpedo without direct orders from Moscow. The submarineā€™s communications officer later said in an interview that Arkhipov did play an important role in calming the captain down, but that while there was a danger of an accident or equipment malfunction, they were never close to intentionally launching the nuclear torpedo (Savranskaya 2007)."
    • Zhdanov also "chaired the Soviet Union's Interagency Science and Technology Council on Molecular Biology and Genetics, which among its many functions directed the Soviet biological weapons program" (Wikipedia), which I think makes it plausible that his expected impact (evaluated during the Cold War) on the long-term future was very negative.
  • My more basic point is just that it seems very hard to say with high confidence what actions had net positive vs net negative impacts and how to rank them, and there's room for reasonable disagreement.

Again, though, I think we can probably sidestep all of this by just saying "people who are widely discussed in EA and who a significant fraction of EAs see as having had an extraordinary positive impact (Arkhipov, Zhdanov, etc.)".

nil @ 2021-05-28T20:28 (+6)

For those who may want to see the deleted entry, I'm posting it below:


David Pearce is a philosopher and writer best known for his 1995 manifesto The Hedonistic Imperative and the associated ideas about abolishing suffering for all sentient life using biotechnology and other technologies.

Pearce argues that it is "technically feasible" and ethically rational to abolish suffering on the planet by replacing Darwinian suffering-based motivational systems with minds animated by "information-sensitive gradients of intelligent bliss" (as opposed to indiscriminate maxed-out bliss). He stresses that this "abolitionist project" is compatible with a diverse set of values and "intentional objects" (i.e. what one is happy "about").

In 1998 together with Nick Bostrom, Pearce co-founded the World Transhumanist Association, today known as Humanity+.

Pearce is the director of bioethics of Invincible Wellbeing and is on the advisory boards of the Organisation for the Prevention of Intense Suffering and since 2021 the Qualia Research Institute. He is also a fellow of the Institute for Ethics and Emerging Technologies and is on the futurist advisory board of the Lifeboat Foundation.

Pablo @ 2021-05-28T11:49 (+5)

Thanks again, nil, for taking the time to create this entry and outline your reasoning. After reviewing the discussion, and seeing that no new comments have been posted in the past five days, I've decided to delete the article, for the reasons I outlined previously.

Please do not let this dissuade you from posting further content to the Wiki, and if you have any feedback, feel free to leave it below or to message me privately.

Michael Huang @ 2021-05-31T05:41 (+4)

To add to arguments for inclusion, hereā€™s an excerpt from an EA Forum post about key figures in the animal suffering focus area.

ā€œMajor inspirations for those in this focus area include Peter Singer, David Pearce, and Brian Tomasik.ā€

Four focus areas of effective altruism by Luke_Muehlhauser, 8th Jul 2013

David Pearceā€™s work on suffering and biotechnology would be more relevant now than in 2013 due to developments in genome editing and gene drives.

MichaelA @ 2021-05-20T06:41 (+4)

I'm roughly neutral on this, since I don't have a very clear sense of what the criteria and "bars" are for deciding whether to make an entry about a given person. I think it would be good to have a discussion/policy regarding that. 

I think some people like Nick Bostrom and Will MacAskill clearly warrant and entry, and some people like me clearly don't, and there's a big space in between - with Pearce included in it - where I could be convinced either way. (This has to do with relevance and notability in the context of the EA Forum Wiki, not like an overall judgement of these people or a popularity contest.)

Some other people who are perhaps in that ambiguous space:

  • Nick Beckstead (no entry atm)
  • Elie Hassenfeld (no entry atm, but an entry for GiveWell)
  • Max Tegmark (no entry atm, but an entry for FLI)
  • Brian Tomasik (has an entry)
  • Stuart Russell (has an entry)
  • Hilary Greaves (has an entry)

(I think I'd lean towards each of them having an entry except Hassenfeld and maybe Tegmark. I think the reason for The Hassenfeld Exception is that, as far as I'm aware, the vast majority of his work has been very connected with GiveWell. So it's very important and notable, but doesn't need a distinct entry. Somewhat similar with Tegmark inasmuch as he relates to EA, though he's of course notable in the physics community for non-FLI-related reasons. But I'm very tentative with all those views.)

nil @ 2021-05-20T17:06 (+1)

... I think the reason for The Hassenfeld Exception is that, as far as I'm aware, the vast majority of his work has been very connected with GiveWell. So it's very important and notable, but doesn't need a distinct entry. Somewhat similar with Tegmark inasmuch as he relates to EA, though he's of course notable in the physics community for non-FLI-related reasons. ...

This makes sense to me, although one who is more familiar w/ their work may find their exclusion unwarranted. Thanks for clarifying!

In this light I still think an entry for Pearce is justified, to a degree scientifically grounded proposals for abolishing suffering is an EA topic (and this is the main theme of Pearce's work). But I'm just one input of course.

Regarding Tomasik, we have different intuitions here: if an entry for Tomasik may not be justified, then I would say this sets a high bar which only original EA founders could reach. (For Tomasik himself is a founder of an EA charity - the Foundational Research Institute / Center on Long-Term Risk - has written extensively on many topics highly relevant to EA, and an advisor at the Center for Reducing Suffering, another EA org.) Anyway, this difference doesn't probably matter in practice since you added that you lean towards Tomasik's having an entry.

Pablo @ 2021-05-20T17:34 (+9)

I agree with you that a Tomasik entry is clearly warranted. I would say that his entry is as justified as one on Ord or MacAskill; he is one of half a dozen or so people who have made the most important contributions to EA, in my opinion.

I will respond to your main comment later, or tomorrow.

MichaelA @ 2021-05-20T17:34 (+5)

As noted, I do lean towards Tomasik having an entry, but "co-founder of an EA org"  + "written extensively on many topics highly relevant to EA" + "is an advisor for another EA org", or 1 or 2 of those things plus 1 or 2 similar things, includes a fair few people, including probably like 5 people I know personally and who probably shouldn't have their own entries. 

I do think Tomasik has been especially prolific and his writings especially well-regarded and influential, which is a big part of why I lean towards an entry for him, but the criteria and cut offs do seem fuzzy at this stage. 

Aaron Gertler @ 2021-05-28T19:32 (+3)

As the head of the Forum, I'll second Pablo in thanking you for creating the entry. While I defer to Pablo on deciding what articles belong in the wiki, I thought Pearce was a reasonable candidate. I appreciate the time you took to write out your reasoning (and to acknowledge arguments against including him).

nil @ 2021-05-29T22:01 (+1)

Thank you for appreciating the contribution.

Since Pablo is trusted w/ deciding on the issue, I will address my questions about the decision directly to him in this thread.

MichaelA @ 2021-05-05T07:41 (+5)

Academia or something like that

This could cover things like how (in)efficient academia is, what influences it has had and could have, the best ways to leverage or direct academia, whether people should go into academic or academia-related careers, etc.

E.g., Open Phil's post(s) on field-building and this post on How to PhD.

Related entries

field-building | meta-science | research methods | research training programs | scientific progress

---

It's possible that this is made redundant by other tags we already have? 

And my current suggested name and scope are vague and just spitballing. 

Pablo @ 2021-05-05T14:56 (+4)

I think this would be a valuable article. Perhaps the title could be refined, but at the moment I can't think of any alternatives I like. So feel free to create it, and we can consider possible name variants in the future.

MichaelA @ 2021-05-05T16:41 (+4)

Ok, done!

MichaelA @ 2021-05-02T08:04 (+5)

Terrorism

Pablo @ 2021-05-02T12:22 (+4)

Makes sense. I created it (no content yet).

MichaelA @ 2021-04-20T12:01 (+5)

Update: I've now made this tag.

Charitable pledges or Altruistic pledges or Giving pledges (but that could be confused with the Giving Pledge specifically) or Donation pledges or similar

Maybe the first two names are good in that they could capture pledges about resources other than money (e.g., time)? But I can't off the top of my head think of any non-monetary altruistic pledges. 

This could serve as an entry on this important-seeming topic in general, and as a directory to a bunch of other entries or orgs on specific pledges (e.g., Giving Pledge, GWWC Pledge, Generation Pledge, Founders Pledge).

See also this post: https://forum.effectivealtruism.org/posts/W2f7AZEe2kCoZhwrf/a-list-of-ea-donation-pledges-gwwc-etc 

MaxRa @ 2021-01-19T08:46 (+5)

What do you think about a tag for posts that include Elicit predictions? I'd like to see all posts that include them and it might be a tiny further reminder to use them more.

MichaelA @ 2021-01-19T13:34 (+4)

This seems plausibly useful to me.

Obviously itā€™d overlap a lot with the Forecasting tag. But if itā€™s the case that several posts include Elicit forecasts but most posts tagged Forecasting donā€™t include Elicit forecasts, then I imagine a separate tag for Elicit forecasts could be useful. (Basically, what Iā€™m thinking about is whether there would be cases in which itā€™d be useful for someone to find / be sent a collection of links to just posts with Elicit forecasts, with the Forecasting tag not covering their needs well.)

But maybe a better option would be to mirror LessWrong in having a tag for posts about forecasting and another tag for posts that include actual forecasts (see here)? (Or maybe the latter tag should only include posts that quite prominently include forecasts, rather than just including them in passing here and there.) Because maybe people would also want to see posts with Metaculus forecasts in them, or forecasts from Good Judgement Inc, or just forecasts from individual EAs but not using those platforms. And Iā€™d guess itā€™d make more sense to have one tag where all of these things can be found than to try to have a separate tag for each.

(Thatā€™s just my quick thoughts in a tired state, though.)

It could also be handy to have a tag for posts relevant to ā€œOught / Elicitā€ - I think itā€™d probably be good to bundle them together but note Elicit explicitly - similarly to how thereā€™s now tags for posts relevant to each of a few other orgs (e.g. Rethink Priorities, FHI, GPI, QURI). So maybe the combination of a tag for posts that contain actual forecasts and a tag for Ought / Elicit would serve the role a tag for posts containing Elicit forecasts would?

MichaelA @ 2022-03-14T18:45 (+4)

Quadratic voting or Uncommon voting methods or Approval voting or something like that or multiple of these

E.g., this post could get the first and/or second tag, and posts about CES could get the second and/or third tag

Pablo @ 2022-03-14T20:00 (+4)

Created.

I may try to expand the description to also cover quadratic funding. (Both quadratic voting and quadratic funding are instances of quadratic payments, at least in Buterin's framing, so we could use the latter for the name of the entry. I used 'quadratic voting' because this is the name that people usually associate with the general idea.)
 

MichaelA @ 2021-12-30T10:16 (+4)

Alignment tax

Here I'm more interested in the Wiki entry than the tag, though the tag is probably also useful. Basically I primarily want a good go-to link that is solely focused on this and gives a clear definition and maybe some discussion.

This is probably an even better fit for LW or the Alignment Forum, but they don't seem to have it. We could make a version here anyway, and then we could copy it there or someone from those sites could.

Here are some posts that have relevant content, from a very quick search:

Related entries:

The term "safety tax" should probably also be mentioned

Pablo @ 2021-12-30T13:56 (+6)

Here's the entry. I was only able to read the transcript of Paul's talk and Rohin's summary of it, so feel free to add anything you think is missing.

Pablo @ 2021-12-30T11:46 (+4)

Thanks, Michael. This is a good idea; I will create the entry.

(I just noticed you left other comments to which I didn't respond; I'll do so shortly.)

MichaelA @ 2021-11-29T12:29 (+4)

Brain-computer interfaces

See also the LW wiki entry / tag, which should be linked to from the Forum entry if we make one: https://www.lesswrong.com/tag/brain-computer-interfaces

Relevant posts:

Pablo @ 2021-11-29T13:11 (+4)

Looks good. I've now created the entry and will add content/links later.

MichaelA @ 2021-11-03T08:21 (+4)

Time-money tradeoffs or Buying time or something like that

For posts like https://forum.effectivealtruism.org/posts/g86DhzTNQmzo3nhLE/what-are-your-favourite-ways-to-buy-time and maybe a bunch of other posts tagged Personal development

Pablo @ 2021-11-03T14:10 (+4)

Cool, I created the entry here. I may add some text soon.

evelynciara @ 2021-10-18T10:16 (+4)

Criticism of the EA community

For posts about what the EA community is like, as opposed to the core ideas of EA themselves. Currently, these posts get filed under Criticism of effective altruism even though it doesn't quite fit.

Aaron Gertler @ 2021-10-18T10:21 (+6)

Seems like a good idea!

If we have three criticism tags covering "causes", "organizations", and "community", then having a general "criticism of EA" tag doesn't seem to make sense. The best alternative seems like "criticism of EA philosophy".

If I don't hear objections from Pablo/Michael, I'll make that change in a week or so and re-tag relevant posts.

MichaelA @ 2021-10-18T11:22 (+2)

So the plan is to have 4 tags, covering community, causes, organizations, and philosophy? Is so, that sounds good to me, I think.

If the idea was to have just three (without philosophy), I'd have said it feels like there's something missing, e.g. for criticism of the ITN framework or ~impartial welfarism or the way EA uses expected value reasoning or whatever.

evelynciara @ 2021-12-09T23:22 (+4)

Update: I have created Criticism of the effective altruism community.

MichaelA @ 2021-10-05T09:21 (+4)

Arms race or Technology race or Arms/technology race something like that

Related entries

AI governance | AI forecasting | armed conflict | existential risk | nuclear warfare | Russell-Einstein Manifesto

--

I think such an entry/tag would be at least somewhat attention hazardous, so I'm genuinely unsure whether it's worth creating it. Though I think it'd also have some benefits, the cat is somewhat out of the bag attention-hazard-wise (at least among EAs, who are presumably the main readers of this site), and LessWrong have apparently opted for such a tag (focused solely and explicitly on AI, so a bit more attention hazardous in my view).

Pablo @ 2021-10-05T12:03 (+4)

Yes, I actually have a draft prepared, though it's focused on AI, just like the LW article. I'll try to finish it within the next couple of days and you can let me know when I publish it if you think we should expand it to cover other technological races (or have another article on that broader topic).

MichaelA @ 2021-10-05T09:10 (+4)

Survey or Surveys

For posts that: 

  1. discuss results from surveys,
  2. promote surveys, and/or
  3. discussing pros and cons and best practices for using surveys in general and maybe for specific EA-relevant areas (e.g., how much can we learn about technology timelines from surveys on that topic? how best can we collect and interpret that info?). 

I care more about the first and third of those things, but it seems like in practice the tag would be used for the second. I guess we could discourage that, but it doesn't seem important.

"Survey" seems more appropriate for the first and second of those things, while "Surveys" seems more appropriate for the third.

Pablo @ 2021-10-05T12:07 (+2)

Yeah, makes sense. There's some overlap with Data, but my sense is that having this other entry is still justified. I don't have a preference for plural vs. singular.

MichaelA @ 2021-10-06T09:34 (+2)

Ok, now created.

MichaelA @ 2021-10-02T10:11 (+4)

Coaching or Coaching & therapy or something like that

Basically I think it'd be useful to have a way to collect all posts relevant to coaching and/or therapy as ways to increase people's lifetime impact - so as meta interventions/cause areas, rather than as candidates for the best way to directly improve global wellbeing (or whatever). So this would include things like Lynette Bye's work but exclude things like Canopie.

In my experience, it tends to make sense to think of coaching and therapy together in this context, as many people offer both services, the boundaries between these concepts/services seem fuzzy, and many of the relevant considerations seem similar. But it could make sense to have two tags.

Examples of posts that would be covered by a tag covering coaching:

Examples of posts that would be covered by a tag covering therapy:

Pablo @ 2021-10-02T15:41 (+2)

Yes, makes a lot of sense. Not sure why we don't have such a tag already.

Weak preference for coaching over coaching & therapy.

MichaelA @ 2021-10-05T09:05 (+4)

Ok, now created, with coaching as the name for now

MichaelA @ 2021-09-10T14:44 (+4)

Management/mentoring, or just one of those terms, or People management, or something like that

This tag could be applied to many posts currently tagged Org strategy, Scalably using labour, Operations, research training programs, Constraints in effective altruism, WANBAM, and effective altruism hiring. But this topic seems sufficiently distinct from those topics and sufficiently important to warrant its own entry.

Pablo @ 2021-09-10T16:45 (+2)

Sounds good. I haven't reviewed the relevant posts, so I don't have a clear sense of whether "management" or "mentoring" is a better choice; the latter seems preferable other things equal, since "management" is quite a vague term, but this is only one consideration. In principle, I could see a case for having two separate entries, depending on how many relevant posts there are and how much they differ. I would suggest that you go ahead and do what makes most sense to you, since you seem to have already looked at this material and probably have better intuitions. Otherwise I can take a closer look myself in the coming days.

MichaelA @ 2021-09-11T11:52 (+2)

Ok, I've now made this, for now going with just one entry called Management & mentoring, but flagging on the Discussion page that that could be changed later. 

Aaron Gertler @ 2021-08-22T20:37 (+4)

We've now redirected almost all of EA Concepts to Wiki entries. A few of the remaining concepts (e.g. "beliefs") don't seem like good wiki entries here, so we won't touch them.

However, there are a couple of entries I think could be good tags, or good additions to existing tags:

  1. Charity recommendations
  2. Focus area recommendations

It seems good to have wiki entries that contain links to a bunch of lists of charity and/or focus area recommendations. Maybe these are worked into tags like "Donation Choice"/"Donation Writeup", or maybe they're separate.

(Wherever the entries end up, they should probably link to GWWC's donation advice page, which is the most thorough I know of.)

Pablo @ 2021-08-23T13:08 (+4)

Charity evaluators, e.g. GiveWell and Animal Charity Evaluators, have Wiki entries with sections listing their current recommendations. One option is to make the charity recommendations entry a pointer to existing Wiki entries that include such sections. Alternatively, we could list the recommendations themselves in this new Wiki entry, perhaps organizing it as a table that shows, for each charity, which charity evaluators recommend it.

Stefan_Schubert @ 2021-08-16T01:05 (+4)

Adjacent communities or something like that is a potential entry/tag (though not very high priority).

Some posts on that theme:

https://forum.effectivealtruism.org/posts/XHHwTu2PCr9CGpLpa/what-is-the-closest-thing-you-know-to-ea-that-isn-t-ea

https://forum.effectivealtruism.org/posts/zA9Hr2xb7HszjtmMx/name-for-the-larger-ea-adjacent-ecosystem 

Pablo @ 2021-08-16T21:46 (+4)

Yeah, how about communities adjacent to effective altruism?

Stefan_Schubert @ 2021-08-16T21:48 (+4)

Sounds good! Thanks.

Pablo @ 2021-08-17T00:02 (+4)

I created a stub. As usual, feel free to revise or expand it.

MichaelA @ 2021-07-17T15:31 (+4)

Update: I've now made this entry.

Requests for proposals or something like that

To cover posts like https://forum.effectivealtruism.org/posts/EEtTQkFKRwLniXkQm/open-philanthropy-is-seeking-proposals-for-outreach-projects 

This would be analogous to the Job listings tags, and sort of the inverse of the Funding requests tag.

This overlaps in some ways with Get involved and Requests (open), but seems like a sufficiently distinct thing that might be sufficiently useful to collect in one place that it's worth having a tag for this.

This could also be an entry that discusses pros, cons, and best practices for Requests for proposals. Related entries include Grantmaking and EA funding. 

MichaelA @ 2021-07-17T14:19 (+4)

Update: I've now made this entry.

Defense in depth

Relevant links/tags:

Seems like a useful concept for risk analysis and mitigating in general.

MichaelA @ 2021-07-17T13:57 (+4)

Update: I've now made this entry.

Semiconductors or Microchips or Integrated circuit or something like that

The main way this is relevant to EA is as a subset of AI governance / AI risk issues, which could push against having an entry just for this.

That said, my understanding is that a bunch of well-informed people see this as a fairly key variable for forecasting AI risks and intervening to reduce those risks, to the point where I'd say an entry seems warranted.

Pablo @ 2021-06-13T14:51 (+4)

Meta: perhaps this entry should be renamed 'Propose and vote on potential entries' or 'Propose and vote on potential tags/Wiki articles'? We generally use the catch-all term 'entries' for what may be described as either a tag or a Wiki article.

MichaelA @ 2021-06-13T15:21 (+2)

Yeah, I considered that a few weeks ago but then (somewhat inexplicably) didn't bother doing it. Thanks for the prod - I have now done it :) 

MichaelA @ 2021-06-12T14:49 (+4)

Update: I've now made this entry

career advising or career advice or career coaching or something like that

We already have career choice. But that's very broad. It seems like it could be useful to have an entry with the more focused scope of things like:

This would be analogous to how we have an entry for donation choice but also entries for grantmaking, intervention evaluation, and charity evaluation.

MichaelA @ 2021-06-12T14:20 (+4)

Charter cities or special economic zones or whatever the best catchall term for those things + seasteading is

From a quick search for "charter cities" on the Forum, I think there aren't many relevant posts, but there are:

Maybe there are other posts that would come up for "special economic zones" or "seasteading".

Maybe this is too niche a topic to warrant its own entry, given that we already have entries like global health and development and economic growth?

Pablo @ 2021-06-12T14:53 (+4)

Yes, definitely. I already had some scattered notes on this. There's also the 80k podcast episode:

Wiblin, Robert & Keiran Harris (2019) The team trying to end poverty by founding well-governed ā€˜charterā€™ cities, 80,000 Hours, March 31.
An interview with Mark Lutter and Tamara Winter from the Charter Cities Institute.

MichaelA @ 2021-06-03T12:45 (+4)

Update: I've now made this entry

Charity evaluation or (probably less good) Charity evaluator

We already have an entries donation choice, intervention evaluation, and cause prioritisation. But charity evaluation is a major component of donation choice for which we lack an entry. This entry could also cover things about charity evaluation orgs like GiveWell, e.g. how useful a role they serve, what the best practices for them are, and whether there should be one for evaluating longtermist charities or AI charities or whatever.

Downside of this name: Really it might be better to speak of "funding opportunity evaluation" or "project evaluation", to capture things that aren't precisely charities. But "charity evaluation" seems like the standard term for this sort of thing, and the entry could just mention that those nearby things also exist and that things about them could get this tag too.

Pablo @ 2021-06-03T15:31 (+4)

I think this should clearly exist.

MichaelA @ 2021-06-03T11:47 (+4)

Update: I've now made this entry.

Effective altruism outreach in schools or High school outreach or something like that

Overlaps with https://forum.effectivealtruism.org/tag/effective-altruism-education , but that entry is broader, and it seems like now there's a decent amount of activity or discussion about high school outreach specifically. E.g.:

Pablo @ 2021-06-03T15:25 (+4)

I'm in favor.

MichaelA @ 2021-05-31T07:33 (+4)

Barriers to effective giving or Psychology of (in)effective giving or something like that

Bibliography

Why arenā€™t people donating more effectively? | Stefan Schubert | EA Global: San Francisco 2018

EA Efficacy and Community Norms with Stefan Schubert [see description for why this is relevant]

[Maybe some other Stefan Schubert stuff]

[Probably some stuff by Lucius Caviola, David Reinstein, and others]

Related entries

cognitive bias | cost-effectiveness | donation choice | diminishing returns | effective giving | market efficiency of philanthropy | rationality | scope neglect | speciesism | temporal discounting

---

Relevant posts:

Pablo @ 2021-05-31T13:36 (+12)

Yeah, I think Psychology of effective giving is probably the best name. Stefan, Lucius and others have published a bunch of stuff on this, which would be good to cover in the article.

Pablo @ 2021-05-31T13:40 (+9)

This is one of many emerging areas of research at the intersection of psychology and effective altruism:

- psychology of effective giving (Caviola et al. 2014; Caviola, Schubert & Nemirow 2020; Burum, Nowak & Hoffman 2020)
- psychology of existential risk (Shubert, Caviola & Faber 2019)
- psychology of speciesism (Caviola 2019; Caviola, Everett & Faber 2019; Caviola & Capraro 2020)
- psychology of utilitarianism (Kahane et al. 2018; Everett & Kahane 2020)

I was thinking of covering all of this research in a general entry on the psychology of effective altruism, but we can also have separate articles for each.

MichaelA @ 2021-06-03T13:01 (+4)

I forgot that there was already an EA Psychology tag, so I've now just renamed that, added some content, and copied this comment of Pablo's on that Discussion page.

(It could still make sense for someone to also create entries on those other topics and/or on moral psychology - I just haven't done so yet.)

Pablo @ 2021-06-03T13:03 (+2)

Great, thanks.

MichaelA @ 2021-05-31T14:59 (+2)

Apparently there's a new review article by Caviola, Schubert, and Greene called "The Psychology of (In)Effective Altruism", which pushes in favour of roughly that as the name. 

I also think that, as you suggest, that can indeed neatly cover "psychology of effective giving" (i.e., that seems a subset of "psychology of effective altruism"), and maybe "psychology of utilitarianism".

But I'm less sure that that neatly covers the other things you list. I.e., the psychology of speciesism and existential risk are relevant to things other than how effective people will be in their altruism. But we can just decide later whether to also have separate entries for those, and if so I do think they should definitely be listed in the Related entries section from the "main entry" on this bundle of topics (and vice versa). 

So I think I currently favour:

  • Haven't an entry called psychology of (in)effective altruism
    • With psychology of effective altruism as a second-to-top pick
  • Probably not currently having a separate entry for psychology of (in)effective giving
    • But if people think there's enough distinctive stuff to warrant an entry/tag for that, I'm definitely open to it
  • Maybe having separate entries for the other things you mention
Pablo @ 2021-05-31T15:12 (+16)

Psychology of (in)effective altruism is adequate for a paper, where authors can use humor, puns, and other informal devices, but inappropriate for an encyclopedia, which should keep a formal tone.

(To elaborate:  by calling the field of study e.g. the 'psychology of effective giving' one is not confining attention only to the psychology of those who give particularly effectively: 'effective giving' is used to designate a dimension of variation, and the field studies the underlying psychology responsible for causing people to give with varying degrees of effectiveness, ranging from very effectively to very ineffectively. By analogy, the psychology of eating is meant to also study the psychology of people who do not eat, or who eat little. A paper about anorexia may be called "The psychology of (non-)eating", but that's just an informal way of drawing attention to its focus; it's not meant to describe a field of study called "The psychology of (non-)eating", and that's not an appropriate title for an encyclopedia article on such a topic.)

RyanCarey @ 2021-05-31T16:13 (+9)

Yeah, the ultra-pedantic+playful parenthetical is a very academic thing. "Psychology of effective altruism" seems to cover giving/x-risk/speciesism/career choice - i.e. it covers everything we want.

MichaelA @ 2021-05-31T18:35 (+2)

Given the fact you both say this and the upvotes on those comments, I think we should probably indeed go with "psychology of effective giving" rather than "psychology of (in)effective giving".[1]

I still don't think that actually totally covers psychology of speciesism, since speciesism is not just relevant in relation to altruism. Likewise, I wouldn't say the psychology of racism or of sexism are covered by the area "psychology of effective altruism". But I do think the entry on psychology of effective altruism should discuss speciesism and so on, and that if we later have an entry for psychology of speciesism they should link to each other.

[1] But FWIW:

  • I don't naturally interpret the "(in)" device as something like humour, a pun, or an informal device
  • I think "psychology of effective altruism" and "psychology of ineffective altruism" do call to mind to distinct focuses, even if I'd expect each thing to either cover (with less emphasis) or "talk to" work on the other thing
    • Somewhat analogously, areas of psychology that focus on what makes for an especially good life (e.g., humanist psychology) are meaningfully distinct from those that focus on "dysfunction" (e.g., psychopathology), and I believe new terms were coined primarily to highlight that distinction

But I don't think this matters much, and I'm totally happy for "psychology of effective giving" to be used instead.

MichaelA @ 2021-05-31T21:14 (+2)

(Oh, just popping a thought here before I go to sleep: "moral psychology" is a relevant nearby thing. Possibly it'd be better to have that entry than "psychology of effective altruism"? Or to have both?)

MichaelA @ 2021-05-30T17:36 (+4)

Our World in Data

Some posts where this tag would be particularly relevant:

But from a quick search, it seems like at least 20 posts mention Our World in Data somewhere, and presumably some of them also say enough about it to warrant a tag.

Pablo @ 2021-05-31T13:33 (+4)

Thanks. Coincidentally this was published yesterday. But I haven't done any tagging yet.

MichaelA @ 2021-05-31T14:40 (+4)

Ah, nice. Maybe I searched for the entry shortly before it was published. I've now tagged those 3 posts I mentioned, but haven't checked and tagged other things that come up when you search "Our World in Data".

Pablo @ 2021-05-31T16:15 (+2)

There are lots of hits for 'EA updates'. The three results that I thought deserved to be tagged were precisely the ones you had already identified. I haven't looked at this exhaustively, though, so if you find other relevant articles, feel free to add the tag to those, too.

MichaelA @ 2021-05-23T12:38 (+4)

Some orgs it might be worth making entries about:

Pablo @ 2021-05-23T13:09 (+2)

Thanks, I'm in the process of compiling a master list of EA orgs and creating entries for the missing ones. Would you be interested in looking at the spreadsheet?

MichaelA @ 2021-05-23T13:35 (+2)

Yeah, I'll send you a DM

MichaelA @ 2021-04-25T18:58 (+4)

Epistemic challenge, or The epistemic challenge, or Epistemic challenges, or any of those but with "to longtermism" added

Relevant posts include the following, and presumably many more:

Related entries

MichaelA @ 2021-05-03T13:02 (+2)

Another idea: Long-range forecasting (or some other name covering a similar topic). 

See e.g. https://forum.effectivealtruism.org/posts/s8CwDrFqyeZexRPBP/link-how-feasible-is-long-range-forecasting-open-phil

Related entries: cluelessness | estimation of existential risk | forecasting | longtermism

Given how much the scope of this entry/tag would overlap with the scope of an epistemic challenge to longtermism tag, and how much both would overlap with other entries/tags we already have, I think we should probably only have one or the other. (I could be wrong, though. Maybe we should have both but with one being wiki-only. Or maybe we should have both later on, once the Wiki has a larger set of entries and is perhaps getting more fine-grained.)

Pablo @ 2021-05-03T13:53 (+4)

I agree with having this tag and subsuming epistemic challenge to longtermism under it. We do already have forecasting and AI forecasting, so some further thinking may be needed to avoid overlap.

MichaelA @ 2021-05-03T16:00 (+4)

Ok, I've now made a long-range forecasting tag, and added a note there that it should probably subsume/cover the epistemic challenge to longtermism as well.

And yeah, I'm open to people adjusting things later to reduce how many entries/tags we have on similar topics.

Pablo @ 2021-04-26T11:37 (+2)

Is the "epistemic challenge to longtermism" something like "the problem of cluelessness, as applied to longtermism", or is it something different?

MichaelA @ 2021-04-27T06:57 (+2)

People in EA sometimes use the term "cluelessness" in a way that's pretty much referring to the epistemic challenge or the idea that it's really really hard to predict long-term-future effects. But I'm pretty sure the philosophers writing on this topic mean something more specific and absolute/qualitative, and a natural interpretation of the word is also more absolute ("clueless" implies "has absolutely no clue"). I think cluelessness could be seen as one special case / subset of the broader topic of "it seems really really hard to predict long-term future effects". 

I write about this more here and here.

Here's an excerpt from the first of those links:

Hmm, looking again at Greaves' paper, it seems like it really is the case that the concept of "cluelessness" itself, in the philosophical literature, is meant to be something quite absolute. From Greaves' introduction:

"The cluelessness worry. Assume determinism.1 Then, for any given (sufficiently precisely described) act A, there is a fact of the matter about which possible world would be realised ā€“ what the future course of history would be ā€“ if I performed A. Some acts would lead to better consequences (that is, better future histories) than others. Given a pair of alternative actions A1, A2, let us say that

(OB: Criterion of objective c-betterness) A1 is objectively c-better than A2 iff the consequences of A1 are better than those of A2.

It is obvious that we can never be absolutely certain, for any given pair of acts A1, A2, of whether or not A1 is objectively c-better than A2. This in itself would be neither problematic nor surprising: there is very little in life, if anything, of which we can be absolutely certain. Some have argued, however, for the following further claim:

(CWo: Cluelessness Worry regarding objective c-betterness) We can never have even the faintest idea, for any given pair of acts (A1, A2), whether or not A1 is objectively c-better than A2.

This ā€˜cluelessness worryā€™ has at least some more claim to be troubling."

So at least in her account of how other philosophers have used the term, it refers to not having "even the faintest idea" which act is better. This also fits with what "cluelessness" arguably should literally mean (having no clue at all). This seems to me (and I think to Greaves'?) quite distinct from the idea that it's very very very* hard to predict which act is better, and thus even whether an act is net positive.

And then Greaves later calls this "simple cluelessness", and introduces the idea of "complex cluelessness" for something even more specific and distinct from the basic idea of things being very very very hard to predict.

Meanwhile, the epistemic challenge is the more quantitative, less absolute, and in my view more useful idea that: 

  • effects probably get harder to predict the further in future they are
  • this might mean we should focus on the near-term if that gradual decrease in our predictive power outweighs the increased scale of the long-term future compared to the nearer-term. 

On that, here's part of the abstract of Tarsney's paper:

Longtermists claim that what we ought to do is mainly determined by how our actions might affect the very long-run future. A natural objection to longtermism is that these effects may be nearly impossible to predictā€” perhaps so close to impossible that, despite the astronomical importance of the far future, the expected value of our present options is mainly determined by short-term considerations. This paper aims to precisify and evaluate (a version of) this epistemic objection to longtermism. To that end, I develop two simple models for comparing ā€œlongtermistā€ and ā€œshort-termistā€ interventions, incorporating the idea that, as we look further into the future, the effects of any present intervention become progressively harder to predict.

MichaelA @ 2021-04-24T12:51 (+4)

I think there should either be an entry for each of Accident risk, Misuse risk, and Structural risk, or a single entry that covers all three, or something like that.

Maybe these entries should just focus on AI, since that's where the terms were originally used (as far as I'm aware). On the other hand, I think the same concepts also make sense for other large-scale risks from technologies.

If the entries do focus on AI, maybe they should have AI in the name (e.g. AI accident risk or Accident risk from AI), or maybe not.

In this case, the reason I'm posting this here rather than just making it is that I'm not sure exactly what form this entry or set of entries should take; I'm quite confident that something like this should exist.

I think the first place these terms were all used may have been this post: https://www.lawfareblog.com/thinking-about-risks-ai-accidents-misuse-and-structure

They're also used here: https://forum.effectivealtruism.org/posts/42reWndoTEhFqu6T8/ai-governance-opportunity-and-theory-of-impact 

Pablo @ 2021-04-24T13:39 (+2)

There's an accidental harm article, which is meant to cover the risk of causing harm as an unintended effect of trying to do good, as discussed e.g. here. What you describe is somewhat different, since the risk results not so much from "attempts to do good" but from the development of a technology in response to consumer demand (or other factors driving innovation not directly related to altruism). Furthermore, misuse risk can involve deliberate attempts to cause harm, in addition to unintended harm. I guess all of these risks are instances of the broader category of "downside risk", so maybe we can have an article on that?

MichaelA @ 2021-04-24T15:51 (+2)

I think there are indeed overlaps between all these things.

But I do think that the application of these terms to technological risk specifically or AI risk specifically is important enough to warrant its own entry or set of entries. 

Maybe if you feel their distinctive scope is at risk of being unclear, that pushes in favour of sticking with the original AI-focused framing of the concepts, and maybe just mentioning in one place in the entry/entries that the same terms could also be applied to technological risk more broadly? Or maybe it pushes in favour of having a single entry focused on this set of concepts as a whole and the distinctions between them (maybe called  Accident, misuse, and structural risks)?

I also wouldn't really want to say misuse risk is an instance of downside risk. One reason is that it may not be downside risk from the misuser's perspective, and another is that downside risk is often/usually used to mean a risk of a downside from something that is or is expected to be good overall. More on this from an older post of mine:

One definition of a downside is ā€œThe negative aspect of something otherwise regarded as good or desirableā€ (Lexico). By extension, I would say that, roughly speaking, a downside risk is a risk (or possibility) that there may be a negative effect of something that is good overall, or that was expected or intended to be good overall.

Also, I think I see "accidental harm" as sufficiently covering standard uses of the term "downside risk" that there's not a need for a separate entry. (Though maybe a redirect would be good?)

MichaelA @ 2021-03-26T06:35 (+4)

EA vs Non-EA Orgs

Proposed tag description:

The EA vs Non-EA Orgs tag is for posts that include arguments for or against pursuing jobs at organisations that explicitly identify with the "effective altruism" label, relative to jobs at other organisations. The tag is also for posts that include discussion of whether and why members of the EA community may be biased in one direction or the other on this question, and how to address that (e.g., how to raise the status - within the EA community - of high-impact work at non-EA orgs). 

This tag is not intended for posts that discuss things like how the effectiveness of EA orgs tends to differ from that of non-EA orgs, except when those posts connect this to the question of where EAs should work.

See also Career Choice, EA Hiring, Criticism (EA Orgs), Criticism (EA Movement). [I'll add the links if I actually make this tag; I'm just being lazy]

Alternative tag names: 

Some posts that would fit this tag:

JP Addison @ 2021-03-26T11:01 (+4)

I like it. Maybe "Working at EA vs Non-EA Orgs?"

MichaelA @ 2021-03-26T22:49 (+2)

Cool, done.

I think that that name is clearer, but thought brevity was substantially preferred for tag names. But I'm personally more inclined towards clarity than brevity here, so I'll use your suggested name. Someone can change it later anyway.

MichaelA @ 2021-03-26T06:14 (+4)

Scalably Using People or Scalably Using Labour or Task Y or something like that

Proposed description:

There are often discussions of how good or bad EA is at efficiently allocating many people to valuable work, the consequences of EA's strengths or weaknesses on that front, how to improve, and how this all might change as EA grows. 

See also Career Choice, Get Involved, Movement Strategy, Community Tools, Criticism (EA Movement), EA Hiring, and Markets for Altruism. 

Notes on that description:

Posts that would warrant this tag include:

JP Addison @ 2021-03-26T11:00 (+4)

I'm pro. I'd call it Task Y, though I wouldn't be surprised if there was a reason not to.

MichaelA @ 2021-03-26T23:13 (+2)

Cool, given that, I've now made the tag. I've called it Scalably Using People rather than Task Y, with the key reason being that Alex originally described Task Y as being a single task. More generally, I think that the description of Task Y wouldn't neatly cover things like the vetting-constrained discussion or Jan's discussion of hierarchical network structures, and I'm hoping for this tag to cover things like that as well. So I see Task Y as a subset of what I'm hoping this tag will cover.

I'm definitely open to people suggesting alternative names, though. 

MichaelA @ 2021-03-25T05:16 (+4)

Industrial Revolution

We already have a variety of related tags, like History, Economic Growth, and Persistence of Political/Cultural Variables. But the Industrial Revolution does seem like perhaps the single most notable episode of history for many/all EA cause areas, so maybe we should have a tag just for it?

Some posts that would warrant the tag:

JP Addison @ 2021-03-25T10:35 (+2)

My guess is that a better tag would be "History of Economic Growth". Because I can't picture a case where someone wants to find things about the industrial revolution but not all of economic growth. (Unless they're doing a specific research project, but that sounds pretty niche.)

But even still, I'd tentatively lean towards economic growth being enough. But I think that depends on how fine-grained our tagging system should be, which I don't have a strong opinion on.

MichaelA @ 2021-03-25T23:43 (+2)

This seems reasonable. I was also unsure about my suggestion, hence popping it here rather than making it. I'll hold off for now, at least.

MichaelA @ 2021-03-25T05:13 (+4)

Cultural Evolution

One relevant post: https://forum.effectivealtruism.org/posts/7QiXR2dv8KL4fkf9D/notes-on-henrich-s-the-weirdest-people-in-the-world-2020

I haven't searched my memory or the Forum for other relevant posts yet.

This would overlap somewhat with the tags for Memetics and Persistence of Political/Cultural Variables.

JP Addison @ 2021-03-25T10:36 (+2)

I'm in favor.

MichaelA @ 2021-03-26T01:08 (+2)

Cool - done.

MichaelA @ 2021-03-16T02:11 (+4)

Update: I've now made this tag.

Persistence of Political/Cultural Variables (or Cultural Persistence, or Cultural, Political, and Moral Persistence, or something like that)

First pass at a description: 

It's often important to have a sense of the persistence of political/cultural variables - such as democracy, authoritarianism, concern for human rights, a concern for animal welfare, or norms conducive to scientific progress or free markets. This can inform our predictions of what the future will be like and our views on the importance of changing those variables.

See also cluelessness, hinge of history, memetics, value drift (which focuses on individuals rather than societies), and academic work on cultural persistence (e.g. Giuliano and Nunn, 2016; but see also Kelly, 2019).

Posts that would warrant this tag include:

Aaron Gertler @ 2021-03-16T07:41 (+4)

Seems reasonable to me. Want to go ahead and create it?

MichaelA @ 2021-03-16T08:23 (+2)

Done!

Larks @ 2021-03-01T14:44 (+4)

One consideration I just thought of, which I do not recall seeing mentioned elsewhere, is that the optional number of tags depends somewhat on the typical tag use case.

MichaelA @ 2021-03-01T23:31 (+2)

Good points.

Maybe the ideal for future will be to have hierarchies/categories of Forum tags? LessWrong now does this (though I haven't looked at their system in detail).

MichaelA @ 2020-11-26T04:46 (+4)

Update: I've now made this tag.

Fellowships or EA-Aligned Fellowships or Research Fellowships or something like that

Stefan Schubert writes:

It could be good if someone wrote an overview of the growing number of fellowships and scholarships in EA (and maybe also other forms of professional EA work). It could include the kind of info given above, and maybe draw inspiration from Larks' overviews of the AI Alignment landscape. I don't think I have seen anything quite like that, but please correct me if I'm wrong.

Maybe this would be partially addressed via a tag for posts about these things. I imagine that could be useful for people who are considering running or participating in such a fellowship, or people who definitely will and want to get some insights into how best to do so.

I think the sort of thing that'd obviously be covered are the Summer Research Fellowships offered by FHI and CLR, and the Early Career Conference Programme offered by GPI. 

I'm not sure whether this tag should also include: 

I'm therefore also not sure what the ideal name and description would be.

MichaelA @ 2020-11-22T01:23 (+4)

(Update: I've now made this tag.)

Institutions for Future Generations

This is arguably a subset of Institutional Decision-Making and/or Policy Change. It also overlaps with Longtermism (Philosophy) and Moral Advocacy / Values Spreading. But it seems like this is an important category that various people might want to learn about in particular (i.e., not just as part of learning about institutional decision-making more broadly), and like there are many EA Forum posts about this in particular.

MichaelA @ 2020-11-22T01:15 (+4)

(Update: I've now made this tag.)

China (or maybe something broader like BRICS or Rising Powers)

Rough proposed description:

The China tag is for posts that are about China, that address how China is relevant to various issues EAs care about, or that are relevant to how one could have an impact by engaging with China.

See also Global Outreach and International Relations.

It seems perhaps odd to single China out for a tag while not having tags for e.g. USA, Southeast Asia, ASEAN, United Nations, Middle Powers. But we do have a tag for posts relevant to the European Union. And China does seem like a particularly important topic, and one that it makes sense to have a specific tag for. And maybe we should indeed have tags for United Nations and Middle Powers.

I'd be interested in thoughts on whether BRICS, Rising Powers, or something else would be a better label/scope for this tag than China.

vaidehi_agarwalla @ 2020-11-04T02:04 (+4)

Update: I've created the tag "Discussion Norms"

Community Norms/Discussion Norms

Very Bad Description: Posts that discuss norms on how EAs to interact with each other. 

Posts this tag could apply to: 

Issues with existing tags: 

MichaelA @ 2020-11-04T02:51 (+2)

My quick, personal take is that:

  1. A tag for Discussion Norms seems useful and distinct from the other tags you mention. It also wouldn't have to only be about discussion norms for intra-EA interactions - it could also be about discussion norms in other contexts. 
  2. "Community Norms" and "Posts that discuss norms on how EAs to interact with each other" feel very broad to me, and it's harder for me to see precisely what that's trying to point at that isn't captured by one of the first three other tags you mention. 
  3. But I have a feeling that something like Community Norms/Discussion Norms could have a clear scope that's useful and distinct from the other tags. Maybe if you just try flesh out what you mean a little more in the description it'd be clear to me?
  4. Maybe what you have in mind will often relate to things like being welcoming, supportive, and considerate? If so, maybe adjusting the tag label or description in light of that could help?
vaidehi_agarwalla @ 2020-11-23T23:45 (+2)

I think Discussion Norms makes sense!

Discussion Norms: Posts about suggested or encouraged norms within the EA community on how to interact with other EAs, which may often relate to being supportive, welcoming and considerate. 

It's still not great, if you had any  feedback I'd be keen to hear it!

JP Addison @ 2020-10-31T17:00 (+4)

Anyone have thoughts on this tag? I'm skeptical, but might be more inclined if I saw more applications that were good. Also if it had a description that described it's naturalness as a category in the EA-sphere. (If this were a business forum it would obviously be good, and maybe it is in this Forum ā€” I'm not sure.)

MichaelA @ 2020-10-31T17:34 (+2)

My quick take is that it does seem like it at least needs a description that explains why it warrants an EA Forum tag. I'd wonder, for example, whether it's meant to just be about scaling organisations (e.g., EA orgs), or also about scaling things like bednet distribution programs. (Or maybe those two things are super similar anyway?) 

MichaelStJules @ 2020-09-13T03:50 (+4)

Do we need both Longtermism (Philosophy) and Long-Term Future?

MichaelA @ 2020-09-13T07:22 (+4)

Personally, I think those two tags have sufficiently large and separate scopes for it to make sense for the forum to have both tags. (I didn't create either tag, by the way.) 

But the Longtermism (Philosophy) tag has perhaps been used too liberally, including for posts that should've only been given tags like Long-Term Future or Existential Risk. Perhaps this is because the Longtermism (Philosophy) tag was around before Long-Term Future was created (not sure if that's true), and/or because the first two sentences of the Longtermism (Philosophy) tag didn't explicitly indicate that its scope was limited to philosophical aspects of longtermism only. Inspired by your comment, I've now edited the tag description to hopefully help a bit with that. 

The tag description used to be:

Longtermism is the idea that we can maximize our impact by working to ensure that the long-run future goes well (because it may contain an enormous number of people whose lives we may be able to improve).

This is a relatively new idea, and people in the EA movement currently work on a wide range of open questions related to different facets of longtermism. 

This tag is meant for discussion of longtermist philosophy, rather than specific longtermist cause areas (there are other tags for those, like Existential Risk).

The tag description is now:

The Longtermism (Philosophy) tag is for posts about philosophical matters relevant to longtermism, meaning, roughly, "an ethical view that is particularly concerned with ensuring long-run outcomes go well" (MacAskill). Longtermism is a relatively new idea, and people in the EA movement currently work on a wide range of open questions related to different facets of longtermism. 

For posts about what the long-term future might actually look like, see Long-Term Future. For posts about specific longtermist cause areas, see other tags such as Existential Risk.

(The second sentence could perhaps be cut.)

For comparison, the tag description of Long-Term Future is:

The Long-Term Future tag is meant for discussion of what the long-term future might actually look like. This doesn't necessarily overlap with the Longtermism (Philosophy) tag, because a post attempting to e.g. model the future of space travel won't necessarily discuss the philosophical implications of its model.

MichaelA @ 2020-08-29T11:44 (+4)

(Update: I've now made this tag.)

Cooperation & Coordination or [just one of those terms] or Moral Trade

(I think I lean towards the first option and away from Moral Trade.)

Proposed description: 

The Cooperation & Coordination tag is for posts about whether, when, and how people - especially effective altruists and others aiming to do good - should cooperate and coordinate. Such posts will often draw on ideas related to game theory, moral trade, moral uncertainty, and how to think about and measure counterfactual impact. 

See also Movement Strategy, Moral Advocacy / Values Spreading, and Epistemic Humility.

Some posts this would cover:

Arguments against this:

Alternative idea:

JP Addison @ 2020-08-30T08:00 (+4)

I lightly think both is better than either one on its own.

MichaelA @ 2020-08-30T09:15 (+2)

Ok, I've now made this tag and used the name that includes both terms :)

MichaelA @ 2020-08-25T07:51 (+4)

Maybe some of the existing tags related to politics & policy should be deleted, and a tag for Politics & Policy should replace them?

Some relevant tags that might be on the chopping block: Improving Institutional Decision-Making, Policy Change, Political Polarisation,  International Relations, Direct Democracy, and Global Governance.

I think I'm moderately against this idea, as I think the sub-topics are large/important enough to warrant their own tags, even if there's a lot of overlap. But I thought I'd throw this idea out there anyway. 

evelynciara @ 2021-07-26T04:51 (+2)

I like this idea, sort of. I think we should create a politics and policy "mega-tag" (the tags that show up in white, like Existential risk) while keeping the others as sub-tags.

Pablo @ 2021-07-26T12:04 (+4)

What do you think about the policy change entry? One option is to rename it to just policy and use it as the "mega-tag" you propose.

evelynciara @ 2021-07-26T16:36 (+2)

That's a good idea.

MichaelA @ 2020-08-25T07:52 (+1)

(If you hate the above idea but also hate disrupting my delicious karma, feel free to downvote that comment and upvote this one to keep the universe in order.

Or vice versa, I guess, if you're a maverick.)

MichaelA @ 2020-08-23T17:17 (+4)

(Update: I've now made this tag.)

Improving Institutional Decision-Making (or similar)

Argument against:

Arguments for:

The post that prompted this, because it's clearly relevant to IIDM but doesn't seem very relevant to Policy Change: Should We Prioritize Long-Term Existential Risk?

MichaelA @ 2020-08-20T13:54 (+4)

(Update: I've now made this tag, with the name Epistemic Humility and a description noting it can be about other, broadly related things as well.)

Social Epistemology & Epistemic Humility or [just one of those terms] or [some other label]

Some posts that might fit this tag:

JP Addison @ 2020-08-20T14:22 (+2)

I really like Social Epistemology except for the crucial flaw that I haven't heard it called that before. Without the ability for people to recognize it, I think it's worse than Epistemic Humility. (Normally I'd prefer the more general term, rather than a term for one strategy within the space.)

MichaelA @ 2020-08-20T15:11 (+2)

I haven't heard it called that before

Do you mean you haven't heard the term social epistemology, or that you haven't heard epistemic humility specifically (or debates around that) referred to by the term social epistemology?

I'd envision this tag including not just things like "How epistemically humble should we be, and how should we update given other people's statements/beliefs?", but also things like when we should give just our conclusions vs also our reasoning if we're concerned about information cascades, and to what extent publicly stating explicit estimates will cause anchoring by others. Those things could arguably be seen as about epistemic humility in that they're about how to communicate given how other people might handle epistemic humility, but saying they're about social epistemology (or something else) seems more natural to me. 

(That said, I think I'm only familiar with the term social epistemology from how it's occasionally used by EAs, and the Wikipedia article's lead section makes me uncertain if they're using the term in the standard way.)

Maybe the best tag label would be Epistemic Humility & Social Epistemology, to put the term that's more common in EA first? That's a longer label than average, though.

FWIW, both my suggestion of this tag and my suggestion of the term social epistemology for it were prompted by the following part of Owen Cotton-Barratt's recent post:

Learning can be much more efficient if we allow the transmission of heuristics between people, but if you don't require people to have any grounding in their own experience or cases they've directly heard about, it's possible for heuristics to be propagated without regard for whether they're still useful, or if the underlying circumstances have changed enough that they shouldn't be applied. Navigating this tension is an interesting problem in social epistemology.

JP Addison @ 2020-08-20T15:10 (+2)

I have now read the post that contains Social Epistemology.

I also wasn't clear before, but I was biasing towards one shorter label or another.

Max_Daniel @ 2020-08-13T10:05 (+4)

Global priorities research and macrostrategy.

I wanted to use these tags when asking this question, but they don't seem to exist.

There is a tag on cause prioritization. But I think it'd be more useful if that tag was focused on content that is directly relevant for prioritizing between causes, e.g. "here is why I think cause A is more tractable than cause B" or "here's a framework for assessing the neglectedness of a cause". Some global priorities or macrostrategy research has this property, but not all of it. E.g. I think it'd be a bit of a stretch to apply the cause prioritization label to this (amazing!) post on Quantifying anthropic effects on the Fermi paradox.

MichaelA @ 2020-12-14T01:52 (+4)

I've now made a tag for Global Priorities Research. I currently think that anything we would've wanted to give a Macrostrategy tag to can just be given a Global Priorities Research tag instead, such that we don't need a Macrostrategy tag, but feel free to discuss that in the "Discussion" page attached to the GPR tag.

MichaelA @ 2020-08-14T01:08 (+2)

I'm tentatively in favour of Macrostrategy. A big issue is that I don't have a crisp sense of what macrostrategy is meant to be about, and conversations I've had suggests that a lot of people who work on it feel the same. So I'd have a hard time deciding what to give that tag to. But I do think it's a useful concept, and the example post you mention does seem to me a good example of something that is macrostrategy and isn't cause prioritisation.

I feel like a tag for Global Priorities Research is probably unnecessary once we have tags for both Cause Prioritisation and Macrostrategy? But I could be wrong. (Also I'm just offering my views as inputs; I have no gate-keeping role and anyone can make whatever tags they want.)

MichaelA @ 2020-08-12T00:12 (+4)

(Update: I've now made this tag.)

Moral Uncertainty

Argument against:

Argument for:

JP Addison @ 2020-08-12T08:08 (+4)

I'd be in favor.

JP Addison @ 2020-08-12T11:39 (+2)

What's the intended difference between Meta-Ethics and Moral Philosophy?

MichaelA @ 2020-08-13T08:17 (+2)

As I understand it, ethics is often split into the branches meta-ethics, normative ethics, and applied ethics. I'm guessing the Moral Philosophy tag is meant to cover all of those branches, or maybe just the latter two. Meta-Ethics would just cover questions about "the nature, scope, and meaning of moral judgment" (Wikipedia).

So some questions that wouldn't fit in Meta-Ethics, but would fit in Moral Philosophy, include:

  • Should we be deontologists or consequentialists?
  • What should be considered intrinsically valuable (e.g., suffering, pleasure, preference satisfaction, achievement, etc.)?
  • What beings should be in our moral circles?

Whereas Meta-Ethics could include posts on things like arguments for moral realism vs moral antirealism. (I'm not sure whether those posts should also go in Moral Philosophy.)

MichaelA @ 2020-08-08T06:32 (+4)

I noticed there's no Consciousness tag, so I was going to create one, but then I saw the Sentience tag. Perhaps that should be renamed "Sentience / Consciousness", and/or its description should be tweaked to mention consciousness?

(I'm putting this here so it can be up- or down-voted to inform whether this change should be made. I think the tag pages will later have the equivalent of Wikipedia's "Talk" pages, at which point I'd put comments like this there instead.) 

(Update: This got 2 upvotes, and continues to seem to me like a good idea, so I updated the name and description of this tag accordingly.)

Aaron Gertler @ 2020-08-07T07:17 (+4)

I've edited this post to include our official mandate at the top. Thanks for creating it, MichaelA!

MichaelA @ 2020-08-05T00:17 (+4)

[Any 80,000 problem areas and career paths - or the additional problem areas and career ideas they mention - that are not directly covered by existing tags]

I haven't yet looked through these problem areas and career paths/ideas with this in mind, to see what's not covered by existing tags and what the arguments for and against creating new tags for these things would be. 

(Feel free to comment yourself with specific tag ideas drawn from the 80k problem areas and career paths, or the additional ones they mention.)

MichaelA @ 2020-08-25T07:44 (+8)

(Update: I've now made this tag.)

Nanotechnology or Atomically Precise Manufacturing

Arguments against:

Arguments for:

  • Not super niche
  • 80k highlight this a potentially important area (though it's not one of their top priorities)
  • The small set of (maybe-not-trustworthy) estimates we have suggest nanotech/APM is decently likely to be among the top 10 largest existential risks we know of (given usual ways of classifying things), and perhaps smaller only than AI and bio
MichaelA @ 2021-06-12T07:08 (+2)

Update: I've now made this entry

Grantmaking

Overlaps with EA funding and probably some other entries. But that entry is quite broad, and this entry could also cover things like how useful grantmaking is, how to test fit for it, best practices for grantmaking, etc. (which I'm not sure fit perfectly in EA funding).

Would overlap with vetting constraints if we make that entry (I proposed it elsewhere on this post).

Pablo @ 2021-06-12T12:45 (+2)

I think this entry makes sense. Maybe effective altruism funding should be made more precise, but that's a separate issue.

MichaelA @ 2021-05-16T10:29 (+2)

Just want to note that:

  • I still think it'd probably be good for someone to go through 80k articles and see which topics covered warrant a Forum wiki entry
    • In many cases, the entry's name and scope might differ a little from the 80k one. E.g. we might want to go with academia and think tanks rather than 80k's academic research and think tank research
  • I now realise that, while doing that, it'd also be cool if the person could add 80k links in the Bibliography / Further reading sections of relevant entries
    • E.g., I just added a link to an 80k article from our Academia entry
MichaelA @ 2021-05-16T10:27 (+2)

(Update: I've now made this tag.)

Think tanks

Could draw on and link to this 80k article: https://80000hours.org/career-reviews/think-tank-research/

MichaelA @ 2020-08-08T06:50 (+2)

(Update: I've now made this tag.)

Space (or maybe Space Governance, or Space Governance & Colonisation, or something along those lines)

"Governance of outer space" is mentioned by 80k here.

Would perhaps just be a subset of Long-Term Future. But perhaps a sufficiently large and important subset to warrant its own tag.

Some posts this should include:

MichaelA @ 2020-08-04T23:57 (+4)

(Update: I've now created this tag.)

Meta-Ethics

Argument against: This is arguably a subset of the tag Moral Philosophy.

Arguments for: This seems like an important subset, which there are several Forum posts about, and which some people might appreciate a specific tag about (e.g., if they're beginning to grapple with meta-ethics and are less focused on moral philosophy as a whole right now).

Some posts this should include:

MichaelA @ 2021-05-05T16:33 (+3)

It might be worth going through the Effective Altruism Hub's resource collections and the old attempts to build EA Wikis (e.g., the Cause Prioritization wiki), to:

I assume some of this has been done already, but someone doing it thoroughly seems worthwhile.

cafelow @ 2021-05-06T23:18 (+5)

Thanks Michael! 
I manage the EA Hub Resources, but much of the content has been slowly getting outdated. 

I think the best action will be to incorporate the content in the Learn and Take Action sections of the EA Hub Resources into the EA Forum wiki, and redirect Hub visitors to the wiki. I'm unlikely to have the time to do this soon, so I would be delighted if someone else was keen to do this. Get in touch if you are keen to do this and I can assist + set up redirects when ready! Message me through the forum private messaging.

The rest of the resources are designed for EA group  organisers and my current plan is to keep this outside of the wiki (but I'm happy for folks to try to change my mind!). I plan to move this content onto a new website in the next few months as the EA Hub team have decided to narrow their focus to the community directories. 

Pablo @ 2021-05-05T16:45 (+2)

I did this systematically for all the relevant wikis I was aware of, back when I started working on this project in mid 2020. Of course, it's likely that I have missed some relevant entries or references.

MichaelA @ 2021-05-05T16:49 (+2)

Ah, nice. What about for the EA Hub stuff?

E.g., they've got a bunch of stuff on how to talk about EA, running EA-related events, and movement-building. And also curated collections for cause areas. And I don't think I've seen those things linked to from tag pages?

Pablo @ 2021-05-06T10:28 (+4)

What about for the EA Hub stuff?

I actually wasn't aware of their resources section (EA Hub has changed a lot over the years and I haven't stayed abreast of the latest changes). They used to have a wiki, which I did review, though some pages were not indexed by the Internet Archive. I wonder if they have migrated their old wiki content to the new resources page. In any case, I've made a note to investigate this further.

cafelow @ 2021-05-06T23:24 (+5)

Hey Pablo! 
You are right that the wiki is long dead. The current resources section was written independently from the wiki.
As I just commented up the thread, with the new EA Forum wiki (which is wonderful!), I think the content on the EA Hub intended for all EAs should be merged into the wiki, and then I can retire those pages and set up redirects. More than happy to chat more about this!
 

Pablo @ 2021-05-06T23:41 (+2)

Thanks for your message! Can you email me at stafforini.com preceded by MyName@, or share an email address where I can reach you?

(EDIT: We have now contacted each other.)

MichaelA @ 2021-05-07T06:56 (+2)

Great that you two have connected!

In the other thread, Catherine says:

The rest of the resources are designed for EA group  organisers and my current plan is to keep this outside of the wiki (but I'm happy for folks to try to change my mind!).

Yeah, I don't think the EA Forum Wiki needs to eat everything else - other options include:

  • Just include a link in Further reading or Bibliography to the external collection of resources
    • See e.g. the link to my own collection of resources from here
  • Look through the collection, give the appropriate tag to the Forum posts that are in that collection, and maybe include links to some other specific things in the Further reading or Bibliography section
MichaelA @ 2021-05-06T12:16 (+2)

Sounds good!

MichaelA @ 2021-05-02T09:32 (+3)

Mind uploads, or Whole brain emulation, or maybe Digital minds

I think that:

But I could be wrong about either of those things.

Further reading

Age of Em

https://www.fhi.ox.ac.uk/brain-emulation-roadmap-report.pdf

Related entries

artificial sentience | consciousness research | intelligence and neuroscience | long-term future | moral patienthood | non-humans and the long-term future | number of future people

Pablo @ 2021-05-02T12:36 (+4)

Definitely. I already was planning to have an entry on whole brain emulation and have some notes on it... wait, I now see the tag already exists. Mmh, it seems we missed it because it was "wiki only". Anyway, I've removed the restriction now. Feel free to paste the 'further reading' and 'related entries' sections (otherwise I'll do it myself; I just didn't want to take credit for your work).

MichaelA @ 2021-05-02T14:43 (+4)

Cool, I've now added those related entries and the "roadmap" report (Age of Em was already cited). 

MichaelA @ 2021-05-02T08:11 (+3)

Non-longtermist arguments for GCR reduction, or Non-longtermist arguments for prioritising x-risks, or similar but with "reasons" instead of arguments, or some other name like that

The main arguments I have in mind are the non-longtermist 4 of the 5 arguments Toby Ord mentions in The Precipice, focusing on the past, the present, civilizational virtues, and cosmic significance.

Ideally, the entry would cover both (a) such arguments and (b) reasons why those arguments might be much weaker than the longtermist arguments and thus might not by themselves justify placing a strong focus on GCR/x-risk reduction (at least if reducing such risks seems much less tractable or neglected than other things).

Examples of posts that this would cover:

I'm not sure whether the label should focus on GCRs or x-risks. I think the present- and civilizational-virtue-focused arguments apply to GCR reduction, but the past- and cosmic-significance-focused arguments probably don't.

Pablo @ 2021-05-02T12:29 (+2)

I think this would be a very useful article to have. It seems challenging to find a name for it, though. How about short-termist existential risk prioritization? I am not entirely satisfied with it, but I cannot think of other alternatives I like more. Another option, inspired by the second of your proposals, is short-termist arguments for prioritising existential risk. I think I prefer 'risk prioritization' over 'arguments for prioritizing' because the former allows for discussion of all relevant arguments, not just arguments in favor of prioritizing.

MichaelA @ 2021-05-02T14:39 (+2)

Hmm, I don't really like "short-termist" (or "near-termist"), since that only seems to cover what Ord calls the "present"-focused "moral foundation" for focusing on x-risks, rather than also the past, civilizational virtue, or cosmic significance perspectives. 

Relatedly, "short-termist" seems like it implies we're still assuming a broadly utilitarianian-ish perspective but just not being longtermist, whereas I think it'd be good if these tags could cover more deontological and virtue-focused perspectives. (You could have deontological and virtue-focused perspectives that prioritise x-risk in a way that ultimately comes down to effects on the near-term, but not all such perspectives would be like that.)

Some more ideas: 

  • Existential risk prioritization for non-longtermists
  • Alternative perspectives on existential risk prioritization
    • I don't really like tag names that say "alternative" in a way that just assumes everyone will know what they're alternative to, but I'm throwing the idea out there anyway, and we do have some other tags with names like that
Pablo @ 2021-05-02T14:54 (+2)

The reasons for caring about x-risk that Toby mentions are relevant from many moral perspectives, but I think we shouldn't cover them on the EA Wiki, which should be focused on reasons that are relevant from an EA perspective. Effective altruism is focused on finding the best ways to benefit others (understood as moral patients), and by "short-termist" I mean views that restrict the class of "others" to moral patients currently alive, or whose lives won't be in the distant future. So I think short-termist + long-termist arguments exhaust the arguments relevant from an EA perspective, and therefore think that all the arguments we should cover in an article about non-longtermist arguments  are short-termist arguments.

Pablo @ 2021-05-02T15:28 (+2)

It's not immediately obvious that the EA Wiki should focus solely on considerations relevant from an EA perspective. But after thinking about this for quite some time, I think that's the approach we should take, in part because providing a distillation of those considerations is one of the ways in which the EA Wiki could provide value relative to other reference works, especially on topics that already receive at least some attention in non-EA circles.

MichaelA @ 2021-05-02T16:08 (+2)

Hmm. I think I agree with the principle that "the EA Wiki should focus solely on considerations relevant from an EA perspective", but have a broader notion of what considerations are relevant from an EA perspective. (It also seems to me that the Wiki is already operating with a broader notion of that than you seem to be suggesting, given that e.g. we have an entry for deontology.)

I think the three core reasons I have this view are:

  1. effective altruism is actually a big fuzzy bundle of a bunch of overlapping things
  2. we should be morally uncertain
  3. in order to do good from "an EA perspective", it's in practice often very useful to understand different perspectives other people hold and communicate with those people in terms of those perspectives

On 1 and 2:

  • I think "Effective altruism is focused on finding the best ways to benefit others (understood as moral patients)" is an overly strong statement.
    • Effective altruism could be understood as a community of people or as a set of ideas, and either way there are many different ways one could reasonably draw the boundaries.
    • One definition that seems good to me is this one from MacAskill (2019):
      • "Effective altruism is: (i) the use of evidence and careful reasoning to work out how to maximize the good with a given unit of resources, tentatively understanding ā€˜the goodā€™ in impartial welfarist terms, and (ii) the use of the findings from (i) to try to improve the world. [...]
      • The definition is: [...] Tentatively impartial and welfarist. As a tentative hypothesis or a first approximation, doing good is about promoting wellbeing, with everyoneā€™s wellbeing counting equally." (emphasis added, and formatting tweaked)
  • I think we should be quite morally uncertain.
    • And many seemingly smart and well-informed people have given non-welfarist or even non-consequentialist perspectives a lot of weight (see e.g. the PhilPapers survey).
    • And I myself see some force in arguments or intuitions for non-welfarist or even non-consequentialist perspectives.
    • So I think we should see at least consideration of non-welfarist and non-consequentialist perspectives as something that could make sense as part of the project to "use evidence and reason to do the most good possible".
  • Empirically, I think the above views are shared by many other people in EA
    • Including two of the main founders of the movement
      • MacAskill wrote a thesis and book on moral uncertainty (though I don't know his precise stance on giving weight to non-consequentialist views)
      • Ord included discussion of the previously mentioned 5 perspectives in his book, and has given indication that he genuinely sees some force in the ones other than present and future
      • These views also seem in line with the "long reflection" idea that both of those people see as quite important
        • For long-reflection-related reasons, I'd actually be quite concerned about the idea that we should, at this stage of (in my view) massive ignorance, totally confidently commit to the ideas of consequentialism and welfarism
        • Though one could support the idea of the long reflection while being certain about consequentialism and welfarism
    • Also, Beckstead seemed open to non-consequentialism in a recent talk at the SERI conference
    • Relatedly, I think many effective altruists put nontrivial weight on the idea that they should abide by certain deontological constraints/duties, and not simply because that might be a good decision procedure for implementing utilitarianism in practice
      • Maybe the same is true in relation to virtue ethics, but I'm not sure
      • I think the same is at least somewhat true with regards to the "past"-focused moral foundation Ord mentions
        • I find that framing emotionally resonant, but I don't give it much weight
        • Jaan Tallinn seemed to indicate putting some weight on that framing in a recent FLI podcast episode (search "ancestors)

On 3:

  • EA represents/has a tiny minority of all the people, money, political power, etc. in the world
    • The other people can block our actions, counter their effects, provide us support, become inspired to join us, etc.
    • How much of each of those things happen will have a huge influence on the amount of good we're ultimately able to do
  • One implication is that what other people are thinking and why is very decision-relevant for us
    • Just as many other features of the world that doesn't adopt an EA mindset (e.g., the European Union) could still be decision-relevant enough to warrant an entry
    • One could see speciesism as a more extreme version of this; that's of course not in line with an impartial welfarist mindset, but impartial welfarists may be more effective if they know about speciesism
  • Another implication is that being able to talk to people in ways that connect to their own values, epistemologies, etc. (or show them resources that do this, e.g. parts of the Precipice) can be very valuable for advocacy purposes
Pablo @ 2021-05-02T18:49 (+4)

I'll respond quickly because I'm pressed with time.

  1. I don't think EA is fuzzy to the degree you seem to imply. I think the core of EA is something like what I described , which corresponds to the Wikipedia definition (a definition which is itself an effort to capture the common features of the many definitions that have been proposed).
  2. I don't understand your point about moral uncertainty. You mention the fact that Will wrote a book about moral uncertainty, or the fact that Beckstead is open to non-consequentialism, as relevant in this context, but I don't see their relevance. EA, in the sense captured by the above Wikipedia definition, is not committed to welfarism, consequentialism, or any other moral view. (Will uses the term 'welfarism', but I don't think he is using it in a moral sense, since he states explicitly that his definition is non-normative.) (ADDED: there is one type of moral uncertainty that is relevant for EA, namely uncertainty about population axiology, because it concerns the class of beings whom EA is committed to helping, at least if we interpret 'others' in "helping others effectively" as "whichever beings count morally". Relatedly, uncertainty about what counts as a person's wellbeing is also relevant, at least if we interpret 'helping' in "helping others effectively" as "improving their wellbeing". So it would be incorrect to say that EA has no moral commitments; still, it is not committed to any particular moral theory.)
  3. I agree it often makes sense to frame our concerns in terms of reasons that make sense to our target audience, but I don't see that as the role of the EA Wiki. Instead, as noted above, one key way in which the EA Wiki can add value is by articulating the distinctively EA perspective on the topic of interest. If I consult a Christian encyclopedia, or a libertarian encyclopedia, I want the entries to describe the reasons Christians and libertarians have for holding the views that they do, rather than the reasons they expect to be most persuasive to their readers.
MichaelA @ 2021-05-02T19:35 (+6)

I think you make some good points, and that my earlier comment was a bit off. But I still basically think it should be fine for the EA Wiki to include articles on how moral perspectives different from the main ones in EA intersect with EA issues. 

---

I think the core of EA is something like what I described , which corresponds to the Wikipedia definition (a definition which is itself an effort to capture the common features of the many definitions that have been proposed).

Yeah, I think the core of EA is something like what you described, but also that EA is fuzzy and includes a bunch of things outside that core. I think the "core" of EA, as I see it, also doesn't include anti-ageing work, and maybe doesn't include a concern for suffering subroutines, but the Wiki covers those things and I think that it's good that it does so.

(I do think a notable difference between that and the other moral perspectives is that one could arrive at those focus areas while having a focus on "helping others". But my basic point here is that the core of EA isn't the whole of EA and isn't all that EA Wiki should cover.)

Going back to "the EA Wiki should focus solely on considerations relevant from an EA perspective", I think that that's a good principle but that those considerations aren't limited to "the core of EA".

---

My understanding of EA, captured in the above Wikipedia definition, is not committed to welfarism, consequentialism, or any other moral view.

Was the word "not" meant to be in there? Or did you mean to say the opposite?

If the "not" is intended, then this seems to clash with you saying that discussion from an EA perspective would omit moral perspectives focused on the past, civilizational virtue, or cosmic significance? If discussion from an EA perspective would omit those things, then that implies that the EA perspective is committed to some set of moral moral views that excludes those things. 

Maybe you're just saying that EA could be open to certain non-consequentialist views, but not so open that it includes those 3 things from Ord's book? (Btw, I do now recognise that I made a mistake in my previous comment - I wrote as if "helping others" meant the focus must be welfarist and impartial, which is incorrect.)

---

I think moral uncertainty is relevant inasmuch as a bit part of the spirit of EA is trying to do good, whatever that turns out to mean. And I think we aren't in a position to rule out perspectives that don't even focus on "helping others", including virtue-ethical perspectives or cosmic significance perspectives. 

I don't think I'd want the cosmic significance thing to get its own wiki entry, but it seems fair for it to be something like 1 of 4 perspectives that a single entry covers, and in reality emphasised much less in that entry than 1 of the other 4 things (the present-focused perspective), especially if that entry is applying these perspectives to a topic many EAs care about anyway.

---

Your point 3 sounds right to me. I think I should retract the "advocacy"-focused part of my previous comment. 

But I think the "understanding these other actors" part still seems to me like a good reason to include entries on things along the lines of moral views that might be pretty foreign to EA (e.g., speciesism or the 3 not-really-helping-others perspectives Ord mentions).

---

Also, I just checked the 2019 EA survey, and apparently 70% of respondents identified with "consequentialism (utilitarian)", but 30% didn't, including some people identifying with virtue ethics or deontology. But I'm not sure how relevant that is, given that they might have flavours of virtue ethics or deontology that are still quite distinct from the related perspectives Ord mentions.

---

(Apologies if the amount I've written gave a vibe of me trying to batter you into giving up or something - it's more just that it'd take me longer to be concise.)

Pablo @ 2021-05-02T19:54 (+4)

(Typing from my phone; apologies for any typos.)

Thanks for the reply. There are a bunch of interesting questions I'd like to discuss more in the future, but for the purposes of making a decision on the issue that triggered this thread, on reflection I think it would be valuable to have a discussion of the arguments you describe. The reason I believe this is that existential risk is such a core topic within EA that an article on the different arguments that have been proposed to mitigate these risks is of interest even from a purely sociological or historical perspective. So even if we may not agree on the definition of EA, the relevance of moral uncertainty or other issues, luckily that doesn't turn out to be an obstacle for agreeing on this particular issue.

Perhaps the article should be simply called arguments for existential risk prioritization and cover all the relevant arguments, including longtermist arguments, and we could in addition have a longer discussion of the latter in a separate article, though I don't have strong views on this. (As it happens, I have a document briefly describing about 10 such arguments that I wrote many years ago, which I could send if you are interested. I probably won't be able to work on the article within the next few weeks, though I think I will have time to contribute later.)

MichaelA @ 2021-05-03T08:39 (+4)

Ok, I've gone ahead and made the tag, currently with the name Moral perspectives on existential risk reduction. I'm still unsure what the ideal scope and name would be, and have left a long comment on the Discussion page, so we can continue adjusting that later.

Pablo @ 2021-05-03T11:20 (+2)

Great, I like the name.

MichaelA @ 2020-11-30T01:25 (+3)

Longtermism (Cause Area)

We have various tags relevant to longtermism or specific things that longtermists are often interested in (e.g., Existential Risk). But we don't have a tag for longtermism as a whole. Longtermism (Philosophy) and Long-Term Future don't fit that bill; the former is just for "posts about philosophical matters relevant to longtermism", and the latter is "meant for discussion of what the long-term future might actually look like".

One example of a post that's relevant to longtermism as a cause area but that doesn't seem to neatly fit in any of the existing longtermism-related tags is Should marginal longtermist donations support fundamental or intervention research? An analogous post that was focused on global health & dev or mental health could be given the tags that cover those cause areas, and one focused on animal welfare could be given the Farm Animal Welfare and Wild Animal Welfare tags (which seem to me to together fill the role of a tag for that whole cause area).

EdoArad @ 2020-11-30T08:27 (+2)

Agreed. Perhaps Longtermism(Philosophy) is redundant because it could be Longetrmism (Cause Area) + Moral Philosophy  - if so, I'd suggest changing the name instead of opening a new tag

MichaelA @ 2020-12-01T00:30 (+4)

Hmm, I think I'd agree that most things which fit in both Longtermism (Cause Area) and Moral Philosophy would fit Longtermism (Philosophy). (Though there might be exceptions. E.g., I'm not sure stuff to do with moral patienthood/status/circles would be an ideal fit for Longtermism Philosophy - it's relevant to longtermism, but not uniquely or especially relevant to longtermism. But those things tie in to potential longtermist interventions.)

But now that you mention that, I realise that there might not be a good way to find and share posts at the intersection of two tags (which would mean that tags which are theoretically redundant are currently still practically useful). I've just sent the EA Forum team the following message about this:

[...]

I think the way one would currently [find and share posts at the intersection of two tags] is going to the frontpage, selecting two tags to filter by, and choosing +25 or required.

But when I do that for Longtermism (Philosophy) and Existential Risk at the same time (as a test), at Required no posts come up at all. But I expect there are many relevant posts with both tags, and I know at least "Crucial questions for longtermists" has both tags.

And when I do that at +25, I think what I get is just the regular frontpage. Or at least it's all pretty recent posts, most with neither of those tags.

Also, it'd be cool to be able to filter from a second tag from a tag page. E.g., to be on https://forum.effectivealtruism.org/tag/longtermism-philosophy , and then filter by another tag, like one could on the frontpage.

Finally, I think it'd be cool to have the filtering used come up in the url, so I can share a url with someone to direct them right to the intersection of two tags.

Currently I send multiple people multiple tag pages after EA conferences (and sometimes at other times), as they serve as handy collections of posts relevant to what the people expressed interest in. It'd be cool to be able to do the equivalent thing for intersections of tags as well.

Just some suggestions - not sure how easy or high-priority to implement they should be :)

So I'll hold off on making a Longtermism (Cause Area) tag or converting the Longtermism (Philosophy) tag into that until I hear back from the Forum team, and/or think more or get more input on what the best approach here would be.

EdoArad @ 2020-12-01T05:55 (+2)

šŸ‘

TrenchFloat @ 2020-10-03T18:56 (+3)

Change My View!

I found r/ChangeMyView recently and I think it's the bee's knees. "A place to post an opinion you accept may be flawed, in an effort to understand other perspectives on the issue."

There are already a good deal of questions and posts inviting criticism on this forum, and this tag could organize them all for the people who enjoy a good, clean disagreement/discussion. It could be used especially (or only) for ideas with <50% certainty.

The subreddit itself is a cool place to go, but many issues are more fruitfully discussed among fellow EAs, or would just work better on the EA Forum.


I'm happy to learn if Change My View is actually not a good format for discussion - I just found out about it, so no harm done.

MichaelA @ 2020-08-31T06:48 (+3)

(Update: I've now made this tag.)

Law

Some posts this could cover:

Arguments for:

Arguments against:

MichaelA @ 2020-08-29T06:01 (+3)

Economics

The Economics tag would be for posts focusing on topics in the domain of economics, making particularly heavy use of concepts or tools from economics, or highlighting ways for people with economics backgrounds to do good.

Some posts that would fit:

Arguments against this tag:

Arguments for:

evelynciara @ 2021-01-23T03:19 (+1)

Yeah, this would work. A general econ tag could focus on values other than economic growth, including equity and preventing hyperinflation.

evelynciara @ 2020-08-19T01:43 (+3)

How about a tag for global governance and/or providing global public goods? This is arguably one of the most pressing problems there is, because many of the problems EA works on are global coordination problems, including existential risk (since existential security is a global public good).

MichaelA @ 2020-08-19T12:51 (+3)

I'd agree that a tag for Global Governance would be good (thanks for suggesting it!). This could cover things like: 

  • how much various moves towards more global governance would help with existential risks and other global and/or transgenerational public goods issues
  • how much various moves towards more global governance could increase risks of totalitarianism
  • how to best implement or prevent various moves towards global governance
  • etc.

Personally, I don't see much value in a tag for something like providing global public goods. This is partly because that matter is common to so many different EA issues. Relatedly, I don't think many posts are especially focused on global public goods provision, relative to a huge portion of other posts. But that's just my tentative two cents.

If no one suggests otherwise or does it themselves, I'll probably create a Global Governance tag in a couple days.

MichaelA @ 2020-08-25T07:25 (+2)

Update: I've now made this tag.

Timothy_Liptrot @ 2020-08-15T18:33 (+3)

Please separate global development from global health.

Global health is one part of global development, which can include political, economic and humanitarian interventions. I write on politics in developing countries, but I'm probably the only one on the forum so I don't need my own tag.

Pablo @ 2021-04-27T12:19 (+2)

I agree this should be separated. I've made a note to split the articles (and rearrange the content/tags accordingly).

MichaelA @ 2020-08-12T09:13 (+3)

Cluelessness

Arguments against:

Arguments for:

MichaelA @ 2022-03-24T08:42 (+2)

Something like Crisis response 

Posts that would get this tag:

Update: Someone else seemingly independently created a tag with basically the same scope: https://forum.effectivealtruism.org/tag/emergency-response-teams/ 

MichaelA @ 2021-12-26T08:08 (+2)

READI Research

https://www.readiresearch.org/ 

My guess is that this org/collective/group doesn't (yet) meet the EA Wiki's implicit notability or number-of-posts-that-would-be-tagged standards, but I'm not confident about that. 

Here are some posts that would be given this tag if the tag was worth making:

MichaelA @ 2021-12-26T07:58 (+2)

Tags for some local groups / university groups

I'd guess it would in theory be worth having tags for EA Cambridge and maybe some other uni/local groups like EA Oxford or Stanford EA. I have in mind groups that are especially "notable" in terms of level and impact of their activities and whether their activities are distinct/novel and potentially worth replicating. E.g., EA Cambridge's seminar programs seem to me like an innovation other groups should perhaps consider adopting a version of, and with more confidence they seem like a good example of a certain kind of innovative thinking in local group organising. 

But I think it would probably be bad if this opened the floodgates to huge numbers of tags for local/uni groups. 

So I'm not sure if the best move is to have no such tags, use our best but somewhat "conservative"/"deletionist" judgement on which such tags to create, or try to come up with criteria that are more objective than my comments above and then follow that. 

(Also note that I'm no expert on local/university EA groups, my comments above are tentative, and my knowledge of which groups are doing which seemingly cool things is somewhat haphazardly arrived at.)

EDIT:

MichaelA @ 2021-10-05T09:05 (+2)

Diplomacy

Might overlap too much with things like international relations and international organizations?

Would partly be about diplomacy as a career path.

Pablo @ 2021-10-05T12:10 (+2)

Probably worth it, if there are enough relevant posts and/or if there's discussion here or elsewhere about diplomacy as a career path. 

evelynciara @ 2021-07-26T04:49 (+2)

Open society

The ideal of an open society - a society with high levels of democracy and openness - is related to many EA causes and policy goals. For example, open societies are associated with long-run economic growth, and an open society is conducive to the "long reflection." This tag could host discussion about the value of open societies, the meaning of openness, and how to protect and expand open societies.

Pablo @ 2021-07-26T12:00 (+4)

I agree that the concept of an open society as you characterize it has a clear connection to EA. My sense is that the term is commonly used to describe something more specific, closely linked to the ideas of Karl Popper and the foundations of George Soros (Popper's "disciple"), in which case the argument for adding a Wiki entry would weaken. Is my sense correct? I quickly checked the Wikipedia article, which broadly confirmed my impression, but I haven't done any other research.

Stefan_Schubert @ 2021-07-26T18:42 (+4)

Yes, I think your sense is correct.

evelynciara @ 2021-07-28T15:23 (+2)

Yeah, maybe something broader like "democracy" or "liberal democracy." Perhaps we could rename the "direct democracy" tag to "democracy"?

Aaron Gertler @ 2021-07-28T21:21 (+6)

The direct democracy tag is meant for investments in creating specific kinds of change through the democratic process. But people are using it for other things now anyway -- probably it's good to have a "ballot initiatives" tag and rename this tag to "democracy" or something else. Good catch!

Pablo @ 2021-07-29T15:03 (+6)

Here's what I did:

  • I renamed direct democracy to ballot initiative.
  • I added two new entries: democracy and safeguarding liberal democracy. The first covers any posts related to democracy, while the second covers specifically posts about safeguarding liberal democracy as a potentially high-impact intervention.

I still need to do some tagging and add content to the new entries.

Pablo @ 2021-07-28T22:54 (+2)

I agree. I'll deal with this tomorrow (Thursday), unless anyone wants to take care of it.

MichaelA @ 2021-07-26T10:24 (+2)

I do see this concept as relevant to various EA issues for the reasons you've described, and I think high-quality content covering "the value of open societies, the meaning of openness, and how to protect and expand open societies" would be valuable. But I can't immediately recall any Forum posts that do cover those topics explicitly. Do you know of posts that would warrant this tag?

If there aren't yet posts that'd warrant this tag, then we have at least the following (not mutually exclusive) options:

  1. This tag could be made later, once there are such posts
  2. You could write a post of those topics yourself
  3. An entry on those topics could be made
    • It's ok to have entries that don't have tagged posts
    • But it might be a bit odd for someone other than Pablo to jump to making an entry on a topic as one of the first pieces of EA writing on that topic?
      • Since wikis are meant to do things more like distilling existing work.
      • But I'm not sure.
      • This is related to the question of to what extent we should avoid "original research" on the EA Wiki, in the way Wikipedia avoids it
  4. Some other entry/tag could be made to cover similar ground
MichaelA @ 2021-06-26T12:57 (+2)

Update: I've now made this entry.

Alternative foods or resilient foods or something like that

A paragraph explaining what I mean (from Baum et al., 2016):

nuclear war, volcanic eruptions, and asteroid impact events can block sunlight, causing abrupt global cooling. In extreme but entirely possible cases, these events could make agriculture infeasible worldwide for several years, creating a food supply catastrophe of historic proportions. This paper describes alternative foods that use non-solar energy inputs as a solution for these catastrophes. For example, trees can be used to grow mushrooms; natural gas can feed certain edible bacteria. Alternative foods are already in production today, but would need to be dramatically scaled up to become the primary food source during a global food supply catastrophe

This is what ALLFED focus on, but other research or implementation work has been done on this topic separately from ALLFED (e.g.), and I currently think the topic is sufficiently plausibly important to get its own entry+tag rather than just being seen as just "that thing ALLFED does". 

Considerations regarding what to call this:

Pablo @ 2021-06-26T14:59 (+4)

I'm in favor. Very weak preference for alternative foods until resilient foods becomes at least somewhat standard.

Pablo @ 2021-06-25T20:26 (+2)

I now feel that a number of unresolved issues related to the Wiki ultimately derive from the fact that tags and encyclopedia articles should not both be created in accordance with the same criterion. Specifically, it seems to me that a topic that is suitable for a tag is sometimes too specific to be a suitable topic for an article.

I wonder if this problem could be solved, or at least reduced, by allowing article section headings to also serve as tags. I think this would probably be most helpful for articles that cover particular disciplines, such as psychology or computer science. Here it seems that it makes most sense to have a single article covering each discipline, yet multiple tags discussing different aspects of the discipline, such as research on that discipline, careers in that discipline, or applications of that discipline. Currently we take a hybrid approach, sometimes having entries for the discipline as a whole and sometimes for specific aspects of it.

Another advantage of allowing article sections to be used as tags is that some tags are currently associated with a very large number of posts. This suggests that a more fine-grained taxonomy of tags would organize the contents of Forum better, and allow users to find the material they want more easily.

A complication is that not all section headings will be suitable for tags. This issue could be solved in various ways. For example, the search field that opens when the user clicks on 'Add tag' could by default  only show the tags corresponding to article titles, just as it does currently. However, the user could be given the choice of expanding the tag to display the corresponding headings, and allow them to select among any of these. Perhaps headings already selected as tags by previous users could be shown by default in future searches.

I'm not particularly confident that this is a good idea. But it does seem like something at least worth discussing further.

Aaron Gertler @ 2021-07-04T22:31 (+2)

These are reasonable concerns, but adding hundreds of additional tags and applying them across relevant posts seems like it will take a lot of time.

As a way to save time and reduce the need for new tags, how many of your use cases do you think would be covered if multi-tag filtering was supported? That is, someone could search for posts with both the "psychology" and "career choice" tags and see posts about careers in psychology. This lets people create their own "fine-grained taxonomy" without so many tags needing to have a bunch of sub-tags.

MichaelA @ 2021-06-26T09:14 (+2)

I think something along these lines feels promising, but I feel a bit unsure precisely what you have in mind. In particular, how will users find all posts tagged with an article section heading tag? Would there still be a page for (say) social psychology like there is for psychology, and then it's just clear somehow that this page is a subsidiary tag of a larger tag?

Inspired by that question, I think maybe a more promising variant (or maybe it's what you already had in mind) is for some article section headings to be hyperlinked to a page whose title is the other page's section heading and whose contents is that section from the other page, below which is shown all the tags with that section heading tag. Then if a user edits the section or the "section's own page", the edit automatically occurs in the other place as well. 

And from "the section's own page" there's something at the top that makes it clear that this entry is a subsidiary entry of a larger entry and people can click through to get back to the larger one. Maybe the "something at the top" would look vaguely like the headers of posts that are in sequences? Maybe then you could even, like with sequences, click an arrow to the right or left to go to the page corresponding to the previous or following section of the overarching entry?

Stepping back, this seems like just one example of a way we could move towards more explicitly having a nested hierarchy of entries where the different layers are in some ways linked together. I imagine there are other ways to do that too, though I haven't brainstormed any yet.

Pablo @ 2021-06-13T14:49 (+2)

I am considering turning a bunch of relevant lists into Wiki entries. Wikipedia allows for lists of this sort (see e.g. the list of utilitarians) and some (e.g. Julia Wise) have remarked that they find lists quite useful. The idea occurred to me after a friend suggested a few courses I may want to add to my list of effective altruism syllabi. It now seems to me that the Wiki might be a better place to collect this sort of information than some random blog. Thoughts?

MichaelA @ 2021-06-13T15:14 (+2)

Quick thoughts:

  • I think more lists/collections would be good
  • I think it's better if they're accessible via the Forum search function than if they're elsewhere
  • I think it's probably better if they're EA wiki entries than EA Forum posts or shortforms because that makes it easier for them to be collaboratively built up
    • And this seems more important for and appropriate to a list than an average post
      • Posts are often much more like a particular author's perspective, so editing beyond copyediting would typically be a bit odd (that said, a function for making suggestions could be cool - but that's tangential to the main topic here)
  • I don't think I see any other advantage of these lists being wiki entries rather than posts or shortforms
  • I think the only disadvantages of them being lists are that then we might have too many random or messy lists that have an air of official-ness or that the original list creator gets less credit for their contributions (their name isn't attached to the list)
    • But the former disadvantage can apply to entries in general and so we already need sufficient policies, other editors, etc. to solve it, so doesn't seem a big deal for lists specifically
    • And the former disadvantage can also apply to entries in general and so will hopefully be partially solved by things like edit counters, edit karma, "badges", or the like
  • So overall this seems worth doing

Less important:

  • Various "collections" on my own shortform might be worth making into such entries
    • Though I think actually most of them are better fits for the bibliography pages of existing entries
      • (And ~ a month ago I added a link to those collections, or to all relevant items from the collections, to the associated entries that existed at the time)
MichaelA @ 2021-06-05T16:34 (+2)

Something like regulation

Intended to capture discussion of the Brussels effect, the California effect, and other ways regulation could be used for or affect things EAs care about.

Would overlap substantially with the entries on policy change and the European Union, as well as some other entries, but could perhaps be worth having anyway.

MichaelA @ 2021-06-05T10:08 (+2)

Update: I've now made this entry.

software engineering

Some relevant posts:

Related entries

artificial intelligence | public interest technology | SparkWave

[Though I think there was a discussion about how often we should include org tags in Related entries, and I can't remember what was said, so maybe SparkWave should be excluded.]

Pablo @ 2021-06-08T15:13 (+2)

Looks good to me.

MichaelA @ 2021-06-05T09:25 (+2)

Vetting constraints

Maybe this wouldn't add sufficient value to be worth having, given that we already have scalably using labour and talent vs. funding constraints.

Pablo @ 2021-06-12T12:48 (+4)

I think there should definitely be a place for discussing vetting constraints. My only uncertainty is whether this should be done in a separate article and, if so, whether talent vs. funding constraints should be split. Conditional on having an article on vetting constraints, it looks to me that we should also have articles on talent constraints and funding constraints. Alternatively, we could have a single article discussing all of these constraints.

MichaelA @ 2021-06-12T13:36 (+2)

I think I agree that we should either have three separate entries or one entry covering all three. I'm not sure which of those I lean towards, but maybe very weakly towards the latter?

MichaelA @ 2021-06-12T14:42 (+4)

Just discovered Vaidehi made a collection of discussions of constraints in EA, which could be helpful for populating whatever entries get created and maybe for deciding on scopes etc.

Pablo @ 2021-06-12T21:52 (+14)

Mmh, upon looking at Vaidehi's list more closely, it now seems to me that we should have a single article: people have proposed various other constraints besides the three mentioned, and I don't think it would make sense to have separate articles for each of these, or to have an additional article for "other constraints". So I propose renaming talent vs. funding constraints constraints in effective altruism. Thoughts?

MichaelA @ 2021-06-13T06:41 (+6)

I think that that probably makes sense.

Pablo @ 2021-06-15T14:51 (+4)

Done. (Though I used the name constraints on effective altruism, which seemed more accurate. I don't have strong views on whether the preposition should be 'in' or 'on', however, so feel free to change it.)

The article should be substantially revised (it was imported from EA Concepts), I think, but at least its scope is now better defined.

Pablo @ 2021-06-12T14:55 (+4)

Great. Let's have three articles then. Feel free to split the existing one, otherwise I'll do that tomorrow. [I know you like this kind of framing. ;) ]

Stefan_Schubert @ 2021-06-05T10:13 (+4)

Vetting constraints dovetails nicely with talent vs. funding constraints. I'm not totally convinced by the scalably using labour entry, though. One possibility would be to just replace it by a vetting constraints entry. Alternatively, it could be retained but renamed/reconceptualised.

Pablo @ 2021-06-08T14:26 (+2)

Yeah, scalably using labor just doesn't strike me as a natural topic for a Wiki entry, though I not sure exactly why. Maybe it's because it looks like the topic was generated by considering an interesting questionā€”"how should the EA community allocate its talent?"ā€”and creating an entry around it, rather than by focusing on an existing field or concept.

I'd be weakly in favor of merging it with vetting constraints.

MichaelA @ 2021-06-08T15:36 (+2)

I'm currently in favour of keeping scalably using labour, though I also made the entry so this shouldn't be much of an update (it's not like a "second vote", just a repeat of the first vote after hearing the new arguments). 

One consideration I'd add is that maybe it's a more natural topic for a tag than a wiki entry? It seems to me like having a tag for posts relevant to a (sufficiently) interesting and recurring question makes sense?

Stefan_Schubert @ 2021-06-08T16:03 (+4)

Fwiw, I think that "scalably using labour" doesn't sound quite like a wiki entry. I find virtually no article titles including the term "using" on Wikipedia.

If one wants to retain the concept, I think that "Large-scale use of labour" or something similar would be better. There are may Wikipedia article titles including the term "use of [noun]". (Potentially nouns are generally better than verbs in Wikipedia article titles? Not sure.)

MichaelA @ 2021-05-30T14:44 (+2)

Intelligence assessment or Intelligence (military and strategy) or Intelligence agencies or Intelligence community or Intelligence or something

I don't really like any of those specific names. The first is what Wikipedia uses, but sounds 100% like it means IQ tests and similar. The second is my attempt to put a disambiguation in the name itself. The third and and fourth are both too narrow, really - I'd want the entry to not just be about the agencies or community but also about the type of activity they undertake. The fifth is clearly even more ambiguous than the first, but is also the term that's most commonly used, I think - it's usually just clear from context, which doesn't work as well for an entry/tag.

MichaelA @ 2021-05-29T10:39 (+2)

(Edit: I've now made this entry.)

Independent research

Proposed text:

Independent research is research conducted by an individual who is not employed by any organisation or institution, or who is employed but is conducting this research separately from that. This person may or may not have funding for this research (e.g., via grants). Research that is done by two or more people collaborating, but still separate from an organisation or institution, could arguably be considered independent research.

There are various advantages and disadvantages of independent research relative to research conducted as part of an organisation or institution. [Ideally someone will in future edit the text to say more about what those pros and cons actually are.]

This tag is intended for posts relevant to things like tips for doing independent research or advantages and disadvantages of independent research, not just for posts that are examples of independent research.

Related entries

EA funding | org strategy | research methods | research training programs | scalably using labour 

Uncertainties about that text:

Examples of posts that would warrant this tag:

MichaelA @ 2021-05-29T11:16 (+2)

Some text from the latest LTFF report that could be drawn on when discussing advantages and disadvantages within this entry:

The largest pitfall in my mind for long-term independent researchers is oneā€™s research becoming detached from the actual concerns of a field and thereby producing negligible value. Tegan seems to have avoided this pitfall so far, thanks to her research judgment and understanding of the relevant areas, and I see no evidence that sheā€™s headed towards it in the future.

Another potential pitfall of independent research is a general lack of feedback loops, both for specific research projects and for the individualā€™s research skills. One way that independent researchers may be able to produce stronger feedback loops for their work is by sharing more intermediate work. While Tegan has shared (and received feedback from) some senior longtermist researchers on some of her intermediate work, I think she would probably benefit from sharing intermediate work more broadly, such as on the EA forum.

Finally, independent research can struggle to get as much traction as other work (keeping quality constant), as itā€™s less likely to be connected to organizations or networks where it will naturally be passed around. My sense is that Teganā€™s research hasnā€™t gotten as much attention as it ā€œdeservesā€ given its level of quality, and that many who would find value in the research arenā€™t aware of it. Fixing such a dynamic generally requires a more active promotion strategy from the researcher. Again, I think posting more intermediate work could help here, as it would create more instances where others see the work, learn about what the researcher is working on, and perhaps even offer feedback.

Pablo @ 2021-05-29T13:46 (+4)

Looks good, thanks!

MichaelA @ 2021-05-29T09:44 (+2)

Edit: I've now made this entry.

Longtermist Entrepreneurship Fellowship

I think this is only mentioned in three Forum posts so far[1], and I'm not sure how many (if any) would be added in future. 

It's also mentioned in this short Open Phil page: https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-jade-leung

I'm also not sure if the name is fully settled - different links seem to use different names, or to not even use a capitalised name.

[1] https://forum.effectivealtruism.org/posts/diZWNmLRgcbuwmYn4/long-term-future-fund-may-2021-grant-recommendations#Longtermist_Entrepreneurship_Fellowship

https://forum.effectivealtruism.org/posts/6x2MjPXhpPpnatJFQ/some-promising-career-ideas-beyond-80-000-hours-priority#Nonprofit_entrepreneurship

https://forum.effectivealtruism.org/posts/SppupBEiPCAYA5nLW/cea-s-2020-annual-review#Internal

Pablo @ 2021-05-29T13:56 (+4)

I'm in favor, though there's so little public information at this stage that inevitably the entry won't have any substantive content for the time being.

MichaelA @ 2021-05-27T08:11 (+2)

Weapons of mass destruction

Related entries

anthropogenic existential risk | armed conflict | biosecurity | global governance | Nuclear Threat Initiative | nuclear warfare | peace and conflict studies | terrorism

Pablo @ 2021-05-27T11:35 (+4)

Looks good.

MichaelA @ 2021-05-27T12:18 (+4)

Cool - given that, I've now made this (though without adding body text or tagging things, for time reasons). 

MichaelA @ 2021-04-26T19:47 (+2)

Make entries for many of the concepts featured on Conceptually

I read the content on that site in 2019 and found it useful. I haven't looked through what concepts are on there to see which ones we already have and which ones might be worth adding, but I expect it'd be useful for someone to do so. So I'm noting it here in case someone else can do that (that'd be my preferred outcome!), or to remind myself to do it in a while if I have time. 

Pablo @ 2021-05-02T12:42 (+2)

I like Conceptually, and during my early research I went through their list of concepts one by one, to decide which should be covered by the EA Wiki, though I may have missed some relevant entries. Thoughts on which ones we should include, that aren't already articles or are listed in our list of projected entries?

MichaelA @ 2021-04-23T07:10 (+2)

Update: I've now made this entry.

Fermi estimation or Fermi estimates

Overlaps with some other things in the Decision Theory and Rationality cluster of the Tags Portal.

Pablo @ 2021-04-26T11:39 (+4)

I agree that this should be added. I weakly prefer 'Fermi estimation'.

MichaelA @ 2021-04-21T14:27 (+2)

Demandingness objection

I'd guess there are at least a few Forum posts quite relevant to this, and having a place to collect them seems nice, but I could be wrong about either of those points.

Pablo @ 2021-04-21T14:32 (+4)

I agree it's relevant. But we already have an article: demandingness of morality.

(It's likely you haven't seen it because many of these articles were Wiki-only until very recently.)

MichaelA @ 2021-04-21T14:41 (+2)

Yeah, I just spotted that and the fact I had a new notification at the same time, and hoped it was anything other than a reply here so I could delete my shamefully redundant suggestion before anyone spotted it :D

(I think what happened is that I used command+f on the tags portal before the page had properly loaded, or something.)

MichaelA @ 2021-04-20T11:53 (+2)

Antimicrobial resistance or Antibiotic resistance

Not sure enough EAs care about this and/or have written about this on the Forum for it to warrant an entry/tag?

(I don't personally have much interest in this topic, but I'm just one person.)

MichaelA @ 2021-06-24T16:00 (+2)

A couple relevant posts I stumbled upon:

MichaelA @ 2021-04-20T07:44 (+2)

Update: I've now made this tag.

Something like Bayesianism

Arguments against having this entry/tag:

Pablo @ 2021-04-20T10:57 (+2)

Yeah, perhaps name it Bayesian reasoning or Bayesian epistemology?

MichaelA @ 2021-04-19T15:15 (+2)

Cognitive biases/Cognitive bias, and/or entries for various specific cognitive biases (e.g. Scope neglect)

I feel unsure whether we should aim to have just a handful of entries for large categories of biases, vs one entry for each of the most relevant biases (even if this means having 5+ or 10+ entries of this type)

Pablo @ 2021-04-26T11:43 (+4)

My sense is that it would be desirable to have both an overview  article about cognitive bias, discussing the phenomenon in general (e.g. the degree to which humans can overcome cognitive bias, the debate over how desirable it is to overcome them, etc.) as well as articles about specific instances of it.

MichaelA @ 2021-04-26T19:26 (+2)

I think you mean it'd be desirable to have both a general article on cognitive bias and one article each for various specific instances of it?

Rather than having just one general article that covers both the topic as a whole and specific instances of it?

Given my assumed interpretation of what you meant, I've now made an entry for Cognitive biases and another for Scope neglect. People could later add more, or delete some, or whatever.

(I've now copied the content of this thread to the Discussion page on the Cognitive biases entry. If you or others would like to reply, please do so there.)

MichaelA @ 2021-04-18T09:01 (+2)

Update: I've now made this entry.

Instrumental vs. epistemic rationality

Some brief discussion here.

These terms may basically only be used on the LessWrong community, and may not be prominent or useful enough to warrant an entry here. Not sure.

Pablo @ 2021-04-26T11:43 (+2)

I think this would be useful to have.

MichaelA @ 2021-04-18T09:00 (+2)

Metaethical uncertainty and/or Metanormative uncertainty

These concepts are explained here.

I think it's probably best to instead have an entry on "Normative uncertainty" in general that has sections for each of those concepts, as well as sections that briefly describe (regular) Moral uncertainty and Decision-theoretic uncertainty and link to the existing tags on those concepts. (Also, the entry on Moral uncertainty could discuss the question of how to behave when uncertain what approach to moral uncertainty is best, which is metanormative uncertainty.) This is because I think there are relatively few posts specifically on Metaethical and Metanormative uncertainty, and some of those that there are are also relevant to other types of normative uncertainty in a broad sense.

But it's possible that "Normative uncertainty" is best defined as uncertainty just about regular normative ethics, such that it shouldn't be seen as covering metaethical and metanormative uncertainty. And it's also possible that, in any case, those concepts are important enough to warrant their own entries.

MichaelA @ 2021-04-18T08:55 (+2)

Subjective vs. objective normativity

See here and here

MichaelA @ 2021-04-17T15:32 (+2)

Update: I've now made this entry.

Disentanglement research

Defined here: https://forum.effectivealtruism.org/posts/RCvetzfDnBNFX7pLH/personal-thoughts-on-careers-in-ai-policy-and-strategy

Off the top of my head, I'm not sure how many posts would get this tag. But I know at least that one would, and I'd guess we'd find several more if we looked.

And in any case, this seems to be a useful concept that's frequently invoked in the EA community, so having a short wiki entry on it might be good (even ignoring tagging).

Related entries:

https://forum.effectivealtruism.org/tag/scalably-involving-people https://forum.effectivealtruism.org/tag/research-methods 

ETA: I've just seen this post: How would you define "disentanglement research"? The existence of that post updates me towards slightly more confidence that this entry would be worth having. And the content in that post could be useful for this entry.

MichaelA @ 2021-04-17T15:56 (+4)

Another suggestion: Research distillation or Research debt or similar

We could have:

  1. an entry for this and another for disentanglement research (with links between them)
  2. one entry covering both
  3. one entry that's mainly on one topic but briefly mentions/links to the other
  4. neither

What I have in mind is what's discussed here: https://distill.pub/2017/research-debt/

Off the top of my head, I'm not sure how many posts would get this tag. But maybe some would?

And in any case, this seems to me to be a useful concept that's sometimes invoked in the EA community, so having a short wiki entry on it might be good (even ignoring tagging). But I'm less confident I've heard this mentioned a lot in EA than I am with disentanglement research.

This is obviously very similar to the idea of a research summary. But I think that these terms and the Distill article add some value. And the research summary tag is currently only for research summaries, not for discussion of the value of or best practices for distilling research or making summaries.

Related entries:

https://forum.effectivealtruism.org/tag/scalably-involving-people https://forum.effectivealtruism.org/tag/research-methods 

MichaelA @ 2021-04-17T07:54 (+2)

Tag portal question/suggestion:

Many tags are probably relevant for more than one of the categories/clusters used on the tag portal. For example, Economic growth is currently listed under global health & development, but it's also relevant to Long-Term Risk and Flourishing and to Economics & Finance and probably some other things.

Currently, I think each tag is only shown in one place on the portal. That might be the best move.

But maybe they should instead be mentioned in every place where they're (highly) relevant, and where people might expect to find them? E.g., if I checked Economics & Finance and saw no tag for Economic growth, I might assume that that tag doesn't exist and so try to make it.

Maybe there's some elegant third option to handle this?

MichaelA @ 2021-03-31T00:44 (+2)

Crypto or something like that

Some EAs are working on or interested in things like crypto and blockchain, either as investment opportunities or as tools that might be useful for accomplishing things EAs care about (e.g., mechanism design, solving coordination problems). Maybe there should be a tag for posts relevant to such things. I'd guess that there are at least 3 relevant Forum posts, though I haven't checked. 

There are also at least two 80,000 Hours episodes that I think are relevant:

EdoArad @ 2021-03-31T08:07 (+2)

I would prefer Blockchain, as it is more general than cryptocurrency and doesn't confuse people with the field of cryptology

MichaelA @ 2021-03-31T10:28 (+2)

Good points. I've now created the tag and used the name Blockchain.

JP Addison @ 2021-03-31T08:14 (+2)

Reasonable because the generality, though I think the cryptography ship has long, long since sailed.

EdoArad @ 2021-03-31T11:15 (+2)

šŸ˜¢

JP Addison @ 2021-03-31T07:16 (+2)

Seems good. Maybe we should crosspost one of the recent articles on Sam Bankman-Freid. 

MichaelA @ 2021-03-31T10:31 (+2)

I've now created the tag. Feel free to make those crossposts and give them the tag, of course :) 

(I won't do it myself, as I have little knowledge about or personal interest in blockchain stuff myself.)

MichaelA @ 2021-03-12T08:20 (+2)

Update: I've now made this entry

Non-Humans and the Long-Term Future

Why I propose this:

Examples of posts that would warrant this tag:

Alternative tag name options:

MichaelA @ 2021-02-17T07:40 (+2)

Update: I've now made this entry

Positive futures (or Utopias, or Ideal futures, or something like that)

Proposed description:

The positive futures tag is for posts that discuss things like what a particularly good long-term future might look like and what sorts of ideal long-term futures we might want to aim towards. Reasons to care about this topic include that: 

  • How positive the future might be influences how important reducing existential risk is
  • Positive visions for the future could motivate work to reduce existential risks
  • Thinking about what we want the future to look like might help us work out what scenarios might constitute (non-extinction) existential catastrophes

See also Long-Term Future, Longtermism (Philosophy), and Existential risk.

Or maybe the "reasons to care about this topic" part is too long/my-own-opinion-y to include in a tag description?

Here's an example of a post that would warrant this tag: Characterising utopia. I think many/most posts tagged Fun Theory on LessWrong would also fit this tag.

There are two people I would've sent this tag page to this week, if it existed and was populated with a few posts, and I think their upcoming work may warrant this tag. This is what prompts me to suggest this.

A bit more on my thinking on this, from a shortform post of mine:

Efforts to benefit the long-term future would likely gain from better understanding what we should steer towards, not merely what we should steer away from. This could allow more targeted actions with better chances of securing highly positive futures (not just avoiding existential catastrophes). It could also help us avoid negative futures that may not appear negative when superficially considered in advance. Finally, such positive visions of the future could facilitate cooperation and mitigate potential risks from competition (Dafoe, 2018 [section on "AI Ideal Governance"]). Researchers have begun outlining particular possible futures, arguing for or against them, and surveying peopleā€™s preferences for them. 

MichaelA @ 2021-01-23T02:07 (+2)

Fermi Paradox

Arguments for having this tag:

Arguments against:

Aaron Gertler @ 2021-03-03T07:07 (+4)

This is currently a wiki-only tag. I doubt many posts are relevant to this, and I suspect that "Space" should work for all of them, but we're still in the process of figuring out how useful a tag has to be to be worth adding to the tagging menu.

MichaelA @ 2021-01-21T11:48 (+2)

EA fellowships

I think it might be useful to have a post on EA fellowships, meaning things like the EA Virtual Programs, which "are opportunities to engage intensively with the ideas of effective altruism through weekly readings and small group discussions over the course of eight weeks. These programs are open to anyone regardless of timezone, career stage, or anything else." (And not meaning things like summer research fellowships, for which there's the Research Training Programs tag.)

I think this'd be a subset of the Event strategy tag.

But I'm not sure if there are enough posts that are highly relevant to EA fellowships for it to be worth having this tag in addition to the Event strategy tag. And maybe a somewhat different scope would be better (i.e., maybe something else should be bundled in with this).

MichaelA @ 2020-12-05T08:42 (+2)

Update: I've now made this tag.

ITN

Proposed description: 

The ITN tag is for posts about the Importance, Tractability, Neglectedness framework that is frequently used in effective altruism, or about highly related matters. This could include posts critiquing the ITN framework, discussing in abstract terms how it should and shouldn't be applied, and discussing other factors that could be considered alongside or instead of ITN. 

This tag is not necessarily meant to capture the much larger set of posts which in some way use the ITN framework. 

See also Cause Priorization, Career Choice, Criticism (EA Movement), and Impact Assessment.

I think this post would entirely be a subset of Cause Prioritization, but it seems like an important subset with a bunch of posts in it, and which people might sometimes want to seek out specifically. 

Two posts that would fit in this tag, and that contain links to a bunch of other posts that'd fit, are [WIP] Summary Review of ITN Critiques and Factors other than ITN?

These posts would also fit: 

MichaelA @ 2020-11-22T01:21 (+2)

Advanced Military Technology (or some other related name)

Proposed description:

The Advanced Military Technology tag is for posts about military technologies that are on the cutting-edge, that are in the process of development, that appear to be on the horizon, or that could plausibly be developed in future. This could include both "entirely new" technologies and substantial advances in existing technologies.

See also Armed Conflict, Autonomous Weapons, and Differential Progress.

Other tags that this overlaps with include: AI Governance, Atomically Precise Manufacturing, Biosecurity, Space, Scientific Progress, Nuclear Weapons, and Biosecurity.

I think that this tag would be entirely a subset of Armed Conflict, but in my view an important subset. I think it would be a superset of Autonomous Weapons. I don't think it would be a subset or superset of any of the other tags I mentioned (as each of those areas can but won't always be about advanced military technologies).

One post that would fit here but might not fit any of the others of those tags except Armed Conflict is What's the big deal about hypersonic missiles?

JP Addison @ 2020-11-23T16:05 (+2)

I agree with whoever upvoted the other of the two tags you made this day but not this one. I would want to see more posts that formed a natural cluster around this concept. The one example is good, but I can't recall any others.

MichaelA @ 2020-11-24T01:02 (+2)

Yeah, that makes sense. I'll hold off unless I encounter additional relevant posts.

MichaelA @ 2020-09-08T09:22 (+2)

(Update: I've now made this tag.)

Impact Assessment (or maybe something like Impact Measurement or Measuring Impact)

Proposed rough description: 

The Impact Assessment tag is for posts relevant to "measuring the effectiveness of organisational activities and judging the significance of changes brought about by those activities" (source). This could include posts which include impact assessments; discuss pros, cons, and best practices for impact assessment; or discuss theories of change or forecasts of impacts, against which measured impacts could later be compared. 

See also Org Strategy, Statistical Methods, and Research Methods.

A handful of the many posts that this tag would fit: 

MichaelA @ 2020-09-08T09:23 (+2)

In addition to the three tags mentioned as ā€œSee alsoā€, this tag would perhaps overlap a bit with the tags:

  • Forecasting
  • Org Update
  • Cause Prioritization
  • Community Projects
  • Criticism (EA Cause Areas)
  • Criticism (EA Movement)
  • Criticism (EA Orgs)
  • Data (EA Community)
  • EA Funding
MichaelA @ 2020-08-05T00:23 (+2)

Global Catastrophic Risk

Argument against:

Argument for:

Some posts that might fit this tag but not the Existential Risk tag:

MichaelA @ 2021-04-18T09:23 (+1)

Nonlinear Fund

Maybe it's too early to make a tag for that org?

evelynciara @ 2021-01-23T03:17 (+1)

"Economic Policy" or "Macroeconomic Stabilization"

Pros:

Cons:

MichaelA @ 2021-01-23T02:07 (+1)

Simulation Argument

Arguments for having this tag:

Arguments against:

Aaron Gertler @ 2021-03-03T07:08 (+4)

This is currently a wiki-only tag. I doubt many posts are relevant to this, but we might make it usable again ā€” we're still in the process of figuring out how useful a tag has to be to be worth adding to the tagging menu.

BrianTan @ 2021-01-04T06:08 (+1)

Can I create a tag called "EA Philippines", for posts by people related to EA Philippines, such as about our progress or research?  I'd like to easily see a page compiling posts related to EA Philippines. I could create a sequence for this, but a sequence usually implies things are in a sequential order and more related to each other. But our posts will likely be not that related to each other, so a tag would likely be better.

A counterargument is I currently don't see any tags for any EA chapter, except for EA London updates, But these aren't about EA London specifically - they're just the updates they compile on the EA movement. Adding in one tag for one chapter seems harmless, but if eventually 50-100 chapters do this, things might get disorganized. Curious to hear others' thoughts on this!

MichaelA @ 2021-01-04T06:21 (+3)

Quick thoughts:

  • There's been some discussion of "country-specific tags" (and region-specific tags) here
  • I think perhaps decisions about general principles for country-specific tags and general principles for EA-chapter-specific tags should be made in tandem
    • E.g., because it'd be a bit weird to have both a tag for the Philippines as a country (e.g., about the relevance of that country for EA cause areas) and a tag for EA Philippines
      • Maybe the best option would be to just have country- or region-specific tags that also serve sort-of like EA-chapter-specific tags, unless there are e.g. more than 10 posts relevant to that EA chapter specifically, or more than 20 posts that'd be in the whole tag?
        • (This is just one possible, quickly thought up principle)
  • But I'm not actually sure what the principles should be
    • E.g., if something like the above principle is adopted, I'm not sure what numbers should be used (I chose 10 and 20 pretty randomly)
    • And I'm not sure how that sort of principle should interact with the option of region-specific tags
      • E.g., maybe it'd be best to just have a tag like Southeast Asia, and let that play roles similar to that that would be played by country-specific and EA-chapter-specific tags for each country in that region?
        • Or maybe if there's a tag for Southeast Asia, that's so broad that it then becomes useful to have an EA Philippines tag (but without there being need for a Philippines tag)?
BrianTan @ 2021-01-05T00:28 (+1)

I think it's a good idea to go with a Philippines tag rather than an EA Philippines tag. Both are quite interchangeable because 100% of past posts (there's 5 of them) related to the Philippines are also written by people in EA Philippines, and 100% of past posts by EA Philippines are related to the Philippines. 

I think this will continue for quite a few years for ~80-100% of posts, since we expect only a few people to not be affiliated with EA Philippines but still be writing about the Philippines. I think that 90-100% of posts by EA Philippines will relate to the Philippines.

I also agree that for national EA groups, rather than have an EA-chapter-specific tag as well as a country-specific tag, we should just have the country-specific tag.

I don't understand how a post related specifically to an EA chapter wouldn't also be related to the country, so I think one country tag (rather than a country and a chapter tag) is enough. 

I would prefer to just have a Philippines tag already rather than a Southeast Asia tag. This is because:

  1. I think we'll hit 10 posts soon, i.e. by the midpoint of 2021
    1. We already have 5 past posts that could be tagged under Philippines
    2. I have ~3 more posts coming up (likely this month) that would also be tagged under Philippines
  2. Therefore rather than tagging these posts as under Southeast Asia, then having to move them to Philippines after we hit 10 posts, I'd rather we just have them tagged as under the Philippines already.

I think the principle should be like "If there are 5 or more posts already for a specific country or EA national chapter, and if you would want to create a tag for easier visibility of posts related to that country/chapter, then you should create a tag for that specific country already." Let me know what you think of this principle!

MichaelA @ 2021-01-05T01:43 (+3)

I think the principle should be like "If there are 5 or more posts already for a specific country or EA national chapter, and if you would want to create a tag for easier visibility of posts related to that country/chapter, then you should create a tag for that specific country already." Let me know what you think of this principle!

That sounds good to me :)

(Though of course this is just one person's thoughts - I have no official role in the EA Forum; I'm just a nerd for tags.)

BrianTan @ 2021-01-05T10:00 (+3)

Alright. I've gone ahead and made the Philippines tag here, along with a description for it. I've also tagged all 5 pasts posts on this topic already. The description I wrote could be a template for how other country-specific tags should be like. I felt that the description you wrote for China didn't apply as much to the Philippines tag. 

If you or anyone else wants to let me know if the description is alright, or if I should change anything, let me know!

MichaelA @ 2021-01-05T14:50 (+3)

The description looks good to me!

And I agree that it seems like it could be a useful example/template for other country-specific tags to draw on.

MaxRa @ 2020-12-25T21:58 (+1)

Country-specific tags

I just saw "creation of country specific content"as an example among the higher rated meta EA areas in the recent article What areas are the most promising to start new EA meta charities - A survey of 40 EAs. What do you think about introducing tags for specific countries? E.g. I'd already have a couple of articles in mind that would be specifically interesting for members of German/Austrian/Swiss communities.

MichaelA @ 2020-12-26T00:12 (+3)

Personally, I think: 

  • it probably makes sense to have at least some tags to mark that posts are relevant to particular countries/regions
  • but that this should probably be something like 2-20 tags, just in the cases where there are several posts for which the tag would be useful
    • Rather than e.g. a tag for every country (which I'm not saying you proposed!)

Relevant prior tags and discussion

There are already tags for China and the European Union. The tag description for the China one (which I wrote) could perhaps be used as a model/starting point for other such tags:

The China tag is for posts that are about China, that address how China is relevant to various issues EAs care about, or that are relevant to how one could have an impact by engaging with China.

See also Global Outreach and International Relations.

And when I proposed the China tag, I wrote:

It seems perhaps odd to single China out for a tag while not having tags for e.g. USA, Southeast Asia, ASEAN, United Nations, Middle Powers. But we do have a tag for posts relevant to the European Union. And China does seem like a particularly important topic, and one that it makes sense to have a specific tag for. And maybe we should indeed have tags for United Nations and Middle Powers.

I'd be interested in thoughts on whether BRICS, Rising Powers, or something else would be a better label/scope for this tag than China.

MaxRa @ 2020-12-26T17:16 (+1)

Yes, I also had something like 5-15 tags in mind. Your proposal for China makes sense to me, though I had a more "internal" perspective in mind, where EAs from the US/UK/Australia/Germany/Canada/etc. could get an overview of articles that are relevant for their specific country and are maybe indirectly encouraged to add something. So I'd write it as

The [country/region] tag is for posts that are about [country/region], that are especially relevant for EAs from and EA communities in [country/region] or that are relevant for projects that involve [country/region].

Looking at the EA Survey results on geographic distribution, I'd maybe do

  • US
  • UK
  • Austria-NZ
  • Germany-Austria-Switzerland
  • Canada
  • Netherlands
  • France
  • Scandinavia
  • Southeast Asia
  • Latin America
BrianTan @ 2020-12-18T14:56 (+1)

Should we have a tag for "Feedback Request"?

We in EA Philippines have made 2 posts (and have another upcoming one) already that were specifically for requesting feedback from the global EA community on an external document we wrote, before we post this document for the general public. See here and here as examples from EA PH, and this other example from a different author. 

I think it happens quite often that EAs or EA orgs ask for feedback on an external document or on a writeup they have rough thoughts on, so I think it's worth having this tag.

A potential counterargument to this being a tag is that lots of authors (or most authors) would want feedback on their posts anyway, and it's hard separating which ones are feedback requests and which ones aren't. I guess the use of this tag is ideally for posts that authors specifically want answers to a few questions for, or if they want feedback on an external document, rather than just getting general feedback on their article. Would appreciate any thoughts on this!

BrianTan @ 2020-12-18T15:01 (+1)

Another potential argument in favor of having a tag for Feedback Request is it might encourage EAs to share work with each other and get feedback more often, which is likely a good thing. 

In my workplace at First Circle, we have a process called "Request for Comment" or "RFC" where we write documents in a specific format and share them on an #rfc slack channel, so that people know we want feedback on a proposal or writeup in order to move forward with our work. This was very effective in getting people to share work, get feedback on work asynchronously rather than via a synchronous meeting, and to streamline and house one place for feedback requests. Maybe a tag for "Feedback Request" could also streamline things?

For example, if an EA wants to see what they could give feedback on, they could click this tag to check out things they could give feedback on. 

It could also be good practice for authors of feedback requests to put a deadline on when they need feedback on something by. This is so people backreading know if they should still give feedback if a deadline has passed.

EdoArad @ 2020-12-18T15:31 (+2)

I made a tag for requests, which I think applies here if there is a specific request for feedback with timeframe. I'll write a short post about it now.

MichaelA @ 2020-12-19T00:17 (+2)

Yeah, I think I'd personally lean towards letting the thing Brian is describing be covered by the Requests (Open) tag. This is partly because, as Brian notes, "lots of authors (or most authors) would want feedback on their posts anyway, and it's hard separating which ones are feedback requests and which ones aren't."

I'm also not really sure I understand the distinction, or the significance of the distinction, between that wanting feedback on an external doc before sharing it more beyond the EA community and wanting feedback on a post before that, or an adapted form of that, is shared beyond the EA community. (One part of my thoughts here is that I think a decent portion of posts may ultimately make their way into things shared beyond the EA community, and sometimes the authors won't be sure in advance which posts those are. E.g., MacAskill's hinge of history post is now an academic working paper.)

That said, I've also appreciated the existence of Slack channels where people can solicit feedback from colleagues. (I've appreciated that both as an author and as a person who enjoys being helpful by giving feedback.) And the EA Editing & Review facebook group seems to demonstrate some degree of demand for this sort of thing in EA. So maybe there's a stronger case for the tag than I'm currently seeing.

(OTOH, maybe the need could be well-met just by using the Requests (Open) tag and posting in EA Editing & Review?)

If a Feedback Request tag is made, perhaps it'd be worth linking in the tag description to Giving and receiving feedback, Asking for advice, and/or Discussion Norms?

BrianTan @ 2020-12-19T04:17 (+1)

Oh cool, yeah I guess this works!

fmoreno @ 2020-08-08T17:50 (+1)

Sorry if offtopic but how do I remove a tag after wrongly using it?

MichaelA @ 2020-08-08T22:52 (+5)

If you mean un-tagging a page, you vote down its relevance by hovering over the tag on the page and clicking the < arrow. If the relevance score gets to or below 0, the tag is removed.

If you mean deleting a tag entirely (not just from one page), I think you'd have to message the EA Forum team?

More info on tags here and here.