Truthseeking is the ground in which other principles grow

By Elizabeth @ 2024-05-27T01:11 (+104)

This is a crosspost, probably from LessWrong. Try viewing it there.

null
Brad West @ 2024-05-28T17:40 (+10)

Thank you for this insightful post. While I resonate with the emphasis on the necessity of truthseeking, it's important to also highlight the positive aspects that often get overshadowed. Truthseeking is not only about exposing flaws and maintaining a critical perspective; it's also about fostering open-mindedness, generating new ideas, and empirically testing disagreements. These elements require significantly more effort and resources compared to criticism, which often leads to an oversupply of the latter and can stifle innovation if not balanced with constructive efforts.

Generating new ideas and empirically testing them involves substantial effort and investment, including developing hypotheses, designing experiments, and analyzing results. Despite these challenges, this expansive aspect of truthseeking is crucial for progress and understanding. Encouraging open-mindedness and fostering a culture of curiosity and innovation are essential. This aligns with your point about the importance of embracing unconventional, “weird” ideas, which often lie outside the consensus and require a willingness to explore and challenge the status quo.

Your post reflects a general EA attitude that emphasizes the negative aspects of epistemic virtue while often ignoring the positive. A holistic approach that includes both the critical and constructive dimensions of truthseeking can lead to a more comprehensive understanding of reality and drive meaningful progress. Balancing criticism with creativity and empirical testing, especially for unconventional ideas, can create a more dynamic and effective truthseeking community.

Elizabeth @ 2024-05-29T18:47 (+6)

Your post reflects a general EA attitude that emphasizes the negative aspects [...]

 

Something similar has been on my mind for the last few months. It's much easier to criticize than to do, and criticism gets more attention than praise. So criticism is oversupplied and good work is undersupplied. I tried to avoid that in this post by giving positive principles and positive examples, but sounds like it still felt too negative for you. 

Given that, I'd like to invite you to be the change you wish to see in the world by elaborating on what you find positive and who is implementing it[1].

  1. ^

    this goes for everyone- even if you agree with the entire post it's far from comprehensive

Ulrik Horn @ 2024-05-27T08:09 (+9)

I generally really resonate with this piece. At the same time, when reading this, I keep wondering if this is advice perhaps more appropriate in LW/rationalist circles than in EA? In general, in EA, I do not clearly see why truth-seeking outside cause areas is super helpful? I actually see downsides with this as we can quickly descend into less action-relevant discussion e.g. around religion, gender/sexuality or even more hot-button topics (of course gender/sexuality is crucial in GH interventions centred on this topic!). I kind of feel truth seeking in such domains is perhaps net negative in EA, especially as we need to work across a range of cultures and with people from all walks of life. I think this is a difficult topic but wanted to mention it here as I could  imagine some readers taking especially the heading on hurting other people's feelings as an encouragement in EA to say things that are mostly action irrelevant but gets them lots of attention. My current stance is to refrain from such truth-seeking in public until I have a very strong conviction it could actually change priorities and also have broad support in making public such potentially controversial takes.

Will Bradshaw @ 2024-05-27T17:34 (+33)

I think you might be using "truth-seeking" a bit differently here from how I and others use it, which might be underlying the disagree-votes you're getting. In particular, I think you might be using "truth-seeking" to refer to an activity (engaging in a particular kind of discourse) rather than an attitude or value, whereas I think it's more typically used to refer to the latter.

I think it's very important to the EA endeavor to adopt a truth-seeking mindset about roughly everything, including (and in some cases especially) about hot-button political issues. At the same time, I think that it's often not helpful to try to hash out those issues out loud in EA spaces, unless they're directly relevant to cause prioritisation or the cause area under discussion.

Ulrik Horn @ 2024-05-28T03:30 (+4)

Hi Will, thanks for the comment. I agree 100% that it is very good for people to even look at hot button topics but keep such explorations offline.

Perhaps something I should have clarified above, and in danger of being perceived as speaking on behalf of others which is not my intention (instead I am trying to think of the least harmful example here): I was thinking that if I was someone really passionate about global health and doing it right, and coming from a strong Christian background, I might feel alienated from EA if it was required of me to frequently challenge my Christian faith. 

So I think I was talking in terms of an attitude or  value. For the above example of a Christian EA, and using another example of an atheist or at least agnostic EA who is super truth-seeking across the board, I could see the latter using this post to come to the conclusion that the Christian EA is not really EA as that person refuses to dive deep into the epistemics of their religious belief. This is what I wanted to highlight. And personally I think the Christian EA above is super helpful even for EAs who think they are not 100% truth-seeking: They have connections to lots of other Christians who want to do good and could influence them to do even better. They also understand large swaths of global population and can be effective communicators and ensure various initiatives from Pause AI to bed nets go well when delivered to Christian populations. Or they might just be a super good alignment researcher and not care too much about knowing the truth of everything. And the diversity of thought they bring also has value.

That said, I think "global truth-seekers" are also really important to EA - I think we would be much worse off if we did not have any people who were willing to go into every single issue trying to get ground contact with truth. 

If helpful, and very simplistically, I guess I am wondering which of the two alternatives below we think is ideal?

Brad West @ 2024-05-28T20:31 (+4)

Of course, one subset of Christians or other religious believers believe that the subjects of their religious beliefs follow from (or at least accord with) their rationality. This would contrast with the position that you seem to be indicating, which I believe is called fideism, which would hold that some religious beliefs cannot be reached by rational thinking. I would be interested in seeing what portion of EAs hold their religious beliefs explicitly in violation of what they believe to be rational, but I suspect that it would be few.

In any case, I believe truthseeking is generally a good way to live for even religious people who hold certain beliefs in spite of what they take to be good reason. Ostensibly, they would simply not apply it to one set of their beliefs.

titotal @ 2024-05-28T10:48 (+5)

I agree mostly with the article, but I think truth-seeking should take into account the large fallibility of the movement. For example:

On the negative side: I can make an argument for any given inclusion or exclusion on the 80,000 hours job board, but I’m certain the overall gestalt is too normal. When I look at the list, almost every entry is the kind of things that any liberal cultivator parent would be happy to be asked about at a dinner party. Almost all of the remaining (and most of the liberal-cultivator-approved) jobs are very core EA. I don’t know what jobs in particular are missing but I do not believe high impact jobs have this much overlap with liberal cultivator parent values. 

I don't see the problem with this. Ideas like "we should stop poor people dying of preventable illnesses" are robust ideas that have stood the test of time and scrutiny, and the reason most people are on board with them is because they are correct and have significant evidence backing them up.

Conversely, "weirder" ideas have significantly less evidence backing them up, and are often based on shaky assumptions or controversial moral opinions.  The most likely explanation for a weird new idea not being popular is that it's wrong

If you score "truth-seeking" by being correct on average about the most things, then a strategy of "agree with the majority of subject level scientific experts in every single field" is extremely hard to beat.  I guess the hope is by encouraging contrarianism, you can find a hidden gem that pays off for everything else, but there is a cost to that. 

Habryka @ 2024-05-28T20:10 (+19)

If you score "truth-seeking" by being correct on average about the most things, then a strategy of "agree with the majority of subject level scientific experts in every single field" is extremely hard to beat.  I guess the hope is by encouraging contrarianism, you can find a hidden gem that pays off for everything else, but there is a cost to that. 

There is no scientific field dedicated to figuring out how to help people best. I agree that deferring to a global expert consensus on claims that thousands of people do indeed study and dedicate their life to is often a good idea, but I don't think there exists any such reference class for the question of what jobs should show up on the 80k job board. 

Nick K. @ 2024-05-28T12:01 (+3)

"The most likely explanation for a weird new idea not being popular is that it's wrong. "

I agree with much of the rest of the comment, but this seems wrong - it seems more likely that these things just aren't very correlated.

Jason @ 2024-05-29T14:37 (+11)

I think there are (at least) two reasons popular ideas might be, on average, less wrong than unpopular ones. One possibility is that, while popular opinion isn't great at coming to correct conclusions, it has at least some modicum of correlation with correctness. The second is that popular ideas benefit from a selection effect of having many eyeballs on the idea (especially over a period of time). One would hope that the scrutiny would dethrone at least some popular ideas that are wrong, while the universe of weird ideas has received very little scrutiny.

dogmatic rationalist @ 2024-06-06T10:18 (+1)

"Popular being more likely to be true" is only a good heuristic under certain circumstances where there is some epistemically reliable group expertise and you are not familiar with their arguments.

Modesty epistemology if taken to extreme is self defeating, for example majority of Earth's population is still theist, without the memetic immune system radical modesty epistemology can lead to people executing stupid popular believed ideas to their stupid logical conclusion. I also take issue with this idea along the lines of "what if Einstein never tried to challenge Newtonian mechanics because from the outside view it is more likely he is wrong given the amount of times crackpots have failed to move the rachet of science forward" . I also personally psychologically cannot function within the framework of "what if I am a crackpot against the general consensus", after certain amount of hours spent studying the material I think one should be able to suggest potentially true new ideas.

Rebecca @ 2024-05-28T14:34 (+7)

'New' is probably a lot of the reason

Will Howard🔹 @ 2024-10-24T12:45 (+2)

I'm curating this post. The facets listed are values that I believe in, but that are easy to forget due to short term concerns about optics and so on. I think it's good to be reminded of the importance of these things sometimes. I particularly liked the examples in the section on Open sharing of information, as these are things that other people can try and emulate.

SummaryBot @ 2024-05-27T13:31 (+1)

Executive summary: Truthseeking and maintaining contact with reality is the most important meta-principle for achieving consequentialist goals, and there are many facets and techniques for pursuing it that are underutilized in the Effective Altruism community.

Key points:

  1. Avoid deferring to "epistemic daddies" and instead delegate opinions while maintaining ultimate responsibility for decisions.
  2. Stick to projects small enough to comprehend to maintain the ability to update beliefs.
  3. Actively seek out and create new information, with short feedback loops.
  4. Protect the epistemic commons by pushing back against anti-truthseeking behavior and supporting others' truthseeking.
  5. Cultivate an emotional appreciation for contact with reality, even when it involves losing bets or receiving criticism.
  6. Share information openly, including negative information about yourself and others, despite the personal costs, as it benefits the community.

 

 

This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.