Why defensive writing is bad for community epistemics
By Emrik @ 2022-10-08T23:38 (+76)
I see this as a big problem that makes communication and learning really inefficient. In a culture where defensive writing is the norm, readers learn to expect that their reputation is greatly at stake if they were to publish anything themselves.
I advocate writing primarily based on what you think will help the reader. I claim this is an act of introspection that's harder than what it seems at first. I'm not trying to judge anyone here. I find this hard myself, so I hope to help others notice what I've noticed in myself.[1]
TL;DR: A summary is hard, but: You may be doing readers a disservice by not being aware of when you're optimising your writing purely for helping yourself vs at-least-partially helping readers as well. Secondly, readers should learn to interpret charitably, lest they perpetuate an inefficient and harmfwl culture of communication.
Definitions
Self-centered writing is when you optimise your writing based on how it reflects on your character, instead of on your expectation of what will help your readers.
Defensive writing is a subcategory of the above, where you're optimising for making sure no one will end up having a bad impression of you.
Judgmental reading is when you optimise your reading for making inferences about the author rather than just trying to learn what you can learn from the content.
Naturally, these aren't mutually exclusive, and you can optimise for more than one thing at once.[2]
Takeaways
- A culture of defensive writing and judgmental reading makes communication really inefficient, and makes it especially scary for newcomers to write anything. Actually, it makes it scary for everyone.
- There's a difference between trying to make your statements safe to defer to (minimsing false-positives), vs not directly optimising for that e.g. because you're just sharing tools that readers can evaluate for themselves (minimising false-negatives). Where appropriate, writers should be upfront about when they're doing what.
- As an example of this, I'm not optimising this post for being safe to defer to. I take no responsibility for whatever nonsense of mine you end up actually believing. :p
- You are not being harmed when someone, according to you, uses insufficiently humble language. Downvoting them for it is tantamount to bullying someone for harmless self-expression.
What does a good epistemic community look like?
In my opinion,[3] an informed approach to this is multidisciplinary and should ideally draw on wisdom from e.g. social epistemology, game theory, metascience, economics of science, graph theory, rationality, several fields in psychology, and can be usefwly supplemented with insights from distributed & parallel computing, evolutionary biology, and more. There have also been relevant discussions on LessWrong over the years.
I'm telling you this because the first principle of a good epistemic community is:
Community members should judge practices based on whether the judgment, when universalised, will lead to better or worse incentives in the community.
And if we're not aware that there even exists a depth of research on what norms a community can try to encourage in order to improve its epistemic health, then we might have insufficient humility and forget to question our judgments. I'm not saying we should be stifled by uncertainty, but I am advocating that we at least think twice about how to encourage positive norms, and not rely too much on cached sensibilities.
I'll summarise some of what I think are some of the most basic problems.
1) We judge people much too harshly for what they don't know
Remember, this isn't an academic prestige contest, and the only thing that matters is whether we have the knowledge we need in order to do the best that we can. I'm not talking about how we should care about each others' feelings more and pretend that we're all equally competent and know an equal amount of stuff. No, I'm saying that if we have a habit of judging people for what they don't know, we'll be incentivised to waste time learning all the same things, and we lose out on diversity of knowledge.
And who are you to judge whether actually knowing whatever is truly essential for what they are trying to do, given that they are the one who's trying to do it?
“His ignorance was as remarkable as his knowledge. Of contemporary literature, philosophy and politics he appeared to know next to nothing.”
A community where people reveal their uncertainty, ask dumb questions, and reward each other for the courage and sagacity to admit their ignorance, will be more effective than this.
2) Celebrate innocence & self-confidence; stop rewarding modesty
This one is a little sad but... when I see someone declare that they have important things to teach the rest of the community, or that they think they can do more good than Peter Singer, I may not always agree that they have a chance--but I'll tear up and my heart will beam with joy and empathy for them, because I know they're sacrificing social comfort and perceived humility in order to virtuously reveal their true beliefs.
There are too many reasons for this, but the first is fundamentally about being a kind and decent human being. When a kid eagerly comes up to you and offers to show you a painting they're proud of, will you not be happy for the kid? Or will you harshly tell them to fall back in line and never call attention to themselves ever again? Or maybe you only think it's "inappropriate" when it's an upstanding adult who does it. After all, they've outgrown their right to be innocent. No. How can a community so full of kind people be so eager to put people down for unfolding their unconquered personalities openly into the world?
But if you insist on stealing the sunlight from their smiles... maybe arguments about effectiveness will convince you instead. It ties back to how people, in ordinary cases, vastly overestimate the downsides of not knowing particular things or not having "credentials", and underestimate the advantage of learning to think independently as early as possible.
It’s important that people be allowed to try ambitious things without feeling like they need to make a great production out of defending their hero license.
When we have a community in which people are afraid to stick out, and they're always timidly questioning whether they know enough to start actually Doing The Thing, we're losing out on an enormous amount of important projects, motivation, ambition, and untroubled mental health. It may not feel as urgent because we can't see the impact we miss out on.
3) Charitable interpretations by default
When you interpret someone with charity, you start out with the tentative assumption that they are a decent human being that means well, and means something coherent that you can learn from if you listen attentively. And you keep curiously looking for reasons not to abandon these assumptions.
EA is not perfect, but there is something different going on here than in the other parts of the world I experience. I don't know about you, but I come from a world in which "what's in it for me?", "it's not my responsibility", and careless exploitation is the norm. If my intuitions about what people are likely to intend have been trained on a world such as this, then I'm not going to be very charitable.
Steelmanning is the act of taking a view, or opinion, or argument and constructing the strongest possible version of it. It is the opposite of strawmanning.
The EA forum is not perfect, but it is different. If you're bringing with you the same reflexive heuristics while you're reading the forum, and you don't put in effort to adjust to the new distribution of actual intentions, then you risk incrementally regressing the forum down to your expectations.
If writers have to spend five paragraphs on disclaimers and clarification just to make sure we don't accuse them of nazism, well, we've wasted everyone's time, and we've lost a hundred good writers who didn't want to take the risk in the first place.
But it's worse, because if we already have the norm of constantly suspecting nazism or whatever, then the first writer to not correct for that misunderstanding is immediately going to look suspicious. This is what's so vicious about refusing to interpret with a little more recursive wisdom. If you have an equilibrium of both expecting and writing disclaimers about X, it self-perpetuates regardless of whether anyone actually means or supports X.
You might point out that disclaiming X isn't really a big efficiency loss, but X is just the tip of an iceberg.
Part of the problem is just laziness and impatience. If people have trained their intuitions on the level of discourse found in the world at large, they may not be used to the level of complexity commonly found in arguments on the forum. So they--at first reasonably--think they can get away with a few simple assumptions they've learned to expect here and there and then the point will immediately snap into understanding after reading the first paragraph. But that often doesn't work for the level of complexity and unusualness of meaning on the forum.
So what's wrong with defensive writing?
(If you've missed it, there are examples of defensive writing in the footnotes.)
Ultimately, my complaint is that it claims to be about benevolence, yet it is not actually about benevolence.[4] If the lack of benevolence was all it was, then I would have no gripe with it. But because it's routinely confused for benevolence, good people end up being fooled by it and it causes all sorts of problems.
Remember, you're not harming other people when they believe you are mistaken, or whether you say a wrong thing in a way that they can easily detect, or whether they think you're a nazi because you mentioned something about DNA and you didn't explicitly disclaim that you're not a nazi.
See the above section for the problems with defensive disclaimers.
It's sort of the opposite of writing ostentatiously just to show off (e.g. using math or technical terms that signal impressiveness but doesn't actually help with understanding). But not because it's any less self-centered. Wasting the reader's time by trying to influence their impression of you is more or less the same regardless of whether it is for defending against negative impressions or inviting positive impressions.
Another perhaps more damaging form of defensive writing is unnecessarily trying to demonstrate that you've done your research due diligence. As a way perhaps to establish some kind of "license" for writing about it at all. Or at least making it clear that you've "done work". But if I only have one interesting thing to teach, it doesn't matter to the reader whether I had to read 1 book or 50 books to find it. And it'd be especially indulgent to then go into detail about the contents of every one of those 49 books when you know that's not where the value is.
I think it's easy to misunderstand me here. What I'm essentially advocating is that that every word you write should flow from intentions in your brain that you are aware and approve of.
And what you optimise for should depend on what you're writing. Sometimes I primarily optimise for A) making readers have true beliefs about what I believe. Other times I primarily optimise for B) providing readers with tools that help them come to their own conclusions.
When I'm doing B, it doesn't matter what they believe I believe. If out of ten things I say, nine will be plain wrong, but one thing will be so right that it helps the reader do something amazing with their lives, I consider that a solid success.
Purpose A is necessary if you're somewhat of an authority figure and you're letting people know what you believe so they can defer to you. E.g. if you're a doctor, you don't want your patients to believe false things about your beliefs.
As it happens, I'm mostly trying to optimise this post for B, but didn't I say that this means I shouldn't care whether I'm misunderstood? So why am I spending time in this section on defending against misunderstandings?
Because I think this is complicated enough that readers may end up not only misunderstanding what I believe, but also misunderstanding the tools I'm trying to teach. I think people can benefit from these tools, so I want to make sure they're understood.[5]
But also just because... well, there are limits to how brave you are required to be.
I want to be liked, so I sometimes signal modesty for no further reason than to be liked. This is ok. And sometimes I have to affect humility in order to make my writing bearable to read, because we already live in a world where that's expected and I can't unilaterally defect from every inadequate equilibria I see all the time--the personal cost is too high.
So I try to do what I can on the margin, show the world that I can be happy about my achievements ("bragging"), do less modesty, say more wrong things, etc. But when the cost is cheap, as is the case when you see others who defect, there are fewer excuses to not at least respect and try to feel happy about them nudging the equilibrium.
- ^
Ironic, yes, but this will be our first example of defensive writing! First, the altruistic value of this paragraph is that it points out that noticing defensiveness is a difficult act of introspection. That's usefwl for the readers to know. The rest of the paragraph only serves the purpose of making sure no one thinks I'm being arrogant.
Notice also how when I write "I claim this is" instead of "this is", I change absolutely nothing about what the readers learn, but I am changing readers' impressions about how humble I'm being. If I can seem more humble by adding two more words, I usually say it's worth the cost--but I'm aware that I am profiting only myself, and it is my readers that pay the cost.
- ^
Another example. I don't want readers to think that I believe you cannot optimise for two things at once, so I point that out. But no one in their right mind would actually think that! So by pointing it out, I'm not helping them understand anything differently about the content. The only purpose of the sentence is to prevent people who are actively looking to misunderstand me from misunderstanding me.
- ^
Part of my motivation for writing this whole paragraph was to provide an example of something that could seem like it was optimised for ostentation (name-dropping a bunch of scientific fields), but was in reality optimised for something else. The point is that people are generally much too quick to judge people for perceived intentions, and if I were afraid of this then I wouldn't have felt permitted to write this sentence. A culture of judgment and no charity will prevent people from writing sentences they actually think are helpfwl.
In addition, notice how this usage of "in my opinion" actually does communicate important-to-the-reader information about the value of deferring to me, so it's not just a ploy to seem more humble.
- ^
Norwegian has the right word here, and it's "nestekjærlighet". The best I can do as a direct translation is "otherpersongoodintentions". (Sorry.) It makes it unfailingly clear that it's not about raising one's own moral character--that would just be selfpersongoodintentions.
- ^
I actually think nearly all the value to these exercises are in things I haven't made the case for. Those explanations would be longer and require more stuff. But I hoped the value present would make it interesting enough, and I didn't want to make the post longer than it already is.
Lukas_Gloor @ 2022-10-11T18:09 (+19)
2) Celebrate innocence & self-confidence; stop rewarding modesty
This one is a little sad but... when I see someone declare that they have important things to teach the rest of the community, or that they think they can do more good than Peter Singer, I may not always agree that they have a chance--but I'll tear up and my heart will beam with joy and empathy for them, because I know they're sacrificing social comfort and perceived humility in order to virtuously reveal their true beliefs
Confidence is part of the skillset of successful conmen and cult leaders. In conmen, it's fake confidence; in cult leaders, it's often "real" overconfidence. Such people will gather altruistically motivated followers and volunteers and waste those people's efforts on an ultimately doomed, self-serving mission.
Those probably weren't the cases you had in mind when you pictured someone being excited about an idea or about some project's impact. But it's important that what you want to reward/encourage is (probably) something like "genuine confidence that's different from an always-grandiose personality."
Emrik @ 2022-10-11T19:45 (+2)
Idk, I think we're much more bottlenecked by people daring to stick out than by people blindly following overconfident people. EAs/rationalists already have strong defenses against the latter. So much so that they'll be very suspicious of anyone openly claiming to be a hero. Even if encouraging confidence increases the risk of conmen, it'd be worth it. Besides, one of the reasons conmen can successfwly amass followers is that the followers lack confidence.
Lukas_Gloor @ 2022-11-12T00:12 (+14)
This aged somewhat poorly IMO even just 1 month later. Maybe it's easier for others to see now why I feel strongly about what I said.
Emrik @ 2022-11-12T00:27 (+5)
I haven't updated appreciably in this direction after this, but I appreciate you calling attention to it.
Lukas_Gloor @ 2022-10-11T21:22 (+5)
EAs/rationalists already have strong defenses against the latter.
I think it could be stronger! While there's a pronounced norm in EA against overconfidence, that's not the same as having "strong defenses." Social signalling is anti-inductive in the sense that the signals that work to trick us into being unduly impressed or intimidated by others' perceived competence cannot be easily summarized with simple rules for what to avoid or look out for. (In other words, even with strong modesty norms, a con man or cult leader could score modesty points wherever doing so is cheap while subtly working in confidence tricks with the way they frame things and with their narrative.)
Besides, one of the reasons conmen can successfwly amass followers is that the followers lack confidence.
That doesn't sound right to me. I think you can be confident and still follow someone you think is deserving of it. Besides, confidence is empty/shallow without the domain-appropriate competence to back it up. In the case of "who would you follow?," the skill in question seems to be people judgment. One part of people judgment is being confident in your own judgment; another part is being the right amount of cynical.
In any case, I think we might be discussing a false dilemma. I'm personally reading your message as "it's important to encourage/nurture people's confidence when they have a vision and seem willing to take on ambitious or even heroic projects to make the world better." I'm all for it! I just want to add "and make sure to check if the person seems to have high integrity." ("Checking" in the sense of "have your eyes open" rather than "be mistrustful from the get go / don't give them a chance.")
I agree with you that EA aims seem bottlenecked by people with vision. I am skeptical that people capable of becoming people with vision are easily deterred by social norms (I'm reminded of the chapters in HPMOR around Hermoine wanting to become a heroine). But, admittedly, that doesn't mean that the effect is zero.
Or maybe you don't see it as a false dilemma and you disagree with me about where we're at with the social pendulum swing, so maybe you consider it important to push the pendulum back as hard as possible. If so, we may find ourselves on ~25% opposed sides on the battlefield over norms. Oh well. Disagreements over "social pendulum dynamics" seem notoriously intractable to me. (My pet theory is that people's strongest opinions on social norms, their "most sacred virtues," are often influenced by small t-traumas from things that went particularly poorly for them or for people they care about, and there's so much variation and situations where opposite types of advice is good for different people.)
Emrik @ 2022-10-11T22:32 (+5)
"Social signalling is anti-inductive in the sense that the signals that work to trick us into being unduly impressed or intimidated by others' perceived competence cannot be easily summarized with simple rules for what to avoid or look out for."
Interesting! Good point.
- - - - -
If you follow someone because you have confidence in your ability to evaluate whether they're worth following, then maybe you should follow them. If you follow someone to compensate for your lack of confidence, that's less likely to produce good consequences. I "follow" some people because I'm confident in their abilities, e.g. I think Eliezer is worth trying to learn from. I think he's also worth deferring to if you choose not to be an explorer and instead someone who wishes to spend their time Doing and less time figuring out stuff.
"I am skeptical that people capable of becoming people with vision are easily deterred by social norms"
I'd like to believe this, but I strongly disagree. I read a comment from Steven Byrnes saying that he (iirc) was was held back by being really uncertain for a long time about whether he had anything usefwl to contribute to the community. I know some other people I think have the power to guide themselves to great things, but who don't fully trust themselves enough.
Personally, I spent years thinking I was purely "catching up" to what everyone else already knows. Every new idea I had was "ah, so this is what the smart people think already". It took someone who believed in me to make me feel safe pursuing my own path. It's really hard to be motivated when you're constantly questioning yourself, even when you try to pursue independence.
I want to push the confidence pendulum way further than it has ever been. I want people to pursue independence, ambition, and self-sacrifice until the road is littered with failed or delusional projects. This is how you sample for outliers on a fat-tailed distribution and double the number of Bankman-Frieds, Yudkowskys, and Borlaugs.
But I have a bit more nuance in what I mean by "confidence". I don't mean unjustified probability estimates. I don't mean lack of curiosity in other people's viewpoints. I don't mean the stereotypical social role, I just mean the stuff that's usefwl for independently pursuing ambitious things with motivation and less wasted motion.
Ivy_Mazzola @ 2022-10-10T07:09 (+17)
I like it and agree, but idk if I find it as easy as that. I'm pretty sure I get downvoted when I don't make modesty overt. It seems key on EA facebook and EA Slack too. Without it, does a controversial point get integrated into the discussion? Idk.
In footnote 1, you say. "If I can seem more humble by adding two more words, I usually say it's worth the cost--but I'm aware that I am profiting only myself, and it is my readers that pay the cost."
But if an average EA will absorb your argument better when written in humble-mode, then it is the average EA who is helped by your humble writing. Or, the writer pays a tax, not the reader. The EA forum to me seems like it has an arms race of humble writing, where if you don't convey your ideas humbly, your ideas look worse in comparison to others.
If this is true, can we even deescalate? Personally, I care too much about what I write to risk screwing myself and my argument by writing plainly.[1] I'd guess others do too. If modesty is what the forum readers and EA decision-makers seem to reward, that is what they will get. [2]
Hopefully your post is the start of changing it though. Maybe it's time the EA forum commenting guidelines were edited? The community might need reminders or Moloch might swallow this one up. "Be chill" or "be yourself" or "it's okay to be confident" might be nice to see every time I open the commenting field.
- ^
If I spend a truly disgusting time editing each comment for modesty, I can screw only me. See, big brain right there.
- ^
Especially from women/AFAB, who are already trained in it, and also have the cultural disincentive from trying anything else. Personally, I'm half-expecting to be seen as a bitch or at least unrelatable even when I make modesty or warmth a priority, and I can't be the only one.
Emrik @ 2022-10-10T07:20 (+5)
"The EA forum to me seems like it has an arms race of humble writing, where if you don't convey your ideas humbly, your ideas look worse in comparison to others."
Precisely!
And yeah, because of this, I do a lot of humility which I know will only contribute to the arms race but I still do it because I think the object-level message is important enough in that case. I was calling it "self-profiteering" only at recursion level 1.
I'm less optimistic about this post being "the start of a change", but maybe it can cause some people to be less judgmental in their reading, and thereby notice when some seemingly brazen people are being exceedingly kind--like me!
Ivy_Mazzola @ 2022-10-10T07:32 (+3)
Hm. Yeah maybe solution is to just write more comments. If you write more, you don't have to risk your best arguments to pepper the forum culture with non-modesty. Like, for every comment we write with a natural vibe, at least we are bringing down the modesty-arms-race average.
Emrik @ 2022-10-10T07:36 (+3)
Good idea! Inundate the forum! Already on it! ^^
david_reinstein @ 2022-10-10T03:00 (+12)
I think “lack of modesty” may often be shorthand for other limitations that stifle discussion, or make readers think the author is not open to input from others.
E.g., if an author writes something like “my research proves that all charity is driven by selfish motives”.
Emrik @ 2022-10-10T04:59 (+4)
Definitely lack of actual humility is a problem, but I think pretty much everyone recognises that it's a problem. Even the people who actually do lack humility to the extent that they think no one can teach them anything-- do recognise that humility is good epistemics in general... for other people. So I don't think going around and reminding everyone that they need to be humble actually helps much. It's already in the water. Especially in EA.
david_reinstein @ 2022-10-10T14:53 (+6)
Generally yes but there are always new people coming in who are not aware of this.
So, the 'when to remind' game is a challenge of finding the right balance of precision and recall (type 1/2 errors) and hitting the best frontier.
Yonatan Cale @ 2022-10-11T16:42 (+11)
I'd like to offer this as an example of defensive writing that I don't like:
It's so hard to argue with it because all the claims are so vague [like "it's not obvious that..."], but the general impression [that I get] is that 80k is encouraging working on building AGIs.
If you agree with me, consider pushing back on that, maybe here.
Emrik @ 2022-10-11T19:31 (+3)
Hmm, I see what you mean, but I think I disagree.[1] It's not obvious that "it's not obvious that" is bad. And I think it makes sense sometimes to write a post that appeals to authority--but you should be very aware of when you're doing that.
What I somewhat dislike about this tweet is that it mixes up trying to make you defer to it while also providing gears-level arguments. It invites the mistake of getting woozled by the technical arguments just because they came from "experts".[2]
Additionally, on the object level, I kinda think alignment researchers should be trying to theoretically innovate on the capabilities frontier, just keep it secret and don't tell anyone you don't trust very highly. It's hard to try to align something you don't have a deep understanding of.
- ^
Notice how this whole comment is largely about my state of belief/agreement, rather than about the patterns themselves.
- ^
This paragraph is mostly about communicating the patterns, but I still lead with "what I somewhat dislike"--referring to my beliefs rather than communicating content.
We should ideally have a standard back-to-object-level codeword that can help defuse discussions from increasingly becoming more and more about belief-states rather than about communicating patterns. Something like... if you say "blueberry muffins", your interlocutor will know to stop asking or revealing about (dis)agreement or estimates.
Amber Dawn @ 2022-10-10T16:35 (+9)
I strongly agree with this post. Particularly enjoyed this sentence:
"Wasting the reader's time by trying to influence their impression of you is more or less the same regardless of whether it is for defending against negative impressions or inviting positive impressions."
I think defensive writing is often stylistically bad. I've thought about this quite a bit since I've been doing writing/editing work for EAs: I'm often tempted to use far fewer hedge-words and caveats, but it's understandable why people sometimes don't feel comfortable without them.
I also think that defensive writing pulls against an element of reasoning transparency that I think is underrated. I think some of my best writing is done when I'm actively thinking through things. I think this produces a kind of....'emotional reasoning transparency', where readers can see exactly how the thought got from the writer's brain to the page. I think if the writer is extremely preoccupied with never writing anything that could in principle be criticized or nitpicked, or that might make someone think they are over-confident, it's very hard to do this kind of writing.
Amber Dawn @ 2022-10-10T16:37 (+3)
some similar/related thoughts here: https://medium.com/@contemplatonist/on-inhibition-612a2ec9b380
Emrik @ 2022-10-10T16:47 (+3)
Love it. Btw, do you have any top writing advice or bullet points/resources? I know Scott's Nonfiction Writing Advice and... that's pretty much it.
Amber Dawn @ 2022-10-10T16:55 (+5)
Some things that have influenced my writing have been Steven Pinker's 'The Sense of Style' and George Orwell's 'Politics and the English Language'. I think quite a few of the tips from those are in Scott's Nonfiction Writing Advice too!
Emrik @ 2022-10-10T17:28 (+3)
Huh, coincidental choices. I read both! I think Pinker's book influenced something but I can't remember any specifics.
zeshen @ 2022-10-10T23:16 (+8)
I really like this post, especially as someone who is fairly anxious when writing for fear of being judged as ignorant. I definitely agree that we should promote an environment conducive for people to say wrong things.
However, I don't fully agree with the notion of celebrating the self-confidence of people who "declare that think they can do more good than Peter Singer". I'm quite likely misinterpreting what you mean by confidence as over-confidence, but just on the face value of it, I prefer claims to have an appropriate level of confidence associated with them. When someone makes a strong claim, I'd like to know whether the person is a domain expert with good epistemics who has done extensive research to arrive at the conclusion, or an innocent kid who claims to have discovered the most important thing in the world. Perhaps just stating epistemic status upfront would solve the problem. And perhaps over-confident people who are just about to take your advice to celebrate confidence and stop rewarding modesty should just reverse the advice.
On a tangential note, I sometimes find myself doing 'defensive writing' not just for defensive reasons, but also to try to convey what I mean to the reader as accurately as I can by ruling out everything else.
Emrik @ 2022-10-11T06:38 (+3)
My take on self-confidence and innocence has several parts to it, and I think it's one of the most important things I want to communicate, for many reasons I don't go into here. But with self-confidence, you don't need to be falsely sure that you'll do more good than Peter Singer, but you can be very confident that trying is the right way to go. Confidence-in-path rather than confidence-in-results.
I know I'm following a trail that I know also goes to crazytown at some crossroad, and I see the skulls, but I'm pretty confident it's the right way to go for me rn, so I'll continue onwards at full speed until I learn that it's not the right path.
At no point in your journey, no matter how uncertain you are, no matter how numerous be the options before you, will dragging your feet help you faster get to where you want to go.
Sometimes I call it "hubris", but it's not really about that. If you make a bet on that you'll do more good than Peter Singer or whatever, you don't need to be certain of the outcome to think it's good EV. But to others who see you betting on it, they might mistake it for certainty--it's as if they forgot that probabilities exist for a moment.
This is one of the most important reasons why hubris is so undervalued. People mistakenly think the goal is to generate precise probability estimates for frequently-discussed hypotheses (a goal in which deference can make sense). In a common-payoff-game research community, what matters is making new leaps in model space, not converging on probabilities. We (the research community) are bottlenecked by insight-production, not marginally better forecasts or decisions. Feign hubris if you need to, but strive to install it as a defense against model-dissolving deference.
--
Oh, and it was nice to meet you in Berlin! Stay awesome. ^^
zeshen @ 2022-10-11T08:53 (+1)
Confidence-in-path rather than confidence-in-results.
Nicely said.
--
See you at GatherTown soon!
EdoArad @ 2022-10-09T05:09 (+8)
Thanks for writing this! The section on "stop rewarding modesty" was especially interesting and will stick with me
Emrik @ 2022-10-09T19:12 (+4)
Thanks! I expect that section is high-variance. Wasn't sure whether I should be so overtly miffed about it. :p
Arjun Panickssery @ 2022-10-09T19:15 (+6)
And what you optimise for should depend on what you're writing. Sometimes I primarily optimise for A) making readers have true beliefs about what I believe. Other times I primarily optimise for B) providing readers with tools that help them come to their own conclusions.
When I'm doing B, it doesn't matter what they believe I believe. If out of ten things I say, nine will be plain wrong, but one thing will be so right that it helps the reader do something amazing with their lives, I consider that a solid success.
cf. https://slatestarcodex.com/2019/02/26/rule-genius-in-not-out/
sphor @ 2022-10-09T23:13 (+3)
Great post :-)
EcologyInterventions @ 2022-10-09T18:32 (+3)
Very useful and illustrative. I especially like how you manage to tie both the personal perspective and the group dynamics together. I was acquainted with this idea but your write up was definitely illuminating of aspects I missed. I expect this to be useful to me and others!