Status Regulation and Anxious Underconfidence

By EliezerYudkowsky @ 2017-11-16T21:52 (+12)

Previous: Against Modest Epistemology


 

I’ve now given my critique of modesty as a set of explicit doctrines. I’ve tried to give the background theory, which I believe is nothing more than conventional cynical economics, that explains why so many aspects of the world are not optimized to the limits of human intelligence in the manner of financial prices. I have argued that the essence of rationality is to adapt to whatever world you find yourself in, rather than to be “humble” or “arrogant” a priori. I’ve tried to give some preliminary examples of how we really, really don’t live in the Adequate World where constant self-questioning would be appropriate, the way it is appropriate when second-guessing equity prices. I’ve tried to systematize modest epistemology into a semiformal rule, and I’ve argued that the rule yields absurd consequences.

I was careful to say all this first, because there’s a strict order to debate. If you’re going to argue against an idea, it’s bad form to start off by arguing that the idea was generated by a flawed thought process, before you’ve explained why you think the idea itself is wrong. Even if we’re refuting geocentrism, we should first say how we know that the Sun does not orbit the Earth, and only then pontificate about what cognitive biases might have afflicted geocentrists. As a rule, an idea should initially be discussed as though it had descended from the heavens on a USB stick spontaneously generated by an evaporating black hole, before any word is said psychoanalyzing the people who believe it. Otherwise I’d be guilty of poisoning the well, also known as Bulverism.

But I’ve now said quite a few words about modest epistemology as a pure idea. I feel comfortable at this stage saying that I think modest epistemology’s popularity owes something to its emotional appeal, as opposed to being strictly derived from epistemic considerations. In particular: emotions related to social status and self-doubt.

Even if I thought modesty were the correct normative epistemology, I would caution people not to confuse the correct reasoning principle with those particular emotional impulses. You’ll observe that I’ve written one or two things above about how not to analyze inadequacy, and mistakes not to make. We hear far too little from its advocates about potential misuses and distortions of modest epistemology, if we’re going to take modest epistemology seriously as a basic reasoning mode, technique, or principle.

And I’ll now try to describe the kinds of feelings that I think modesty’s appeal rests on. Because I’ve come to appreciate increasingly that human beings are really genuinely different from one another, you shouldn’t be surprised if it seems to you like this is not how you work. I claim nonetheless that many people do work like this.

 

i.

Let’s start with the emotion—not restricted to cases of modesty, just what I suspect to be a common human emotion—of “anxious underconfidence.”

As I started my current writing session, I had just ten minutes ago returned from the following conversation with someone looking for a job in the Bay Area that would give them relevant experience for running their own startup later:

Eliezer: Are you a programmer?

Aspiring Founder: That’s what everyone asks. I’ve programmed at all of my previous jobs, but I wouldn’t call myself a programmer.

Eliezer: I think you should try asking (person) if they know of any startups that could use non-super programmers, and look for a non-doomed startup that’s still early-stage enough that you can be assigned some business jobs and get a chance to try your hand at that without needing to manage it yourself. That might get you the startup experience you want.

Aspiring Founder: I know how to program, but I don’t know if I can display that well enough. I don’t have a Github account. I think I’d have to spend three months boning up on programming problems before I could do anything like the Google interview—or maybe I could do one of the bootcamps for programmers—

Eliezer: I’m not sure if they’re aimed at your current skill level. Why don’t you try just one interview and see how that goes before you make any complicated further plans about how to prove your skills?

This fits into a very common pattern of advice I’ve found myself giving, along the lines of, “Don’t assume you can’t do something when it’s very cheap to try testing your ability to do it,” or, “Don’t assume other people will evaluate you lowly when it’s cheap to test that belief.”

I try to be careful to distinguish the virtue of avoiding overconfidence, which I sometimes call “humility,” from the phenomenon I’m calling “modest epistemology.” But even so, when overconfidence is such a terrible scourge according to the cognitive bias literature, can it ever be wise to caution people against underconfidence?

Yes. First of all, overcompensation after being warned about a cognitive bias is also a recognized problem in the literature; and the literature on that talks about how bad people often are at determining whether they’re undercorrecting or overcorrecting.1 Second, my own experience has been that while, yes, commenters on the Internet are often overconfident, it’s very different when I’m talking to people in person. My more recent experience seems more like 90% telling people to be less underconfident, to reach higher, to be more ambitious, to test themselves, and maybe 10% cautioning people against overconfidence. And yes, this ratio applies to men as well as women and nonbinary people, and to people considered high-status as well as people considered low-status.

Several people have now told me that the most important thing I have ever said to them is: “If you never fail, you’re only trying things that are too easy and playing far below your level.” Or, phrased as a standard Umeshism: “If you can’t remember any time in the last six months when you failed, you aren’t trying to do difficult enough things.” I first said it to someone who had set themselves on a career track to becoming a nurse instead of a physicist, even though they liked physics, because they were sure they could succeed at becoming a nurse.

I call this “anxious underconfidence,” and it seems to me to share a common thread with social anxiety. We might define “social anxiety” as “experiencing fear far in excess of what a third party would say are the reasonably predictable exterior consequences, with respect to other people possibly thinking poorly of you, or wanting things from you that you can’t provide them.” If someone is terrified of being present at a large social event because someone there might talk to them and they might be confused and stutter out an answer—when, realistically, this at worst makes a transient poor impression that is soon forgotten because you are not at the center of the other person’s life—then this is an excess fear of that event.

Similarly, many people’s emotional makeup is such that they experience what I would consider an excess fear—a fear disproportionate to the non-emotional consequences—of trying something and failing. A fear so strong that you become a nurse instead of a physicist because that is something you are certain you can do. Anything you might not be able to do is crossed off the list instantly. In fact, it was probably never generated as a policy option in the first place. Even when the correct course is obviously to just try the job interview and see what happens, the test will be put off indefinitely if failure feels possible.

If you’ve never wasted an effort, you’re filtering on far too high a required probability of success. Trying to avoid wasting effort—yes, that’s a good idea. Feeling bad when you realize you’ve wasted effort—yes, I do that too. But some people slice off the entire realm of uncertain projects because the prospect of having wasted effort, of having been publicly wrong, seems so horrible that projects in this class are not to be considered.

This is one of the emotions that I think might be at work in recommendations to take an outside view on your chances of success in some endeavor. If you only try the things that are allowed for your “reference class,” you’re supposed to be safe—in a certain social sense. You may fail, but you can justify the attempt to others by noting that many others have succeeded on similar tasks. On the other hand, if you try something more ambitious, you could fail and have everyone think you were stupid to try.

The mark of this vulnerability, and the proof that it is indeed a fallacy, would be not testing the predictions that the modest point of view makes about your inevitable failures—even when they would be cheap to test, and even when failure doesn’t lead to anything that a non-phobic third party would rate as terrible.

 

ii.

The other emotions I have in mind are perhaps easiest to understand in the context of efficient markets.

In humanity’s environment of evolutionary adaptedness, an offer of fifty carrots for a roasted antelope leg reflects a judgment about roles, relationships, and status. This idea of “price” is easier to grasp than the economist’s notion; and given that somebody doesn’t have the economist’s very specific notion in mind when you speak of “efficient markets,” they can end up making what I would consider an extremely understandable mistake.

You tried to explain to them that even if they thought AAPL stock was underpriced, they ought to question themselves. You claimed that they couldn’t manage to be systematically right on the occasions where the market price swung drastically. Not unless they had access to insider information on single stocks—which is to say, they just couldn’t do it.

But “I can’t do that. And you can’t either!” is a suspicious statement in everyday life. Suppose I try to juggle two balls and succeed, and then I try to juggle three balls and drop them. I could conclude that I’m bad at juggling and that other people could do better than me, which comes with a loss of status. Alternatively, I could heave a sad sigh as I come to realize that juggling more than two balls is just not possible. Whereupon my social standing in comparison to others is preserved. I even get to give instruction to others about this hard-won life lesson, and smile with sage superiority at any young fools who are still trying to figure out how to juggle three balls at a time.

I grew up with this fallacy, in the form of my Orthodox Jewish parents smiling at me and explaining how when they were young, they had asked a lot of religious questions too; but then they grew out of it, coming to recognize that some things were just beyond our ken.

At the time, I was flabbergasted at my parents’ arrogance in assuming that because they couldn’t solve a problem as teenagers, nobody else could possibly solve it going forward. Today, I understand this viewpoint not as arrogance, but as a simple flinch away from a painful thought and toward a pleasurable one. You can admit that you failed where success was possible, or you can smile with gently forgiving superiority at the youthful enthusiasm of those who are still naive enough to attempt to do better.

Of course, some things are impossible. But if one’s flinch response to failure is to perform a mental search for reasons one couldn’t have succeeded, it can be tempting to slide into false despair.

In the book Superforecasting, Philip Tetlock describes the number one characteristic of top forecasters, who show the ability to persistently outperform professional analysts and even small prediction markets: they believe that outperformance in forecasting is possible, and work to improve their performance.2

I would expect this to come as a shock to people who grew up steeped in academic studies of overconfidence and took away the lesson that epistemic excellence is mostly about accepting your own limitations.3 But I read that chapter of Superforecasting and laughed, because I was pretty sure from my own experience that I could guess what had happed to Tetlock: he had run into large numbers of naive respondents who smiled condescendingly at the naive enthusiasm of those who thought that anyone can get good at predicting future events.4

Now, imagine you’re somebody who didn’t read Superforecasting, but did at least grow up with parents telling you that if they’re not smart enough to be a lawyer, then neither are you. (As happened to a certain childhood friend of mine who is now a lawyer.)

And then you run across somebody who tries to tell you, not just that they can’t outguess the stock market, but that you’re not allowed to become good at it either. They claim that nobody is allowed to master the task at which they failed. Your uncle tripled his savings when he bet it all on GOOG, and this person tries to wave it off as luck. Isn’t that like somebody condescendingly explaining why juggling three balls is impossible, after you’ve seen with your own eyes that your uncle can juggle four?

This isn’t a naive question. Somebody who has seen the condescension of despair in action is right to treat this kind of claim as suspicious. It ought to take a massive economics literature examining the idea in theory and in practice, and responding to various apparent counterexamples, before we accept that a new kind of near-impossibility has been established in a case where the laws of physics seem to leave the possibility open.

Perhaps what you said to the efficiency skeptic was something like:

If it’s obvious that AAPL stock should be worth more because iPhones are so great, then a hedge fund manager should be able to see this logic too. This means that this information will already be baked into the market price. If what you’re saying is true, the market already knows it—and what the market knows beyond that, neither you nor I can guess.

But what they heard you saying was:

O thou, who burns with tears for those who burn,
In Hell, whose fires will find thee in thy turn
Hope not the Lord thy God to mercy teach
For who art thou to teach, or He to learn?5

This again is an obvious fallacy for them to suspect you of committing. They’re suggesting that something might be wrong with Y’s judgment of X, and you’re telling them to shut up because Y knows far better than them. Even though you can't point to any flaws in the skeptic's suggestion, can't say anything about the kinds of reasons has in mind for believing X, and can't point them to the information sources Y might be drawing from. And it just so happens that Y is big and powerful and impressive.

If we could look back at the ages before liquid financial markets existed, and record all of the human conversations that went on at the time, then practically every instance in history of anything that sounded like what you said about efficient markets—that some mysterious powerful being is always unquestionably right, though the reason be impossible to understand—would have been a mistake or a lie. So it’s hard to blame the skeptic for being suspicious, if they don’t yet understand how market efficiency works.

What you said to the skeptic about AAPL stock is justified for extremely liquid markets on short-term time horizons, but—at least I would claim—very rarely justified anywhere else. The claim is, “If you think you know the price of AAPL better than the stock market, then no matter how good the evidence you think you’ve found is, your reasoning just has some hidden mistake, or is neglecting some unspecified key consideration.” And no matter how valiantly they argue, no matter how carefully they construct their reasoning, we just smile and say, “Sorry, kid.” It is a final and absolute slapdown that is meant to be inescapable by any mundane means within a common person’s grasp.

Indeed, this supposedly inescapable and crushing rejoinder looks surprisingly similar to a particular social phenomenon I’ll call “status regulation.”

 

iii.

Status is an extremely valuable resource, and was valuable in the ancestral environment.

Status is also a somewhat conserved quantity. Not everyone can be sole dictator.

Even if a hunter-gatherer tribe or a startup contains more average status per person than a medieval society full of downtrodden peasants, there’s still a sense in which status is a limited resource and letting someone walk off with lots of status is like letting them walk off with your bag of carrots. So it shouldn’t be surprising if acting like you have more status than I assign to you triggers a negative emotion, a slapdown response.

If slapdowns exist to limit access to an important scarce resource, we should expect them to be cheater-resistant in the face of intense competition for that resource.6 If just anyone could find some easy sentences to say that let them get higher status than God, then your system for allocating status would be too easy to game. Escaping slapdowns should be hard, generally requiring more than mere abstract argumentation.

Except that people are different. So not everyone feels the same way about this, any more than we all feel the same way about sex.

As I’ve increasingly noticed of late, and contrary to beliefs earlier in my career about the psychological unity of humankind, not all human beings have all the human emotions. The logic of sexual reproduction makes it unlikely that anyone will have a new complex piece of mental machinery that nobody else has… but absences of complex machinery aren’t just possible; they’re amazingly common.

And we tend to underestimate how different other people are from ourselves. Once upon a time, there used to be a great and acrimonious debate in philosophy about whether people had “mental imagery” (whether or not people actually see a little picture of an elephant when they think about an elephant). It later turned out that some people see a little picture of an elephant, some people don’t, and both sides thought that the way they personally worked was so fundamental to cognition that they couldn’t imagine that other people worked differently. So both sides of the philosophical debate thought the other side was just full of crazy philosophers who were willfully denying the obvious. The typical mind fallacy is the bias whereby we assume most other people are much more like us than they actually are.

If you’re fully asexual, then you haven’t felt the emotion others call “sexual desire”… but you can feel friendship, the warmth of cuddling, and in most cases you can experience orgasm. If you’re not around people who talk explicitly about the possibility of asexuality, you might not even realize you’re asexual and that there is a distinct “sexual attraction” emotion you are missing, just like some people with congenital anosmia never realize that they don’t have a sense of smell.

Many people seem to be the equivalent of asexual with respect to the emotion of status regulation—myself among them. If you’re blind to status regulation (or even status itself) then you might still see that people with status get respect, and hunger for that respect. You might see someone with a nice car and envy the car. You might see a horrible person with a big house and think that their behavior ought not to be rewarded with a big house, and feel bitter about the smaller house you earned by being good. I can feel all of those things, but people’s overall place in the pecking order isn’t a fast, perceptual, pre-deliberative thing for me in its own right.

For many people, I gather that the social order is a reified emotional thing separate from respect, separate from the goods that status can obtain, separate from any deliberative reasoning about who ought to have those goods, and separate from any belief about who consented to be part of an implicit community agreement. There’s just a felt sense that some people are lower in various status hierarchies, while others are higher; and overreaching by trying to claim significantly more status than you currently have is an offense against the reified social order, which has an immediate emotional impact, separate from any beliefs about the further consequences that a social order causes. One may also have explicit beliefs about possible benefits or harms that could be caused by disruptions to the status hierarchy, but the status regulation feeling is more basic than that and doesn’t depend on high-level theories or cost-benefit calculations.

Consider, in this context, the efficiency skeptic’s perspective:

Skeptic: I have to say, I'm baffled at your insistence that hedge fund managers are the summit of worldly wisdom. Many hedge fund managers—possibly most—are nothing but charlatans who convince pension managers to invest money that ought to have gone into index funds.

Cecie: Markets are a mechanism that allow and incentivize a single smart participant to spot a bit of free energy and eat it, in a way that aggregates to produce a global equilibrium with no free energy. We don’t need to suppose that most hedge fund managers are wise; we only need to suppose that a tiny handful of market actors are smart enough in each case to have already seen what you saw.

Skeptic: I’m not sure I understand. It sounds like what you’re saying, though, is that your faith is not in mere humans, but in some mysterious higher force, the “Market.”

You consider this Market incredibly impressive and powerful. You consider it folly for anyone to think that they can know better than the Market. And you just happen to have on hand a fully general method for slapping down anyone who dares challenge the Market, without needing to actually defend this or that particular belief of the Market.

Cecie: A market’s efficiency doesn’t derive from its social status. True efficiency is very rare in human experience. There’s a very good reason that we had to coin a term for the concept of “efficient markets,” and not “efficient medicine” or “efficient physics”: because in those ecologies, not just anyone can come along and consume a morsel of free energy.

If you personally know better than the doctors in a hospital, you can’t walk in off the street tomorrow and make millions of dollars saving more patients’ lives. If you personally know better than an academic field, you can't walk in off the street tomorrow and make millions of dollars filling the arXiv with more accurate papers.

Skeptic: I don’t know. The parallels between efficiency and human status relations seem awfully strong, and this “Market moves in mysterious ways” rejoinder seems like an awfully convenient trick.

Indeed, I would be surprised if there weren’t at least some believers in “efficient markets” who assigned them extremely high status and were tempted to exaggerate their efficiency, perhaps feeling a sense of indignation at those who dared to do better. Perhaps there are people who feel an urge to slap down anyone who starts questioning the efficiency of Boomville’s residential housing market.

So be it; Deepak Chopra can’t falsify quantum mechanics by being enthusiastic about a distorted version of it. The efficiency skeptic should jettison their skepticism, and should take care to avoid the fallacy fallacy—the fallacy of taking for granted that some conclusion is false just because a fallacious argument for that conclusion exists.7

I once summarized my epistemology like so: “Try to make sure you’d arrive at different beliefs in different worlds.” You don’t want to think in such a way that you wouldn’t believe in a conclusion in a world where it were true, just because a fallacious argument could support it. Emotionally appealing mistakes are not invincible cognitive traps that nobody can ever escape from. Sometimes they’re not even that hard to escape.

The remedy, as usual, is technical understanding. If you know in detail when a phenomenon switches on and off, and when the “inescapable” slapdown is escapable, you probably won’t map it onto God.

 

iv.

I actually can’t recall seeing anyone make the mistake of treating efficient markets like high-status authorities in a social pecking order.8 The more general phenomenon seems quite common, though: heavily weighting relative status in determining odds of success; responding to overly ambitious plans as though they were not merely imprudent but impudent; and privileging the hypothesis that authoritative individuals and institutions have mysterious unspecified good reasons for their actions, even when these reasons stubbornly resist elicitation and the actions are sufficiently explained by misaligned incentives.

From what I can tell, status regulation is a second factor accounting for modesty’s appeal, distinct from anxious underconfidence. The impulse is to construct “cheater-resistant” slapdowns that can (for example) prevent dilettantes who are low on the relevant status hierarchy from proposing new SAD treatments. Because if dilettantes can exploit an inefficiency in a respected scientific field, then this makes it easier to “steal” status and upset the current order.

In the past, I didn’t understand that an important part of status regulation, as most people experience it, is that one needs to already possess a certain amount of status before it’s seen as acceptable to reach up for a given higher level of status. What could be wrong (I previously thought) with trying to bestow unusually large benefits upon your tribe? I could understand why it would be bad to claim that you had already accomplished more than you had—to claim more respect than was due the good you’d already done. But what could be wrong with trying to do more good for the tribe, in the future, than you already had in the present?

It took me a long time to understand that trying to do interesting things in the future is a status violation because your current status right now determines what kinds of images you are allowed to associate with yourself, and if your status is low, then many people will intuitively perceive an unpleasant violation of the social order should you associate with yourself an image of possible future success above some level. Only people who already have something like an aura of pre-importance are allowed to try to do important things. Publicly setting out to do valuable and important things eventually is above the status you already have now, and will generate an immediate system-1 slapdown reaction.

I recognize now that this is a common lens through which people see the world, though I still don’t know how it feels to feel that.

Regardless, when I see a supposed piece of epistemology that looks to me an awful lot like my model of status regulation, but which doesn’t seem to cohere with the patterns of correct reasoning described by theorists like E. T. Jaynes, I get suspicious. When people cite the “outside view” to argue that one should stick to projects whose ambition and impressiveness befit one’s “reference class,” and announce that any effort to significantly outperform the “reference class” is epistemically suspect “overconfidence,” and insist that moving to take into account local extenuating factors, causal accounts, and justifications constitutes an illicit appeal to the “inside view” and we should rely on more obvious, visible, publicly demonstrable signs of overall auspiciousness or inauspiciousness… you know, I’m not sure this is strictly inspired by the experimental work done on people estimating their Christmas shopping completion times.

I become suspicious as well when this model is deployed in practice by people who talk in the same tone of voice that I’ve come to associate with status regulation, and when an awful lot of what they say sounds to me like an elaborate rationalization of, “Who are you to act like some kind of big shot?”

I observe that many of the same people worry a lot about “What do you say to the Republican?” or the possibility that crackpots might try to cheat—like they’re trying above all to guard some valuable social resource from the possibility of theft. I observe that the notion of somebody being able to steal that resource and get away with it seems to inspire a special degree of horror, rather than just being one more case of somebody making a mistaken probability estimate.

I observe that attempts to do much better than is the norm elicit many heated accusations of overconfidence. I observe that failures to even try to live up to your track record or to do as well as a typical member of some suggested reference class mysteriously fail to elicit many heated accusations of underconfidence. Underconfidence and overconfidence are symmetrical mistakes epistemically, and yet somehow I never see generalizations of the outside view even-handedly applied to correct both biases.

And so I’m skeptical that this reflects normative probability theory, pure epistemic rules such as aliens would also invent and use. Sort of like how an asexual decision theorist might be skeptical of an argument saying that the pure structure of decision theory implies that arbitrary decision agents with arbitrary biologies ought to value sex.

This kind of modesty often looks like the condescension of despair, or bears the “God works in mysterious ways” property of attributing vague good reasons to authorities on vague grounds. It’s the kind of reasoning that makes sense in the context of an efficient market, but it doesn’t seem to be coming from a model of the structure or incentives of relevant communities, such as the research community studying mood disorders.

No-free-energy equilibria do generalize beyond asset prices; markets are not the only ecologies full of motivated agents. But sometimes those agents aren’t sufficiently motivated and incentivized to do certain things, or the agents aren’t all individually free to do them. In this case, I think that many people are doing the equivalent of humbly accepting that they can’t possibly know whether a single house in Boomville is overpriced. In fact, I think this form of status-oriented modesty is extremely common, and is having hugely detrimental effects on the epistemic standards and the basic emotional health of the people who fall into it.

 

v.

Modesty can take the form of an explicit epistemological norm, or it can manifest in more quiet and implicit ways, as small flinches away from painful thoughts and towards more comfortable ones. It’s the latter that I think is causing most of the problem. I’ve spent a significant amount of time critiquing the explicit norms, because I think these serve an important role as canaries piling up in the coalmine, and because they are bad epistemology in their own right. But my chief hope is to illuminate that smaller and more quiet problem.

I think that anxious underconfidence and status regulation are the main forces motivating modesty, while concerns about overconfidence, disagreement, and theoreticism serve a secondary role in justifying and propagating these patterns of thought. Nor are anxious underconfidence and status regulation entirely separate problems; bucking the status quo is particularly painful when public failure is a possibility, and shooting low can be particularly attractive when it protects against accusations of hubris.

Consider the outside view as a heuristic for minimizing the risk of social transgression and failure. Relying on an outside view instead of an inside view will generally mean making fewer knowledge claims, and the knowledge claims will generally rest on surface impressions (which are easier to share), rather than on privileged insights and background knowledge (which imply more status).

Or consider the social utility of playing the fox's part. The fox can say that they rely only on humble data sets, disclaiming the hedgehog’s lofty theories, and disclaiming any special knowledge or special powers of discernment implied thereby. And by sticking to relatively local claims, or only endorsing global theories once they command authorities’ universal assent, the fox can avoid endorsing the kinds of generalizations that might encroach on someone else’s turf or otherwise disrupt a status hierarchy.

Finally, consider appeals to agreement. As a matter of probability theory, perfect rationality plus mutual understanding often entails perfect agreement. Yet it doesn’t follow from this that the way for human beings to become more rational is to try their best to minimize disagreement. An all-knowing agent will assign probabilities approaching 0 and 1 to all or most of its beliefs, but this doesn’t imply that the best way to become more knowledgeable is to manually adjust one’s beliefs to be as extreme as possible.

The behavior of ideal Bayesian reasoners is important evidence about how to become more rational. What this usually involves, however, is understanding how Bayesian reasoning works internally and trying to implement a causally similar procedure, not looking at the end product and trying to pantomime particular surface-level indicators or side-effects of good Bayesian inference. And a psychological drive toward automatic deference or self-skepticism isn’t the mechanism by which Bayesians end up agreeing to agree.

Bayes-optimal reasoners don’t Aumann-agree because they’re following some exotic meta-level heuristic. I don’t know of any general-purpose rule like that for quickly and cheaply leapfrogging to consensus, except ones that do so by sacrificing some amount of expected belief accuracy. To the best of my knowledge, the outlandish and ingenious trick that really lets flawed reasoners inch nearer to Aumann’s ideal is just the old-fashioned one where you go out and think about yourself and about the world, and do what you can to correct for this or that bias in a case-by-case fashion.

Whether applied selectively or consistently, the temptation of modesty is to “fake” Aumann agreement—to rush the process, rather than waiting until you and others can actually rationally converge upon the same views. The temptation is to call an early halt to risky lines of inquiry, to not claim to know too much, and to not claim to aspire to too much; all while wielding a fully general argument against anyone who doesn’t do the same.

And now that I’ve given my warning about these risks and wrong turns, I hope to return to other matters.

 

My friend John thought that there were hidden good reasons behind Japan’s decision not to print money. Was this because he thought that the Bank of Japan was big and powerful, and therefore higher status than a non-professional-economist like me?

I literally had a bad taste in my mouth as I wrote that paragraph.9 This kind of psychologizing is not what people epistemically virtuous enough to bet on their beliefs should spend most of their time saying to one another. They should just be winning hundreds of dollars off of me by betting on whether some AI benchmark will be met by a certain time, as my friend later proceeded to do. And then later he and I both lost money to other friends, betting against Trump’s election victory. The journey goes on.

I’m not scheming to taint all humility forever with the mere suspicion of secretly fallacious reasoning. That would convict me of the fallacy fallacy. Yes, subconscious influences and emotional temptations are a problem, but you can often beat those if your explicit verbal reasoning is good.

I’ve critiqued the fruits of modesty, and noted my concerns about the tree on which they grow. I’ve said why, though my understanding of the mental motions behind modesty is very imperfect and incomplete, I do not expect these motions to yield good and true fruits. But cognitive fallacies are not invincible traps; and if I spent most of my time thinking about meta-rationality and cognitive bias, I'd be taking my eye off the ball.10

 


 

Cross-posted to Less Wrong. Conclusion: Against Shooting Yourself in the Foot.

The full book is now available in print and electronic formats at equilibriabook.com.

 


 

  1. From Bodenhausen, Macrae, and Hugenberg (2003):

    [I]f correctional mechanisms are to result in a less biased judgment, the perceiver must have a generally accurate lay theory about the direction and extent of the bias. Otherwise, corrections could go in the wrong direction, they could go insufficiently in the right direction, or they could go too far in the right direction, leading to overcorrection. Indeed, many examples of overcorrection have been documented (see Wegener & Perry, 1997, for a review), indicating that even when a bias is detected and capacity and motivation are present, controlled processes are not necessarily effective in accurately counteracting automatic biases. 
  2. From Superforecasting: “The strongest predictor of rising into the ranks of superforecasters is perpetual beta, the degree to which one is committed to belief updating and self-improvement. It is roughly three times as powerful a predictor as its closest rival, intelligence.” 

  3. E.g., Alpert and Raiffa, “A Progress Report on the Training of Probability Assessors.” 

  4. Or rather, get better at predicting future events than intelligence agencies, company executives, and the wisdom of crowds. 

  5. From Edward FitzGerald’s Rubaiyat of Omar Khayyám

  6. The existence of specialized cognitive modules for detecting cheating can be seen, e.g., in the Wason selection task. Test subjects perform poorly when asked to perform a version of this task introduced in socially neutral terms (e.g., rules governing numbers and colors), but perform well when given an isomorphic version of the task that is framed in terms of social rules and methods for spotting violators of those rules. See Cosmides and Tooby, “Cognitive Adaptations for Social Exchange.” 

  7. Give me any other major and widely discussed belief from any other field of science, and I shall paint a picture of how it resembles some other fallacy—maybe even find somebody who actually misinterpreted it that way. It doesn’t mean much. There’s just such a vast array of mistakes human minds can make that if you rejected every argument that looks like it could maybe be guilty of some fallacy, you’d be left with nothing at all.

    It often just doesn’t mean very much when we find that a line of argument can be made to look “suspiciously like” some fallacious argument. Or rather: being suspicious is one thing, and being so suspicious that relevant evidence cannot realistically overcome a suspicion is another. 

  8. It’s a mistake that somebody could make, though, and people promoting ideas that are susceptible to fallacious misinterpretation do have an obligation to post warning signs. Sometimes it feels like I’ve spent my whole life doing nothing else. 

  9. Well, my breakfast might also have had something to do with it, but I noticed the bad taste while writing those sentences. 

  10. There’s more I can say about how I think modest epistemology and status dynamics work in practice, based on past conversations; but it would require me to digress into talking about my work and fiction-writing. For a supplemental chapter taking a more concrete look at these concepts, see https://equilibriabook.com/hero-licensing


null @ 2017-11-17T00:16 (+8)

It strikes me as much more prevalent for people to be overconfident in their own idiosyncratic opinions. If you see half of people are 90% confident in X and half of people are 90% confident in not-X, then you know on average they are overconfident. That's how most of the world looks to me.

But no matter - they probably won't suffer much, because the meek do no inherit the Earth, at least not in this life.

People follow confidence in leaders, generating the pathological start-up founder who is sure they're 100x more likely to succeed than the base rate; someone who portrays themselves as especially competent in a job interview is more likely to be hired than someone who accurately appraises their merits; and I don't imagine deferring to a boring consensus brings more romantic success than elaborating on one's exciting contrarian opinions.

Given all this, it's unsurprising evolution has programmed us to place an astonishingly high weight on our own judgement.

While there are some social downsides to seeming arrogant, people who preach modesty here advocate going well beyond what's required to avoid triggering an anti-dominance reaction in others.

Indeed, even though I think strong modesty is epistemically the correct approach on the basis of reasoned argument, I do not and can not consistently live and speak that way, because all my personal incentives are lined up in favour of me portraying myself as very confident in my inside view.

In my experience it requires a monastic discipline to do otherwise, a discipline almost none possess.

null @ 2017-11-18T22:19 (+3)

Cross-posting a reply from FB:

It strikes me as much more prevalent for people to be overconfident in their own idiosyncratic opinions. If you see half of people are 90% confident in X and half of people are 90% confident in not-X, then you know on average they are overconfident. That's how most of the world looks to me.

This seems consistent with Eliezer's claim that "commenters on the Internet are often overconfident" while EAs and rationalists he interacts with in person are more often underconfident. In Dunning and Kruger's original experiment, the worst performers were (highly) overconfident, but the best performers were underconfident.

Your warnings that overconfidence and power-grabbing are big issues seem right to me. Eliezer's written a lot warning about those problems too. My main thought about this is just that different populations can exhibit different social dynamics and different levels of this or that bias; and these can also change over time. Eliezer's big-picture objection to modesty isn't "overconfidence and power-grabbing are never major problems, and you should never take big steps to try combat them"; his objection is "biases vary a lot between individuals and groups, and overcorrection in debiasing is commonplace, so it's important that whatever debiasing heuristics you use be sensitive to context rather than generically endorsing 'hit the brakes' or 'hit the accelerator'".

He then makes the further claim that top EAs and rationalists as a group are in fact currently more prone to reflexive deference, underconfidence, fear-of-failure, and not-sticking-their-neck-out than to the biases of overconfident startup founders. At least on Eliezer's view, this should be a claim that we can evaluate empirically, and our observations should then inform how much we push against overconfidence v. underconfidence.

The evolutionary just-so story isn't really necessary for that critique, though it's useful to keep in mind if we were originally thinking that humans only have overactive status-grabbing instincts, and don't also have overactive status-grab-blocking instincts. Overcorrection is already a common problem, but it's particularly likely if there are psychological drives pushing in both directions.