A Million Is Made Out Of A Million Ones

By Bentham's Bulldog @ 2025-08-08T20:15 (+39)

Maybe my favorite quote ever is from a recent Scott Alexander article, but in order to explain it, I will have to provide some context.

Scott’s recent piece, My Heart of Hearts, is broadly about why we should value ethical consistency. In it, he described his emotional reaction to the Gaza conflict in the following way:

A few months ago, I read an article by an aid worker in Gaza recounting the horrors he’d seen. Among a long litany, one stood out. A little kid came into the hospital with a backpack. The doctors told him he had to put it down so they could treat him, and he refused. The doctors insisted. The kid fought back. Finally someone opened the bag. It was some body part fragments from the kid’s dead brother. He couldn’t bear to leave him, so he carried them everywhere he went.

I am a Real Man and therefore do not cry. But I confess to getting a little misty at this story, and I know exactly why. When my 1.5-year-old son wakes up early, the first words out of his mouth when I extract him from his crib are “Yaya? Yaya?” which is how he says his sister Lyra’s name. No matter how I distract him, he’ll keep saying “Yaya? Yaya?” and pointing at the door to her room until she wakes up, at which point he’ll get a big smile and run over to her. It’s impossible for me to read this story without imagining her body parts in the backpack and him saying “Yaya? Yaya?” in an increasingly distressed voice, over and over again, until the doctors drag him away.

So my absolute most honest and deepest opinion on the war in Gaza, the one I hold in my heart of hearts, is: I would kill everyone in the entire region, on both sides, if it would give that kid his brother back.

He goes on to say that even though this is his emotional reaction, it’s not his overall considered judgment. There are a great many people in the world undergoing tragedies as great as that young boy. In Gaza, there are thousands of parents who have lost children. Even though this single incident happened to affect him emotionally more than everything else did, in objective terms, it’s not any worse than tens of thousands of other horrendous things happening in Gaza and Israel (though at this point, there is vastly more suffering in Gaza). Scott ends the essay in the following way:

No, I can’t actually feel emotions about everyone in Gaza, and I’m not sure anyone else can either. This doesn’t mean concern must be virtue signaling or luxury beliefs. It just means that it requires principle rather than raw emotion. One death is a tragedy, a million deaths is a statistic. But if you’re interested in having the dignity of a rational animal (a perfectly acceptable hobby! no worse than trying to get good at Fortnite or whatever!) then eventually you notice that a million is made out of a million ones and try to act accordingly.

I might etch on my tombstone the phrase “A million is made out of a million ones.”

None of us can really have fitting emotional reactions to the world’s evils. Even our sadness about the situation Scott describes is vastly less great than the genuine horror of the situation. We almost never feel as bad about situations as they really are. We have about the same emotional reaction to 1,000 people dying as to 1,000,000 people dying, and a much stronger emotional reaction to one person dying in a sad way than to 1,000,000 dying in a less sad way. None of us have sat in a room and cried about the abstract tragedy of tuberculosis, even though it kills over a million people every year. Many of us have cried at sad movies depicting objectively vastly less terrible events, like the death of a single dog.

When you come to see that a million is made of a million ones, even if you can’t modulate your emotional reaction to care a million times as much about a million deaths as about a single death, you should recognize intellectually that it’s a million times worse and act accordingly. We should not let people needlessly die because our ability to feel emotions is poorly calibrated. We should not be jerked around like a puppet to whichever cause happens to have upset us most that day.

I think in some sense the core intuition of effective altruism is that a million is made of a million ones.

Paul Bloom (with whom I recently spoke) has a book called Against Empathy. In it, he makes the case that empathy isn’t a great guide to figuring out which things are really important. We have empathy primarily for those who are like us. One who relied only on empathy would not care about shrimp or insects—or much about non-humans, because it’s hard to have empathy for such creatures. They will mostly care about whoever they find it easy to imagine themselves as. They will mostly ignore counterintuitive causes or causes that are very bad in absolute terms but that don’t pull on the heartstrings as much.

I don’t have a negative emotional reaction to shrimp suffering. When I read about shrimp farming, I don’t feel sad the way I do when I read about the scenario Scott described. But being a serious person involves recognizing that some things in the world are terrible, and worth acting to stop, even if they don’t upset you personally.

Humans can’t really get their head around big numbers. We can just barely imagine a thousand deaths by imagining 10 big lecture halls full of students being wholly exterminated. Larger numbers are statistics that we can’t intuitively grok.

A million is made out of a thousand thousands—as if each of ten lecture halls had ten lecture halls for each student. And a billion is a thousand times that.

There are about half a million annual deaths from malaria.

There are about 90 billion factory farmed land animals killed each year.

There are about 440 billion shrimp tortured in farms each year.

I don’t have much emotional reaction to the last of these things. But I can imagine what it’s like to be in pain. When the shrimp have their eyes crushed and get placed into vast overcrowded containers of feces and filth and disease, I can abstractly recognize that something horrible is happening. Experiences that I’d give up quite a lot to avoid could very well be happening to each of those shrimp.

And there are 440 billion of them.

A billion is made of a thousand millions. A million is made of a million ones. 440 billion is made of 440 billions. If shrimp are conscious, which they probably are, then something awful is happening on a scale that can scarcely be fathomed, to a number of creatures that truly defies comprehension.

It sometimes feels crass to care about and give money to shrimp in a world full of suffering people. But we shouldn’t be guided by our emotions on this subject. When we recognize that in expectation horrible things are happening hundreds of billions of times, and that thousands of instances of this horror can be prevented per dollar, it becomes clear just how important preventing it is. We should not rely merely on empathy, because it’s a radically defective gauge of importance.

But there are other numbers vastly more than 440 billion. Bostrom conservatively estimated that under optimistic assumptions, the far future could have 10^52 people. Now, you shouldn’t take that number too seriously—it’s mostly just illustrative. But even a low probability of such vast numbers of future people existing means the future is in expectation very large.

My emotional reaction to 10^52 happy flourishing people is just about the same as my emotional reaction to 10^30 flourishing people, even though the first has 10^22 times as many people. 10^52 is roughly to 10^30 as the total number of people who have ever lived is to a single person. Ethical action in the world shouldn’t be held captive to our defective emotional response and our brain’s inability to multiply.

I think effective altruism is about, in some deep sense, having our efforts to do good be untethered from bias and emotions and the fact that we lack empathy for everyone whose stories don’t make it into the paper that we read.

People sometimes say that effective altruists fetishize numbers. But I think this charge is exactly backwards. Effective altruists care about everyone in the world suffering and dying. Behind every number in a DALY calculation, there is a real person whose suffering and death is on the line. It’s ethically mandatory, not crass and numerically-fetishistic, to care 100 times more about something that affects a hundred million people than something that affects a million people. It has 100 times as many victims, and if we care seriously about victims, then we must not think of them merely as tiny and morally insignificant portions of a statistic. Something that affects 100 million people really is 100 million times as bad as something that only affects a single person.

If we really care about people, we care just as much about the one child who dies tragically in a way that makes us upset as about the 1.1 millionth child who died of Tuberculosis. It’s not effective altruists who fetishize numbers. It’s non-effective-altruists who fetishize their own emotional reactions, caring more about them than about the victims of injustice.

In a sentence, effective altruism is about seeing that a million is made out of a million ones and trying to act accordingly. It’s about caring about problems proportional to how severe they are, rather than only caring about what pulls on the heartstrings. It is about genuine altruism, rather than only altruism directed at sympathetic victims. It goes against our naive empathic feelings only because our naive empathic feelings are wildly miscalibrated.

It is not the numbers that are cold and uncaring: it is us when we neglect the number of others suffering and dying. 


Denkenberger🔸 @ 2025-08-10T04:27 (+7)

As I learned from someone on this forum, "EAs are not cold and calculating - they are warm and calculating."

K.F. Martin @ 2025-08-21T02:22 (+1)

I like this sentiment. It's good to care about the one, I feel, and to take oppurtunities to help many because they're all just like that one, with suffering and joy that's worth caring about, even alone.