Who's at fault for FTX's wrongdoing
By EliezerYudkowsky @ 2022-11-16T04:47 (+177)
Caroline Ellison, co-CEO and later CEO of Alameda, had a now-deleted blog, "worldoptimization" on Tumblr. One does not usually post excerpts from deleted blogs - the Internet has, of course, saved it by now - but it looks like Caroline violated enough deontology to be less protected than usual in turn, and also I think it's important for people to see what signals are apparently not reliable signs of honesty and goodness.
In a post on Oct 10 2022, Caroline Ellison crossposted her Goodreads review of The Golden Enclaves, book 3 of Scholomance by Naomi Novik. Caroline Ellison writes, including very light / abstract spoilers only:
A pretty good conclusion to the series.
Biggest pro was the resolution of mysteries/open questions from the first two books. It wrapped everything up in a way that felt very satisfying.
Biggest con was … I think I felt less bought into the ethics of the story than I had for the previous two books?
The first two books often have a vibe of “you can either do the thing that’s easy and safe or you can do the thing that’s hard and scary but right, and being a good person is doing the right thing.” And I’m super on board with that.
Whereas if I had to sum up the moral message of the third book I might go with “there is no ethical consumption under late capitalism.”
For someone like myself, this is a pretty shocking thing to hear somebody say, on a Tumblr blog not then associated with their main corporate persona, not in a way that sounds like the usual performativity, not like it's meant to impress anybody (because then you're probably not writing about anything as undignified as fantasy fiction in the first place). It sounds like - Caroline might have been under the impression, as late as Oct 10, that what she was doing at FTX was the thing that's hard and scary but right? That she was doing, even, what Naomi Novik would have told her to do?
The Scholomance novels feature a protagonist, Galadriel Higgins, with unusually dark and scary powers, with a dark and scary prophecy about herself, trying to do the right thing anyways and being misinterpreted by her classmates, in an incredibly hostile environment.
The line of causality seems clear - Naomi Novik, by telling her readers to do the right thing, probably contributed to Caroline Ellison doing what she thought was the right thing - misusing Alameda's customer deposits. Furthermore, the Scholomance novels romanticized people with dark and scary powers, and those people not just immediately killing themselves in the face of a prophecy that they'd do immense harm later, i.e., sending the message that it's okay for them to take huge risks with other people's interests.
I expect this to be a very serious blow to Naomi Novik's reputation, possibly the reputation of fantasy fiction in general. The now-deleted Tumblr post is tantamount to a declaration that Caroline Ellison was doing this because she thought Naomi Novik told her to. We can infer that probably at least $30 of Scholomance sales are due to Caroline Ellison, and with the resources that Ellison commanded as co-CEO of Alameda, some unknown other fraction of Scholomance's entire revenues could have been due to phantom purchases that Ellison funded in order to channel customer deposits to her favorite author.
My moral here? It can also be summed up in an old joke that goes as follows: "He has no right to make himself that small; he is not that great."
The best summary of the FTX affair that I've read so is Milky Eggs's "What Happened at Alameda Research?" If you haven't read it already, and you're at all interested in this affair, I recommend that you go read it right now.
Pieced together from various sources, including some allegedly shared from FTX-employees (and including some comments posted by those to the Effective Altruism forum), Milky Eggs pieces together a harrowing story of how Alameda Research probably lost in excess of $15 billion dollars. Primary causative factors:
- Their actual arb strategies stopped working, and were frog-boilingly gradually replaced with long bets on crypto that paid out during the boom and exploded during the bust;
- Poor accounting, possibly just no really global accounting or sense of where the money was going;
- Excessive use of stimulants, including those known to result in compulsive gambling behavior;
- A corporate acquisitions spree, possibly partially motivated by buying up corporate entities that held the FTT token and could have taked the market by dumping it, maybe even raiding those companies for their own customer deposits;
- A general lack of spending discipline: for example, buying naming rights to the e-sports organization TSM for $210M, which was way out of line to comparable deals in e-sports.
Completely missing from Milky Eggs's account: Any mention of effective altruism, except that the EA Forum is listed as a source for some of their alleged-ex-FTX-employee accounts.
Why?
Because - and I say this meaning it gently, and with kindness - you were not that fucking important.
The amount that FTX spent on e-sports naming rights for TSM was greater than everything they donated to effective altruism.
Can you imagine how you'd judge it if, rather than my writing it as a joke, Naomi Novik had gone online and sincerely tried to accept blame for FTX's fall, because she thought she hadn't been careful enough to put messages about good corporate governance and careful accounting into her fantasy novels, and Novik had talked about how she was planning to donate an appropriate portion of her Scholomance book royalties back to FTX's ruined customers? Depending on her state of mind, you might either try to gently console her and somehow get her to realize that she was being way too scrupulous and might possibly want to try standard meds for OCD at some point; or, on another hypothesis about Novik's state of mind, you might try to gently explain that she's not the center of the universe and that this wasn't mostly about her.
This would be true even if Sam Bankman-Fried himself had presented as a Naomi Novik fan, if he had told others that he wanted to be a Novik-style DoTheRightThingist just like Galadriel Higgins the Scholomance protagonist, and he had funneled $140M to causes having to do with things that were on-theme for some of Novik's books. The $140M would still be less than FTX had spent on e-sports naming rights. SBF calling himself a Novikian RightThingist would not have been much of a factor in why he was trusted, compared to their claims of being the first GAAP-audited crypto exchange and so on.
There probably would be some sort of weird blowup in the Novik fandom, in that case, it would make more sense for them to wonder if they were responsible. But I'd expect people in the Novik fandom to also vastly overestimate how much it was all about them, in that case; because they would know all about Novik, but have less daily exposure to the much wider world in which FTX operated. They'd have heard about the money donated to RightThingism but not about the e-sports naming rights. They would not realize that there were other and bigger fish in the pond.
(Be it clear, I'm not analogizing myself to Novik in that metaphor. I'm analogizing Peter Singer and classical Givewell-style EA to Novik. I asked SBF if he wanted to meet with me ever, he never got around to it, I do not think he was a Yudkowsky fan and he hung out with some EAs who definitely weren't.)
(ADDED: I am not saying that EA influence on Alameda was comparable in magnitude to Novik's influence on Caroline Ellison; I am giving an example of the mental motion of trying to grab too much responsibility because you don't know about all the parts of the universe that aren't yourself.)
It wouldn't, even, reflect all that badly on the spirit running across many fantasy novels of RightThingism. Not just because "no true Scotsman", not even because SBF would have really actually missed the point of fantasy-novel RightThingism. But because the amount that FTX spent on e-sports naming rights vs the amount they gave to RightThingist causes, and how they didn't take a billion off the table for RightThingism while they still had a billion, maybe belied a bit the idea that RightThingism was in fact that central to their mental lives.
Also Milky Eggs's account says that FTX's own employees were encouraged to keep all their salaries on the exchange, which... I don't really have words. It's not - what you'd expect somebody to do if they still had even fantasy-novel RightThingism inside them. The Milky Eggs account says that Caroline Ellison was one of four FTX employees who knew. I wish I had a reliable printout of what Caroline Ellison was actually thinking at the time she wrote that Tumblr post. I would bet that, even without the benefit of hindsight on how it turned out, Naomi Novik wouldn't have agreed with it at the time.
And whatever Caroline Ellison was thinking when she wrote that, it is obvious - when you look at it from safely outside - that it wasn't Naomi Novik's fault.
If Caroline Ellison had worn a Naomi Novik T-shirt and put the Scholomance books in her Twitter profile and told her crypto clients "Trust me, I read fantasy novels and I know what the Right Thing is," it would still not have been Naomi Novik's fault.
It wouldn't have been the fault of the abstract concept of “you can either do the thing that’s easy and safe or you can do the thing that’s hard and scary but right, and being a good person is doing the right thing”. Plenty of people have read fantasy novels like that and not wrecked depository institutions. Not just in terms of moral responsibility, but actual causality, I'd be surprised if that was really in actual fact a key driver in the decisions that Caroline Ellison made; maybe she used that to rationalize that afterwards, but I doubt it's what was going through her mind on the fatal day that FTX used customer deposits to pay back Alameda creditors (if that's in fact when FTX first touched customer deposits). Pride did it, I'd sooner guess, or the desire to not not not be in this universe going so badly and taking the only step that preserved the feeling that everything could still be okay.
Who's at fault for FTX's wrongdoing?
FTX.
Ask a simple question, get a simple answer.
You have no right to blame yourself any more than that. You weren't that important.
If there's anyone other than FTX who's really to blame, here, it's me. I've written some fiction that tries to walk people through the experience of abandoning sunk costs and facing reality. Including my most recent work.
Caroline Ellison, according to her tumblr, had even started reading it...
But her liveblogs cut out before she got very far in.
I just wasn't a good-enough writer; I lost my reader's attention, and with it, perhaps, the world.
Now, some people might say here: "But Eliezer, aren't you co-writing that story with another author?" And to this I can only reply: I see no reason why the existence of any other people in the universe ought to detract from my own sole accountability for everything that anyone does inside it.
Habryka @ 2022-11-16T19:18 (+131)
DM conversation I had with Eliezer in response to this post. Since it was a private convo and I was writing quickly I had somewhat exaggerated in a few places that I've now indicated with edits.
Hmm, I do feel like I maybe want to have some kind of public debate about whether indeed we could have noticed that a bunch of stuff about FTX was noticeable, and whether we have some substantial blame to carry.
Like, to be clear, I think the vast majority of EAs had little they could have or should have done here. But I think that I, and a bunch of people in the EA leadership, had the ability to actually do something about this.
I sent emails in which I warned people of SBF. I had had messages written but that I never sent that seem to me like if I had sent them they would have actually caused people to realize a bunch of inconsistencies in Sam's story. I had sat down my whole team, swore them to secrecy, and told them various pretty clearly illegal things that I heard Sam had done [sadly all uncomfirmed, asking for confidentiality and only in rumors] that convinced me that we should avoid doing business with him as much as possible (this was when we were considering whether to do a bunch of community building in the Bahamas). Like, in my worldview, I did do my due diligence, and FTX completely failed my due diligence, and I just failed to somehow propagate that knowledge.
Also, ultimately Sam's social group was my social group. The author of the Scholomance books did not also happen to hang out with Sam at multiple 5-day retreats in the last year. They are not close friends with metamours of Sam and Caroline. They do not share 75% of their friends, do not hang out in the same office and do not read the same forums, all in addition to not subscribing explicitly to the same ethical philosophy that's centered around a few thousand pages of writing. [To clarify some exaggerations, I think I share more like 30% of friends with FTX leadership, and Caroline visited Constellation, a Berkeley coworking space that I sometimes work out of, a number of times in the last year, though neither she nor Sam usually work there]
Like, I am open to there being nothing I could have done, but the conclusion doesn't seem obvious to me at the moment.
I asked Sam a couple of times if we could schedule a long conversation, and he never got back to me go figure. I passed through the Bahamas and ended up meeting with the Future Fund people but not, iirc, the FTX people. I don't have the sense that SBF was one of My People. If Caroline was one of My People and not just somebody who read some EY fiction but ultimately a Singer/Givewell type, I haven't yet heard the account of it.
The question isn't whether there's anything we could've done, but anything we could've done in a more meaningful sense than I "could've" recced Bitcoin to HPMOR readers in 2010.I mean, I am a lot more embedded in the EA space beyond MIRI than you are, and I mean, I think there is an important sense in which I don't think you had the relevant info, but I do think a bunch of other people had.
I do also think it is pretty unlikely we could have prevented FTX exploding, though I do think we could have likely prevented FTX being super embedded in the EA Community, having a ton of people rely on its funding, and having Sam be held up in tons of places as a paragon of the EA community.
Like, I think we fucked up pretty hard by saying for a few years that we think Sam did great by our values, when I think it was already pretty clear by that point that he quite likely wasn't.
I wouldn't mind if you wanted to post that, or if you wanted to post this whole conversation. My experience was much more of FTX being some weird distant people who were doing a kind of longtermism that never intersected with much I considered useful until their regranting program started.
Cool, I might post this chat.
Evan R. Murphy @ 2022-11-17T06:22 (+42)
Commending Habryka for willing to share about these things. It takes courage and I think reflections/discussions like this could be really valuable (perhaps essential) to the EA community having the kind of reckoning about FTX that we need.
Holly_Elmore @ 2022-11-17T05:57 (+26)
Great points, all. Even if most people could do nothing and Sam was not motivated by a core problem with EA philosophy, that doesn’t mean there was nothing that EAs close to the situation could have done differently. I would love to see a public airing of what genuine evidence people think they might have had that should have changed those people’s behavior around Sam.
strawberry @ 2022-11-18T06:42 (+10)
I think I share more like 30% of friends with FTX leadership
Assuming that this means that the FTX leadership is friends with prominent EAs, I think that this fact raises some questions that many people might consider important.
For instance, I think some people might find it important to know what those friends have been doing with respect to this situation for the past week. What sort of communication have they had with the FTX leadership? Do they still feel loyalty toward SBF/Caroline/etc.? Are they in any way aiding or abetting them to commit crimes or avoid the legal or reputational consequences of their actions?
These might be dumb questions, and I apologize if so. They occurred to me because I model people as being quite likely to aid and abet with their close friends' criminal or malicious activity, but I acknowledge that that model could be wrong and/or not very applicable to this situation.
pseudonym @ 2022-11-16T06:00 (+126)
I'm analogizing Peter Singer and classical Givewell-style EA to Novik.
What about the parts of EA that isn't Peter Singer and classical GiveWell-style EA? If those parts of EA were somewhat responsible, would it be reasonable to call that EA as well?
I don't think the analogy is helpful. Naomi Novik presumably does not claim to emphasize the importance of understanding tail risks. Naomi presumably didn't meet Caroline and encourage her to earn a lot of money so she can donate to fantasy authors, nor did Caroline say "I'm earning all of this money so I can fund Naomi Novik's fantasy writing". Naomi Novik did not have Caroline on her website as a success story of "this is why you should earn money to buy fantasy books or support other fantasy writers". Naomi didn't have a "Fantasy writer's fund" with the FTX brand on it.
I think it's reasonable to preach patience if you think people are jumping too quickly to blame themselves. I think it's reasonable to think that EA is actually less responsible than the current state of discourse on the forum. And I'm not making a claim about the extent EA is in fact responsible for the events. But the analogy as written is pretty poor, and doesn't really make a good case for saying EA has zero responsibility here (emphasis added):
Who's at fault for FTX's wrongdoing?
FTX.
Ask a simple question, get a simple answer.
You have no right to blame yourself any more than that. You weren't that important.
EliezerYudkowsky @ 2022-11-16T06:21 (+37)
I agree that if I, personally, had steered SBF into crypto, and uncharacteristically failed to add on a lot of "hey but please don't scam people, only do this if you find a kind of crypto you can feel good about" I might consider myself more at fault. I even think that the Singer side of EA in fact does less talking about deontology, less writing of fiction that exemplifies the feelings and reasoning behind that deontology, less cautioning of people against twisting up their brains by chasing good ideas; on my view, the Singer side explicitly starts by trying to twist people's brains up internally, and at some point we should all maybe have a conversation about that.
The thing is, if you want to be sane about this sort of thing, even so and regardless I think Peter Singer himself would not have approved this, would obviously not have approved this. When somebody goes that far off the rails, I just don't see how you could reasonably hold responsible people who didn't tell them to do that and would've obviously not wanted them to do that.
Sam Elder @ 2022-11-16T12:22 (+27)
I agree that if I, personally, had steered SBF into crypto, and uncharacteristically failed to add on a lot of "hey but please don't scam people, only do this if you find a kind of crypto you can feel good about" I might consider myself more at fault.
Given how big of a role EA apparently had in the origin of Alameda (Singh says in the Sequoia puff piece that it wouldn’t have started without EA), there very likely are many members of the community who offered more encouragement and/or didn’t give as many warnings as they should have.
I don’t know what point that fault transcends the individual and attributes to the community, but at the very least, adding up other individuals’ culpabilities in steering SBF to crypto without appropriate caution would seem to put a lot of the blame you say you personally avoid on EA as a whole.
Milan_Griffes @ 2022-11-16T18:21 (+100)
Here are some excerpts from Sequoia Capital's profile on SBF (published September 2022, now pulled).
On career choice:
Not long before interning at Jane Street, SBF had a meeting with Will MacAskill, a young Oxford-educated philosopher who was then just completing his PhD. Over lunch at the Au Bon Pain outside Harvard Square, MacAskill laid out the principles of effective altruism (EA). The math, MacAskill argued, means that if one’s goal is to optimize one’s life for doing good, often most good can be done by choosing to make the most money possible—in order to give it all away. “Earn to give,” urged MacAskill.
...
It was his fellow [fraternity members] who introduced SBF to EA and then to MacAskill, who was, at that point, still virtually unknown. MacAskill was visiting MIT in search of volunteers willing to sign on to his earn-to-give program.
At a café table in Cambridge, Massachusetts, MacAskill laid out his idea as if it were a business plan: a strategic investment with a return measured in human lives. The opportunity was big, MacAskill argued, because, in the developing world, life was still unconscionably cheap. Just do the math: At $2,000 per life, a million dollars could save 500 people, a billion could save half a million, and, by extension, a trillion could theoretically save half a billion humans from a miserable death.
MacAskill couldn’t have hoped for a better recruit. Not only was SBF raised in the Bay Area as a utilitarian, but he’d already been inspired by Peter Singer to take moral action. During his freshman year, SBF went vegan and organized a campaign against factory farming. As a junior, he was wondering what to do with his life. And MacAskill—Singer’s philosophical heir—had the answer: The best way for him to maximize good in the world would be to maximize his wealth.
SBF listened, nodding, as MacAskill made his pitch. The earn-to-give logic was airtight. It was, SBF realized, applied utilitarianism. Knowing what he had to do, SBF simply said, “Yep. That makes sense.” But, right there, between a bright yellow sunshade and the crumb-strewn red-brick floor, SBF’s purpose in life was set: He was going to get filthy rich, for charity’s sake. All the rest was merely execution risk.
His course established, MacAskill gave SBF one last navigational nudge to set him on his way, suggesting that SBF get an internship at Jane Street that summer.
In 2017, everything was going great for SBF. He was killing it at Jane Street... He was giving away 50 percent of his income to his preferred charities, with the biggest donations going to the Centre for Effective Altruism and 80,000 Hours. Both charities focus on building the earn-to-give idea into a movement. (And both had been founded by Will MacAskill a few years before.) He had good friends, mostly fellow EAs. Some were even colleagues.... [much further down in the profile]
So when, that next summer, MacAskill sat with SBF in Harvard Square and carefully explained, in the way only an Oxford-educated philosopher can, that the practice of effective altruism boils down to “applied utilitarianism,” Snipe’s arrow hit SBF hard. He’d found his path. He would become a maximization engine. As he wrote in his blog, “If you’ve decided that some of your time—or money—can be better spent on others than on yourself, well, then, why not more of it? Why not all of it?”
On deciding what to do after leaving Jane Street:
SBF made a list of possible options, with some notes about each:
- Journalism—low pay, but a massively outsized impact potential.
- Running for office—or maybe just being an advisor?
- Working for the movement—EA needs people!
- Starting a startup—but what, exactly?
- Bumming around the Bay Area for a month or so—just to see what happens.
On setting up the initial Japanese Bitcoin arbitrage at Alameda:
Fortunately, SBF had a secret weapon: the EA community. There’s a loose worldwide network of like-minded people who do each other favors and sleep on each other’s couches simply because they all belong to the same tribe. Perhaps the most important of them was a Japanese grad student, who volunteered to do the legwork in Japan. As a Japanese citizen, he was able to open an account with the one (obscure, rural) Japanese bank that was willing, for a fee, to process the transactions that SBF—newly incorporated as Alameda Research—wanted to make.
The spread between Bitcoin in Japan and Bitcoin in the U.S. was “only” 10 percent—but it was a trade Alameda found it could make every day. With SBF’s initial $50,000 compounding at 10 percent each day, the next step was to increase the amount of capital.
At the time, the total daily volume of crypto trading was on the order of a billion dollars. Figuring he wanted to capture 5 percent of that, SBF went looking for a $50 million loan. Again, he reached out to the EA community. Jaan Tallinn, the cofounder of Skype, put up a good chunk of that initial $50 million.
On the early days at Alameda:
The first 15 people SBF hired, all from the EA pool, were packed together in a shabby, 600-square-foot walk-up, working around the clock. The kitchen was given over to stand-up desks, the closet was reserved for sleeping, and the entire space overrun with half-eaten take-out containers. It was a royal mess. But it was also the good old days, when Alameda was just kids on a high-stakes, big-money, earn-to-give commando operation. Fifty percent of Alameda’s profits were going to EA-approved charities.
“This thing couldn’t have taken off without EA,” reminisces Singh, running his hand through a shock of thick black hair. He removes his glasses to think. They’re broken: A chopstick has been Scotch taped to one of the frame’s sides, serving as a makeshift temple. “All the employees, all the funding—everything was EA to start with.”
On how he was thinking about future earnings:
“Am I,” [reporter asks], “talking to the world’s first trillionaire?”
...
“Maybe let’s take a step back,” he says, only to launch into an explanation of his own, personal utility curve: “Which is to say, if you plot dollars-donated on the X axis, and Y is how-much-good-I-do-in-the-world, then what does that curve look like? It’s definitely not linear—it does tail off, but I think it tails off pretty slowly.”
His point seems to be that there is, out there somewhere, a diminishing return to charity. There’s a place where even effective altruism ceases to be effective. “But I think that, even at a trillion, there’s still really significant marginal utility to dollars donated.”
...
“So, is five trillion all you could ever use to help the world?”
...
“Okay, at that scale, I think the answer might be yes. Because, if your spending is on the scale of the U.S. government, it might have too weird and distortionary an impact on things.”
... so, money spent now will be more effective at making the world a better place than money spent later. “I think there are some things that are pretty urgent,” SBF says. “There’s just a long series of crucial considerations, and all of them matter a lot—and you can’t fuck any of them up, or you miss most of the total value that you could ever get.”
To be clear, SBF is not talking about maximizing the total value of FTX—he’s talking about maximizing the total value of the universe. And his units are not dollars: In a kind of GDP for the universe, his units are the units of a utilitarian. He’s maximizing utils, units of happiness. And not just for every living soul, but also every soul—human and animal—that will ever live in the future. Maximizing the total happiness of the future—that’s SBF’s ultimate goal. FTX is just a means to that end.
On what differentiates FTX in crypto:
The FTX competitive advantage? Ethical behavior. SBF is a Peter Singer–inspired utilitarian in a sea of Robert Nozick–inspired libertarians. He’s an ethical maximalist in an industry that’s overwhelmingly populated with ethical minimalists. I’m a Nozick man myself, but I know who I’d rather trust my money with: SBF, hands-down. And if he does end up saving the world as a side effect of being my banker, all the better.
On the EA community in the Bahamas that congealed around FTX:
A cocktail party is in full swing, with about a dozen people I don’t recognize standing around. It turns out to be a mixer for the local EA community that’s been drawn to Nassau in the hopes that the FTX Foundation will fund its various altruistic ideas. The point of the party is to provide a friendly forum for the EAs who actually run EA-aligned nonprofits to meet the earn-to-give EAs at FTX who will fund them, and vice versa. The irony is that, while FTX hosts the weekly mixer—providing the venue and the beverages—it’s rare for an actual FTX employee to ever show up and mix. Presumably, they’re working too hard.
...
“Imagine nerds invented a religion or something,” says Woods, stabbing at my question with vigor, “where people get to argue all day.”
“It’s… an ideology,” counters Morrison. The argument has begun.
Woods amiably disagrees: “EA is not an ideology, it’s a question: ‘How do I do the most good?’ And the cool thing about EA, compared to other cause areas, is that you can change your views constantly—and still be part of the movement.”
...
Woods serves up an answer to my question. (Fittingly, she’s wearing tennis whites.) “EA attracts people who really care, but who are also really smart,” she says. “If you are altruistic but not very smart, you just bounce off. And if you’re smart but not very altruistic,” she continues, “you can get nerd sniped!”
...
“This ties into the way FTX is doing its foundation,” Morrison says, helpfully knocking the ball back to my true interest. “The foundation wants to get a lot of money out there in order to try a lot of things quickly. And how can you do that effectively?” It’s a rhetorical question, a move worthy of a preppy debate champ who went to a certain finishing school in Cambridge—which is exactly what Morrison is. “Part of the answer is to give money to someone in the EA community.”
“Because EA is different from other communities,” Woods continues, picking up right where Morrison left off. “They’re like, ‘This is the ethical thing, and this is the truth.’ And we’re like, ‘What is the ethical thing? What is the truth?’”
Following your analogy, if a fan of Novik had:
- been convinced by Novik to dedicate their career to the Novikian ethic
- been pointed by Novik to a promising first job in that career path
- decided to leave that promising first job on the basis of Novikian reasoning, framing the question of what to do next in Novikian terms
- worked with a global network of Novikians to implement an international crypto arbitrage
- received seed funding from a prominent Novikian to scale up this arbitrage
- exclusively hired Novikians to continue scaling the arbitrage once it started working
- thought about forward-facing professional decisions strictly in terms of the Novikian ethic
- used their commitment to Novikianism to garner a professional edge in their industry
- used a large portion of the proceeds of their business to fund Novikian projects, overseen by a foundation staffed exclusively by elite Novikians and advised by Novik herself
- fostered a community of Novikians around their lavish corporate headquarters
... then I think it would be fair to attribute some of the impact of their actions to Novikianism.
Richard Möhn @ 2022-11-17T01:55 (+36)
Some corrections of the Sequoia info:
- I've never been a grad student.
- I'm neither Japanese nor a Japanese citizen.
- I ‘volunteered’ in the sense that people at Alameda reached out to me, I said ok and then got paid by the hour for my help.
- ‘(obscure, rural)’ is an exaggeration. ‘provincial’ would be a more apt adjective for the location. The main bank we used was SMBC, the second-largest bank in Japan.
- ‘for a fee’ sounds as if it was some sort of bribe to get them to do what we wanted. But we only paid the usual transaction fees and margin that any bank would charge.
But mostly, if https://forum.effectivealtruism.org/posts/xafpj3on76uRDoBja/the-ftx-future-fund-team-has-resigned-1?commentId=hpP8EjEt9zTmWKFRy is accurate, I'm bummed that the money I helped earn was squandered right away.
Dr. David Mathers @ 2022-11-16T18:49 (+19)
Definitely: you are obviously right and Eliezer obviously wrong about this, imho.
BUT
I do think it is hindsight bias to some degree to think that "EA" as a collective or Will MacAskill as an individual are recorded as doing something wrong, in the sense of "predictably a bad idea" at any point in the passages you quote. (I know you didn't actually claim that!) It's not immoral to tell some to found a business, so it's definitely not immoral to tell someone to found a business and give to charity. It's not immoral to help someone make a legal, non-scammy trade, as the anonymous Japanese EA apparently did ("buy low and sell high" is not poor business ethics as far as I know, though I'm prepared to be corrected about that by someone who actually knows finance.) It's a bit more controversial to say it's not wrong to take very rich people's money to do the sort of work EA charities do, but it's certainly not obvious that it is, and nothing in the quoted passages actually shows that any individual had evidence that FTX were a bad org to be associated with. (They may well have, I'm not saying no one did wrong, I'm just saying no wrong-doing is suggested by the information quoted here.) Furthermore "take money from rich people for philanthropy and speculative academic research" isn't exactly a uniquely EA practice!
That leaves suggesting FTX think in utilitarian terms about maximizing, but I think it is obviously a complicated question whether that was a knowably bad idea when it was done, and depends on the details of how it was done.
Of course, there may well have been wrong-doing at some point, but we need proper investigation before we decided. And furthermore, we can't just assume that any wrongdoing, even severe wrongdoing, that did occur would have saved the depositors SBF stole from, who are the main victims of this whole mess. My guess is that once the early decision to encourage SBF to found Alameda was made by Will, and SBF received some early help from the community, withdrawing our support later would not have done very much to prevent FTX from becoming a successful business that stole from its customers. But those early decisions are probably the least morally suspicious, in that they were taken early when there was the least available information about the business ethics of SBF and FTX/Alameda available. To repeat: I don't think telling someone to found a business to earn to give, or helping out a business make a legal, non-scammy trade, is itself immoral. (Again, I'm assuming the trade was legal and non-scammy, but very willing to be corrected!). The suspicious decisions that might have been decisive was maybe "get SBF and other FTX/Alameda high-ups to think in a utilitarian way'. But as I say, I don't think its reasonable to hold that was clearly wrong at the time.
Milan_Griffes @ 2022-11-16T19:17 (+14)
Thanks for this comment.
I'm more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated).
Examples of foundational issues:
- FTX was an explicitly maximalist project, and maximization is perilous
- Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage); the decision to pursue this strategy didn't account for the massive externalities that resulted from its failure
- The Future Fund failed to identify an existential risk to its own operation, which casts doubt on their/our ability to perform risk assessment
- EA's inability and/or unwillingness to vet FTX's operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF's history of questionable leadership points to overeager power-seeking
- MacAskill's attempt to broker an SBF <> Elon deal re: purchasing Twitter also points to overeager power-seeking
- Consequentialism straightforwardly implies that the ends justify the means at least sometimes; protesting that the ends don't justify the means is cognitive dissonance
- EA leadership's stance of minimal communication about their roles in the debacle points to a high weight placed on optics / face-saving (Holden's post and Oli's commenting are refreshing counterexamples though I think it's important to hear more about their involvement at some point too)
RobBensinger @ 2022-11-16T19:43 (+8)
Sounds right to me!
I agree with Eliezer that a lot of EAs are over-blaming EA for the FTX implosion, based on the facts currently known. But the Scholomance case is obviously a lot weaker than the EA case in real life, and this is a great summary of why.
EliezerYudkowsky @ 2022-11-16T23:57 (+12)
The point is not "EA did as little to shape Alameda as Novik did to shape Alameda" but "here is an example of the mental motion of trying to grab too much responsibility for yourself".
RobBensinger @ 2022-11-17T00:32 (+2)
Fair!
MichaelPlant @ 2022-11-16T11:15 (+92)
This seems to be a false equivalence. There's a big difference between asking "did this writer, who wrote a bit about ethics and this person read, influence this person?" vs "did this philosophy and social movement, which focuses on ethics and this person explicitly said they were inspired by, influence this person?"
I agree with you that the question
Who's at fault for FTX's wrongdoing?
has the answer
FTX
But the question
Who else is at fault for FTX's wrongdoing?
Is nevertheless sensible and cannot have the answer FTX.
Rina @ 2022-11-16T16:47 (+13)
Couldn't agree more strongly.
The inferential jump from someone reading a book in their spare time, making a pretty superficial Goodreads review about a main takeaway, to
It sounds like - Caroline might have been under the impression, as late as Oct 10, that what she was doing at FTX was the thing that's hard and scary but right?
Is a pretty big one, and kinda egregious honestly.
Holly_Elmore @ 2022-11-16T19:45 (+41)
Agree, we shouldn't give a pass to irrational (frankly, egocentric) thinking just because it feels like taking responsibility.
I feel especially irritated with people who are ready to change their entire utilitarian philosophy just because someone associated with ours (probably) committed a major crime and got caught, as if they didn't understand last week that they lived in a world where surprises like that can happen. I don't understand how else they could update their moral philosophy so fast based on the info we have.
EliezerYudkowsky @ 2022-11-17T23:33 (+15)
Maybe they weren't familiar with the overwhelming volume of previous historical incidents, hadn't had their brains process history or the news as real events rather than mythology, or were genuinely unsure about how often these sorts of things happened in real life rather than becoming available on the news. I'm guessing #2.
RobBensinger @ 2022-11-16T22:43 (+13)
I feel especially irritated with people who are ready to change their entire utilitarian philosophy just because someone associated with ours (probably) committed a major crime and got caught
I agree that this is pretty weird. There were presumably a bunch of historical contingencies that went into whether the FTX implosion occurred; it seems weird if we should endorse some moral philosophy X in the world where all those contingencies occurred, and some different moral philosophy Y in the world where not all of those contingencies occurred.
And it also seems weird if we should endorse the same moral philosophy in both worlds, but this one data point -- an important data point EV-wise, but still a single event, historical contingency and all -- is crucial evidence about such a high-level proposition. Evidence that we somehow didn't acquire via looking at the entirety of human history, the entire psychology and sociology literature, etc.
The least-weird versions of this update I can imagine are:
- "This isn't a large update about high-level questions like that, but it's at least an interesting case study. We shouldn't treat it as a huge deal evidentially, but having a Schelling case study we can all drill down on is still a useful exercise, since we usually don't take the time to be this thorough."
- "This is a large update for me, exactly because my perspective on the world is heavily influenced by things like the status hierarchies I perceive, which things are seen as socially acceptable or unacceptable, which people I personally like or dislike, etc. Events that cause a realignment in the status hierarchy are a bit like taking antidepressants, and observing that some of my world-models change when I'm on the antidepressants.
There's no a priori guarantee that my epistemics are more accurate on antidepressants versus off them; but having the extra vantage point can help me reflect on these two perspectives, and it's not weird if I end up deciding that one vantage point is better than the other, and thereby updating my object-level world-models to better match that vantage point."
Rebecca @ 2022-11-17T00:04 (+11)
There’s also a 3rd option - we should have been updating based on what was already talked about re SBF before the implosion (his pathological behaviour, his public statements essentially agreeing he’s running a Ponzi scheme, and people warning other people about these). So the implosion makes us realise that, in a world where FTX didn’t implode we still should have disassociated from SBF very early on, and be doing some soul searching about why UK EA leaders were [/are, in this hypothetical world] choosing to hype up someone with a track record of being so terrible
Holly_Elmore @ 2022-11-17T15:27 (+14)
I think it’s very worth reflecting on strategic decisions that were made around Sam. I just don’t think what happened is very significant to whether utilitarianism is the correct moral philosophy.
RyanCarey @ 2022-11-17T15:55 (+6)
I agree that these events are separate from arguments for & against utilitarianism as a criterion of rightness. But they do undermine the viability of the act utilitarian calculus as a decision procedure. Sam seems to have thought of himself as an act utilitarian, but by neglecting to do the utilitarian calculus correctly or at all, he did massive harm, making it clear that we can't rely on this decision procedure to avoid such harms. Instead, we need utilitarians to adopt a decision procedure that includes constraints on certain behaviour.
Jan_Kulveit @ 2022-11-17T16:48 (+15)
- In practice I think utilitarians should adopt mostly a skillful combination of virtue ethics, deontic rules, and explicit calculations.
- I think what does the FTX case provides some evidence for, is some fraction of smart EAs exposed to utilitarianism being prone to attempt to rely on the explicit act utilitarianism, despite the warnings.
I think part of the story here is the a weird status dynamic where...
1. I would basically trust some people to try the explicit direct utilitarian thing: eg I think it is fine for Derek Parfit or Toby Ord.
2. This creates some weird correlation where the better you are on some combination of (smartness/understanding of ethics/power in modelling the world), the more you can try to be actually guided by consequences
3. This can make being 'hardcore' consequentialist ...sort of cool and "what the top people do"
4. ... which is a setup where people can start goodhart/signal on it
EliezerYudkowsky @ 2022-11-17T23:40 (+21)
Yeah, I think it's a severe problem that if you are good at decision theory you can in fact validly grab big old chunks of deontology directly out of consequentialism including lots of the cautionary parts, or to put it perhaps a bit more sharply, a coherent superintelligence with a nice utility function does not in fact need deontology; and if you tell that to a certain kind of person they will in fact decide that they'd be cooler if they were superintelligences so they must be really skillful at deriving deontology from decision theory and therefore they can discard the deontology and just do what the decision theory does. I'm not sure how to handle this; I think that the concept of "cognitohazard" gets vastly overplayed around here, but there's still true facts that cause a certain kind of person to predictably get their brain stuck on them, and this could plausibly be one of them. It's also too important of a fact (eg to alignment) for "keep it completely secret" to be a plausible option either.
Holly_Elmore @ 2022-11-17T16:30 (+10)
Sam seems to have thought of himself as an act utilitarian, but by neglecting to do the utilitarian calculus correctly or at all, he did massive harm, making it clear that we can't rely on this decision procedure to avoid such harms.
I completely agree that a motivated person could easily believe that any decision is the right act utilitarian decision because there aren't clear rules for determining the right act utilitarian decision and checking your answer. Totally.
But idk if it's even fair to say Sam was using act utilitarianism as a decision procedure. It's not clear to me if he even believed that while he was (allegedly) committing the fraud.
RyanCarey @ 2022-11-17T16:35 (+8)
I totally agree. But even if we conservatively say that it's a 50% chance that he was using act utilitarianism as his decision procedure, that's enough to consider it compromised, because it could lead to bad consequences multiple billions of dollars of damages (edited).
There are also subtler issues: if you intend to be act utilitarian but aren't and do harm, that's still an argument against intending to use the decision procedure. And if someone says they're act utilitarian but aren't and does harm, that's an argument against trusting people who say they're act utilitarian.
Holly_Elmore @ 2022-11-17T16:49 (+18)
Not trying to take this out on you, but I'm annoyed by how much all this advocacy of deontology all of sudden overlaps with covering our own asses. I don't buy it as a massive update about morality or psychology from the events themselves but a massive update about optics.
Jan_Kulveit @ 2022-11-18T23:28 (+14)
Reposting from twitter: It's a moderate update on the prevalence of naive utilitarians among EAs.
Expanded:
Classical problem with this debate on utilitarianism is the vocabulary used makes motte-and-bailey defense of utilitarianism too easy.
1. Someone points to a bunch problems with a act consequentialist decision procedure / cases where naive consequentialism tells you to do bad things
2. The default response is "but this is naive consequentialism, no one actually does that"
3. You may wonder that while people don't advocate for or self-identify as naive utilitarians ... they actually make the mistakes
The case provides some evidence that the problems can actually happen in practice in important enough situations to care. [*]
Also, you have the problem that sophisticated naive consequentialists could be tempted to lie to you about their morality ("no worries, you can trust me, I'm following the sensible deontic constraints!"). Personally, before the recent FTX happenings, I would be more of the opinion "nah, this sounds too much like an example from a philosophical paper, unlikely with typical human psychology ". Now I take it as more real problem.
[*] What I'm actually worried about ...
Effective altruism motivated thousands of people to move into highly leveraged domains, with large and potentially deadly consequences - powerful AI stuff, pandemics, epistemic tech. I think that if just 15% of them believe in some form of hardcore utilitarianism where you drop integrity constrains and trust your human brain ability to evaluate when to be constrained and when not, it's ... actually a problem?
EliezerYudkowsky @ 2022-11-17T23:41 (+11)
I'd agree with this statement more if it acknowledged the extent to which most human minds have the kind of propositional separation between "morality" and "optics" that obtained financially between FTX and Alameda.
Linch @ 2022-11-18T23:43 (+3)
I don't buy it as a massive update about morality or psychology from the events themselves but a massive update about optics.
This will be a relief if true. I am much more worried about people not having principles (or their principles guided by something other than morality) than people being overly concerned about optics. The latter is a tactical concern (albeit a big one) and hopefully fixable, the former is evidence that people in our movement is too conformist or otherwise too weak or too evil to confront moral catastrophes.
Holly_Elmore @ 2022-11-24T20:53 (+2)
I don’t think they know they are concerned about optics. My suspicion was that the bad optics suddenly made utilitarian ideas seem false or reckless.
EliezerYudkowsky @ 2022-11-17T23:37 (+15)
This strikes me as a bad play of "if there was even a chance". Is there any cognitive procedure on Earth that passes the standard of "Nobody ever might have been using this cognitive procedure at the time they made $mistake?" That more than three human beings have ever used? I think when we're casting this kind of shade we ought to be pretty darned sure, preferably in the form of prior documentation that we think was honest, about what thought process was going on at the time.
RyanCarey @ 2022-11-17T23:50 (+13)
Why require surety, when we can reason statistically? There've been maybe ten comparably-sized frauds ever, so on expectation, hardline act utilitarians like Sam have been responsible for 5% of the worst frauds, while they represent maybe 1/50M of the world's population (based on what I know of his views 5-10yrs ago). So we get a risk ratio of about a million to 1, more than enough to worry about.
Anyway, perhaps it's not worth arguing, since it might become clearer over time what his philosophical commitments were.
Holly_Elmore @ 2022-11-17T16:43 (+9)
I guess it's some new evidence that one person was maybe using act utilitarianism as a decision procedure and messed up? Also not theoretically impossible he was correct in his assessment of the possible outcomes, chose the higher EV option, and we just ended up in one of the bad outcome worlds.
RobBensinger @ 2022-11-18T07:27 (+7)
Sam seems to have thought of himself as an act utilitarian, but by neglecting to do the utilitarian calculus correctly or at all, he did massive harm, making it clear that we can't rely on this decision procedure to avoid such harms.
This seems to me like it's overstating the strength of evidence, as though FTX is a disproof rather than one data point among many.
It is a disproof for extremely strong claims like "people who endorse act utilitarianism never do unethical things", but those claims should have had extremely low probability pre-FTX.
Holly_Elmore @ 2022-11-17T16:13 (+4)
How much of an update is this really, though? Am I wrong that it's already the majority utilitarian view that act utilitarianism may be theoretically correct, but individual humans don't have the foresight to know the full consequences of every act and humans trying to work together need to be able to predict what others will do --> something like rule utilitarianism or observing constraints? Seems like the update should be about much you can know how things will turn out and whether you can get away with cutting corners.
It does seem like Sam had pathological beliefs re:St. Petersburg paradox but that seems like more than wanting to maximize EV too much-- it's not caring about the longterm future (where everyone's inevitably dead after enough coin flips) enough. I really don't see how that can be attributed to act utilitarianism either.
RyanCarey @ 2022-11-17T16:26 (+2)
I agree that most utilitarians already thought act utilitarianism as a decision procedure was bad. Still, it's important that more folks can see this, with higher confidence, so that this can be prevented from happening again.
I think I agree that the St Petersburg paradox issue is orthogonal to choice of decision procedure (unless placing the bet requires engaging in a norm-violating activity like fraud).
Holly_Elmore @ 2022-11-17T16:33 (+2)
Risking the entire earth seems like a norm violation to me
Milan_Griffes @ 2022-11-16T20:09 (+2)
Here are some jumping-off points for reflecting on how one might update their moral philosophy given what we know so far.
Sabs @ 2022-11-16T10:11 (+29)
idk, when people explicitly endorse your ideology as why they endorse "high leverage and double-or-nothing flips" I think it's at least worth taking a look at yourself. Now quite probably the person in question has misunderstood your ideology and doesn't understand why EAs do in fact care about the risk of ruin and why stealing money isn't ok, but then perhaps try to correct them?
Fwiw I think it very unlikely that the decision to use customer funds was a one-off decision made in 2022. My view is that that FTX was set up from the start to use customer money as a source of cheap capital for Alameda. In 2018 Alameda was offering potential investors a 15% guaranteed return on loans. It seems fairly likely that at some point SBF figured "fuck this, why are we offering these dorks 15% when we can just set up our own exchange and access huge amounts of capital at 0%". Never mind that the fact that privileged information from the exchange may well have opened up for Alameda more ways to make money!
The plan, imo, was always to accrue as much as wealth as possible as fast as possible with as few ethical constraints as possible. This worked for a while because Alameda's trades were profitable and crypto was in a bull market. This plan may or may not have been a EA-aligned, but if you have short enough AI/pandemic timelines (I don't), it doesn't seem obviously non-compatible and given the career backgrounds and interest set of all the major people involved, yes, I think they were committed and sincere EAs who really believed this stuff. SBF's own weird version of EA, at least, seems to have played a fairly large role in why they took on so much risk, as he himself explained in an overly long and boring twitter thread somewhere and Caroline also mentioned on her blog.
It also makes zero sense to compare FTX's spending on stadiums vs the Future Fund as a sign for how much they cared about these respective things. The Future Fund would almost certainly have got way more money in subsequent years, while the stadium rights purchase was a form of advertising designed to help grow the business faster. I can't imagine SBF is a big sports fan and was doing that sort of thing because he really enjoyed seeing the FTX logo on umpire shirts.
Not to Godwinpost, but this isn't really "were Nietzsche and Wagner at fault for the Nazis", it's more "were Nietzsche and Wagner at fault for the Nazis if they'd actually lived throughout the 1930s and worked in prominent cultural education posts in the German state bureaucracy."
timunderwood @ 2022-11-16T12:49 (+24)
I mean he is a big sports fan, at least baseball, at least when he was younger. I got linked to his blog from 10 years ago from something, and the number one and two sets of posts were about baseball statistics.
callum @ 2022-11-16T13:40 (+26)
The role of the EA movement in the case of FTX seems surely to meet the level of influence for some of the impact win's that EA has had so far here.
Perhaps most prominently, the movement:
- Gave the idea of 'Earning to Give' to Sam
- Provided a primary motivation to Sam and other FTX leadership to build the exchange
For example, when comparing to the case of Sendwave, the influence seems at least comparable and if not larger e.g. played a motivational role in founding a company, for the purpose of improving the world. (I'm not familiar with Wave's founders motivations, so could be wrong here)
In welfare terms alone, the impact of FTX's collapse on it's customers seems plausibly comparable to some of the impact win's of the movement to date. I.e. of the order of $1bn in lost funds. Given this, I think that an honest impact evaluation of the EA movement would include the harm caused to customers through FTX's collapse.
This is relevant not for blame assignment, but because it's very decision-relevant to EA's mission of improving the world. For example, when in the future deciding how much to emphasise harm avoidance when encouraging the (good and novel) idea of Earning to Give.
Dancer @ 2022-11-16T15:11 (+20)
I think that an honest impact evaluation of the EA movement would include the harm caused to customers through FTX's collapse.
Agreed. However:
In welfare terms alone, the impact of FTX's collapse on it's customers seems plausibly comparable to some of the impact win's of the movement to date. I.e. of the order of $1bn in lost funds.
Are you talking about welfare terms or financial terms? Because $1bn in lost savings of FTX customers seems very different in welfare terms to $1bn spent on bed-nets etc. I think there are strong reasons FTX shouldn't have acted the way it did, but suggesting these two things are comparable in welfare terms because they are similar in financial terms seems like an error to me.
callum @ 2022-11-18T16:07 (+1)
Yeah I agree, I just mean that $1bn in funds lost to customers across the world is plausibly comparable in welfare terms to other wins on that list. E.g. dividing by 10 to account for differences in income of those affected, it would be around the amount attributed to GiveDirectly on the EA impact page.
(without wanting to make a very direct crude comparison, or getting into the details of that)
Dancer @ 2022-11-18T17:20 (+2)
Okay yes, they may well be.
I'm also pretty hesitant to attempt to make direct crude comparisons - and I'll say again that I think there are strong reasons FTX shouldn't have acted as it did in addition to the direct harm to customers - but I'll just say that I seem to remember 100x or 1000x multipliers being more common than 10x in similar scenarios.
Sam Elder @ 2022-11-16T12:06 (+16)
It is well-written, but I am not particularly convinced by the fantasy fiction analogy — it feels a lot more like “Here’s this very different situation, and you agree that the conclusions would be different. That would even be true if we modify it in several hard-to-imagine ways.”
In particular, I don’t see any reasonable analogies for:
- EA’s “Earning to Give” career path, up to and including 80k featuring a profile on SBF as an exemplar.
- The specific logic of “my marginal money is going to be donated” => “I should be closer to risk-neutral”, which I haven’t really seen rebutted on the facts (most instead argue that in reality, SBF/FTX/Alameda went too far and were risk-seeking).
That SBF ultimately contributed such a paltry amount of his apparent fortune is more impactful, but mainly as a reminder of how small and vulnerable EA actually is. It might very well be true that we didn’t mean that much to SBF, but he meant a lot to us.
demirev @ 2022-12-05T17:04 (+2)
I also think this was a not-so-good and somewhat misleading analogy - the association between Novik and Caroline in the example is strictly one-way (Caroline likes Novik, Novik has no idea who Caroline is), whereas the association between FTX and EA is clearly two-way (e.g. various EA orgs endorsing and promoting SBF, SBF choosing to earn-to-give after talking with 80k etc).
samuel @ 2022-11-16T08:31 (+14)
I don’t [currently] view EA as particularly integral to the FTX story either. Usually, blaming ideology isn’t particularly fruitful because people can contort just about anything to suit their own agendas. It’s nearly impossible to prove causation, we can only gesture at it.
However, I’m nitpicking here but - is spending money on naming rights truly evidence that SBF wasn’t operating under a nightmare utilitarian EA playbook? It’s probably evidence that he wasn’t particularly good at EA, although one could argue it was the toll to further increase earnings to eventually give. It’s clearly an ego play but other real businesses buy naming rights too, for business(ish) reasons, and some of those aren’t frauds… right?
I nitpick because I don't find it hard to believe that an EA could also 1) be selfish, 2) convince themselves that ends justify the means and 3) combine 1&2 into an incendiary cocktail of confused egotism and lumpy, uneven righteousness that ends up hurting people. I’ve met EAs exactly like this, but fortunately they usually lack the charm, knowhow and/or resources required to make much of a dent.
In general, I’m not surprised with the community's reaction. Best case scenario, it had no idea that the fraud was happening (and looks a bit naïve in hindsight) and its dirty laundry is nonetheless exposed (it’s not so squeaky clean after all). Even if EA was only a small piece in the machinery that resulted in such a [big visible] fraud, the community strives to do *important* work and it feels bad for potentially contributing to the opposite.
EliezerYudkowsky @ 2022-11-16T09:36 (+12)
The point there isn't so much, "He could not have had any EA thoughts in his head at all", which I doubt is really true - though also there could've just been pressure from coworkers, and office politics around it, resolving in something like the Future Fund so that they were doing anything. My point is just that this nightmare is probably not one of a True Sincere Committed EA Act Utilitarian doing these things; that person would've tried to take more money off the table, earlier, for the Future Fund. Needing an e-sports site named after your company - that's indeed something that other businesses do for business reasons; and if it feeds your business, that's real, that's urgent, that has to happen now. The philanthropy side was evidently not like that.
samuel @ 2022-11-16T15:59 (+3)
"My point is just that this nightmare is probably not one of a True Sincere Committed EA Act Utilitarian doing these things" - I agree that this is most likely true, but my point is that it's difficult to suss out the "real" EAs using the criteria listed. Many billionaires believe that the best course of philanthropic action is to continue accruing/investing money before giving it away.
Anyways, my point is more academic than practical, the FTX fraud seems pretty straight forward and I appreciate your take. I wonder if this forum would be having the same sorts of convos after Thanos snaps his fingers.
Ronny Fernandez @ 2022-11-16T06:24 (+14)
I still think that this incident should overall update most EAs in the direction of 1) ethical injunctions are important for humans and 2) more EAs should read the ethical injunctions section of the sequences. I agree that there is no system of ethics, or cultural movement, so awesome that it will stop its most loyal adherents from doing terrible things, but some do better than others. Nobody should feel guilty except for the people who committed the crime, but it would be great if EAs thought the right amount about how to lower the prob of events like this in the future, and that amount is not zero.
I'm also not sure how to square your advice about how I should relate to this incident with heroic responsibility.
AllAmericanBreakfast @ 2022-11-16T07:32 (+25)
Which ethical systems do you think have a better track record and why? Does virtue ethics, the preferred moral system of Catholics, have to take responsibility for pedophile priests? Does the rule-based ethics of deontology have to take responsibility for mass incarceration in the USA?
I can understand people claiming that this ethics implies that crazy conclusion, or assigning blame to an idea that seems clearly to have inspired a particular person to do a particular act. But I have no confidence that anybody on this earth has a clue about which ethical system is most or least disproportionately to blame for common-sense forms of good or bad behavior.
Ronny Fernandez @ 2022-11-16T15:30 (+18)
I think liberalism has a better track record than communism, for instance. No, but I do think Catholics should spend some time thinking about what's up with catholic priests molesting children, particularly if that catholic has any control over what goes on in the church. In general I do not think blaming this or that ethical system or social movement makes much sense, but noticing that the adherents of some social movement or ethical system tend to do some particular kind of bad thing more often than others can be useful, particularly if you are a part of that social movement.
RobBensinger @ 2022-11-16T06:38 (+7)
more EAs should read the ethical injunctions section of the sequences
Ronny is talking about https://www.lesswrong.com/s/AmFb5xWbPWWQyQ244.
Gideon Futerman @ 2022-11-16T13:32 (+12)
There are however a number of things we ARE at fault for here.
- We as a community idolised SBF, including promoting him in many presentations, a relatively fawning interview by 80K which continued to promote the idea that SBF was living frugally (surely people knew by then that was bs). We could have chosen not to do this
- Will MacAskell made the introduction to Elon to try and get SBF to help buy twitter. We still have no public information why, but this would have given SBF more power and used a lot of money that could have been used on doing good to that end. Why?
- Carrick Flynn campaign; we as a community hugely supported this campaign which was quite blatantly SBF and GBF trying to buy a seat for their interests. Sure, we as a community thought this was also our interests (and I still assume Carrick would have done a good job?) but once again this was a way the community encouraged and didn't question SBFs power
- Will MacAskell knew SBF for 9 years, seemingly relatively closely. Its not Wills fault SBF committed fraud, but it is partially Wills fault SBF became such a face for the community within and outside of it. Maybe no ordinary person could have known SBF was a fraudster. But then, if we only expect from Will what we expect of "ordinary people" why are we happy trusting him with so much power in the community? The only justification I can think of is he is just so so so much better at decision making, having a reliably positive impact and avoiding risks to the community and project of EA. Its clear that Will isn't this uniquely good. So why do we trust him (and others) with so much power in the community
RobBensinger @ 2022-11-16T22:24 (+20)
There are however a number of things we ARE at fault for here.
Yes, assuming that these were foreseeably bad calls. Seems good to separately ask "what responsibility do EAs bear for Sam's bad decisions ?" and "what did we otherwise do wrong, or right?". E.g., if it were true that Sam would have made all the same missteps in the absence of EA, it could still be the case that we made Sam-related mistakes like "failing to propagate info about Sam's past bad behavior".
2. Will MacAskell made the introduction to Elon to try and get SBF to help buy twitter. We still have no public information why, but this would have given SBF more power and used a lot of money that could have been used on doing good to that end.
It would have given SBF a different kind of power. I'm skeptical of the claim that SBF would be more powerful if he'd poured his money into Twitter, since that implies that Twitter is a more useful, leveraged thing to spend money on than SBF's other alternatives.
It seems more likely to me that either buying Twitter would reduce SBF's power/influence (because Twitter isn't very important), or that buying Twitter is a not-crazy sort of thing for EAs to try to do (because Twitter is very important).
Of course, SBF owning Twitter could have been bad insofar as SBF's judgment and character were flawed. But then we're just repeating the critique "EAs should have known that SBF was a bad guy", not separately critiquing Will for thinking the Twitter buy was a good idea.
I think more of an argument needs to be given for "buying Twitter was a dumb idea" in order to include this on a list of "things EAs are at fault for".
3. Carrick Flynn campaign; we as a community hugely supported this campaign which was quite blatantly SBF and GBF trying to buy a seat for their interests. Sure, we as a community thought this was also our interests (and I still assume Carrick would have done a good job?) but once again this was a way the community encouraged and didn't question SBFs power
This seems totally wrong to me. First, because I knew Carrick pre-campaign, I think he's awesome and would make an amazing elected official, and it doesn't update me at all to know that Carrick (like a ton of excellent, well-intentioned EAs) got FTX funding.
And second, because AFAIK Carrick is an FHI guy who SBF later decided to support in his primary race (because he's an EA and SBF wanted more EAs in politics), not someone with close ties to SBF. Quoting Carrick in a Vox interview:
First, I’ve never met [Sam Bankman-Fried], I’ve never talked to him. I don’t have any information that anyone else doesn’t have. I actually don’t have any information that’s not public with, I guess, one exception, which is information I think other people think they have, which is they think I’m involved in crypto or something. That is not the case. I’m not a crypto person. I don’t know very much about it. I’ve never looked at regulations for it. I don’t think it’s a priority.
Left with that information, my take is speculative, but what I will say is it seems to me like Sam Bankman-Fried is someone who legitimately wants to prevent pandemics from happening again. I am on board. I love that, great goal. Let’s do it. I see why he would want to support me for that, since I’ve made this my first priority and I’ve got a history in this. He’s also supported other candidates and sitting congresspersons who have good pandemic prevention policies, with less money, but I can see why he’d want to give more to the person with more background in it.
Zvi @ 2022-11-16T15:41 (+11)
From an upcoming post I am drafting: I would point out that ‘heroes put the entire group, many innocent people, ‘the city,’ planet Earth or even the whole damn universe or multiverse in grave danger to save any main character or other thing that We Cannot Bear To Lose, because That’s What Heroes Do’ is ubiquitous in our fantasy media. It might be a majority of DC comics plots. Villains invoke it because decision theory, they know it will work, and even without that it is rather mind-bogglingly awful. That kind of thinking needs to be widely condemned and fall in status at least via What The Hell Hero moments, and I worry it has more influence in these situations than we think.
RedStateBlueState @ 2022-11-16T15:25 (+11)
First a disclaimer that I’ve never got anywhere close to interacting with SBF personally; I’m very much an outsider to this situation. However, from everything I have read, I think it’s pretty ridiculous to suggest that EA wasn’t the main reason SBF tried so hard to maximize profit (poorly, I might add, but it seems like that was his goal) to the point of committing fraud. As far as I understand EA was SBF’s primary guiding ideology; it is why he went down this career path of Jane Street and then starting his own companies. This post seems overly reliant on the fun fact that SBF paid more for e-sports naming rights than on EA donations to show why actually Sam didn’t care about EA that much. But these are two completely separate things! E-sports naming rights is just a means of advertising, with the goal of making FTX more money which will eventually allow SBF to donate more to EA. I think there’s also decent evidence that SBF was looking to ramp up donations in the future, as Effective Altruism continues to grow and is able to use more funding. Once you take out this fun fact about SBF’s current EA spending, I think this whole argument kind of falls apart.
RobBensinger @ 2022-11-16T22:47 (+6)
Seems like a reasonable objection to me. (Though it's still weird that SBF overpaid so much for that particular form of advertising; and it's weird that SBF didn't set aside money for FTX FF.)
zeshen @ 2022-11-16T07:42 (+11)
I like this post much more than your previous post.
Greg_Colbourn @ 2022-11-16T15:46 (+8)
Is there a source for the $140M figure?
Douglas Knight @ 2022-11-17T03:21 (+5)
My guess is that this is the June figure for the FTX Future Fund grant commitments. The current figure is $160M as of September 1st. Some of these grants were in installments, especially the multi-year ones, and not all of the money was transferred. This Fund was "longtermist" and I do not see a dollar figure on other FTX charitable giving. This does not include $500M in equity in Anthropic.
Added, weeks later: Or maybe he got it from NYT:
As recently as last month, the umbrella FTX Foundation said it had given away $140 million, of which $90 million went through the FTX Future Fund dedicated to long-term causes. It is unclear how much of that money made it to the recipients and how much was earmarked for giving in installments over several years.
which seems to be sourced from NYT a month ago:
Mr. Bankman-Fried makes his donations through the FTX Foundation, which has given away $140 million, of which $90 million has gone through the group’s Future Fund toward long-term causes.
I suspect that these numbers are actually delivered, not promises. My guess is that the Future Fund pledged $190 million, 160 directly and 30 through regranting, delivered 100 and failed to deliver $90 million (a). (Plus $50 million not through the Future Fund, at least some of which counts as EA.)
Greg_Colbourn @ 2022-11-17T07:43 (+2)
Thanks, there is also $32M from the regrants tab. But yes, difficult to know the actual total of payouts without word from the staff. Or payouts not subject to clawback without further details on legal proceedings.
Jason @ 2022-11-16T21:01 (+7)
When comparing the size of SBF/FTX outlay on EA vs. stuff like naming rights, I think it is important to compare apples to apples. As far as the victim's perspective, the key question is "how much money went out the door" as opposed to "how much did SBF/FTX plan or commit to spend in the future?" Although I don't know how the naming rights deals were set up, I suspect that much of the money was to be paid in the future. That means the stadiums, teams, etc. are now general unsecured creditors on any claims. I am hearing that depositor claims may be valued on the distressed-debt market at 3-5 cents on the dollar, so the claims of naming-rights counterparties are likely worth even less.
EliezerYudkowsky @ 2022-11-16T23:56 (+3)
Fair point.
Manuel Del Río Rodríguez @ 2022-12-15T17:31 (+6)
The question that heads this post obviously answers itself, in that only actual perpetrators of bad deeds and their direct instigators (intellectual or otherwise) are to be held accountable for them; nevertheless, I must admit that I found Eliezer Yudkowsky' analogy unconvincing, and (not quite, but feeling a little bit) disingenuous. Whenever we see examples of adherents of some creed, ideology, religious or thought system going into nefarious places, it is natural to wonder if said ideas (whether properly or mistakenly interpreted) influenced or condoned the path they took. Some articles I have read lately have pointed the finger towards the hubristic hazards of miscalculating for optimal results, and the concomitant dangers of risky betting and of cutting corners. As is well known, the road to hell is paved with good intentions. And besides, as has been stated, a lot of the people involved in this weren't just 'fellow travelers' or ocasional readers of EA material. A lot of them were very visibly engaged in and seen as poster childs for the movement. And I am sure most of them were innocent victims, especially so the rank-and-file workers of FTX and Alameda.
Having said that, I do not find it reasonable either to go to masochistic extremes of self-flagellation. Humans being what they are, there will always be cases of wolves in sheep's clothing, and never enough controls to catch them in advance. Which is humbling, in a not necessarily bad way. My impression is that the EA community and its members are a wonderful group of people and they will probably come out of this situation wiser, if sadder. And that obviously, it is wrong to blame EA for what has happened.
As Eliezer Yudkowsky mentions Caroline Ellison's blog, I would like to say that I have been reading it of late, and even taking into account the potential deceitfulness of words and the pictures we build with them, I do not get from both its contents and her general trajectory that she could be a morally bankrupt person. On the contrary, the impression I got is of a true believer, and a good person. This does not preclude the possibility that, under circumstances of a certain naiveté and inexperience in a field as murky as crypto, she might have let herself go along with what she might have perceived as temporary and 'bad' expedient means. But to believe this person ever intended to purposely and maliciously scam people our of their money or be privy to a fraud is, for me, completely out of the question. I believe the best option is to be charitable and await to see what the courts of law have to say once the dust has settled. As for SBF, and after reading some of the things he has said and done, that's a completely different story.
Geoffrey Miller @ 2022-11-16T19:36 (+6)
In case anyone wants a reference for the $210 million that FTX committed to spend on esports naming rights for TSM, a Washington Post article from today is here
Missy Maserati @ 2022-11-16T17:27 (+6)
I'm very new here, just signed in today so I'm unfamiliar with all the formats at the point but I often seek to explain how biological factors can play a role in morality or our decision making because it can be useful to understand our brain's limitations among all the other factors.
Stress, isolation and the position of power have consequences for the brain. The less cooperative one has to function within their society to leads to damage in areas of the brain responsible for decision making, the anterior cingulate cortex being the most key area. It breaks down the ability for the human to manage their own emotions and impulse control becomes a problem over time. I haven't followed SBFs career closely but there were perhaps signs of his brain struggling. Addiction is a typical symptom this is occuring.
We are but human, all of us, and we can't supercede how our brains evolved to operate. It's a very tricky position to have so much responsibility and power, that consolidation of power becomes potentially harmful as it did with FTX. This is my no means an excuse for SBF, it's just a potential problem to consider when engaging in effective altruism through large amounts of wealth. Managing ones own ego is maybe a lot harder than some anticipated. Perhaps even the most difficult tasks they'll ever do because of the design of the brain. Lots and lots of emotional self care would help, but free will is a tricky concept of whether it's possible to beat our own brains since we make decisions before we become aware we did. Self compassion is a very important piece, maybe one SBF did not have.
andrewpei @ 2022-11-16T08:58 (+6)
I agree that EA likely wasn't a major causal factor for FTX/SBF's likely fraud. Unfortunately, it's a situation where even if it's not our fault it is our problem. People are trashing EA across the internet because of Sam's position in the movement. His Twitter profile pic still has him wearing an EA shirt for christ sake!
Davidmanheim @ 2022-11-16T10:07 (+18)
So are people who never attacked EA before suddenly doing so? That isn't what I've seen. I've seen lots of bad-faith takes about how this is proof of what they always thought, and news reporting which is about as accurate as you'd expect - that is, barely correct on the knowable facts, and misleading or confused about anything more complicated than that.
Justin Helps @ 2022-11-16T20:07 (+3)
EA is a brand, and people on the outside don't have much information about it, so a negative association matters on the margin for recruiting. The main post makes a fair point about not going overboard with self blame, but it seems good for EA folks to be publicly concerned about how they could have acted better, or to publicly discuss the lessons they're taking. At the very least, I don't think it's worth much effort to stop people from doing so.
Ramiro @ 2022-11-17T04:10 (+2)
Can you imagine how you'd judge it if, rather than my writing it as a joke, Naomi Novik had gone online and sincerely tried to accept blame for FTX's fall, because she thought she hadn't been careful enough to put messages about good corporate governance and careful accounting into her fantasy novels, and Novik had talked about how she was planning to donate an appropriate portion of her Scholomance book royalties back to FTX's ruined customers?
Even so, I'm still recommending people to read Terry Pratchett instead of Novik. Something something low probability, large impact.
But seriously, I think the problem is less how SBF self-identified with EA, and more the way EAs saw him as The Hero.
Anyway, maybe EAs do have a problem of egocentrism.
Ramiro @ 2022-11-17T03:48 (+2)
We can infer that probably at least $30 of Scholomance sales are due to Caroline Ellison, and with the resources that Ellison commanded as co-CEO of Alameda
C'mon, if she's a true maximizer using depositors money, I guess she'd just download it from z-library
OMG, is this why z-lib was recently seized by FBI?
quinn @ 2022-11-20T19:47 (+1)
Poor accounting, possibly just no really global accounting or sense of where the money was going;
I chatted with an Alameda python dev for about an hour. I tried to get a sense of their testing culture, QA practices, etc. Lmao: there didn't seem to be any. Soups of scripts, no time for tests, no internal audits. Just my impression.
My type-driven and property-based testing zealot/pedant side has harvested some bayes points, unfortunately.
specbug @ 2022-11-16T07:58 (+1)
If there's anyone other than FTX who's really to blame, here, it's me. I've written some fiction that tries to walk people through the experience of abandoning sunk costs and facing reality. Including my most recent work.
Caroline Ellison, according to her tumblr, had even started reading it...
But her liveblogs cut out before she got very far in.
I just wasn't a good-enough writer; I lost my reader's attention, and with it, perhaps, the world.
We do not know her absolute state of mind when FTX (mis)used customer deposits. But, for all its worthiness, I wouldn't have predicted ahead of time that EY's writings packed a sufficient multiplicative weight to attain the sufficient condition that drove the state. I think the simple answer is whole (per the current corpus of evidence); 'FTX.'.
EliezerYudkowsky @ 2022-11-16T09:37 (+17)
I was not being serious there. It was meant to show - see, I could blame myself too, if I wanted to be silly; now don't be that silly.
Davidmanheim @ 2022-11-16T10:04 (+9)
I think you probably need to label your account "EliezerYudkowsky (parody)" because otherwise a few people might not realize you're occasionally being sarcastic, and then you might get banned from Twitter.