The FTX Future Fund team has resigned

By Nick_Beckstead, leopold, ab, ketanrama @ 2022-11-11T02:04 (+680)

We were shocked and immensely saddened to learn of the recent events at FTX. Our hearts go out to the thousands of FTX customers whose finances may have been jeopardized or destroyed.

We are now unable to perform our work or process grants, and we have fundamental questions about the legitimacy and integrity of the business operations that were funding the FTX Foundation and the Future Fund. As a result, we resigned earlier today.

We don’t yet have a full picture of what went wrong, and we are following the news online as it unfolds. But to the extent that the leadership of FTX may have engaged in deception or dishonesty, we condemn that behavior in the strongest possible terms. We believe that being a good actor in the world means striving to act with honesty and integrity. 

We are devastated to say that it looks likely that there are many committed grants that the Future Fund will be unable to honor. We are so sorry that it has come to this. We are no longer employed by the Future Fund, but, in our personal capacities, we are exploring ways to help with this awful situation. We joined the Future Fund to support incredible people and projects, and this outcome is heartbreaking to us. 

We appreciate the grantees' work to help build a better future, and we have been honored to support it. We're sorry that we won't be able to continue to do so going forward, and we deeply regret the difficult, painful, and stressful position that many of you are now in.

To reach us, grantees may email grantee-reachout@googlegroups.com. We know grantees must have many questions, and in our personal capacities we will try to answer them as best as we can given the circumstances.


Nick Beckstead

Leopold Aschenbrenner

Avital Balwit

Ketan Ramakrishnan

Will MacAskill


Sharmake @ 2022-11-11T15:09 (+141)

What do EA and the FTX Future Team think of a claim by Kerry Vaughan that Sam Bankman-Fried did severely unethical behavior before and EA and FTX covered it up and laundered his reputation, effectively getting away with it.

I'm posting because of true, this suggests big changes to EA norms are necessary to deal with bad actors like him, and that Sam Bankman-Fried should be outright banned from the forum and EA events.

Link to tweets here:

https://twitter.com/KerryLVaughan/status/1590807597011333120

Kerry_Vaughan @ 2022-11-12T00:20 (+225)

I want to clarify the claims I'm making in the Twitter thread.

I am not claiming that EA leadership or members of the FTX Future fund knew Sam was engaging in fraudulent behavior while they were working at FTX Future Fund.

Instead, I am saying that friends of mine in the EA community worked at Alameda Research during the first 6 months of its existence. At the end of that period, many of them suddenly left all at once. In talking about this with people involved, my impression is:

1) The majority of staff at Alameda were unhappy with Sam's leadership of the company. Their concerns about Sam included concerns about him taking extreme and unnecessary risks and losing large amounts of money,  poor safeguards around moving money around, poor capital controls, including a lack of distinction between money owned by investors and money owned by Alameda itself, and Sam generally being extremely difficult to work with.

2) The legal ownership structure of Alameda did not reflect the ownership structure that had been agreed to by the parties involved.  In particular, Sam registered Alameda under his sole ownership and not as jointly owned by him and his cofounders. This was not thought to be a problem because everyone trusted each other as EAs.

3) Eventually, the conflict got serious enough that a large cohort of people decided to quit all at once. Sam refused to honor the agreed ownership structure of Alameda and used his legal ownership to screw people out of their rightful ownership stake in Alameda Research.

4) Several high-ranking and well-respected members of the EA community with more familiarity with the situation believed that Sam had behaved unethically in his handling of the situation. I heard this from multiple sources personally.

5) I believe the basic information above circulated widely among EA leadership and was known to some members of FTX Future Fund.

A person who did work at Alameda during this period described Sam's behavior as follows:

ruthless, immoral backstabbing, and exploitation of EA’s willingness to always play cooperate. it was also extreme risk-taking (including totally unnecessary EV- risks), and a cavalier disregard for any of the standard legal structures or safeguards around managing money. current events are entirely unsurprising and follow from the exact same pattern of behaviour.

Additionally, Jeffrey Ladish has a Twitter thread that further suggests that concerns about Sam's business practices were somewhat widespread.

Information about pre-2018 Alameda is difficult to obtain because the majority of those directly involved signed NDAs before their departure in exchange for severance payments. I am aware of only one employee who did not. The other people who can spreak freely on the topic are early investors in Alameda and members of the EA community who heard about Alameda from those directly involved before they signed their NDAs.

I want to add that I am reasonably afraid that my discussing this will make me some powerful enemies in the EA community. I didn't raise this issue sooner because of this concern. If anyone else wants to step up and reveal their information or otherwise help agitate strongly to ensure all the information about this situation comes to light, that would be most appreciated. 

I think this is really important.

ftxthrowaway @ 2022-11-12T05:27 (+230)

I was one of the people who left at the time described. I don't think this summary is accurate, particularly (3).

(1) seems the most true, but anyone who's heard Sam on a podcast could tell you he has an enormous appetite for risk. IIRC he's publicly stated they bet the entire company on FTX despite thinking it had a <20% chance of paying off. And yeah, when Sam plays league of legends while talking to famous investors he seems like a quirky billionaire; when he does it to you he seems like a dick. There are a lot of bad things I can say about Sam, but there's no elaborate conspiracy.

Lastly, my severance agreement didn't have a non-disparagement clause, and I'm pretty sure no one's did. I assume that you are not hearing from staff because they are worried about the looming shitstorm over FTX now, not some agreement from four years ago.

When said shitstorm dies down I might post more and under my real name, but for now the phrase "wireless mouse" should confirm me as someone who worked there at the time to anyone else who was also there.

nbouscal @ 2022-11-12T10:51 (+165)

I'm the person that Kerry was quoting here, and am at least one of the reasons he believed the others had signed agreements with non-disparagement clauses. I didn't sign a severance agreement for a few reasons: I wanted to retain the ability to sue, I believed there was a non-disparagement clause, and I didn't want to sign away rights to the ownership stake that I had been verbally told I would receive. Given that I didn't actually sign it, I could believe that the non-disparagement clauses were removed and I didn't know about it, and people have just been quiet for other reasons (of which there are certainly plenty).

I think point 3 is overstated but not fundamentally inaccurate. My understanding was that a group of senior leadership offered Sam to buy him out, he declined, and he bought them out instead. My further understanding is that his negotiating position was far stronger than it should have been due to him having sole legal ownership (which I was told he obtained in a way I think it is more than fair to describe as backstabbing). I wasn't personally involved in those negotiations, in part because I clashed with Sam probably worse than anyone else at the company, which likely would have derailed them.

That brings me to my next point, which is that I definitely had one of the most negative opinions of Sam and his actions at the time, and it's reasonable for people to downweight my take on all of this accordingly. That said, I do feel that my perspective has been clearly vindicated by current events.

I want to push back very strongly against the idea that this was primarily about Sam's appetite for risk. Yes, he has an absurd appetite for risk, but what's more important is what kinds of risks he has an appetite for. He consistently displayed a flagrant disregard for legal structures and safeguards, a belief that rules do not apply to him, and an inclination to see the ends as justifying the means. At this stage it's clear that what happened at FTX was fraud, plain and simple, and his decision to engage in that fraud was entirely in character.

(As a minor note, I can confirm that the "wireless mouse" phrase does validate ftxthrowaway as someone who was there at the time, though of course now that it has been used this way publicly once it will no longer be valid in the future.)

John_Maxwell @ 2022-11-12T17:07 (+45)

I'm curious if you (or any other "SBF skeptic") has any opinion regarding whether his character flaws should've been apparent to more people outside the organizations he worked at, e.g. on the basis of his public interviews. Or alternatively, were there any red flags in retrospect when you first met him?

I'm asking because so far this thread has discussed the problem in terms of private info not propagating. But I want to understand if the problem could've been stopped at the level of public info. If so that suggests that a solution of just getting better at propagating private info may be unsatisfactory -- lots of EAs had public info about SBF, but few made a stink.

I'm also interested to hear "SBF skeptic" takes on the extent his character flaws were a result of his involvement in EA. Or maybe something about being raised consequentialist as a kid? Like, if we believe that SBF would've been a good person if it weren't for exposure to consequentialist ideas, that suggests we should do major introspection.

nbouscal @ 2022-11-12T17:58 (+104)

One of the biggest lessons I learned from all of this is that while humans are quite good judges of character in general, we do a lot worse in the presence of sufficient charisma, and in those cases we can't trust our guts, even when they're usually right. When I first met SBF, I liked him quite a bit, and I didn't notice any red flags. Even during the first month or two of working with him, I kind of had blinders on and made excuses for things that in retrospect I shouldn't have.

It's hard for me to say about what people should have been able to detect from his public presence, because I haven't watched any of his public interviews. I put a fair amount of effort into making sure that news about him (or FTX) didn't show up in any of my feeds, because when it did I found it pretty triggering.

Personally, I don't think his character flaws are at all a function of EA. To me, his character seems a lot more like what I hear from friends who work in politics about what some people are like in that domain. Given his family is very involved in politics, that connection seems plausible to me. This is very uncharitable, but: from my discussions with him he always seemed a lot more interested in power than in doing good, and I always worried that he just saw doing good as an opportunity to gain power. There's obviously no way for me to have any kind of confidence in that assessment, though, and I don't think people should put hardly any weight on it.

John_Maxwell @ 2022-11-12T18:50 (+27)

Thanks for the reply!

In terms of public interviews, I think the most interesting/relevant parts are him expressing willingness to bite consequentialist/utilitarian bullets in a way that's a bit on the edge of the mainstream Overton window, but I believe would've been within the EA Overton window prior to recent events (unsure about now). BTW I got these examples from Marginal Revolution comments/Twitter.

  • This one seems most relevant -- the first question Patrick asks Sam is whether the ends justify the means.

  • In this interview, search for "So why then should we ever spend a whole lot of money on life extension since we can just replace people pretty cheaply?" and "Should a Benthamite be risk-neutral with regard to social welfare?"

In any case, given that you think people should put hardly any weight on your assessment, it seems to me that as a community we should be doing a fair amount of introspection. Here are some things I've been thinking about:

  • We should update away from "EA exceptionalism" and towards self-doubt. (EDIT: I like this thread about "EA exceptionalism", though I don't agree with all the claims.) It sounds like you think more self-doubt would've been really helpful for Sam. IMO, self-doubt should increase in proportion to one's power. (Trying to "more than cancel out" the normal human tendency towards decreased self-doubt as power increases.) This one is tricky, because it seems bad to tell people who already experience Chidi Anagonye-style crippling self-doubt that they should self-doubt even more. But it certainly seems good for our average level of self-doubt to increase, even if self-doubt need not increase in every individual EA. Related: Having the self-awareness to know where you are on the self-doubt spectrum seems like an important and unsolved problem.

  • I'm also wondering if I should think of "morality" as being two different things: A descriptive account of what I value, and (separately) a prescriptive code of behavior. And then, beyond just endorsing the abstract concept of ethical injunctions, maybe it would be good to take a stab at codifying exactly what they should be. The idea seems a bit under-operationalized, although it's likely there are relevant blog posts that aren't coming to my mind. Like, I notice that the EA who's most associated with the phrase "ethical injunctions" is also the biggest advocate of drastic unilateral action, and I'm not sure how to reconcile that (not trying to throw shade -- genuinely unsure). EDIT: This is a great tweet; related.

Institutional safeguards are also looking better, but I was already very in favor of those and puzzled by lack of EA interest, so I can't say it was a huge update for me personally.

Arepo @ 2022-11-12T23:30 (+19)

This one is tricky, because it seems bad to tell people who already experience Chidi Anagonye-style crippling self-doubt that they should self-doubt even more.

EA self-doubt has always seemed weirdly compartmentalized to me. Even the humblest of people in the movement is often happy to dismiss considered viewpoints by highly intelligent people on the grounds that it doesn't satisfy EA principles. This includes me - I think we are sometimes right to do so, but probably do so far too much nonetheless.

John_Maxwell @ 2022-11-13T03:23 (+1)

Seems plausible, I think it would be good to have a dedicated "translator" who tries to understand & steelman views that are less mainstream in EA.

Wasn't sure about the relevance of that link?

Arepo @ 2022-11-13T09:56 (+15)

(from phone) That was an example of an ea being highly upvoted for dismissing multiple extremely smart and well meaning people's life's work as 'really flimsy and incredibly speculative' because he wasn't satisfied that they could justify their work within a framework that the ea movement had decided is one of the only ones worth contemplating. As if that framework itself isn't incredibly speculative (and therefore if you reject any of its many suppositions, really flimsy)

John_Maxwell @ 2022-11-13T18:15 (+7)

Thanks!

I'm not sure I share your view of that post. Some quotes from it:

...he just believed it was really important for humanity to make space settlements in order for it to survive long-term... From what I could tell, [my professor] probably spend less than 10 hours seriously figuring out if space settlements would actually be more valuable to humanity than other alternatives.

...

Take SpaceX, Blue Origin, Neurolink, OpenAI. Each of these started with a really flimsy and incredibly speculative moral case. Now, each is probably worth at least $10 Billion, some much more. They all have very large groups of brilliant engineers and scientists. They all don't seem to have researchers really analyzing the missions to make sure they actually make sense.

...

My impression is that Andrew Carnegie spent very little, if anything, to figure out if libraries were really the best use of his money, before going ahead and funding 3,000 libraries.

...

I rarely see political groups seriously red-teaming their own policies, before they sign them into law, after which the impacts can last for hundreds of years.

I don't think any of these observations hinge on the EA framework strongly? Like, do we have reason to believe Andrew Carnegie spent a significant amount trying to figure out if libraries were a great donation target by his own lights, as opposed to according to the EA framework?

The thing that annoyed me about that post was that at the time it was written, it seemed to me that the EA movement was also fairly guilty of this! (It was written before the criticism/red teaming contest.)

Arepo @ 2022-11-14T11:54 (+7)

I'm not familiar enough with the case of Andrew Carnegie to comment and I agree on the point of political tribalism. The other two are what bother me. 

On the professor, the problem is there explicitly: you omitted a key line 'I tried asking for his opinion on existential threats', which is a strongly EA-identifying approach, and one which many people feel is too simplistic. Eg see Gideon Futurman's EAGx Rotterdam talk when it's up - he argues the way EAs think about x-risk is far too simplified, focusing on single-event narratives, ignoring countless possible trajectories that could end in extinction or similar any one of which is vanishingly unlikely, but which collectively we should take much more seriously. Whether or not one agrees with this view, it seems to me to be one a smart person could reasonably hold, and shows that by asking someone 'his opinion on existential threats, and which specific scenarios these space settlements would help with', you're pigeonholing them into EA-aligned specific-single-event way of thinking.

As for Elon Musk, I think the same problem is there implicitly: he's written a paper called 'Making Humans a Multiplanetary Species', spoken extensively on the subject and spent his life thinking that it's important, and while you could reasonably disagree with his arguments, I don't see any grounds for dismissing them as 'really flimsy and incredibly speculative' without engagement, unless your reason for doing so is 'there exists a pool of important research which contradicts them and which I think is correct'. There are certainly plenty of other smart people who think as he does, some of them EAs (though maybe that doesn't contribute to my original complaint). Since there's  a very clear mathematical argument that it's harder to kill all of a more widespread and numerous civilisation, to say that the case is 'really flimsy', you basically need to assume the  EA-aligned narrative that AI is highly likely to kill us all.

John_Maxwell @ 2022-11-15T02:37 (+4)

Thanks!

Simon Bazelon @ 2022-11-12T19:14 (+3)

What's interesting about this interview clip though is that he seems to explicitly endorse a set of principles that directly contradict the actions he took! 

John_Maxwell @ 2022-11-12T19:23 (+4)

Well that's the thing -- it seems likely he didn't see his actions as contradicting those principles. Suggesting that they're actually a dangerous set of principles to endorse, even if they sound reasonable. That's what's really got me thinking.

I wonder if part of the problem is a consistent failure of imagination on the part of humans to see how our designs might fail. Kind of like how an amateur chess player devotes a lot more thought to how they could win than how their opponent could win. So if the principles Sam endorsed are at all recoverable, maybe they could be recovered via a process like "before violating common-sense ethics for the sake of utility, go down a massive checklist searching for reasons why this could be a mistake, including external observers in the decision if possible".

Sharmake @ 2022-11-12T19:37 (+3)

My guess is standard motivated reasoning explains why he thought he wasn't in violation of his stated principles.

Question, but why do you think the principles were dangerous, exactly? I am confused about the danger you state.

John_Maxwell @ 2022-11-12T19:41 (+4)

I think your first paragraph provides a potential answer to your second :-)

There's an implicit "Sam fell prey to motivated reasoning, but I wouldn't do that" in your comment, which itself seems like motivated reasoning :-)

(At least, it seems like motivated reasoning in the absence of a strong story for Sam being different from the rest of us. That's why I'm so interested in what people like nbouscal have to say.)

Sharmake @ 2022-11-12T19:53 (+4)

So you think there's too much danger of cutting yourself and everyone else via motivated reasoning, ala Dan Luu's "Normalization of Deviance" and the principles have little room for errors in implementing them, is that right?

Here's a link to it:

https://danluu.com/wat/

And a quote:

most human beings perceive themselves as good and decent people, such that they can understand many of their rule violations as entirely rational and ethically acceptable responses to problematic situations. They understand themselves to be doing nothing wrong, and will be outraged and often fiercely defend themselves when confronted with evidence to the contrary.

John_Maxwell @ 2022-11-12T19:59 (+3)

I'm not sure what you mean by "the principles have little room for errors in implementing them".

That quote seems scarily plausible.

EDIT: Relevant Twitter thread

Sharmake @ 2022-11-12T20:03 (+3)

Specifically, I was saying that wrong results would come up if you failed in one of the steps of reasoning, and there's no self-correction mechanism for bad reasoning like Sam Bankman-Fried was doing.

ftxthrowaway @ 2022-11-12T19:19 (+34)

I do feel that my perspective has been clearly vindicated by current events.

Can I ask the obvious question of whether you made money by shorting ftt? You were both one of the most anti-FTX and most still involved in crypto trading, so I suspect if you didn't then no one did.

Ps: apologies for burning the "wireless mouse" Commons. If others want to make throwaways, feel free to dm me what that is referring to and I will publicly comment my verification.

arthrowaway @ 2022-11-13T17:45 (+27)

Also no non-disparagement clause in my agreement. FWIW I was one of the people who negotiated the severance stuff after the 2018 blowup, and I feel fairly confident that that holds for everyone. (But my memory is crappy, so that's mostly because I trust the FB post about what was negotiated more than you do.) 

DM'd you.

ftxthrowaway @ 2022-11-13T18:23 (+10)

Confirming this account made an Alameda research reference in my DMs.

nbouscal @ 2022-11-13T18:05 (+5)

… I assume you realise that that narrows you down to one of two people (given it's safe to assume Nishad is not currently spending his time on the EA Forum)

I do think I was probably just remembering incorrectly about this to be honest, I looked back through things from then and it looks like there was a lot of back-and-forth about the inclusion of an NDA (among other clauses), so it seems very plausible that it was just removed entirely during that negotiation (aside from the one in the IP agreement).

arthrowaway @ 2022-11-13T18:31 (+4)

… I assume you realise that that narrows you down to one of two people (given it's safe to assume Nishad is not currently spending his time on the EA Forum)

yep, not too worried about this. thanks for flagging :) 

nbouscal @ 2022-11-12T20:08 (+11)

Can I ask the obvious question of whether you made money by shorting ftt? You were both one of the most anti-FTX and most still involved in crypto trading, so I suspect if you didn't then no one did.

 

I've been on leave from work due to severe burnout for the last couple months (and still am), and was intentionally avoiding seeing anything about SBF/FTX outside of work until recent events made that basically impossible. So no, I didn't personally trade on any of this at all.

ftxthrowaway @ 2022-11-12T20:20 (+12)

Fair. Sorry to hear that, I hope you can go back to ignoring the situation soon!

Anonymous (for unimpressive reasons =[ ) @ 2022-11-13T23:07 (+10)

Can you answer two questions related to the source of SBF's early business wealth?

Were the Kimchi arb returns real?

As you know, the "Kimchi premium" was this difference in BTC price between Korea (Japan?) and the rest of the world.

The narrative is that SBF arbed this price difference to make many millions and create his early wealth.

The Sequoia puff piece makes this cute story:

Curious, SBF had started looking into crypto—and almost immediately noticed something strange. Bitcoin was trading at a higher price in Japan and Korea than it was in the U.S. In theory, this should never happen because it represents a riskless profit opportunity—in other words, a free lunch. One simply buys Bitcoin at the lower price, sells it at the higher price, and pockets the difference. Jane Street built an empire on high-frequency trades that took advantage of fraction-of-a-cent price differences. But here was Bitcoin, trading at around $15,000 in South Korea: an unheard-of 50 percent price premium.

After SBF's fall, Twitter speculation says this is dubious. 

This is because the cause of the Kimchi premium was strict legal capital controls, and the liquidity was orders of magnitude too small to produce the wealth in SBF later used. At best, SBF was actively breaking laws by this trade. The amount of money he could make may have been too small to justify the narratives around his early success.

Do you have any comments on the above? 

 

Jaan Tallinn investment

Tallinn later ended up funding SBF with $50M. 

What would you say to the speculation that it was this funding, and not the Kimchi arb , that really launched SBF's career?

 

If this is mostly true, the takeaway is that there's little cleverness or competency being expressed here here?

It seems like power, money and access led to SBF's success. This theme would fit with SBF's later behavior, with bluffing and overaweing spend. 

That tradition seems hollow and bad, maybe contagious to the things that SBF created or touched.

This could be useful in some way? It seems like the vector EA or EA PR could take, could counter this.

nbouscal @ 2022-11-14T00:31 (+112)

I don't mind sharing a bit about this. SBF desperately wanted to do the Korea arb, and we spent quite a bit of time coming up with any number of outlandish tactics that might enable us to do so, but we were never able to actually figure it out. The capital controls worked. The best we could do was predict which direction the premium would go and trade into KRW and then back out of it accordingly.

Japan was different. We were able to get a Japanese entity set up, and we did successfully trade on the Japan arb. As far as I know we didn't break any laws in doing so, but I wasn't directly involved in the operational side of it. My recollection is that we made something like 10-30 million dollars (~90%CI) off of that arb in total, but I'm not at all confident on the exact amount.

Is that what created his early wealth, though? Not really. Before we all left, pretty much all of that profit had been lost to a series of bad trades and mismanagement of assets. Examples included some number of millions lost to a large directional bet on ETH (that Sam made directly counter to the predictions of our best event trader), a few million more on a large OTC trade in some illiquid shitcoin that crashed long before we could get out of it, another couple million in a series of XRP transfers that nobody noticed had never arrived, and that had fallen in value by something like 90% when they finally showed up much later, and various other random small things like a junior trader accidentally transferring half a million dollars of USDT to a BTC address (or something like that) due to a complete lack of safeguards on transfers, etc. Not to mention absurd levels of expenditures, e.g. an AWS bill that at one point reached about a quarter million dollars per month.

My knowledge of the story ends when we left, and my recollection is that at that point the Japan arb had long been closed and most of our profits from it had been squandered. I don't know how he achieved his later success, but if I were to guess, I'd say it probably has a lot more to do with setting up FTX, launching highly predatory instruments like leveraged ETF tokens on it, and doing similarly shady stuff to the things that brought it all crashing down, but during a bull market that made all of those risks pay off. That's entirely guesswork though, I have no inside knowledge about anything that happened after April 2018.

Note: All of this is purely from memory, I have not cross-checked it with anyone else who was there, and it could be substantially wrong in the details. It has been a long time, and I spent most of that time trying to forget all about it. I'm sharing this because I believe the broad strokes of it to be accurate, but please do not update too strongly from it, nor quote it without mentioning this disclaimer.

Sabs @ 2022-11-14T08:01 (+4)

What about the GBTC arb trade? Did Alameda get into that during your time there?

nbouscal @ 2022-11-14T08:23 (+4)

Good question, but tbh I just don’t remember the answer.

Yitz @ 2022-11-14T04:17 (+4)

Thank you for sharing, I can understand why you might be feeling burnt out!! I've been in a workplace environment that reminds me of this, and especially if you care about the people and projects there...it's painful.

John G. Halstead @ 2022-11-12T12:15 (+12)

Thanks for sharing this nbouscal. How many people did you tell about this at the time?

nbouscal @ 2022-11-12T12:57 (+54)

Personally, I remember telling at least a handful of people at the time that Sam belonged in a jail cell, but I expect that people thought I was being hyperbolic (which was entirely fair, I was traumatised and was probably communicating in a way that signalled unreliability).

I was told that conversations were had with people in leadership roles in EA. I wasn’t part of those conversations and don’t know the full details of what was discussed or with whom.

Asa Cooper Stickland @ 2022-11-12T19:12 (+21)

It would be awesome for the names of senior people who knew to be made public, plus the exact nature of what they were told and their response or lack thereof.

Ozzie Gooen @ 2022-11-12T20:37 (+5)

I think this could be a nice-to-have, but really, I think it's too much to ask,
"For every senior EA, we want a long list of exactly each thing they knew about SBF"


This would probably be a massive pain, and much of the key information will be confidential (for example, informants who want to remain anonymous). 

My guess is that there were a bunch of flags that were more apparent than nbouscal's stories.

I do think we should have really useful summaries of the key results. If there were a few people who were complicit or highly negligent, then that should be reported, and appropriate actions taken. 

Devon Fritz @ 2022-11-13T12:14 (+9)

I strongly believe it is hyperrelevant to know who knew what, when so that these people are held to account. I don't think this is too much to ask, nor does it have to be arduous in the way you described of getting every name with max fidelity. I see so many claims that "key EA members knew what was going on" and never any sort of name associate with it.  

Ozzie Gooen @ 2022-11-13T15:27 (+8)

I agree this is really important and would really, really want it to be figured out, and key actions taken. I think I'm less focused on all of the information of such a discovery being public, as opposed to much of it being summarized a bit.

Isaac King @ 2022-11-15T01:05 (+2)

A summary of sorts is being compiled here:

pseudonym @ 2022-11-13T04:18 (+2)

What would you suggest might be appropriate actions for complicity or negligence? 

Ozzie Gooen @ 2022-11-13T05:19 (+2)

I don't feel like I'm in a good place to give a good answer. First, I haven't really thought about it nor am I an expert in these sorts of matters. 

Second, I'm like several layers deep in funding structures that start with these people. It's sort of like asking me to publicly write what I love/hate, objectively, about my boss.

I think I could say that I'd expect appropriate actions to look a lot like they do with top companies (mainly ones without lots of known management integrity problems). At these companies, I believe that when some officials are investigated for potential issues, often they're given no punishment, and sometimes they're fired. It really depends on the details of the findings.

John G. Halstead @ 2022-11-12T10:38 (+116)

I think it is very important to understand what was known about SBF's behaviour during the initial Alameda breakup, and for this to be publicly discussed and to understand if any of this disaster was predictable beforehand. I have recently spoken to someone involved who told me that SBF was not just cavalier, but unethical and violated commonsense ethical norms. We really need to understand whether this was known beforehand, and if so learn some very hard lessons. 

It is important to distinguish different types of risk-taking here. (1) There is the kind of risk taking that promises high payoffs but with a high chance of the bet falling to zero, without violating commonsense ethical norms, (2) Risk taking in the sense of being willing to  risk it all secretly violating ethical norms to get more money. One flaw in SBF's thinking seemed to be that risk-neutral altruists should take big risks because the returns can only fall to zero. In fact, the returns can go negative - eg all the people he has stiffed, and all of the damage he has done to EA. 

Kerry_Vaughan @ 2022-11-12T10:47 (+17)

I have recently spoken to someone involved who told me that SBF was not just cavalier, but unethical and violated commonsense ethical norms.

Are you in a position to be more specific about what SBF did that this is referring to?

John G. Halstead @ 2022-11-12T12:11 (+8)

no

EliezerYudkowsky @ 2022-11-12T02:16 (+121)

In 2021 I tried asking about SBF among what I suppose you could call "EA leadership", trying to distinguish whether to put SBF into the column of "keeps compacts but compact very carefully" versus "un-Lawful oathbreaker", based on having heard that early Alameda was a hard breakup.  I did not get a neatly itemized list resembling this one on either points 1 or 2, just heard back basically "yeah early Alameda was a hard breakup and the ones who left think they got screwed" (but not that there'd been a compact that got broken) (and definitely not that they'd had poor capital controls), and I tentatively put SBF into column 1.  If "EA leadership" had common knowledge of what you list under items 1 or 2, they didn't tell me about it when I asked.  I suppose in principle that I could've expended some of my limited time and stamina to go and inquire directly among the breakup victims looking for one who hadn't signed an NDA, but that's just a folly of perfect hindsight.

My own guess is that you are mischaracterizing what EA leadership knew.

Habryka @ 2022-11-12T17:21 (+165)

Huh, I am surprised that no one responded to you on this. I wonder whether I was part of that conversation, and if so, I would be interested in digging into what went wrong. 

I definitely would have put Sam into the "un-lawful oathbreaker" category and have warned many people I have been working with that Sam has a reputation for dishonesty and that we should limit our engagement with him (and more broadly I have been complaining about an erosion of honesty norms among EA leadership to many of the current leadership, in which I often brought up Sam as one of the sources of my concern directly). 

I definitely had many conversations with people in "EA leadership" (which is not an amazingly well-defined category) where people told me that I should not trust him. To be clear, nobody I talked to expected wide-scale fraud, and I don't think this included literally everyone, but almost everyone I talked to told me that I should assume that Sam lies substantially more than population-level baseline (while also being substantially more strategic about his lying  than almost everyone else).

I do want to add to this that in addition to Sam having a reputation for dishonesty, he also had a reputation for being vindictive, and almost everyone who told me about their concerns about Sam did so while seeming quite visibly afraid of retribution from Sam if they were to be identified as the source of the reputation, and I was never given details without also being asked for confidentiality. 

Milan_Griffes @ 2022-11-12T18:40 (+57)

Can you give some context on why Lightcone accepted a FTX Future Fund grant (a) given your view of his trustworthiness? 

Habryka @ 2022-11-12T20:11 (+94)

So far I have been running on the policy that I will  accept money from people who seem immoral to me, and indeed I preferred getting money from Sam instead of Open Philanthropy or other EA funders because I thought this would leave the other funders with more marginal resources that could be used to better ends (Edit: I also separately thought that FTX Foundation money would come with more freedom for Lightcone to pursue its aims independently, which I do think was a major consideration I don't want to elide).

To be clear, I think there is a reasonable case to be made for the other end of this tradeoff, but I currently still believe that it's OK for EAs to take money from people whose values or virtues they think are bad (and that indeed this is often better than taking money from the people who share your values and virtues, as long as its openly and willingly given). I think the actual tradeoffs are messy, and indeed I ended up encouraging us to go with a different funder for a loan arrangement for a property purchase we ended up making, since that kind of long-term relationship seemed much worse to me, and I was more worried about that entangling us more with FTX. 

To be again clear, I was not suspecting large-scale fraud. My sense was that Sam was working in a shady industry while being pretty dishonest in the way the crypto industry often is, but was primarily making money by causing tons of people to speculate in crypto while also being really good at trading against them and eating their lunch, which I think is like, not a great thing to do, but was ultimately within the law and was following reasonable deontological constraints in my opinion. 

I am seriously considering giving back a bunch of the money we received. I also for pretty similar reasons think that giving that money back does definitely not entail giving that money back to FTX right now, who maybe are just staging a hack on their own servers (or are being hacked) and should not be trusted with more resources. I expect this will instead require some kind of more sophisticated mechanism of actually helping the people who lost funds (conditional on the bankruptcy proceedings not doing clawbacks, which  I think is reasonable given that I think clawbacks are unlikely). 

I think it personally might have been better to have a policy of refusing funds from institutions that I think are bad and have power in my social ecosystem, so that I feel more comfortable speaking out against them. I personally prefer the policy of taking their money while also having a policy of just speaking out against them anyways (Dylan Matthews did this in one of his Future Perfect articles in a way I find quite admirable), but I do recognize this is setting myself up for a lot of trust in my own future integrity, and it might be better to tie myself to a mast here. 

I think the key damage caused by people in my reference class receiving funds from FTX was that they felt less comfortable criticizing FTX, and I think indeed in-retrospect I was more hesitant than I wish I would have been to speak out against Sam and FTX for this reason, and am currently spending a lot of my time trying to understand how to update and learn from this. It's pretty plausible to me that I fucked up pretty badly here, though I currently think my fuckup was not being more public about my concerns, and not the part where I accepted Sam's money. I also think confidentiality concerns were a major problem here, and it's pretty plausible another component of my fuckup was to agree to too much confidentiality in a way that limited what I could say here.

RobBensinger @ 2022-11-13T01:33 (+28)

In situations like this, it might be a good habit to state reservations publicly at the same time you receive the grant? Then your accepting the grant isn't a signal that you endorse the grantmaker, and you can be less worried about your relationship with the grantmaker damaging your future ability to be candid. Either they stop giving you money, or they continue giving you money even though you badmouthed them (which makes it more clear that you have impunity to do so again in the future).

Geoffrey Miller @ 2022-11-14T18:16 (+3)

Interesting idea. 

But it seems unrealistic to expect a recipient of a grant, upon receiving it, to publicly announce ethical and legal reservations about the grant-giver... and then for the grant-giver to be OK with that, and to follow through on providing the grant funding.

'Biting the hand that feeds you' doesn't typically result in good outcomes.

RobBensinger @ 2022-11-14T21:30 (+12)

Sure, though I think altruistic grantmakers should want their grantees to criticize them (because an altruistic grantmaker should care more about getting useful and actionable criticism than about looking good in the moment), and I think a lot of EA grantmakers walk the walk in that respect. E.g., MIRI has written tons of stuff publicly criticizing Open Phil, even though Open Phil is by far our largest all-time funder; and I don't think this has reduced our probability of getting future Open Phil funding.

One advantage of the norm I proposed is that it can help make this a more normal and expected practice, and (for that reason) less risky than it currently is.

And since everything's happening in public, grantmakers can accumulate track records. If you keep defunding people when they criticize you (even when the criticisms seem good and the grant recipients seem worthy, as far as others can tell), others can notice this fact and dock the grantmaker reputational points. (Which should matter to grantmakers who are optimizing this hard for their reputation in the first place.)

Geoffrey Miller @ 2022-11-14T21:38 (+11)

Fair points. I guess if any community can create a norm where it's OK for grant receivers to criticize grantmakers, it's the EA community. 

I was really just pointing out that creating and maintaining such an open, radically honest, self-reflective, criticism-welcoming culture is very much an uphill struggle, given human nature.

tcheasdfjkl @ 2022-11-12T20:02 (+31)

That's very surprising!!

Do you know if anybody attempted to propagate this information to any of the EAs who were promoting SBF publicly? (If so, do you know if they succeeded in conveying that information to them?)

And just to check, did any of the people who warn you privately promote SBF/FTX publicly?

I ask because it seems weird for a lot of EAs to be passing around warnings about SBF being untrustworthy while a lot of (other?) EAs are promoting him publicly; I very much hope these sets were disjoint, but also it's weird for them to be so disjoint, I would have expected better information flow.

Habryka @ 2022-11-12T20:25 (+64)

Yep, I was and continue to be confused about this. I did tell a bunch of people that I think promoting SBF publicly was bad, and e.g. sent a number of messages when some news article that people were promoting (or maybe 80k interview?)  was saying that "Sam sleeps on a bean bag" and "Sam drives a Corolla" when I was quite confident that they knew that Sam was living in one of the most expensive and lavish properties in the Bahamas and was definitely not living a very frugal livestyle. This was just at the same time as the Carrick stuff was happening, and I would have likely reached out to more people if I hadn't spent a lot of my social energy on pushing back on Carrick stuff at the time (e.g. ASB's piece on Carrick's character).

Overall, I did not message many people, and I personally did not speak out very publicly about my broader concerns. I also think a lot of that promotion was happening in a part of the EA ecosystem I interface much less with (80k, UK EAs, Will, etc.), and I've had historically somewhat tense relationships to that part of the ecosystem, so I did not have many opportunities to express my concerns.

David Mears @ 2022-11-13T15:33 (+8)

It would be useful to say whether any of the people you told would be considered 'EA leadership'; and if so, who.

Devon Fritz @ 2022-11-13T12:26 (+13)

How can both of these be true:

  1. You (and others, if all of the accounts I've been reading about are true) told EA leadership about a deep mistrust of SBF.
  2. EA decided to hold up and promote SBF as a paragon of EA values and on of the few prominent faces in the EA community.

If both of those are true, how many logical possibilities are there?

  1. The accounts that people told EA leadership are false.
  2. The accounts are true and EA leadership didn't take these accounts seriously.
  3. EA leadership took the accounts seriously, but still proceeded to market SBF.

     

I find them all super implausible so I don't know what to think!

nbouscal @ 2022-11-13T12:35 (+39)

My understanding is that the answer is basically 2.

I'd love to share more details but I haven't gotten consent from the person who told me about those conversations yet, and even if I were willing to share without consent I'm not confident enough of my recollection of the details I was told about those conversations when they happened to pass that recollection along. I hope to be able to say more soon.

EDIT: I've gotten a response and that person would prefer me not to go into more specifics currently, so I'm going to respect that. I do understand the frustration with all of the vagueness. I'm very hopeful that the EA leaders who were told about all of this will voluntarily come forward about that fact in the coming days. If they don't, I can promise that they will be publicly named eventually.

Habryka @ 2022-11-14T01:29 (+20)

My guess is different parts of leadership. I don't think many of the people I talked to promoted SBF a lot. E.g. see my earlier paragraph on a lot of this promotion being done by the more UK focused branches that I talk much less to.

Devon Fritz @ 2022-11-14T10:10 (+4)

That could very well be and there are a lot of moving parts. That is why I think it is important for people who supposedly warned leadership to say who was told and what they were told. If we are going to unravel this this all feels like necessary information.

nbouscal @ 2022-11-14T11:21 (+29)

The people who are staying quiet about who they told have carefully considered reasons for doing so, and I'd encourage people to try to respect that, even if it's hard to understand from outside.

My hope is that the information will be made public from the other side. EA leaders who were told details about the events at early Alameda know exactly who they are, and they can volunteer that information at any time. It will be made public eventually one way or another.

Devon Fritz @ 2022-11-14T11:26 (+5)

I respect that people who aren't saying what they know have carefully considered reasons for doing so. 

I am not confident it will come from the other side as it hasn't to date and there is no incentive to do so. 

May I ask why you believe it will be made public eventually? I truly hope that is the case.

nbouscal @ 2022-11-14T11:37 (+30)

The incentives for them to do so include 1) modelling healthy transparency norms, 2) avoiding looking bad when it comes out anyway, 3) just generally doing the right thing.

I personally commit to making my knowledge about it public within a year. (I could probably commit to a shorter time frame than that, that's just what I'm sure I'm happy to commit to having given it only a moment's thought.)

pseudonym @ 2022-11-13T13:54 (+11)

What do you find super implausible about 2?

NunoSempere @ 2022-11-12T13:21 (+57)

I found this comment annoying enough to read that I felt compelled to give a simplified version:

In 2021, I asked about SBF among some "senior EA people". I had heard that Alameda had had a hard breakup, and I didn't know whether SBF cheated his partners or whether he was merely a punctilious negotiator who nonetheless keeps his word. Based on what I heard, I classified SBF more as a punctilious negotiator than as a cheater, and I definitely didn't hear about them having poor capital controls at the beginning. 

My own guess is that you are mischaracterizing what EA leadership knew.

This removes some nuance, but maybe adds some clarity.

Edit: Reworded, see original here.

EliezerYudkowsky @ 2022-11-13T02:30 (+35)

I did not say that it'd be good if somebody was a ruthless negotiator.

If you're going to paraphrase somebody, please be more careful to paraphrase things they actually said, by dereferencing, and not add implications you thought they meant.

NunoSempere @ 2022-11-13T11:01 (+4)

I didn't say I was paraphrasing you, I said I was giving a simplified version. I also pointed out the sentence was not in the original.

interstice @ 2022-11-13T17:22 (+7)

Adding in an unflattering sentiment that was not said or clearly implied in the original is not "simplifying".

NunoSempere @ 2022-11-13T20:44 (+3)

Ok, fine, reworded. You can still find the original here.

Kerry_Vaughan @ 2022-11-12T02:32 (+17)

I consider this credible.

It suggests that my categorization of "EA leadership" was probably too broad and that fewer people knew the details of the situation than I believed.

That means there is a question of how many people knew. I am confident that Nick Beckstead and Will MacAskill knew about the broken agreement and other problems at Alameda. I am confident they are not the only ones that knew.

EliezerYudkowsky @ 2022-11-12T03:46 (+45)

Why are you confident of that?  In general, I think there's just less time and competence and careful checking to go around, in this world, than people would want to believe.  This isn't Hieronym's To The Stars or the partially Hieronym-inspired world of dath ilan.

tcheasdfjkl @ 2022-11-12T07:18 (+18)

Huge thanks for spelling out the specific allegations about SBF's behavior in early Alameda; for the past couple days I'd been seeing a lot of "there was known sketchy stuff at Alameda in 2017-18" and it was kind of frustrating how hard it was to get any information about what is actually alleged to have happened, so I really appreciate this clear point-by-point summary.

Yitz @ 2022-11-14T04:25 (+2)

Same here, this is really helping me understand the (at least perceived) narrative flow of events

Sharmake @ 2022-11-12T00:57 (+7)

I decided to speak about it because if true, it would imply bad things about how EA hasn't remembered the last time things went wrong.

In many senses, this is EA's first adversarial interaction, where we can't rely on internal norms of always cooperating anymore.

Jason @ 2022-11-12T01:37 (+4)

After the involved EAs consult with their lawyers, they may find a receptive audience to tell their stories at the Department of Justice or another federal agency. I would be shocked if the NDAs were effective as against cooperating with a federal investigation. If the quoted description is true, it seems relevant to the defense SBF seems to be trying to set up.

ftxthrowaway33 @ 2022-11-13T22:14 (+124)

I knew about Sam's bad character early on, and honestly I'm confused about what people would have expected me to do.

I should have told people that Sam has a bad character and can't be trusted,  that FTX is risky? Well, I did those things, and as far as I can tell, that has made the current situation less bad than it would have been otherwise (yes, it could have been worse!). In hindsight I should have done more of this though.

Should I have told the authorities that Sam might be committing fraud? All I had were vague suspicions about his character and hints that he might be dishonest, but no convincing evidence or specific worries about fraud. (Add jurisdictional problems, concerns about the competence of regulators, etc)

Should I not have "covered up" the early scandal? Well, EAs didn't, and I think Kerry's claim is wrong.

Should I have publicly spread concerns about SBF's character? That borders on slander. Also, I was concerned that SBF would permanently hate me after that (you might say I'm a coward, but hey, try it yourself).

Should I have had SBF banned from EA? Personally, I'm all for a tough stance, but the community is usually against complete bans of bad actors, so it just wasn't feasible.  (EG, if I were in charge, Jacy and Kerry would be banned, but many wouldn't like that.)

SBF was powerful and influential. EA didn't really have power over him.

What could have been done better? I am sincerely curious to get suggestions.

Habryka @ 2022-11-14T22:32 (+87)

My current, extremely tentative, sense of the situation is not that individuals who were aware of some level of dishonesty and shadiness were not open enough about it. I think individuals acted in pretty reasonable ways, and I heard a good amount of rumors. 

I think the error likely happened at two other junctions: 

  1. Some part of EA leadership ended up endorsing SBF very publicly and very strongly despite having very likely heard about the concerns, and without following up on them (In my model of the world Will fucked up really hard here)
  2. We didn't have any good system for aggregating rumors and related information, and we didn't have anyone who was willing to just make a public post about the rumors (I think this would have been a scary and heroic thing to do, I am personally ashamed that I didn't do it, but I don't think it's something that we should expect the average person to do)

I think if we had some kind of e.g. EA newspaper where people try to actively investigate various things that seem concerning, then I think this would have helped a bunch. This kind of thing could even be circulated privately, though a public version seems also good. 

I separately also think that we should just much more deeply embed the virtues of honesty and truth-seeking into the core idea of EA. I think it shouldn't be possible to be seen as "an effective EA" without also being actually good at truth-seeking and helping other people orient to the world. 

I think when a billionaire shows up with billions of dollars, or an entrepreneur builds a great company, I think it should just be a strict requirement that they are also honest and good at truth-seekingness in order to gain status and reputation within the community, in the same way that no matter how much money you make, people are not going to think you are a "good scientist" without actually having discovered new verifiable regularities in the natural world (you might be a "great supporter of science", but I think that doesn't usually mean you would get invited to all the scientific conferences, or get the Nobel Prize, or something, and I think people would have a healthy understanding of the relationship of you to the rest of the scientific ecosystem).

Pablo @ 2022-11-15T07:54 (+30)

Agree with much of what you say here. (Though I don't think we currently have strong enough evidence to single out specific EA leaders as being especially responsible for the recent tragic events; at least I don't think I personally have that kind of information.)

As a substitute, or complement, to an investigative EA newspaper, what do you think about an "EA rumours" prediction market?[1] Some attractive features of such a market:

  • It turns private information held by individual people with privileged access to sources into public information available to the entire EA community, increasing the likelihood that the information will reach those for whom it is most valuable and actionable.
  • It potentially reduces community drama by turning "hot" debates influenced by tribal allegiances and virtue signaling into "cold" assignments of probability and assessments of evidence.
  • It makes rumours more accurate, by incentivizing users to estimate their probability correctly.
  • It makes false rumours less damaging to their targets, by explicitly associating them with a low probability.

I think this market would need judicious moderation to function well and avoid being abused. But overall it seems to me like it might be an idea worth exploring further, and of the sort that could make future events in the same reference class as the FTX debacle less likely to happen.

  1. ^

    By 'market', I do not necessarily mean a real-money prediction market like Polymarket or PredictIt; it could also be a play-money market like Manifold Markets or a forecasting platform like Metaculus.

Habryka @ 2022-11-15T08:27 (+23)

Yeah, I feel excited about something in this space. Generally I feel like prediction markets have a lot of good things going for them in situations like this, though I do worry that they will somehow just end up gamed when the stakes are high. Like, my guess is Sam could have likely moved the probability of a market here a lot, either with money, or by encouraging other people to move it.

Ben Snodin @ 2022-11-15T11:55 (+11)

Should EA people just be way more aggressive about spreading the word (within the community, either publicly or privately) about suspicions that particular people in the community have bad character?

(not saying that this is an original suggestion, you basically mention this in your thoughts on what you could have done differently)

ftxthrowaway @ 2022-11-14T03:29 (+10)

Confirming that this account DM'd me with information indicating that they worked at Alameda.

ZekeFaux @ 2022-11-14T19:46 (+4)

I met Sam in February and wrote a profile of him for Bloomberg. In hindsight, there are a lot of red flags that everyone missed, myself included. Of course, it all looked different when he was on top.

At the time, I tried to research Alameda's early years and the dispute that led to the big breakup, but didn't get anywhere. I'm now working on a book - on the off chance that any insiders from Alameda or FTX read this, please DM me here or on Twitter.

Arepo @ 2022-11-11T16:42 (+89)

I'm unclear how to update on this, but note that Kerry Vaughan was at CEA for 4 years, and a managing director there for one year before, as I understand it, being let go under mysterious circumstances. He's now the program manager at a known cult that the EA movement has actively distanced itself from. So while his comments are interesting, I wouldn't treat him as a particularly credible source, and he may have his own axe to grind.

sphor @ 2022-11-11T19:25 (+70)

All this conversation about Leverage and Kerry's motives and character misses the point that he's talking about events that have little to nothing to do with him. He's saying that there was a blowup at Alameda early on reflecting badly on SBF that lots of EA leaders knew about and turned a blind eye to. This can be investigated and confirmed or denied without delving into conversations about Leverage or Kerry that are besides the point at hand. 

Holly_Elmore @ 2022-11-11T20:31 (+81)

To the extent that Kerry's allegation involves his own judgment of Sam's actions as bad or shady, I think it matters that there's reason not to trust Kerry's judgment or possibly motives in sharing the information. However we should definitely try to find out what actually happened and determine whether it was truly predictive of worse behavior down the line.

RobBensinger @ 2022-11-11T20:23 (+34)

Agreed! IMO it's good for people to be aware that Kerry has an axe to grind; but the thing to do with that information is to look into the matter further.

ftxthrowaway @ 2022-11-12T10:26 (+50)

I commented above that I think Kerry's comment is incorrect, so I feel obligated to state that I have no reason to think this is the result of bias. I am inclined to think he's doing the best he can in an information-scarce environment.

ftxthrowaway @ 2022-11-13T16:48 (+53)

I retract this comment. Kerry has continued repeating the same claim on Twitter without noting that there's disagreement about its truth. This does not seem like unbiased behavior.

Kerry_Vaughan @ 2022-11-13T19:31 (+3)

The claim on Twitter is different.

Can you clarify what you think is unfair? Happy to issue a correction.

https://twitter.com/KerryLVaughan/status/1591508739236188160?t=qL-dGKXar3b7EQ4EHs597Q&s=19

Edit: if anyone else wants to take a stab at explaining why the Twitter thread is unfair given this thread feel free. Would want to issue a correction sooner rather than later.

richard_ngo @ 2022-11-11T16:52 (+45)

-1 on this comment. In particular, being at CEA for 4 years seems like something which makes criticism more plausible. And it's not surprising that EA has distanced itself from groups critical of us (while I have some concerns about Leverage, I think there are a bunch of ways that they've been treated unfairly).

Arepo @ 2022-11-11T17:33 (+82)

Hard disagree on Leverage. They've absorbed a tonne of philanthropic funding over the years to produce nothing but pseudoscience and multiple allegations of emotional abuse.

I'm not saying Kerry wouldn't know about this stuff - I think he likely does. I'm saying a) that he was one of the 'top leaders' he refers to, so had ample chance to do something about this himself, b) he has a track record of questionable integrity, and c) he has potential motive to undermine the people he's criticising.

richard_ngo @ 2022-11-12T01:45 (+47)

I think this comment is a pretty clear example of one way in which Leverage has been treated unfairly, which is that people lump "not very productive" and "abusive" into a single criticism. The latter is much more serious, but the former is much easier to quickly verify, and so the former ends up lending credibility to the latter even though I personally think we probably have too few groups taking philanthropic funding to do crazy research that may end up looking like pseudoscience.

To be very clear, I'm not claiming that Leverage was not an abusive environment, and I take the allegations you mention very seriously. I've just also seen people piling onto Leverage in not-very-careful ways that I'm not very happy about.

RobBensinger @ 2022-11-11T18:06 (+73)

I'm not a fan of Leverage, but I agree with Richard here. I think Kerry is better modeled as "normal philosophy-friendly EA" with the modifications "less conflict-averse than the average EA" and "mad at EA (for plenty of good reasons and also plenty of bad reasons, IMO) and therefore pretty axe-grindy". If you model him with a schema closer to "crazy cultist" than to "bitter ex-EA", I expect you to make worse predictions.

elifland @ 2022-11-11T17:11 (+49)

I’m guessing I have a lower opinion of Leverage than you based on your tone, but +1 on Kerry being at CEA for 4 years making it more important to pay serious attention to what he has to say even if it ultimately doesn’t check out. We need to be very careful to minimize tribalism hurting our epistemics.

CarolineJ @ 2022-11-11T17:27 (+59)

For what it's worth, these different considerations can be true at the same time:

  1. "He may have his own axe to grind.": that's probably true, given that he's been fired by CEA.
  2. "Kerry being at CEA for four years makes it more important to pay serious attention to what he has to say even if it ultimately doesn’t check out.": it also seems like he may have particularly useful information and contexts.
  3. "He's now the program manager at a known cult that the EA movement has actively distanced itself from": it does seem like Leverage is shady and doesn't have a very good culture and epistemic, which doesn't reflect greatly on Kerry.

    So I would personally be inclined to pay close attention to his criticisms of CEA. At the same time, I would need more "positive" contexts from others to be able to trust what he says. 
elifland @ 2022-11-11T17:30 (+24)

I agree that these can technically all be true at the same time, but I think the tone/vibe of comments is very important in addition to what they literally say, and the vibe of Arepo's comment was too tribalistic.

I'd also guess re: (3) that I have less trust in CEA's epistemics to necessarily be that much better than Leverage's , though I'm uncertain here (edited to add: tbc my best guess is it's better, but I'm not sure what my prior should be if there's a "he said / she said" situation, on who's telling the truth. My guess is closer to 50/50 than 95/5 in log odds at least).

CarolineJ @ 2022-11-11T17:52 (+21)

I agree that the tone was too tribalistic, but the content is correct.

(Seems a bit like a side-topic, but you can read more about Leverage on this EA Forum post and, even more importantly, in the comments. I hope that's useful for you! The comments definitely changed my views - negatively - about the utility of Leverage's outputs and some cultural issues.)

elifland @ 2022-11-11T17:56 (+19)

I've read it. I'd guess we have similar views on Leverage, but different views on CEA. I think it's very easy for well-intentioned, generally reasonable people's epistemics to be corrupted via tribalism, motivated reasoning, etc.

But as I said above I'm unsure.

Edited to add: Either way, might be a distraction to debate this sort of thing further. I'd guess that we both agree in practice that the allegations should be taken seriously and investigated carefully, ideally by independent parties.

Arepo @ 2022-11-11T23:28 (+12)

Mea culpa for not being clear enough. I don't think handwavey statements from  someone whose credibility I doubt have much evidential value, but I strongly think CEA's epistemics and involvement should be investigated - possibly including Vaughan's.

I find it bleakly humourous to be interpreted as tribalistically defending CEA when I've written gradually more public criticisms of them and their lack of  focus -and honestly, while I don't understand thinking they're as bad as Leverage, I think they've historically probably been a counterfactual negative for the movement, and don't have a good sense of whether things have improved.

elifland @ 2022-11-11T23:34 (+10)

Thanks for clarifying. To be clear, I didn't say I thought they were as bad as Leverage. I said "I have less trust in CEA's epistemics to necessarily be that much better than Leverage's , though I'm uncertain here"

Dancer @ 2022-11-12T11:57 (+2)

while I don't understand thinking they're as bad as Leverage, I think they've historically probably been a counterfactual negative for the movement

I thought CEA started the movement?

Arepo @ 2022-11-12T23:10 (+31)

As I understood it, CEA was originally just a legal entity to save 80k and GWWC from having to both individually get charitable status, though GWWC had been around in some form since maybe 2007ish, and 80k for a year or two (and Givewell, which had started about the same time as CEA  and arguably has as good a claim to having started it had no formal association with any of these orgs). The emerging movement might have taken its name from the new org, or maybe just started using the phrase in response to the poll result.

At some stage IIRC, CEA started taking on more responsibilities and distanced itself, and eventually split from its child orgs. From that point on, I feel like they have generally not been well run - the staff seem to have been hired for enthusiasm and allegiance to the cause, and sometimes apparent nepotism (they seem to have hired internally for quite a few positions) rather than competence. As far as I can tell, staff have neither a carrot to motivate them or a stick: I know of only two examples of CEA employees being pushed out, one of who was CEO, and those were, as I understand it, for behaviour that was unambiguously termination-worthy (CEA may not want to disclose details of specific individuals being let go, and if that has happened those individuals might understandably not want to talk about it either, but the org doesn't eg have a clear policy for expecting high standards). Meanwhile they run multiple programs, the nature of which is constantly changing and lacks meaningful outcome metrics, meaning both that it's hard to gauge how well they do what they do, and hard for alternative organisations to offer them high fidelity competition.

(excuse all the self-citations - I don't know anyone else who's been publicly writing anything highly critical of CEA since the funds criticism, though I've had a number of conversations with people who're also cynical about the org. I've been fairly reluctant go on record with these views myself, and suspect I'm harming myself in expectation by doing so, since I'm interested in doing future EA-funded work)

To be clear a) I don't think all CEA staff have been bad - some I think highly of, the vast majority I have no specific opinion of, just that the overall org has generally functioned ineffectively, b) most of the specific actions I have in mind date back at least a couple of years, before Max Dalton became ED, and c) I had a recent conversation with him and gave him these concerns, which he seemed somewhat open to. So it may be that they're in a much better state under him. But I'm also wary of under-new-management-itis, under which a nonprofit org can't be criticised for a couple or years after a change - which potentially puts the org beyond reproach if it cycles EDs often enough.

Dancer @ 2022-11-12T12:28 (+12)

But good on you for being brave enough to publicly criticise your funding sources ("I have received EA funding in multiple capacities, and feel quite constrained in my ability to criticise CEA publicly")  or people you like ("I like everyone I've interacted with from CEA").

RobBensinger @ 2022-11-11T18:06 (+1)

I really like this comment, and I agree with it.

Will Bradshaw @ 2022-11-11T16:46 (+8)

++++

BrownHairedEevee @ 2022-11-12T08:13 (+2)

Why would being dismissed from CEA and being part of Leverage mean he has an axe to grind regarding SBF?

Arepo @ 2022-11-12T08:45 (+19)

Regarding 'top EA leaders' knowing about it (see further in the thread).

Kerry_Vaughan @ 2022-11-12T01:52 (+2)

He's now the program manager at a known cult that the EA movement has actively distanced itself from.


If you'd like to investigate whether Leverage was a cult, there are now several additional sources of information available.

 One source is Cathleen's post which is detailed, extensive, and written directly by a former employee. A board member conducted their own investigation into what Leverage could have done better between 2012 and 2019 by conducting interviews with former members of Leverage staff. 

You can also view Leverage's website to learn more about what we've been working on post-2019. The fact that I work at Leverage is best explained by my having a very different view of the organization's history and current work than you do.   

In any case, I don't see why disagreements about the value of Leverage's current or past work have anything to do with the specific claims I've made about what happened at Alameda in 2018.  

hummus @ 2022-11-13T10:10 (+14)

I’d also recommend reading Zoe Curzi’s essay about her own (traumatic) experience at Leverage, the publishing of which was publicly supported by Leverage founder Geoff Anders.

jimrandomh @ 2022-11-11T22:50 (+55)

I heard the same claim, from a different source: that SBF did something unethical at Alameda Research prior to founding FTX, that some EAs had left Alameda saying that SBF was unethical and no one should work with him, and that there were privately circulated warnings to this effect. (The person I heard this from hasn't spoken publicly about it yet as far as I know. They are someone with no previous or current involvement with FTX or Alameda Research, who I think is reporting honestly and is well positioned to have heard such things.)

(EDIT: others along the rumor-path via which I heard this have now spoken on this thread, in greater detail than I have; so this comment is a duplicate report and should not be coutned.)

CarolineJ @ 2022-11-11T17:58 (+33)

+ 1 for way more investigations and background checks for major donations, megaprojects, and association with EA.

DaneelO @ 2022-11-11T23:57 (+6)

I think this suggests that the EA orgs which had close ties to FTX and SBF should have investigations performed by outside parties. If this is true it makes the situation even worse than it appears at the moment since it could have been prevented by having higher ethical standards.

Rona Tobolsky @ 2022-11-11T03:28 (+120)

Thank you so much for your time, dedication, and efforts.
It seems like, for many of us, difficult times lay ahead. Let us not forget the power of our community - a community of brilliant, kind-hearted, caring people trying to do good better together
This is a crisis - but we have the ability to overcome it.
 

DonyChristie @ 2022-11-11T05:58 (+106)

I was really looking forward to maybe implementing impact markets in collaboration with Future Fund plus FTX proper if you and they wanted, and feel numb with regard to this shocking turn. I really believed FTX had some shot at 'being the best financial hub in the world', SBF 'becoming a trillionaire', and this longshot notion I had of impact certificates being integrated into the exchange, funding billions of dollars of EA causes through it in the best world. This felt so cool and far out to imagine. I woke up two days ago  and this dream is now ash. I have spiritually entangled myself with this disaster.

I don't want to be the first commenter to be that guy, and forgive me if I'm poking a wound, but when you have the time and slack can you please explain to us to what extent you guys grilled FTX leadership about the integrity of the sources of money they were giving you? Surely you had an inside view model of how risky this was if it blew up? If it's true SBF has had a history of acting unethically before (rumors, I don't know), isn't that something to have thoroughly questioned and spoken against? If there was anyone non-FTX who could have pressured them to act ethically, it would have been you. As an outsider it felt like y'all were in a highly trusted concerted relationship with each other going back a decade.

In any case, thank you for what you've done.

Greg_Colbourn @ 2022-11-11T08:36 (+142)

Sven Rone should've won a prize in the Red Teaming contest[1]:

The Effective Altruism movement is not above conflicts of interest 

[published Sep 1st 2022]

Summary

Sam Bankman-Fried, founder of the cryptocurrency exchange FTX, is a major donator to the Effective Altruism ecosystem and has pledged to eventually donate his entire fortune to causes aligned with Effective Altruism. By relying heavily on ultra-wealthy individuals like Sam Bankman-Fried for funding, the Effective Altruim community is incentivized to accept political stances and moral judgments based on their alignment with the interests of its wealthy donators, instead of relying on a careful and rational examination of the quality and merits of these ideas. Yet, the Effective Altruism community does not appear to recognize that this creates potential conflicts with its stated mission of doing the most good by adhering to high standards of rationality and critical thought.

In practice, Sam Bankman-Fried has enjoyed highly-favourable coverage from 80,000 Hours, an important actor in the Effective Altruism ecosystem. Given his donations to Effective Altruism, 80,000 Hours is, almost by definition, in a conflict of interest when it comes to communicating about Sam Bankman-Fried and his professional activities. This raises obvious questions regarding the trustworthiness of 80,000 Hours’ coverage of Sam Bankman-Fried and of topics his interests are linked with (quantitative trading, cryptocurrency, the FTX firm…).

In this post, I argue that the Effective Altruism movement has failed to identify and publicize its own potential conflicts of interests. This failure reflects poorly on the quality of the standards the Effective Altruism movement holds itself to. Therefore, I invite outsiders and Effective Altruists alike to keep a healthy level of skepticism in mind when examining areas of the discourse and action of the Effective Altruism community that are susceptible to be affected by incentives conflicting with its stated mission. These incentives are not just financial in nature, they can also be linked to influence, prestige, or even emerge from personal friendships or other social dynamics. The Effective Altruism movement is not above being influenced by such incentives, and it seems urgent that it acts to minimize conflicts of interest.

(Note that this issue was commented on here a month ago.) This whole thing is now starting to look like the classic "ends justify the means" criticism of Utilitarianism writ large :(

  1. ^

    although looks like it wasn't actually entered? Edit: it was, but not posted as a top-level post on the EA Forum (see comments below).

Stuart Buck @ 2022-11-11T14:16 (+183)

I wrote that comment from over a month ago. And I actually followed it up with a more scathing comment that got downvoted a lot, and that I deleted out of a bit of cowardice, I suppose. But here's the text: 

 

Consider this bit from the origin story of FTX

In 2019, he took some of the profits from Alameda and $8 million raised from a few smaller VC firms and launched FTX. He quickly sold a slice to Binance, the world’s biggest crypto exchange by volume, for about $70 million. 

Binance, you say? This Binance

During this period, Binance processed transactions totalling at least $2.35 billion stemming from hacks, investment frauds and illegal drug sales, Reuters calculated from an examination of court records, statements by law enforcement and blockchain data, compiled for the news agency by two blockchain analysis firms. Two industry experts reviewed the calculation and agreed with the estimate.

Separately, crypto researcher Chainalysis, hired by U.S. government agencies to track illegal flows, concluded in a 2020 report that Binance received criminal funds totalling $770 million in 2019 alone, more than any other crypto exchange. Binance CEO Changpeng Zhao accused Chainalysis on Twitter of “bad business etiquette.”

Or consider FTX's hiring of Daniel Friedberg as a chief compliance officer. This article claims that he had been involved in previous cheating/fraud at other businesses:  

Crypto’s ongoing addiction to the Tether stablecoin is nearly as alarming as the sector’s questionable embrace of lawyers linked to online gambling fraud. . . . 

If one ever doubted the insincerity of SBF’s compliance commitment, . . . the company’s former GC, Daniel S. Friedberg, is now FTX’s new chief compliance officer, a role for which Friedberg is almost comically inappropriate. . . . 

Friedberg’s presence on FTX’s payroll means Sam Bankman-Fried (SBF) either didn’t do his due diligence before hiring, or he knew of Friedberg’s past sins and didn’t care. Neither of these options paints Sam Bankman-Fried in an overly flattering light."

Then there are all the recent examples of FTX trying to buy up other crypto players. For example, in July, FTX signed a deal to buy BlockFi for up to $240 million, and to give it $400 million in revolving credit. BlockFi is most famous for having agreed to pay $100 million in penalties for its securities fraud. It's not clear why FTX would want to spend this amount of money on buying a fraudulent firm. 

Just last week, there was a story that FTX is thinking about buying Celsius, another fraudulent firm. 

Another story from July had the remarkable claim that SBF is even thinking of putting his own cash into bailing out other crypto firms: 

On one or two occasions, Bankman-Fried, who made billions arbitraging cryptocurrency prices in Asia beginning in 2017, said he has used his own cash to backstop failing crypto companies when it didn’t make sense for FTX to do so.

“FTX has shareholders and we have a duty to do reasonable things by them and I certainly feel more comfortable incinerating my own money,” he said.

Why is FTX and perhaps SBF himself putting so much money into buying up other people's scams? I would hope it's because they intend to reform the crypto industry and put it on more of a moral footing, although that would reduce the market size by an order of magnitude or two. 

***

At least, SBF and FTX ought to provide more transparency into where exactly all the wealth came from, and what (if anything) they are actively doing to prevent crypto frauds/scams. And one might argue that FTX Foundation has a particular moral duty to establish a fund to help out all of the people whose lives were ruined by falling for crypto's many Ponzi schemes and other assorted scams. 

There are about a hundred different letters to the judge, many of them speaking of the fear and depression caused by having money that they desperately need to help keep their families fed locked up in Celsius’ coffers. Each one feels more desperate than the last, with some speaking of money they’d lost because - ironic, considering why Celsius is insolvent - they weren’t able to cover their margin calls, as the platform had stopped accepting new collateral, meaning that they had loans liquidated through no fault of their own. One woman from Australia spoke of needing money to pay for the hospital when her third child was born, her life savings locked up in the platform. In fact, many people talk about losing their life savings, and others speak of having little cash to their name, the majority trapped in Celsius.

 

 

IanDavidMoss @ 2022-11-11T14:41 (+182)

Wow, I didn't see it at the time but this was really well written and documented. I'm sorry it got downvoted so much and think that reflects quite poorly on Forum voting norms and epistemics.

Danny Donabedian @ 2022-11-11T18:45 (+48)

Moreover, Sven Rone is a pseudonym. The author used a pen name astheir views were unpopular and underappreciated at the time; they likely feared career repercussions if they went public with it. It's unfortunate that this was the environment they found themselves in. 

Arepo @ 2022-11-11T16:19 (+43)

Seconded. This whole saga has really made me sour on some already mixed views on EA epistemics.

Sharmake @ 2022-11-11T17:50 (+17)

I find myself having a mixed opinion of how EA responded. It wasn't outright terrible epistemics, unlike most of the world reacting to a similar event, but there were real failures of epistemics.

On the other hand, there was also successes in EA epistemics, as well.

Lukas_Gloor @ 2022-11-11T20:57 (+26)

I think the post ended up around 0 or 1 karma, is that right? (I mean before people changed their voting based on hindsight!) I think it's important to distinguish between "got downvoted a lot but ended up at neutral karma" vs. "got downvoted double digits into no longer being visible." The former reflects somewhat poorly on EA, the latter very poorly. 

sphor @ 2022-11-11T21:33 (+28)

I think the most informative signal here is not the exact karma that comment ended up with but rather that the author ended up deleting it despite believing that what he was saying was potentially important and not receiving any reasons to think he was wrong. A culture where people feel compelled to silence themselves is worse than one where some comments are wrongly downvoted without much consequence to the author. 

RobBensinger @ 2022-11-12T02:08 (+21)

I think the most important data points here are any comments that were left, and the net karma of the comment. People have in fact been known to overreact, or react in idiosyncratic ways, in forum discussions; I haven't seen the thread in question, but if the responses were friendly and the comment got ~0 net karma, then that would be a large update for me.

I definitely took "that got downvoted a lot" to mean that the comment got a lot of net downvotes, not just that people offset its upvotes to keep it around a neutral 0. I think it's pretty bad to describe vote patterns that misleadingly, if it was hovering around 0.

Lukas_Gloor @ 2022-11-11T21:38 (+3)

Good point. :S

Sam Elder @ 2022-11-14T03:06 (+2)

Are we talking about this deleted comment? It has 6 overall karma in 9 votes, and -3 agreement in 5 votes.

Lukas_Gloor @ 2022-11-14T10:28 (+2)

No, I was talking about Stuart Buck's initial comment in that same thread, which is still up and now has high upvotes.

 But Stuart also mentioned he deleted a second comment after it got downvoted too, so that must be the one you're linking to. (We also don't know if some people retroactively upvoted the deleted comment, it's at +6 now but could've been negative at the time of deletion. I think I'm still able to vote on the deleted comment – though maybe that's just because I had already voted on it before it got deleted [strong upvote and weak agree vote]). 

Sam Elder @ 2022-11-14T12:17 (+3)

Either way it seems highly unlikely that the deleted comment I linked to had lots of negative votes. It had a few disagree votes but very likely not more than 1-2 karma downvotes.

John_Maxwell @ 2022-11-13T04:16 (+20)

I like how Hacker News hides comment scores. Seems to me that seeing a comment's score before reading it makes it harder to form an independent impression.

I fairly frequently find myself thinking something like: "this comment seems fine/interesting and yet it's got a bunch of downvotes; the downvoters must know something I don't, so I shouldn't upvote". If others also reason this way, the net effect is herd behavior? What if I only saw a comment's score after voting/opting not to vote?

Maybe quadratic voting could help, by encouraging everyone to focus their voting on self-perceived areas of expertise? Commenters should be trying to impress a narrow & sophisticated audience instead of a broad & shallow one?

EDIT: Another thought: If there was a way I could see my recent votes, I could go back and reflect on them to ensure I'm voting in a consistent manner across threads

EliezerYudkowsky @ 2022-11-12T02:25 (+38)

I think that what FTX is accused of this comment is legitimately way more something where a charitable recipient is not morally obliged to demand this level of careful checking of everything, because our civilization is just not actually able to support this level of competency pornography.

Stealing your customers' funds is a very different matter from "some of the people who use our services are criminals".  Why, MIRI has in the past accepted matching funds from Google, which I'm sure profits a whole lot off criminals using their services!  And some of those criminals may even be bad people!

But you can't, actually, run a post-agricultural civilization on the principle of everybody who engages in every transaction checking out the full moral character of everybody who transacts with them.  If you did try to build clever infrastructure for that, its first use on the margin would be by the right to hunt down sex workers (as already occurs with Visa) and by the left to hunt down people who said bad things on Twitter.

In a hunter-gatherer tribe it maybe makes sense to demand that people not transact with that bad guy over there; it scales as far as it needs to scale.  And MIRI would not take money from somebody what we knew had stolen in charity's name.  But to figure it all out - if you want to read about Civilizations that have the basic infrastructure and competence to run those kind of traces, go read science fiction; here on Earth you've got VC firms trying to run six months of due diligence and then they invest in FTX.

Sarah Levin @ 2022-11-12T02:42 (+30)

IMO the amount of diligence someone ought to perform on their counterparties' character is different in different circumstances. "This person is one of hundreds of people I transact with every week" carries different obligations than "This person is one of the four big donors who fund my organization" carries different obligations than "This person has been my only source of income for the past two years". Different EAs were at different points along this spectrum.

Stuart Buck @ 2022-11-12T18:20 (+6)

I generally agree with you, but in this case SBF 

1) hired a high-level person with a long history of fraud (you don't see Asana or Stripe doing this); and 

2) described his own business as a Ponzi scheme (see https://www.bloomberg.com/news/articles/2022-04-25/sam-bankman-fried-described-yield-farming-and-left-matt-levine-stunned ). 

It was obvious that he was up to no good. 

"But Sequoia" -- I'm not convinced that they did any due diligence, judging by what they published on their own website: https://twitter.com/zebulgar/status/1590394857474109441  It's not the only occasion when it looks to me like "top" VC firms leapt into investments out of FOMO, with zero effort at due diligence: https://medium.com/swlh/why-are-investors-eager-to-lose-money-on-health-tech-f8c678ccc417 

sphor @ 2022-11-11T14:54 (+18)

Quoting from the article you linked about the involvement of Daniel Friedberg, FTX's Chief Regulatory Officer, in a previous scandal:

In 2008, online poker site Ultimate Bet (UB) publicly confirmed rumors that certain individuals had utilized a little-known feature of the site’s software to view players’ hole cards during hands. This so-called ‘god mode’ allowed a number of ‘super users’ to cheat opponents out of tens of millions in poker winnings. The site’s operators begrudgingly paid out a few million to the loudest complainers and folded the site’s operations into a sister site (which was dealing with its own scandals).

In 2013, an audio recording surfaced that made mincemeat of UB’s original version of events. The recording of an early 2008 meeting with the principal cheater (Russ Hamilton) features Daniel S. Friedberg actively conspiring with the other principals in attendance to (a) publicly obfuscate the source of the cheating, (b) minimize the amount of restitution made to players, and (c) force shareholders to shoulder most of the bill.

On the tape, Daniel S. Friedberg tells Hamilton that he doesn’t want news of the cheating scandal to get out, but if it must, the “ideal thing” would be for the public to be told that a “former consultant to the company, uh, took advantage of a server flaw by hacking into the [software] client.” Friedberg advises Hamilton to publicly claim that he was among the victims of this cheating, “otherwise [the cover story’s] not going to fly.”

Regarding how many millions the site would have to cough up—both in returns to players and regulatory penalties—Friedberg says “if we could get it down to five, I’d be happy.” This is despite Friedberg knowing the real sum owed was many multiples of that number. Friedberg later says that achieving this $5 million target is possible, “depending how creative we get.”

Friedberg also emphasizes the need to shift responsibility for the payout to Excapsa, the holding company that owned UB’s software during the period in which some of the cheating took place. Friedberg discusses naming an Excapsa employee as having prior knowledge of the cheating, because “in order to get to Excapsa’s money legally you almost have to show fraud.”

Nathan Young @ 2022-11-11T21:56 (+14)

Sorry this text got heavily downvoted? If so, we should be ashamed.

Danny Donabedi @ 2022-11-12T01:15 (+11)

"They tell you to do your thing but they don't mean it. They don't want you to do your own thing, not unless it happens to be their thing, too. It's a laugh, Goober, a fake. Don't disturb the universe, Goober, no matter what the posters say.” - Robert Cormier, the Chocolate War

Yes, we should. People hesitate or are averse to bringing issues up with authorities/communities due to fears of being punished. As groups collectivize and become increasingly memetically homogeneous, that which coincides with the solidification of power/influence/financial structures and hierarchies, dissent of any form becomes decreasingly tolerated. It becomes safer/easier to criticize EA as an outsider than as member who simultaneously want to grow in EA, be well received by potential EA organization employers, and rise up the oft unstated hierarchies that developed as EA blossomed. 

Until this debacle, SBF was lionized beyond comparison by the major community organizations. And moreover, he was closely associated with EA giants via the foundation/future fund and other projects. He had excellent PR presence due to the constant EA affiliated media attention. He was 80k's paragon of earning to give. 

That's not to say figures like him were untouchable (nothing in EA is untouchable fortunately), but criticizing the most popular embodiment of success would result in online backlash at best or at worst, damage to the critic's career capital. In a situation similar to Stuart's, that is precisely why Sven's essay on conflicts of interest in EA was anonymous. It's also why it didn't even get honorable mention in the essay competition. Even if the criticisms themselves were valid and justified, the PR risks of promoting dissent made sure it wasn't given a prize. Demands for greater transparency or accountability from EA vanguards in the wake of recent developments may also be viewed instinctively or intuitively as threats to harmony. 

Not everyone enjoys having beloved paragons and prophets criticized. Not everyone likes having their faith or trust in institutions shattered, let alone challenged. Not everyone maintains a cynical, skeptical attitude towards those in authority positions. During EA training newcomers certainly aren't prepared for such developments, perhaps because events like such are not expected to ever come up in the first place. 

It remains a problem the community has faced since day one, although much of it is attributable due to hierarchical and tribalistic human psychology rather than EA itself. While EA has better epistemics and remains more open to criticisms than the average ideological movement, harshness or cynical sternness, used to be (in EA's early days), much more commonplace and welcomed than it is now. As EA has grown and become more of a community, intra-group harmonic cohesion became increasingly prized and promoted. Those who elicit controversy by means of intellectual dissent (rather than conforming) are at a higher likelihood of being downvoted. 


Spouting off this stuff isn't productive on my end. I don't have a solution, but there needs to be better ways to increase reception towards contrarian/unpopular takes, minimizing unjustified repercussions for dissenters. Those who are harshest or most skeptical among EAs should not be dismissed as impediments to progress. I have faith EA has the capacity to ameliorate this. 
 

Denkenberger @ 2022-11-13T22:44 (+3)

By the way, it looks like the comment is now heavily upvoted. I've seen this happen quite a few times, so it seems like it might be good to withhold judgment about the net votes for a day or two. But of course it could be that it became highly upvoted because of reactions like this, so I'm not sure what the best course of action is.

Sharmake @ 2022-11-11T14:42 (+9)

I don't follow crypto, or it's space, but this seems like a bad habit or norm to downvote pieces that criticize EA's questionable relationship to crypto.

Cryptocurrency doesn't actually work, and only is there for scams and fraud. Not surprising that FTX collapsed.

Zach Furman @ 2022-11-11T16:41 (+53)

I think you may be getting a lot of disagree-votes because I don't think crypto was the issue here. People who just have USD sitting in FTX right now lost their money too.

FTX shouldn't have been risky. It wasn't a DAO, or based entirely off some token or chain, it was an exchange. It should have just been connecting  people who wanted to buy crypto with people who wanted to sell crypto, and taking a fee for doing this. The exchange itself shouldn't be taking any risk.

The reason as to how looks at least in part to do with leveraged transactions, allowing customers to buy more crypto by supplementing their purchase with a loan. But we've let leveraged transactions happen with stock for a hundred years. This looks a lot more like garden-variety financial crime than some problem with crypto.

Kevin_Cornbob @ 2022-11-12T22:14 (+15)

Here's a quote from former US Treasury Secretary Larry Summers in a recent Bloomberg interview that backs up some of the claims in this comment: 

A lot of people have compared this to Lehman. I would compare it to Enron - the smartest guys in the room, not just financial error, but certainly from the reports, whiffs of fraud. Stadium namings very early in a company's history. Vast explosion of wealth that nobody quite understands where it comes from. 

[...] I think this is probably less about the complexities of the nuances of the rules of crypto regulation and more about some very basic financial principles that go back to financial scandals that took place in ancient Rome.

Sharmake @ 2022-11-11T16:43 (+15)

Sorry for misfiring here, I'll retract my comment.

Jason @ 2022-11-11T18:35 (+4)

The relation to crypto is that the bulk of crypto is poorly regulated. Some of that is solvable -- well regulated exchanges should be possible. The extreme volatility also increases the temptation toward fraud. So the fraud risk is higher than in a well-regulated industry.

I'd submit that a well-regulated and managed exchange is going to find it much harder to achieve a stratospheric valuation, and other parts of crypto are harder to regulate well. So some skepticism toward huge crypto-linked donors is warranted.

Geoffrey Miller @ 2022-11-11T19:24 (+3)

More crypto regulation is coming, and many crypto protocols have worked hard already to be regulatory-compliant. But regulation won't be uniform across jurisdictions; there will always be loopholes that allow regulatory arbitrage.

Some exchanges, such as Coinbase and Kraken, are based and regulated within the US, and are subject to much stricter oversight than FTX -- which seems to have been deliberated based in Hong Kong and then the Bahamas precisely in order to avoid US regulatory oversight. (Arguably, this should have been a red flag in terms of EA's relationship with FTX).

The US, UK, or EU can regulate all they want, but crypto finance is a global business, and there are plenty of less-regulated havens willing to host crypto businesses. 

Hopefully crypto investors, traders, and users will become savvier about checking where businesses are operating, and what regulatory scrutiny they're subject to.

Jason @ 2022-11-11T19:58 (+1)

Agreed on that. My point was that it would be a lot harder for an individual to get super-rich quick in a regulated market. No sane regulator is going to allow a regulated party to risk customer assets for the party's benefit, and few will allow crazy leverage. And the whole thing will require significantly more of a buffer in fiat currency, again limiting any single person's ability to get megarich.

In short, I think there are few ways for a well-regulated exchange to be stratospherically profitable. So people should not expect the rise of new crypto megadonors who hail from regulated backgrounds.

Zach Furman @ 2022-11-11T21:31 (+2)

I would agree with this. Separate from the object-level causes of the current crisis, crypto as an industry has accepted and normalized a lack of accountability that other industries haven't. And I agree that lack of regulation and high volatility make fraud more likely.

I would want to avoid purely focusing on crypto, because I think the meta-lesson I might take away is less "crypto bad" and more "make sure donors and influential community members are accountable," whether that be to regulators, independent audits, or otherwise. (And accountable in a real due diligence sense, because it's easy for that word to just be an applause light.) But yes, skepticism of crypto-linked donors would be justified under this framework.

Sabs @ 2022-11-12T09:59 (+2)

I have no idea why this comment is no longer endorsed by its author because it's entirely correct. Not only is crypto a great way to scam people because transactions can't be reversed & there's virtually no regulation for most of the space, the fact that it's so hard to make money in crypto across an entire cycle means that entities have a huge incentive to resort to scamming. 

Isaac King @ 2022-11-15T03:34 (+14)

I can tell you why I downvoted it.

Cryptocurrency doesn't actually work

False, it works just fine. It's a token that can't be duplicated and people can send to each other without any centralized authority.

and only is there for scams and fraud.

There are indeed a lot of those, but scams and fraud were very clearly not the intention of its creators. Realistically they were cryptography nerds who wanted to make something cool, or libertarians with overly-idealistic visions of the future.

Not surprising that FTX collapsed.

Clear hindsight bias. This person should have made some money betting against FTX before it collapsed and then I'd take them more seriously.

Basically, the comment is just your standard "cryptocurrency bad" take, without any attempt at justifying their claims or even saying much of anything other than expressing in an inflammatory way that they don't like cryptocurrency.

Sabs @ 2022-11-15T07:39 (+16)

"This person should have made some money betting against FTX before it collapsed and then I'd take them more seriously."

this is naive EMH fundamentalism

not everything can be shorted, not everything can be shorted easily, not everything should be shorted, markets can be manipulated. Especially the crypto market. It both can be the case that people 100% think X is a fraud, and X collapses, and shorting X would have been a losing trade over most timeframes. "Never short" is an oversimplification but honestly not a bad one.


 

Czynski @ 2022-11-13T22:42 (+8)

Most of that isn't even clearly bad, and I find it hard to see good faith here. 

Your criticism of Binance amounts to "it's cryptocurrency". Everyone knows crypto can be used to facilitate money laundering; this was, for Bitcoin, basically the whole point. Similarly the criticism of Ponzi schemes; there were literally dozens of ICOs for things that were overtly labeled as Ponzis - Ponzicoin was one of the more successful ones, because it had a good name. Many people walked into this with eyes open; many others didn't, but they were warned, they just didn't heed the warnings. Should we also refuse to take money from anyone who bets against r/wallstreetbets and Robinhood? Casinos? Anyone who runs a platform for sports bets? Prediction markets? Your logic would condemn them all.

It's not clear why FTX would want to spend this amount of money on buying a fraudulent firm.

FTX  would prefer that the crypto sector stay healthy, and backstopping companies whose schemes were failing serves that goal. That is an entirely sufficient explanation and one with no clear ethical issues or moral hazard.

Even in retrospect, I think this was bad criticism and it was correct to downvote it.

Stuart Buck @ 2022-11-13T23:55 (+9)

My criticism of Binance was not "it's cryptocurrency." My criticism of Binance was that at the very time that that SBF allied with Binance, it was a "hub for hackers, fraudsters and drug traffickers." Apparently your defense of SBF is that "everyone knows" crypto is good for little else . . . but perhap if someone enters a field that is mostly or entirely occupied by criminal activity, that isn't actually an excuse

As for backstopping other scams and frauds, that isn't a way to make sure that the "crypto sector stays healthy" (barring very unusual definitions of the word "healthy"), and in actuality, we're now seeing evidence that FTX was just trying to extract assets from other companies in a desperate attempt to shore up their own malfeasance and fraud. https://twitter.com/AutismCapital/status/1591569275642589184 

Stuart Buck @ 2022-11-14T14:33 (+6)

The only flaw in my earlier comment is that I was too charitable towards SBF in suggesting that there might be some plausible excuse for the multiple red flags I noticed. 

bruce @ 2022-11-11T13:18 (+42)

Thanks for this! I echo Lizka's comment about linkposting.

In light of the recent events I'm struggling a bit with taking my hindsight-bias shades off, and while I scored it reasonably highly, I don't think I can fairly engage with whether it should have received a prize over other entries even if I had the capacity to (let alone speak for other panelists). I do remember including it in the comment mainly because I thought it was a risk that didn't receive enough attention and was worth highlighting (though I have a pretty limited understanding of the crypto space and ~0 clue that things would happen in the way they did).

I think it's worth noting that there has been at least one other post on the forum that engaged with this specifically, but unfortunately didn't receive much attention. (Edit: another one here)

Ultimately though,  I think it's more important to think about what actionable and constructive steps the EA community can take going forward. I think there are a lot of unanswered questions wrt accountability from EA leaders in terms of due diligence, what was known or could have been suspected prior to Nov 9th this year, and what systems or checks/balances were in place etc that need to be answered, so the community can work out what the best next steps are in order to minimise the likelihood of something like this from happening again.

I also think there are questions around how these kinds of decisions are made when benefits affect one part of the EA community but the risks are pertinent to all, and how to either diversify these risks, or make decision-making more inclusive of more stakeholders, keeping in mind the best interests of the EA movement as a whole.

This is something I'm considering working on at the moment and will try and push for - do feel free to DM me if you have thoughts, ideas, or information.

(Commenting in personal capacity etc)

Czynski @ 2022-11-13T22:46 (+10)

Strongly disagree. That criticism is mostly orthogonal to the actual problems that surfaced. Conflicts of interest were not the problem here.

Random @ 2022-11-11T11:02 (+6)

It seems it was entered, according to the (second) comment from Bruce here:  Winners Red Teaming

Greg_Colbourn @ 2022-11-11T11:26 (+4)

Thanks (link to the comment). I think those entries really should've been put on the EA Forum as posts to be interacted with (like with the Future Fund AI Worldview Prize[1])

  1. ^

    Which I imagine is no longer happening :(

Lizka @ 2022-11-11T11:56 (+8)

Yeah, I can confirm that we evaluated that submission. 

Re: putting them on the Forum — we didn't have the capacity to do that (and I'm not sure it would have been helpful to do that for all the submissions), but in general, I really encourage people to link-post relevant content to the EA Forum. So, you could link-post this (or similar content in the future).

[I should note that I have low capacity right now and might not reply to this thread. Apologies in advance!]

Jason @ 2022-11-11T13:15 (+54)

FTX had received several billion dollars in funding from major investors. One was a province pension fund, so it wasn't just crypto folks. That generally involves having the investors' accountants do substantial due diligence on the target firm's financials. That tells me that either the books were fairly clean at the time of investment or they were cooked in a way that even the due dilligence specialists didn't detect. It's not clear to me how the Future Fund people, who to my knowledge are not forensic accountants or crypto experts, would have had a better ability to pick up on funny business. So I don't see why it would be unreasonable for them to have relied on third-party expert vetting.

Geuss @ 2022-11-11T13:36 (+18)

From what I understand (please correct me if I'm wrong), FTX didn't have a CFO, it's COO was a friend with no experience, and it didn't have a proper board of directors. Clearly, that flimsy corporate governance would not pass a standard due diligence test. 

EDIT: This flow chart of shells nested in shells, like Russian dolls, speaks to why the company's governance should have been a red-flag.

https://i.redd.it/078p4g7m6cz91.jpg 

Jeff Kaufman @ 2022-11-11T17:56 (+28)

this flow chart of shells nested in shells, like Russian dolls, speaks to why the company's governance should have been a red-flag.

I don't think a highly branched company structure is a red flag: my understanding is that to operate a financial business legally across many jurisdictions you generally need to have subsidiaries in each jurisdiction. Ex: https://wise.com/help/articles/2974131/what-are-the-wise-group-entities

Jason @ 2022-11-11T18:56 (+20)

In the autopsy, the biggest red flag will probably be the lack of appropriate internal controls. One should not be able to move that kind of money without vetting by staff with appropriate background and independence, but no ownership interest. Based on the reported en masse resignation of the bulk of legal and compliance staff, it seems that it was technically possible to transfer billions in customer assets to the CEO's company without legal/compliance involvement.

Jason @ 2022-11-11T16:38 (+10)

I think the class of issues that would make it inappropriate to accept donations is much narrower than the issues that would and should make a public investor (like a province pension fund) decline to invest.

Few private businesses are going to let an outsider come in on a regular basis, conduct a hard look at sensitive internal documents, and potentially publish derogatory information to the public. Even for investors, this kind of stuff is generally done under a heavy NDA and for good reason. That would make it extremely difficult to do this on a regular basis -- so any scrutiny would at best catch fraud that existed at the time of scrutiny.

Stuart Buck @ 2022-11-11T14:55 (+14)

I wouldn't be very confident in the level of due diligence undertaken by supposedly sophisticated investors: 

https://twitter.com/zebulgar/status/1590394857474109441

Pablo @ 2022-11-12T18:54 (+10)

This just isn't plausible on reasonable priors. You need to assume that multiple investment firms working in different sectors, whose survival in a highly competitive environment in large part depends on being skilled at scrutinizing a company's financials, would miss warning signs that should have been apparent to folks with no relevant domain expertise. See also Eliezer's Twitter thread.

ETA: Alexander:

Some people are asking whether people who accepted FTX money should have “seen the red flags” or “done more due diligence”. Sometimes this is from outsider critics of effective altruism. More often it’s been effective altruists themselves, obsessively beating themselves up over dumb things like “I met an FTX employee once and he seemed to be frowning, why didn’t I realize that this meant they were all committing fraud?!” Listen: there’s a word for the activity of figuring out which financial entities are better or worse at their business than everyone else thinks, maximizing your exposure to the good ones, and minimizing your exposure to the bad ones. That word is “finance”. If you think you’re better at it than all the VCs, billionaires, and traders who trusted FTX - and better than all the competitors and hostile media outlets who tried to attack FTX on unrelated things while missing the actual disaster lurking below the surface - then please start a company, make $10 billion, and donate it to the victims of the last group of EAs who thought they were better at finance than everyone else in the world. Otherwise, please chill.

DaneelO @ 2022-11-12T19:57 (+3)

I would disagree, there are numerous examples such as Theranos and WeWork which show that sophisticated investors do not necessarily scrutinize potential investments thoroughly. Thus I don't think assuming they do is a good prior. I think this is actually a reason these problems happen, since everyone else assumes that Respectable Company/Person X has scrutinized it.

Pablo @ 2022-11-12T20:19 (+17)

I am making a comparative, not an absolute, claim: however bad the professionals may be, it is unreasonable to expect outsiders to do better.

DaneelO @ 2022-11-12T20:56 (+2)

I agree with the point that in general one should expect less from "unsophisticated" investors/parties than from sophisticated ones. I do not disagree with that. 

I was disagreeing with "This just isn't plausible on reasonable priors." which seemed to mean that you disagreed with Stuart's comment.

But I also don't think VC scrutiny is necessarily a high bar in general in the absolute sense, and Stuart has posted some warning signs here in other comments such as the hiring of Friedberg. Then considering how important FTX and SBF was to the EA community it could have been investigated more, i.e. the  low VC scrutiny bar could have been surpassed by hiring experts or something similar. To a VC firm this is just another losing bet among many they expect to make. This is why I don't think the comparison with VC firms is very apt.

Pablo @ 2022-11-12T21:24 (+9)

I was disagreeing with "This just isn't plausible on reasonable priors." which seemed to mean that you disagreed with Stuart's comment.

Stuart's comment was in reply to the claim that "It's not clear to me how the Future Fund people, who to my knowledge are not forensic accountants or crypto experts, would have had a better ability to pick up on funny business." I disagreed with Stuart's comment in the sense that I disputed the reasonableness of expecting unsophisticated outsiders to do better because sophisticated investors sometimes perform poorly. I did not mean to dispute that sophisticated investors sometimes perform poorly; indeed, there's plenty of evidence of that, including the evidence you provide in your comment. 

DaneelO @ 2022-11-12T21:46 (+4)

Yeah that makes sense, I think I overinterpreted your comments.

Pablo @ 2022-11-12T21:52 (+7)

In retrospect, I think my original comment was insufficiently clear. Anyway, thanks for the dialogue.

Pat Myron @ 2022-11-12T19:00 (+5)

And that's what Sequoia proudly publicly posted themselves

Sharmake @ 2022-11-11T13:26 (+5)

Or the best auditors are inadequate, and overlooked fairly obvious flaws for some reason.

projectionconfusion @ 2022-11-11T10:54 (+51)

Please feel free to "be that guy" as hard as possible when we are talking about massive financial fraud. 

RobBensinger @ 2022-11-11T06:12 (+49)

can you please explain to us to what extent you guys grilled FTX leadership about the integrity of the sources of money they were giving you? Surely you had an inside view model of how risky this was if it blew up?

This sounds a bit hindsight-bias-y to me; we know to poke at this specific topic now because we know what happened. SBF claims to not have known himself that this was happening, which I take to mean that either this info was super siloed or buried somehow, or that Sam is lying. (And is relying on few-to-no people knowing the truth, or someone would immediately call him out on the lie.)

Geuss @ 2022-11-11T10:57 (+60)

The idea that SBF didn't know what was happening is farcical. You don't unknowingly loan out $10bn of customer funds, which you then lose on bad bets, and then try and cover up your insolvency. I think it's healthy to wait for a clearer picture of what happened before making any summary judgement, but we know enough to say that SBF was not an honest actor.

Greg_Colbourn @ 2022-11-11T07:49 (+53)

To be honest, I'm at a point now where I'm putting significant weight on lying. Some evidence here that FTX bailed out Alameda for ~$4B in FTT on Sep 28th. There are the blockchain transactions (disclaimed by SBF at the time), and the resignation of a high-profile figure (President of FTX.US) the day before. (Note that whilst this doesn't look good, it's still inconclusive. I'm sure the truth will come out eventually.)

PeterMcCluskey @ 2022-11-11T16:42 (+19)

I agree that there's a lot of hindsight bias here, but I don't think that tweet tells us much.

My question for Dony is: what questions could we have asked FTX that would have helped? I'm pretty sure I wouldn't have detected any problems by grilling FTX. Maybe I'd have gotten some suspicions by grilling people who'd previously worked with SBF, but I can't think of what would have prompted me to do that.

Lukas_Gloor @ 2022-11-11T21:04 (+9)

There were IMO some orange flags (such as the connection to the questionable lawyer who also works with tether), but admittedly think it's difficult to notice such things when there's an aura of success around someone. I think it isn't just hindsight, though. I think people need to get a lot better at being cynical, because it's important. For instance, it was odd how FTX positioned itself as the savior of crypto by proposing to buy out entities like Voyager and Blockfi and then it comes out that Alameda owes them money. They said they could "pay anytime," but it still looked weird.

Chris Leong @ 2022-11-11T06:07 (+32)

Hope you're feeling okay Dony.

PeterSlattery @ 2022-11-11T04:16 (+76)

[on phone] Thank you so much for all of your hard work managing the fund. I really appreciated it and I think that it did a lot of good. I doubt that you could have ever have reasonably expected this outcome so I don't hold you responsible for it.

Reading this announcement was surprisingly emotional for me. It made me realise how many exceptionally good people who I really admire are going to be deeply impacted by all of this. That's really sad in addition to all the other stuff to be sad about. I probably don't have much to offer other than my thoughts and sympathy but please let me know if I can help.

I suppose that I should disclose that I recently received a regrant from FTX which I will abstain from drawing on for the moment. I don't think that this has much, if any, relevance to my sentiments however.

Sabs @ 2022-11-11T06:47 (+13)

I would not draw on that grant for quite some time, if ever: you should be worried about clawbacks.

Leon_Lang @ 2022-11-11T09:56 (+9)

I have no idea under what circumstances clawbacks can happen. If you have good reasons to believe this is plausible, then it seems worth it to write a top level post on it.

Lorenzo Buonanno @ 2022-11-11T10:02 (+31)

https://forum.effectivealtruism.org/posts/BesfLENShzSMeb7Xi/community-support-given-ftx-situation?commentId=y7hEdxGhjsYzpg6p3 OpenPhilantropy expects to put out an explainer about clawbacks tomorrow

Lawrence Newport @ 2022-11-11T14:33 (+11)

The comment below that this is like Bernie Madoff is not right as far as I can see. This is a different situation, with different facts - including that we have, as yet, no idea what those facts are! Your situation will also be individual - if you took the funds as a limited company is different to if you took them individually, for example, with different effects most likely. It is also entirely unknown what is happening. Nothing has been made clear officially, no one knows what's going on and you - importantly - had nothing to do with any of that stuff that is being potentially alleged (not yet actually alleged by any authority).

I'm not giving legal advice here. I'm just stating that being calm is the right response and that googling Bernie Madoff (as suggested below), won't most likely be of any help.

Alex Catalán Flores @ 2022-11-11T09:28 (+54)

I commend you on your moral leadership and I join everyone else in the comments in expressing gratitude for the tremendous good you've done so far. However, I'm curious about your decision to resign. I get the moral justification, but surely there are many grantees with many questions who'd be able to get better answers were you still within Future Fund. Something as simple as access to documents or previous emails would enable you to better support grantees who are likely in significant distress. Why did you see it as imperative to resign effective immediately? Why not at the very least see out your notice period? 

sphor @ 2022-11-11T14:05 (+9)

How does it take moral leadership to distance yourself from and condemn massive fraud? Even entirely selfish actors would do the same. 

Markus Amalthea Magnuson @ 2022-11-11T10:15 (+7)

I'm curious about this as well. Does leaving immediately not impede the chances of getting a better (I'd never dare say "full") picture of what went down? Additionally, in terms of accountability, I guess now we'll never know or have records of (from emails etc.) who knew what and when.

Jason @ 2022-11-11T13:32 (+5)

I don't think staying on would add to what the insolvency trustee, regulatory authorities, and likely criminal prosecutors will uncover. The court has already appointed a liquidation trustee whose mission is preserving assets and does not include working with EA. Its unclear to me whether the trustee is in control of the FTX Foundation now, but the statement did say related entities. The FTX principals are doubtless preoccupied and are presumably attuned enough to legal exposure to not be having unnecessary conversations.

juanbenet @ 2022-11-11T03:58 (+50)

Hey team -- thank you for all the work you did. The Future Fund has been tremendously inspiring to see. I'll reach out to you about how we (myself or Protocol Labs) might be able to help.

Dawn Drescher @ 2022-11-11T12:49 (+3)

Pooling the expertise of the Future Fund team and Protocol Labs would be amazing! <3

Abby Hoskin @ 2022-11-11T02:34 (+46)

Thanks very much for posting this update! 

Greg_Colbourn @ 2022-11-12T12:09 (+45)

My main question re the Future Fund at the moment is: why does it seem like there weren't any ring-fenced funds under legal ownership by the Future Fund or the FTX Foundation? Are there any? Were there any when it was founded last year (i.e. presumably when FTX/Alameda was still solvent)? If not, why not? Did this not raise suspicions amongst any of you? I can imagine maybe SBF saying something like the max-EV thing to do is keeping all the funds in the for-profit companies to maximise their growth, and you going along with it because you trusted him (or you just independently agreed and didn't put any significant weight on FTX/Alameda collapsing or even just becoming less rich). Obviously an error in hindsight. Or maybe you kept asking about getting (more) ring-fenced funds, and kept getting fobbed off? That should've raised alarm bells if so! Sorry if this is a bit ranty and speculative, or too soon, or too accusatory, but I'm grasping for answers here. I'm grateful for everything you've done for the world and EA in your careers, but can't help feeling that you might've messed up a bit here.

Milan_Griffes @ 2022-11-12T18:46 (+5)

I asked some further questions in this direction here

Dancer @ 2022-11-15T11:41 (+3)

Asked whether he had set up any kind of endowment for his giving, Mr. Bankman-Fried said in the Times interview last month: “It’s more of a pay-as-we-go thing, and the reason for that, frankly, is I’m not liquid enough for it to make sense to do an endowment right now.”

https://www.nytimes.com/2022/11/13/business/ftx-effective-altruism.html

Greg_Colbourn @ 2022-11-15T11:48 (+3)

Thanks for linking. That should've raised alarm-bells, in hindsight. Could he not at least have donated illiquid assets to the Foundation, for them to liquidate as they see fit (and put the Foundation under independent control)? Although guess that still might not've helped much in this case with FTT and FTX stock collapsing.

Aleks_K @ 2022-11-15T18:47 (+3)

I think this (the fact that there is no endowment) was (or at least should have been) pretty well-known in the EA community from the point in time that the FTX Future Fund started to pay grants, as these came from all kinds of sources, but not from an endowed foundation.  And it obviously would have been known to the people working for FTX Foundation from when they started working there.

(And I would guess one reason that it didn't raise more alarm bells for lots of people in the EA community that learned about this, is probably that they put high trust in the people working for FTX Foundation.)

ZekeFaux @ 2022-11-15T20:14 (+2)

How much did the Future Fund actually pay out? The website lists $160 million in committed grants.

Aleks_K @ 2022-11-15T20:30 (+1)

I agree this would be very useful information. In theory, the FTX Future Fund team should know this information but they probably are not allowed to share it.

Of course, someone could try to collect this information by contacting all named FTX Future Fund grantees and it might be worth the effort to try to do this. (Though it's unclear who might be best suited to do that, given that they'd have to be trusted enough by all grantees for them to share their individual details with them.) Maybe the largest recipients (I think these are CEA and Longview) could start by stating how much they received. 

Greg_Colbourn @ 2022-11-15T20:01 (+1)

Made this into a post: Why didn't the FTX Foundation secure its bag?

Greg_Colbourn @ 2022-11-15T18:01 (+2)

Can anyone find the original source for the "interview last month"? Clicking that link from the link above takes me to https://www.nytimes.com/2022/10/08/business/effective-altruism-elon-musk.html (a) which doesn't contain the quote.

Dawn Drescher @ 2022-11-11T02:51 (+39)

Please let us know if there is anything we at GoodX can do to help. Our main project is to build an impact marketplace, but ultimately we want to get resources to where they are needed (as efficiently as possible).

(E.g., it wouldn’t be my first time running an emergency fundraiser to bail out customers of a failed venture.)

RAB @ 2022-11-11T02:58 (+34)

Strikes me as…premature? We’ll have a lot more clarity in the coming days, and resigning + questioning the ethics at FTX when we still fundamentally don’t know what happened doesn’t seem particularly productive.

If FTX just took risks and lost, this will look very dumb in hindsight. And if there turn out to be lots of unethical calls, we’ll have more than enough time to criticize them all to our hearts’ content. But at least we’ll have the facts.

Jason @ 2022-11-11T03:48 (+50)

Looking dumb is an acceptable risk. If the team prematurely resigned and there is still usable money . . . the usable money is presumably locked in the FTX Foundation and in DAFs, it is not lost.

Premature send, ETA: As far as "questioning the ethics at FTX," it would be very easy for FTX to have denied raiding customer funds if they didn't do it as reported. It's appropriate to draw the obvious inference that they did, and that alone is more than enough to "question[] the ethics at FTX" which is a pretty mild response to the news in my book.

The PR attention is at its height this week, the risk of "looking dumb" (which I think is very unlikely) is outweighed by the need to engage in damage control. No one will be listening if EA waits a few weeks to start distancing itself....

vaniver @ 2022-11-11T20:30 (+29)

From The Snowball, dealing with Warren Buffett's son's stint as a director and PR person for ADM:

The second the FBI agents left, Howie called his father, flailing, saying, I don't know what to do, I don't have the facts, how do I know if these allegations are true? My name is on every press release. How can I be the spokesman for the company worldwide? What should I do, should I resign?

Buffett refrained from the obvious response, which was that, of his three children, only Howie could have wound up with an FBI agent in his living room after taking his first job in the corporate world. He listened to the story non-judgmentally and told Howie that it was his decision whether to stay at ADM. He gave only one piece of advice: Howie had to decide within the next twenty-four hours. If you stay in longer than that, he said, you'll become one of them. No matter what happens, it will be too late to get out.

That clarified things. Howie now realized that waiting was not a way to get more information to help him decide, it was making the decision to stay. He had to look at his options and understand as of right now what they meant.

If he resigned and they were innocent, he would lose friends and look like a jerk.

If he stayed and they were guilty, he would be viewed as consorting with criminals.

The next day Howie went in, resigned, and told the general counsel that he would take legal action against the company if they put his name on any more press releases. Resigning from the board was a major event. For a director to resign was like sending up a smoke signal that said the company was guilty, guilty, guilty. People at ADM did not make it easy for Howie. They pushed for reprieve, they asked how he could in effect convict them without a trial. Howie held firm, however, and got out.

t3tsubo @ 2022-11-11T03:04 (+6)

The facts are plenty clear (with respect to the type of criminal activity taking place, if not the specifics or quantum) if you do some digging on twitter. Crypto-forensics have been having a field day and SBF himself has surprisingly been continuing to dig his grave deeper.

RAB @ 2022-11-11T03:13 (+24)

I would highly, highly recommend that people just wait up to 72 hours for more information, rather than digging through Twitter or Reddit threads.

Edit: This is not to imply that I have secret information - just that this is unfolding very quickly and I expect to learn a lot more in the coming days.

Geuss @ 2022-11-11T11:03 (+6)

Why? Coin Desk's leak - which set off the death spiral - is clear enough. Multiple investors that SBF tried to get bail-out funds from have told the WSJ and FT that SBF admitted to loaning out customer funds to Alameda. Binance pulled out of the deal for a reason. There is plenty of data online about FTX's movements on the blockchain. And, of course, there's the obvious fact that SBF is now very publicly looking for $8bn of funding to cover FTX's liabilities. 

Nathan Young @ 2022-11-11T04:11 (+5)

Feels like you are implying you have secret info, but it just seems extremely unlikely to me that this was anything other than huge mismanagement of customer funds against their wishes.

What odds are you willing to bet that we will see it differently in 72 hours?

Guy Raveh @ 2022-11-11T10:15 (+28)

I don't think the bet suggestions (not just from you - there were a bunch in others' comments on your own post) are helping make the situation any less tense.

Edit: I also think the interpretation of "implying to have secret information" rather than "trying to de-escalate" is not really grounded, and results in your comment being combative in my eyes.

Benjamin Cosman @ 2022-11-11T14:01 (+18)

I think bets with real stakes can be a good de-escalation procedure! It's easy to fire increasingly heated claims back and forth while there's no concrete consequences, but when there's money on the line you have to back off and figure out what you actually believe, and then also once the bet is made there is less incentive to keep arguing while you wait for resolution.

RAB @ 2022-11-11T15:34 (+3)

Didn’t mean to imply secret info, edited the comment above.

That said, seeing most of their legal and compliance teams quit gives me much more serious reservations about illegal or unethical behavior.

Edit: I think I retract this second part - I don’t know if everyone’s quitting now that they can’t pay salaries, or just the legal/compliance teams.

SaraAzubuike @ 2022-11-11T17:35 (+33)

I've made this into a post on the forum, because I'm afraid it'll get buried in the comments here. Please comment on the forum post instead.

https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change

I suggested that we would have trouble with FTX and funding around 6 months ago.  

SBF has been giving lots of money to EA. He admits it's a massively speculative bubble. Crypto crash hurts the most vulnerable, because poor uneducated people put lots of money into it (Krugman). Crypto is currently small, but should be regulated and has potential contagion effects (BIS). EA as a whole is getting loose with it's money due to large crypto flows (MacAskill). An inevitable crypto crash leads to either a) bad optics leading to less interest in EA or b) lots of dead projects. 

It was quite obvious that this would happen--although the specific details with Alameda were not obvious. Stuart Buck was the only one who took me seriously at the time.

Below are some suggestions for change.
 

1. The new button of "support" is great, but I think EA forum should have a way to *sort* by controversiality. And, have the EA forum algorithm occasionally (some % of the time), punt controversial posts back upwards to the front page. If you're like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how "good" red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
 

2. More of you should consider anonymous posts. This is EA forum. I cannot believe that some of you delete your posts simply because it ends up being downvoted. Especially if you're working higher up in an EA org, you ought to be actively voicing your dissent and helping to monitor EA.
 

For example, this is not good: 
 

"Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks." (New Yorker)

What makes EA, *EA*, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have *already concluded something is super effective*, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don't associate with EA.
 

3. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it's because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don't donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition. 

Sharmake @ 2022-11-11T17:47 (+9)

"Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks." (New Yorker)

What makes EA, EA, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have already concluded something is super effective, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don't associate with EA.

Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.

But I'm concerned that they couldn't simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.

  1. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it's because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don't donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.

Disagree, this would make transparency worse without providing much benefit.

  1. The new button of "support" is great, but I think EA forum should have a way to sort by controversiality. And, have the EA forum algorithm occasionally (some ϵ % of the time), punt controversial posts back upwards to the front page. If you're like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how "good" red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))

Disagree here because I don't want to see an EA forum that values controversial posts.

SaraAzubuike @ 2022-11-11T17:50 (+1)

Hi, thanks for replying! I've made this into an EA forum post, instead because I'm afraid it'll get buried in the comments here. https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change

Remmelt @ 2022-11-11T11:06 (+25)

Question just to double-check: are posts no longer going to be evaluated for the AI Worldview Prize? Given that is, that the FTX Future team has resigned.

Greg_Colbourn @ 2022-11-11T11:36 (+10)

I think it would be good if others stepped in to help see it through (perhaps offering smaller prizes), given how critical the answers are to determining EA resource allocation. Have asked Holden re OpenPhil fulfilling this role.

Geuss @ 2022-11-11T11:47 (+8)

Why do you think it's any more important than the FTX Fund's other obligations? If there's to be a settlement matching partial assets to all of the fund's liabilities, it should done in an open and fair way. Maybe the assets are 0, in which case that becomes moot. My own view is that there are many other projects of equal or greater merit with funding commitments from the FTX Fund.

Greg_Colbourn @ 2022-11-11T11:58 (+8)

That's reasonable. I guess from my perspective, I think the top EA grantmakers need persuading that p(doom|AGI) is significantly greater than 35%. If OpenPhil already think this, then that's great, but if they don't (and their probabilites are similar to the Future Fund's), then the Worldview prize is very important. Even if your probabilities are the same, or much lower, it's still very high Value of Information imo.

RobBensinger @ 2022-11-11T15:40 (+12)

In the survey I did last year, four Open Phil staff respectively gave probability 0.5, 0.5, 0.35, and 0.06 to "the overall value of the future will be drastically less than it could have been, as a result of AI systems not doing/optimizing what the people deploying them wanted/intended".

That's just four people, and isn't necessarily representative of the rest of longtermist Open Phil, but it at least shows that "higher than 35%" isn't an unrepresented view there.

Greg_Colbourn @ 2022-11-11T15:44 (+4)

Interesting, thanks. What about short timelines? (p(AGI by 2043) in Future Fund Worldview Prize terms)

RobBensinger @ 2022-11-11T20:34 (+8)

Ajeya Cotra's median guess is that AGI is 18 years away; the last time I talked to a MIRI person, their median guess was 14 years. So the Cotra and MIRI camps seem super close to me in timelines (though you can find plenty of individuals whose median year is not in the 2036-2040 range).

If you look at (e.g.) animal welfare EAs vs. AI risk EAs, I expect a much larger gap in timeline beliefs.

Jason @ 2022-11-11T13:01 (+2)

One could also argue for prioritizing funding for work that has already been done over work that has been approved but not yet done. If someone was going to receive a grant to do certain work and has it been pulled, that is unfair and a loss to them . . . but it's not  as bad (or as damaging to the community / future incentives) as denying people payment for work they have already done. 

How this logic translates to a prize program is murky. But unless you believe that the prize's existence did not cause people to work more (i.e., that the prize program was completely ineffective), its cancellation would mean people are not going to be paid for work already performed.

Of course, it might be possible to honor the commitment made for that work in some fashion that doesn't involve awarding full prizes. 

Neil Chilson @ 2022-11-11T20:29 (+20)

Potential Help for FF Grantees.  I work at a major philanthropic organization, Stand Together, on technology and innovation related efforts.  I was a big fan of Future Fund's ambition and methods, even where I didn't share your priors.  

At Stand Together, we work on a wide range of issues, all seeking to break the barriers that prevent individuals from reaching their true potential. On technology, we think technological innovation has been the primary driver of widespread human prosperity and we are looking to promote both a culture that embraces innovation rather than fears it and a regulatory environment that enables it. 

If you are a Future Fund grantee interested in alternative funding and any of the above seems to line up with your work, please reach out: nchilson@standtogether.org. 

And best of luck to everyone.


 

Jelle Donders @ 2022-11-11T13:43 (+19)

Wishing much strength to everyone affected by this. Let's support each other and get through this together.

throwaway123 @ 2022-11-15T13:21 (+12)

as a non EA reading this thread, on balance, makes me really happy. You guys just have some good old fashioned cleansing to do and you'll be fine.

FWIW, everyone who's had any dealings with the Alameda crew knew that they were the worst kind of trash - we just thought that meant they have so much money that surely they don't need to steal ours.

 

cheers.

Pagw @ 2022-11-11T22:22 (+11)

It seems like there are quite a lot of people/orgs who made plans based on promised money that now seems unlikely to arrive. Is there a lesson that can be learned about how to reduce risk in grant awarding e.g. by waiting until funds are securely in the foundation's hands? Or is there no way to avoid this risk given potential clawbacks, even in cases of bankruptcy that don't involve any fraud?

Jakob @ 2022-11-11T09:07 (+11)

Thank you for your good work over the last months, and thank you for your commitment to integrity in these hard times. I'm sure this must also be hard for you on a personal level, so I hope you're able to find consolation in all the good that will be created from the projects you helped off the ground, and that you still find a home in the EA community. 

alexflint @ 2022-11-11T20:38 (+10)

I trust you guys to decide that this is the right time to resign, but I do hope as a community that we are able to hold value of our friendships together with the importance of holding people who made mistakes to account, without either one negating the other. We don't yet know what kind of ethical errors Sam made, but the larger those mistakes are, the more important it is that we offer friendship of a kind that is compatible with holding people to account.

guyi @ 2022-11-11T18:05 (+9)

In his post announcing the new found wealth of EA movement stemming from FTX Will included this argument for why charitable enterprises are more dangerous than for profit companies:

There’s one huge difference between aiming to do good and aiming to make profit. If you set up a company aiming to make money, generally the very worst that can happen is that you go bankrupt; there’s a legal system in place that prevents you from getting burdened by arbitrarily large debt. However, if you set up a project aiming to do good, the amount of harm that you can do is basically unbounded.

At the time I remarked at how wrongheaded this seemed to me. Of course for profit companies can do a large amount of harm! In fact, because for profit companies have ability to use their profits to increase their scale, they have the potential to do immense harm.

Hopefully, the FTX fallout makes abundantly clear the original point I was trying to make and encourages some deeper reflection in this community of about how earn part of earn to give has potential to cause great harm.

Sam Glover @ 2022-11-11T21:50 (+25)

This feels like a weird interpretation of Will's comment, which doesn't (in my view) imply that for-profit companies can't do a lot of harm, but rather that if you start a company with the sole goal of making a profit, usually the worst outcome (with regards to your goal of making a profit) is that you go bankrupt. 

Jonathan Paulson @ 2022-11-11T23:05 (+6)

As FTX just spectacularly demonstrated, Will was wrong. This is because even though FTX was ostensibly started with the sole goal of making a profit, it turns out there were other important implicit goals like “don’t steal billions of dollars from thousands of people”, implicit goals like that always exist, and failure to meet those implicit goals is very bad.

Sharmake @ 2022-11-12T17:56 (+2)

This sounds like a human form of alignment failure, specifically, the What Failure Looks Like story part I.

Here's a link to it:

https://www.lesswrong.com/posts/HBxe6wdjxK239zajf/what-failure-looks-like

Sharmake @ 2022-11-11T21:56 (+5)

In his post announcing the new found wealth of EA movement stemming from FTX Will included this argument for why charitable enterprises are more dangerous than for profit companies:

There’s one huge difference between aiming to do good and aiming to make profit. If you set up a company aiming to make money, generally the very worst that can happen is that you go bankrupt; there’s a legal system in place that prevents you from getting burdened by arbitrarily large debt. However, if you set up a project aiming to do good, the amount of harm that you can do is basically unbounded.

At the time I remarked at how wrongheaded this seemed to me. Of course for profit companies can do a large amount of harm! In fact, because for profit companies have ability to use their profits to increase their scale, they have the potential to do immense harm.

Hopefully, the FTX fallout makes abundantly clear the original point I was trying to make and encourages some deeper reflection in this community of about how earn part of earn to give has potential to cause great harm.

Should have called it, but I'll do it now: It's a double standard applied, so the comparison is not what you think.

Brad West @ 2022-11-11T21:59 (+4)

Exactly, just as charities might unintentionally do harm, so can for-profit entities. Will's statement erred in assuming financial viability for companies is the only dimension on which they can be assessed.

Linda Linsefors @ 2022-11-13T13:58 (+8)

How much money was committed in grants that will no not be paid out?
Additionally it would be useful to know the distribution among cause areas for this money.

While some people are focused on figuring out what went wrong with FTX and why, the rest of us needs to focus on mitigating the immediate damage from broken funding promises. I would be helpful to know the total scale of this situation. 

Viktoria Malyasova @ 2022-11-12T20:32 (+6)

Is there any reason why, when you commit to a grant, you cannot set aside the money as gold or index funds or some other reliable asset, and instead have to rely on a single company's ability to pay in the future?

Miguel @ 2022-11-11T12:22 (+6)

If you guys need help with ideas on improving oversight or adding internal audit reviews for future EA projects, let me know. It's always the lack of governance measures that leads to these unfortunate events.

smoskal @ 2022-11-15T20:09 (+1)

Recently, I listened to a podcast with Douglas Rushkoff about his new book describing recent encounters with billionaires, and combined with this incident, how does the EA movement take into account harm which may be caused by those in high wealth generating endeavors?  Is that a factor when one thinks of the total good calculation of EA?  Would philanthropy, society, and Sam Bankman-Fried have been better off if he had pursued his original interest of animal welfare instead of finance to form the EA movement?