Sam Harris and William MacAskill on SBF & EA

By AnonymousTurtle @ 2024-04-01T23:58 (+64)

This is a linkpost to https://www.samharris.org/podcasts/making-sense-episodes/361-sam-bankman-fried-effective-altruism


JWS @ 2024-04-03T17:57 (+55)

After listening, here are my thoughts on the podcast (times refer roughly to youtube timestamps):[1]

Recap[2]

 

Personal Thoughts

So there's not an actual deep-dive into what happened with SBF and FTX, and how much Will or figures in EA actually knew. Perhaps the podcast was trying to cover too much ground in 80 minutes, or perhaps Sam didn't want to come off as too hostile of a host? I feel like both a talking about the whole thing at an oddly abstract level, and not referencing the evidence that's come out in court.

While I also agree with both that EA principles are still good, and that most EAs are doing good in the world, there's clearly a connection between EA - or at least a bastardised, naïvely maximalist view of it - and SBF. The prosecution and the judge seemed to take the view that SBF was a high risk of doing the same or a similar thing again in the future, and that he has not shown remorse. This makes sense if Sam was acting in the way he did because he thought he was doing the right thing, and the fact that it was an attitude rather than a 'rational calculation' doesn't make it less driven by ideas.

So I think that's where I've ended up on this (I'm not an expert on financials, or what precise laws FTX broke, and how their attempted scheme operated or how often they were brazenly lying for. Feels like those with an outside view are pretty damn negative on SBF). I think trying to fix the name 'EA' or 'not EA' to what SBF and the FTX team believed is pretty unhelpful. I think Ellison and SBF had a very naïve, maximalist view of the world and their actions. They believed they had special ability and knowledge to act in the world, and to break existing rules and norms in order to make the world better if they saw it, even if this incurred high risks, if their expectation was that it would work out in EV terms. An additional error here, and perhaps where the 'Hubris' theory does play in, is that there was no error mechanism to correct these beliefs. Even after the whole collapse, and a 25-year sentence, it still seems to me that SBF thinks he made the 'right' call and got unlucky.

My takeaway is that this cluster of beliefs[5] is dangerous and the EA community should develop an immune system to reject these ideas. Ryan Carey refers to this as 'risky beneficentrism', and I think part of 'Third-Wave' EA should be about rejecting this cluster of ideas, making this publicly known, and disassociating EA from the individuals, leaders, or organisations who still hold on to it in the aftermath of this entire debacle.

  1. ^

    For clarity Sam refers to Sam Harris, and SBF refers to Sam Bankman-Fried

  2. ^

    Not necessarily in order, I've tried to group similar points together

  3. ^

    I think this makes some sense if you view EA as a set of ideas/principles, less so if you view EA as a set of people and organisations

  4. ^

    During this section especially I kinda wanted to shout at my podcast when Will asked rhetorically "was he lying to me that whole time?" the answer is yes Will, it seems like they were. The code snippets from Nishad Singh and Gary Wang that the prosecution shared are pretty damning, for example.

  5. ^

    See the following link in the text to Ryan Carey's post. But I think the main dangerous ideas to my mind are:

    1) Naïve consequentialism

    2) The ability and desire to rapidly change the world

    3) A rejection of existing norms and common-sense morality

    4) No uncertainty about whether the values above or the empirical consequences of the actions

    5) Most importantly, no feedback or error correction mechanism for any of the above.

samuel @ 2024-04-05T23:44 (+22)

Thanks for this summary. I listened to this yesterday & browsed through the SH subreddit discussion, and I'm surprised that this hasn't received much discussion on here. Perhaps the EA community is talked out about this subject, which, fair enough. But as far as I can tell, it's one of Will's first at-length public remarks on SBF, so it feels discussion-worthy to me.

I agree that the discussion was oddly vague given all the actual evidence we have. I don't feel like going into much detail but a few things I noticed:

  • It seems that Will is still somewhat in denial that SBF was a fraud. I guess this is a perfectly valid opinion, Will knows SBF, but I can't help but feel that this is naive (or measured, if we're to be charitable). We can quibble over his reasoning, but fraud is fraud, and SBF committed a lot of it, repeatedly and at scale. He doesn't have to be good at fraud to be one.
  • They barely touch on the dangers of a "maximalist" EA. If Will doesn't believe SBF was fraudulent for regular greedy reasons, then EA may have played a part, and that's worth considering. No need to be overly dramatic here, the average EA is well-intentioned and not going to do anything like this... but as long as EA is centralized and reliant on large donors, it's something we need to further think through.
  • The podcast is a reminder of how difficult it is to understand motivations, and how difficult it is for "good" people to understand what motivates "bad" actions (adding "" to acknowledge the big vast gray areas here). Given that there are a lot of neuro-atypical EAs, the community seems desensitized to potential red flags like claiming not to feel love. This is my hobbyhorse -- like any community, EA has smart capable people that I wouldn't ever want to have power. It sounds like there were people who felt this way about SBF, including a cofounder. It's a bit shocking to me that EA leadership didn't see SBF as much of a liability.
SteadyPanda @ 2024-04-09T13:37 (+7)

Thanks for the summary.  One nitpick:

During this section especially I kinda wanted to shout at my podcast when Will asked rhetorically "was he lying to me that whole time?" the answer is yes Will, it seems like they were. The code snippets from Nishad Singh and Gary Wang that the prosecution shared are pretty damning, for example.

To be fair to Will, he does acknowledge that Nishad's insurance fund code "seems like really quite clear fraud", if comparatively minor.

As for the other code snippets in your link -- the "backdoor" -- Nishad and Gary said the intention was to support Alameda's role as a backstop liquidity provider (which FTX was heavily dependent on in its early days to function):

  • “[Wang] testified about several changes Bankman-Fried asked him to make to FTX's software code to allow Alameda to withdraw unlimited funds from the exchange. . . . Wang agreed that the changes were necessary for Alameda to provide liquidity on the exchange” (Reuters)
  • "Singh also acknowledged that he originally thought some of the special treatment Bankman-Fried’s trading firm Alameda Research received on FTX was meant to protect customers by allowing it to more effectively ‘backstop’ some trades. ‘My view at the time [was that] it would be helpful for customers,’ Singh said" (Financial Times)
Stuart Buck @ 2024-04-05T02:58 (+14)

I did think Harris could have been slightly more aggressive in his questioning (as in, some level above zero). E.g., why would MacAskill even suggest that SBF might have have been altruistic in his motivations, even though we now know about the profligate and indulgent lifestyle that SBF led? MacAskill had to have known about that behavior at the time (why didn't it make him suspicious?).

 And why was MacAskill trying to ingratiate himself with Elon Musk so that SBF could put several billion dollars (not even his in the first place) towards buying Twitter? Contributing towards Musk's purchase of Twitter was the best EA use of several billion dollars? That was going to save more lives than any other philanthropic opportunity? Based on what analysis?

Habryka @ 2024-04-05T04:44 (+17)

FWIW, buying Twitter still seems plausibly like a good idea to me. It sure seems to be the single place that is most shaping public opinion on a large number of topics I care a lot about (like AI x-risk attitudes), and making that go better seems worth a lot.

Jason @ 2024-04-05T22:21 (+8)

I'm struggling to draw the line between owning (a minority stake in) Twitter and having public opinion on certain topics meaningfully flow in a desired direction or path.

Habryka @ 2024-04-05T23:43 (+7)

Yeah, I think this would only make sense if you would somehow end up majorly shaping the algorithms and structure of Twitter. I don't think just being a shareholder really does much here.

Stuart Buck @ 2024-04-05T21:36 (+7)

I could imagine making that case, but what's the point of all the Givewell-style analysis of evidence, or all the detailed attempts to predict and value the future, if in the end, what would have been the single biggest allocation of EA funds for all time was being proposed based on vibes? 

AnonymousTurtle @ 2024-04-09T16:57 (+13)

Like with Wytham Abbey, I'm really surprised by people in this thread confusing investments with donations.

If SBF had invested some billions in Twitter, the money wouldn't be burned, see e.g. what happened with Anthropic.

From his (and most people's) perspective, SBF was running FTX with ~1% the employees of comparable platforms, so it seemed plausible he could buy Twitter, cut 90% of the workforce like Musk did, and make money while at the same time steering it to be more scout-mindset and truth-seeking oriented.

Jason @ 2024-04-10T18:52 (+8)

I've never seen a good business case for valuing Twitter at anywhere near the $44B it took to acquire. SBF didn't have nearly that much available, so he'd still be looking at Musk as the majority owner....and it was 100 percent foreseeable that Musk had his own ideological axes to grind. That SBF ran FTX lean seems weak evidence that he could cut 90 percent of Twitter staff without serious difficulties, and the train wreck caused by Musk's cuts suggest that never was realistic.

Finally, the idea that SBF could somehow make Twitter significantly "more scout-mindset and truth-seeking oriented" has never been fleshed out AFAIK. Also, it would be a surprising and suspicious convergence that the way to run Twitter profitably would also have been the way to run it altruistically.

Stuart Buck @ 2024-04-12T18:43 (+5)

Is the consensus currently that the investment in Twitter has paid off or is ever likely to do so? 

AnonymousTurtle @ 2024-04-14T07:26 (+7)

No, but in expectation it wasn't very far from the stock market valuation. I think it's very possible that it was positive EV even if it didn't work out

Jason @ 2024-04-02T01:48 (+10)

Where's SummaryBot when you really need/want it?

Simon_M @ 2024-04-02T21:05 (+14)

Claude's Summary:

Here are a few key points summarizing Will MacAskill's thoughts on the FTX collapse and its impact on effective altruism (EA):

  • He believes Sam Bankman-Fried did not engage in a calculated, rational fraud motivated by EA principles or long-termist considerations. Rather, it seems to have stemmed from hubris, incompetence and failure to have proper risk controls as FTX rapidly grew.
  • The fraud and collapse has been hugely damaging to EA's public perception and morale within the community. However, the core ideas of using reason and evidence to do the most good remain valid.
  • Leadership at major EA organizations has essentially turned over in the aftermath. Will has stepped back from governance roles to allow more decentralization.
  • He does not think the emphasis on long-termism within EA was a major driver of the FTX issues. If anything, near-term considerations like global health and poverty reduction could provide similar motivation for misguided risk-taking.
  • His views on long-termism have evolved to be more focused on short-term AI risk over cosmic timescales, given the potential for advanced AI systems to pose existential risks to the current generation within decades.
  • Overall, while hugely damaging, he sees the FTX scandal as distinct from the valid principles of effective altruism rather than undermining them entirely. But it has prompted substantial re-evaluation and restructuring within the movement.
Stephen Clare @ 2024-04-02T17:19 (+2)

Sam said he would un-paywall this episode, but it still seems paywalled for me here and on Spotify. Am I missing something? (The full thing is available on youtube)

Simon_M @ 2024-04-02T17:49 (+2)

If you click preview episode on that link you get the full episode. I also get the whole thing on my podcast feed (PocketCasts, not Spotify). Perhaps it's a Spotify issue?