Holden Karnofsky’s recent comments on FTX

By Lizka @ 2023-03-24T11:44 (+149)

Holden Karnofsky has recently shared some reflections on EA and FTX, but they’re spread out and I’d guess that few people have seen them, so I thought it could be useful to collect them here. (In general, I think collections like this can be helpful and under-supplied.) I've copied some comments in full, and I've put together a simpler list of the links in this footnote.[1]

These comments come after a few months — there’s some explanation of why that is in this post and in this comment.

Updates after FTX

I found the following comment (a summary of updates he’s made after FTX) especially interesting (please note that I’m not sure I agree with everything): 

Here’s a followup with some reflections.

Note that I discuss some takeaways and potential lessons learned in this interview.

Here are some (somewhat redundant with the interview) things I feel like I’ve updated on in light of the FTX collapse and aftermath:

Other recent comments

And there is more in his interview with Vox from January (here are edited highlights). 

(Thanks to the folks who suggested making this post & helped.)

  1. ^

    On why these comments didn't come earlier — post & comment

    Updates post FTX — comment (see also the interview with Vox, edited highlights)

    Responsibility — comment.

    SBF — comment

    Claim from the TIME  article — comment


NewLeaf @ 2023-03-25T01:48 (+49)

Thanks, Lizka, for highlighting these comments! I'd really like to see others in the EA community, and especially leaders of EA orgs, engage more in public conversations about how EA should change in light of the FTX collapse and other recent events.

I think the events of the last few months should lead us to think carefully about whether future efforts inspired by EA ideas might cause significant harm or turn out to be net-negative in expectation, after accounting for downside risks. I'd like to see leaders and other community members talking much more concretely about how organizations' governance structures, leadership teams, cultural norms, and project portfolios should change to reduce the risk of causing unintended harm.

Holden's reflections collected here, Toby Ord's recent address at EAG, and Oliver Habryka’s comments explaining the decision to close the Lightcone Offices feel to me like first steps in the right direction, but I'd really like to see other leaders, including Will MacAskill and Nick Beckstead, join the public conversation. I’d especially like to see these and other leaders identify the broad changes they would like to see in the community, commit to specific actions they will take, and respond to others’ proposals for reform. (For the reasons Jason explains here, I don't think the ongoing investigation presents any necessary legal impediment to Will or Nick speaking now, and waiting at least another two months to join the conversation seems harmful to the community's ability to make good decisions about potential paths forward.)

My guess is that leaders’ relative silence on these topics is harming the EA community's ability to make a positive difference in the world. I and others I know have been taking steps back from the EA community over the past several months, partly because many leaders haven’t been engaging in public conversations about potential changes that seem urgently necessary. I've personally lost much of the confidence I once had in the ability of the EA community’s leaders, institutions, and cultural norms to manage risks of serious harm that can result from trying to put EA ideas into practice. I’m now uncertain about whether engaging with the EA community is the right way for me to spend time and energy going forward. I think leaders of EA orgs can help restore confidence and chart a better course by starting or joining substantive, public conversations about concrete steps toward reform.

(To end on a personal note: I've been feeling pretty discouraged over the last few months, but in the spirit of Leaning into EA Disillusionment, I aim to write more on this topic soon. I hope others will, too.)

Chris Leong @ 2023-03-24T14:10 (+6)

Thanks for gathering these comments!