Against the Guardian's hit piece on Manifest

By Omnizoid @ 2024-06-19T23:24 (+60)

Crosspost of this on my blog 

The Guardian recently released the newest edition in the smear rationalists and effective altruists series, this time targetting the Manifest conference. The piece titled “Sam Bankman-Fried funded a group with racist ties. FTX wants its $5m back,” is filled with bizarre factual errors, one of which was so egregious that it merited a correction. It’s the standard sort of journalist hitpiece on a group: find a bunch of members saying things that sound bad, and then sneeringly report on that as if that discredits the group.

It reports, for example, that Scott Alexander attended the conference, and links to the dishonest New York Times smear piece criticizing Scott, as well as a similar hitpiece calling Robin Hanson creepy. It then smears Razib Khan, on the grounds that he once wrote a piece for magazines that are Paleoconservative and anti-immigration (like around half the country). The charges against Steve Hsu are the most embarrassing—they can’t even find something bad that he did, so they just mention half-heartedly that there were protests against him. And it just continues like this—Manifest invited X person who has said a bad thing once, or is friends with a bad person, or has written for some nefarious group.

If you haven’t seen it, I’d recommend checking out Austin’s response. I’m not going to go through and defend each of these people in detail, because I think that’s a lame waste of time. I want to make a more meta point: articles like this are embarrassing and people should be ashamed of themselves for writing them.

Most people have some problematic views. Corner people in a dark alleyway and start asking them why it’s okay to kill animals for food and not people (as I’ve done many times), and about half the time they’ll suggest it would be okay to kill mentally disabled orphans. Ask people about why one would be required to save children from a pond but not to give to effective charities, and a sizeable portion of the time, people will suggest that one wouldn’t have an obligation to wade into a pond to save drowning African children. Ask people about population ethics, and people will start rooting for a nuclear holocaust.

Many people think their worldview doesn’t commit them to anything strange or repugnant. They only have the luxury of thinking this because they haven’t thought hard about anything. Inevitably, if one thinks hard about morality—or most topics—in any detail, they’ll have to accept all sorts of very unsavory implications. In philosophy, there are all sorts of impossibility proofs, showing that we must give up on at least one of a few widely shared intuitions.

Take the accusations against Jonathan Anomaly, for instance. He was smeared for supporting what’s known as liberal eugenics—gene editing to make people smarter or make sure they don’t get horrible diseases. Why is this supposed to be bad? Sure, it has a nasty word in the name, but what’s actually bad about it? A lot of people who think carefully about the subject will come to the same conclusions as Jonathan Anomaly, because there isn’t anything objectionable about gene editing to make people better off. If you’re a conformist who bases your opinion about so called liberal eugenics (terrible term for it) on the fact that it’s a scary term, you’ll find Anomaly’s position unreasonable, but if you actually think it through, it’s extremely plausible, and is even agreed with by most philosophers. Should philosophy conferences be disbanded because too many philosophers have offensive views?

I’ve elsewhere remarked that cancel culture is a tax on being interesting. Anyone who says a lot of things and isn’t completely beholden to social consensus will eventually say some things that sound bad. The only people safe from cancel culture are those who never have interesting thoughts, who never step outside of the Overton window, who never advance beyond irrational and immoral societal norms, for they are beholden to the norms of their society.

Lots of people seem to treat associations like a disease—if you associate with people who think bad things, they’ll infect you with the badness bug, and then you’ll become bad too (this seems to be the reasoning behind the Guardian smearpiece). If I accepted this I’d have to be a Hermit in the wilderness, because I think almost everyone either thinks or does bad things—specifically, people who eat meat have, I think, either repugnant views or do things they know to be very wrong.

The association as disease model is crazy! It’s valuable to associate with people who think bad things. Has Hanania said some things I regard as objectionable? Of course. Does this mean I think Hanania should be permanently shunned? No—he’s an interesting guy who I can learn a lot from.

No one has ever convincingly explained why one shouldn’t interact with bad people or invite them to their conferences (even though it’s taken as axiomatic by lots of people). Suppose the Manifest crew really invited some bad hombres. So what? Why not have bad people give talks? While maybe the bad people will bring the good people over to the dark side, maybe the good people will bring the bad people over to the light side. For this reason, I’d expect people with radical views to be depolarized by an event like Manifest, if it has any impact on one’s views.

The Guardian hitpiece was written by Jason Wilson and Ali Winston. Maybe the Wilson and Winstons only go to conferences where no one thinks anything offensive (and perhaps everyone’s last name starts with a Wi, and has an o as the second to last letter). But if this is so then they only hung out with prudish bores. Anyone who thinks for themselves about issues will think some things that they wouldn’t want to utter at a liberal dinner party.

This shouldn’t be surprising. Social norms are often wrong. Just like old social norms were racist and sexist and homophobic, we should expect modern consensus views to often be similarly in error. This means that even if a person believed all and only true things, they’d end up constantly disagreeing with social norms. They’d end up thinking things that the ~Wilsons wouldn’t like—that they’d think are worthy of cancellation.

Are there any philosophers who don’t think any offensive things about ethics? I can’t think of any. Singer, one of the most influential ethicists, has been so controversial that he’s drawn protests, and supports infanticide in some cases. Should we want groups that censor people like Singer—people who diverge from mainstream groupthink?

If, as I’ve argued before, people who are interesting and write a lot will generally say controversial things, then stifling those who have controversial views will produce either people who self-censor or people who are not interesting. It will produce a world devoid of free thinkers who write a lot, a world filled with the type of Midwit who determines their beliefs by judging what things sound good rather than what is true.

The people at Manifest weren’t even disproportionately right-wing. Scott isn’t right-wing—neither were most of the attendees. But they provided enough fodder for a Guardian hitpiece because they had the unfortunate property of being interesting, of thinking for themselves. If we don’t want a society of boring conformists, we’ll have to tolerate that sometimes conferences will have people who we disagree with. The fact that in 2024, the Guardian is still churning out these misleading, low-info hitpieces in an attempt to cancel people is shameful.

 

 


 


HjalmarWijk @ 2024-06-20T23:24 (+76)

Certainly the Guardian article had a lot of mistakes and issues, but I don't at all buy that there's nothing meaningfully different between someone like Hanania and most interesting thinkers, just because forcing consistency of philosophical views will inevitably lead to some upsetting conclusions somewhere. If I was to "corner someone in a dark alleyway" about population ethics until I caught them in a gotcha that implied they would prefer the world was destroyed, this updates me ~0 about the likelihood of this person actually going out and trying to destroy the world or causing harm to people. If I see someone consistently tweet and write in racist ways despite a lot of criticism and push-back, this shows me important things about what they value on reflection, and provides fairly strong evidence that this person will act in exclusionary and hateful ways. Trying to say that such racist comments are fine because of impossibility theorems showing everyone has to be committed to some weird views doesn't at all engage with the empirical track record of how people who write like Hanania tend to act.

David Mathers @ 2024-06-21T09:41 (+13)

Even IF Hanania is not personally discriminatory, he is campaigning for the repeal of the single most famous piece of American legislation designed to outlaw racist discrimination. 

titotal @ 2024-06-20T16:01 (+38)

I think posts like this exhibit the same thought terminating cancel culture behaviour that you are supposedly complaining about, in a way that is often inaccurate or uncharitable. 

For example, take the mention of scott alexander:

It reports, for example, that Scott Alexander attended the conference, and links to the dishonest New York Times smear piece criticizing Scott, as well as a similar hitpiece calling Robin Hanson creepy. 

Now, compare this to the actual text of the article:

Prediction markets are a long-held enthusiasm in the EA and rationalism subcultures, and billed guests included personalities like Scott Siskind, AKA Scott Alexander, founder of Slate Star Codex; misogynistic George Mason University economist Robin Hanson; and Eliezer Yudkowsky, founder of the Machine Intelligence Research Institute (Miri).

Billed speakers from the broader tech world included the Substack co-founder Chris Best and Ben Mann, co-founder of AI startup Anthropic.

Now, I get the complaint about the treatment of robin hanson here, and I feel that "accused of misogyny" would be more appropriate (outside of an oped). But with regards to scott alexander, there was literally no judgement call included. 

When it comes to the NYT article, very few people outside this sphere know who he is. Linking to an article about him in one of the most well known newspapers in the world does not seem like a major crime! People linking to articles you don't like is not cancel culture. Or if it is, then I guess I'm pro cancel culture, because the word has lost all meaning. 

It feels like you want to retreat into a tiny, insular bubble where people can freely be horribly unpleasant to each other without receiving any criticism at all from the outside world. And I'm happy for those bubbles to exist, but I have no obligation to host your bubble or hide out there with you. 

Omnizoid @ 2024-06-21T01:11 (+23)

Linking to hitpieces is not cancel culture, but if your objection to some group is "look at all these bad people they associate with," and then you link to poorly reasoned and educated hitpieces, that is bad. 

David Mathers @ 2024-06-21T09:19 (+4)

I think the NYT's criticisms of Scott were basically fair even if some of the details were off, but I don't think you can reasonably imply that someone linking to it while writing a scathing criticism of groups and views Scott is associated with is linking it just because it is in the NYT. They are obviously trying to get the reader to draw negative inferences about Scott and people and movements associated with him.

Maniano @ 2024-06-21T07:17 (+31)

I am surprised by some of the things written here, and this line especially stood out to me:

The people at Manifest weren’t even disproportionately right-wing

Based on the discussions had at the Yarvin afterparty (which was organised by Curtis Yarvin, not Manifest), I'd say there was a significant overrepresentation of very very right-wing people at Manifest (as in the right-wing tail of the political distribution was overrepresented. Not making a statement on more moderate right-wingers or left-wingers.). This sentence felt especially surprising since you were there at the afterparty.[1] To be fair, there were also people there who weren't right-wing at all, and when I reached out to you to ask about this you said that you didn't find many to say right-wing things, and that only a small percentage of Manifest attendees were invited to the afterparty.

There is a chance that people around me said non-representatively many bigoted things, but I think it is more likely that your experience is explained by people avoiding more incendiary topics around an in-group famous, non-right wing blogger such as yourself. I am not very confident on this, though.

  1. ^

    I asked Omnizoid for his permission to mention this.

Nathan Young @ 2024-06-27T19:53 (+4)

Depends what we mean in proportion to? I guess most of them will vote democrat. 

And again, to mention the Yarvin afterparty seems importantly different. 

Scott Alexander @ 2024-06-20T06:10 (+16)

The article was obviously terrible, and I hope the listed mistakes get corrected, but I haven't seen a request for correction on the claim that CFAR/Lightcone has $5 million of FTX money and isn't giving it back. Is there any more information on whether this is true and, if so, what their reasoning is?

Habryka @ 2024-06-20T08:53 (+22)

Lightcone doesn't have $5M of FTX money! I've generally been very transparent about this and e.g. you can see a breakdown of our FTX funding in e.g. this old comment of mine (and also some others that I could probably dig up). 

Lightcone Infrastructure (fiscally sponsored by CFAR) has received around $4M in grants from FTX. By the time FTX collapsed almost all of the grant funding was spent on the programs that FTX wanted to support (the relevant work was mostly on the Lightcone Offices, LessWrong and the AI Alignment Forum). We offered FTX a settlement of a large fraction of Lightcone's assets and cash reserves (~$700k / ~$900k and more than what wasn't already spent or legally committed by the time FTX collapsed), which they rejected without any kind of counter offer. Now they filed a formal complaint, which we'll fight.

The article and the FTX complaint also includes an additional $1M, which was an escrow deposit that FTX covered for us. We never had any ownership over that money and it's just sitting with the title company somewhere, and I don't know why the FTX estate hasn't picked it up. We have tried to put them in contact. I am sad to see they are still including it in their complaint, since as far as I can tell there is really no way in which Lightcone has or ever had that money.

Happy to try to answer any other questions people might have (though commenting on ongoing litigation is a bit messy).

bullfinch076 @ 2024-06-20T14:56 (+24)
  1. Why is the escrow deposit still sitting somewhere? Some quick online research (so take it with a grain of salt) makes it sound like the escrow process usually takes 4 to 8 weeks in California—so this seems significantly long, in comparison.
  2. Can you clarify when you received these grants and the escrow money? The complaint filed by FTX (documents here, for anyone interested) have the dates of transfers as March 3, July 8, July 13, August 18, September 20, and October 3, all in 2022—so well within the timeframe that might be subject to clawbacks, and well within the bankruptcy lookback period. (For a comparison point, EV US and EV UK paid the FTX estate an amount equal to all the funds the entities received in 2022.)
  3. Why would you not proactively return this money or settle with the FTX estate, given the money came from FTX and could have been originally obtained in fraudulent ways? My prior is that you (Oliver Habryka) have written multiple times on the Forum about the harm EA may have caused related to FTX and wish it could have been prevented, so somehow it seems strange to me that you wouldn't take the opportunity to return money that came from FTX, especially when it could have been obtained in harmful, unethical ways. 
  4. Did you in fact ignore FTX's attempts to contact you in 2023, as the complaint says? And if so, why?

I also think it's worth pointing out that in bankruptcy cases, especially regarding clawbacks, the question of whether you have a legal obligation to return the money isn't a question of whether you currently have the $5M of FTX money sitting around or whether you've already allocated or used it. Demonstrating that you've spent the funds on legitimate charitable activities might strengthen your case, but that doesn't guarantee protection from clawback attempts.

Habryka @ 2024-06-20T16:14 (+16)

Why is the escrow deposit still sitting somewhere? Some quick online research (so take it with a grain of salt) makes it sound like the escrow process usually takes 4 to 8 weeks in California—so this seems significantly long, in comparison.

I am also confused (and very frustrated by this). The key thing to understand here is that the escrow was due to be returned right around the time when FTX went bankrupt (the sale was completed on the 4th of November, FTX filed for bankruptcy November 11), so this meant that none of my contacts at FTX were there to facilitate the return of the escrow, and there was presumably enough chaos for multiple weeks that the escrow's company's attempt to reach out to North Dimension Inc. at their usual address and contact information were unsuccessful. After a few weeks the escrow company asked Lightcone for advice on how to return the funds and we gave them the contact information we had. 

Can you clarify when you received these grants and the escrow money? The complaint filed by FTX (documents here, for anyone interested) have the dates of transfers as March 3, July 8, July 13, August 18, September 20, and October 3, all in 2022—so well within the timeframe that might be subject to clawbacks, and well within the bankruptcy lookback period. (For a comparison point, EV US and EV UK paid the FTX estate an amount equal to all the funds the entities received in 2022.)

Yes, the rough timeline here is accurate (I didn't double check the exact dates and am not confirming that in detail here). All the funds were received in 2022.

Why would you not proactively return this money or settle with the FTX estate, given the money came from FTX and could have been originally obtained in fraudulent ways? My prior is that you (Oliver Habryka) have written multiple times on the Forum about the harm EA may have caused related to FTX and wish it could have been prevented, so somehow it seems strange to me that you wouldn't take the opportunity to return money that came from FTX, especially when it could have been obtained in harmful, unethical ways. 

Well, the key problem was that by the time FTX went bankrupt, and it became clear there was a lot of fraud at FTX, the money had been spent or committed in contracts, there wasn't much opportunity left to return the funds. Indeed, by early 2023 when the liabilities from our renovation project had cleared and everything was paid, Lightcone had completely ran out of money and indeed was financially in the red until around Q3 2023.

I did fundraise explicitly for money to return to the FTX creditors during our 2023 fundraising, from both the Open Philanthropy project and the Survival and Flourishing Fund, our two biggest funders. Open Philanthropy declined to give us any funds for settlement or return purposes. SFF didn't explicitly tell us whether the money they gave us was for settlement or return purposes, but we only received barely enough money from them during the 2023 grant round to cover our existing liabilities (and the settlement we offered FTX in a settlement was greater than the amount I think one could conceivably say we fundraised for it). 

If Lightcone had been in a position to return funds proactively I likely would have done it. 

Did you in fact ignore FTX's attempts to contact you in 2023, as the complaint says? And if so, why?

Yes, or like, some of them. The central reason here was just that everyone I talked to told me to get representation by a lawyer before talking to FTX, since given that the funds had already been spent, there was a quite high chance there would be some kind of suit or more complicated settlement. 

I decided I would be very thorough in my choice of lawyer due to the high stakes, and so I took a long time (multiple months IIRC) interviewing different bankruptcy lawyers. During that time I asked every lawyer I interviewed about how we should respond to the FTX communications. I think literally every lawyer said that we should wait on responding, on the basis of there still being a huge amount of uncertainty and lack of clarity about whether the FTX estate is actually in a position to settle these claims, and that before that issue is cleared, there wouldn't be much use in talking to them and all information I gave them would be used against me. 

I now honestly think the lawyer's advice to not respond was kind of a mistake (and more broadly think that the type of excuse of "my lawyer told me so" is a bad excuse for immoral behavior in-general, though I personally don't feel that much guilt about my decision-making process here, since it is a very high-stakes situation, there was consensus among many lawyers I talked to about this, and I did not have any experience whatsoever in navigating legal situations like this).

I also think it's worth pointing out that in bankruptcy cases, especially regarding clawbacks, the question of whether you have a legal obligation to return the money isn't a question of whether you currently have the $5M of FTX money sitting around or whether you've already allocated or used it. Demonstrating that you've spent the funds on legitimate charitable activities might strengthen your case, but that doesn't guarantee protection from clawback attempts.

Yep, I am well aware. My current take is that bankruptcy law is kind of broken here, and indeed, there are multiple judges who upon delivering judgements against nonprofits that seemed unfair even to them (but where the bankruptcy law gave them little choice) have called for bankruptcy law to be changed to be more protective of nonprofits here.

Jason @ 2024-06-20T17:40 (+11)

The legal situation for nonprofits is unfortunate, but I think the potentially workable patches wouldn't help an org in Lightcone's shoes very much. IIRC, one state shortened its look back period for charities after many of them got burnt in a fraud.

But all these transfers were within ~7 months. Most of us would prefer our monies go to charity rather than our creditors, so a superfast look back period would incentivize throwing tons of money to charity once the business or person realized the ship was gonna sink.

Protection based on a donor's good faith wouldn't help. Protection up to a percentage of profits wouldn't help given FTX claimed tons of losses on its taxes. Protection based on consistency with a well-established pattern of giving from that donor wouldn't help.

Equitably, my general take in these situations is that the charity got some value toward its charitable out of the expended donation (although perhaps not the full dollar value). The victims got $0 out of the transaction. So I'd be hesitant to endorse any reforms that didn't produce some meaningful recoveries for victims in a case like this.

Habryka @ 2024-06-20T17:48 (+2)

(I have lots of takes here, but my guess is I shouldn't comment. Overall, agree with you that it's a tricky situation of the law. I disagree that there aren't small changes that would help. For example, I think if the Religious Liberty and Charitable Donation Protection Act of 1998 could have considered foundations or corporations as part of its definition of "natural person", that would have been a substantial improvement. But again, I sadly can't comment here much, which I find really annoying, also in parts because I find this part of the law quite fascinating and would love to talk about it)

Jason @ 2024-06-21T01:44 (+15)

We may not disagree: I had specific elements of Lightcone's situation in mind when I said "help an org in Lightcone's shoes very much." That situation is unfortunately not rare, given the charities that ended up with Madoff-tainted money and Petters-tainted money.

So in that context, the RLCDAP amendments to 11 USC 548 won't help a charity with a SBF/Madoff/Petters-type problem because they don't protect charities where the debtor had an "actual intent to hinder, delay, or defraud" creditors under (a)(1)(A). Another reason a small fix might not help here: If Congress were to extend RLCDAP protections to corporations, it would need to decide how big the safe harbor should be. Although RLCDAP gives individuals some room to play bad-faith games, that room is usually fairly limited by the usual relationship between individual's incomes and assets. I don't think it would be reasonable to protect nearly as much as FTXFF was handing out under the circumstances. Whatever formula you choose, it has to work for low-margin, high-volume companies (think grocery stores) as well as tech-like companies.

I would have to think more about the extent to which -- at least where large donations are involved -- strong protection should be dependent on the existence of an acceptable comprehensive audit of the company-donor. Where that isn't the case, and the donations are fairly large, I might focus relatively more on education of the nonprofit sector about the risks and relatively less about writing them an insurance policy on the creditors' backs.

In part, I think I'm much more accepting of charitable donations by insolvent individuals than insolvent corporations. A decent number of individuals are insolvent; I certainly would not suggest that they should not donate to charity and instead have some sort of ethical duty to run their lives in a way that maximizes creditor recoveries. In contrast, I am more willing to assign an insolvent corporation has much more rigorous duties to creditors and so am considerably more willing to call out dissipation of assets away from creditors.

Habryka @ 2024-06-21T03:43 (+2)

I mean, I would really love to discuss this stuff with you, but I think I can't. Maybe in a year or so we can have a call and discuss bankruptcy law.

Jason @ 2024-06-21T12:19 (+4)

Yeah, I agree with that. Mainly, I think I want to signal to the audience that the situation in which orgs find themselves reflects thorny policy tradeoffs rather than a simple goof by Congress. Especially since the base rate of goofs is so high!

Jason @ 2024-06-20T16:03 (+2)

Are you able to say whether the other relevant defendants -- CFAR and Rose Garden LLC -- also made offers, or whether accepting LI's offer would have required the estate to surrender its claims against them?

I'm obviously not going to get into the legal side, but your comment hints at various ethical or equitable arguments for why ~17.5 cents on the dollar was a fair offer for the estate's claim. To the extent it would be a global settlement, LI's own ability to pay the potential judgment seems of little relevance without additional relevant facts.

Given litigation, I will obviously understand and not draw adverse inferences if you decide not to answer.

Habryka @ 2024-06-20T16:19 (+4)

I don't think I can comment on this because it risks breaking legal privilege, though I am not confident (also, sidenote, I really really hate the fact that discussing legal strategy in the US risks breaking privilege, it makes navigating this whole situation so much worse).

As a relevant clarification: Lightcone Infrastructure is a fiscally sponsored project of CFAR. In-general FTX has directed all of its communications at CFAR, and made no distinction between CFAR and the fiscally sponsored projects within it. 

Jason @ 2024-06-20T17:02 (+2)

Makes sense -- I wouldn't have even asked about possible details if you hadn't mentioned the settlement offer.

The complaint sues a Lightcone Infrastructure, Inc., "a Delaware non-profit corporation with its principal place of business at 270 Telegraph Avenue, Berkeley, California, 94705." Am I correct in thinking that, as of 10/13/22, Lightcone now possesses a separate corporate existence (which many fiscally sponsored projects do not)?

Habryka @ 2024-06-20T17:32 (+6)

Lightcone Infrastructure Inc. has so far never done anything. It's a nonprofit that I incorporated with the intent of being a home for future projects of mine, but doing anything with it was delayed because of the whole FTX thing. The most real thing it has done is me depositing $50 in its bank account.

Lukas_Gloor @ 2024-06-20T12:41 (+13)

I agree the article was pretty bad and unfair, and I agree with most things you say about cancel culture.

But then you lose me when you imply that racism is no different than taking one of the inevitable counterintuitive conclusions in philosophy thought experiments. (I've previously had a lengthy discussion on this topic in this recent comment thread.)

If I were an organizer of a conference where I wanted having interesting and relevant ideas being discussed, I'd still want there to be a bar for attendees to avoid the problem Scott Alexander pointed out (someone else recently quoted this in this same context, so hat tip to them, but I forget the name of the person): 

The moral of the story is: if you’re against witch-hunts, and you promise to found your own little utopian community where witch-hunts will never happen, your new society will end up consisting of approximately three principled civil libertarians and seven zillion witches. It will be a terrible place to live even if witch-hunts are genuinely wrong.

I'd be in favor of having the bar be significantly lower than many outrage-prone people are going to be comfortable with, but I don't think it's a great idea to have a bar that is basically "if you're interesting, you're good, no matter what else."

In any case, that's just how I would do it. There are merits to having groups with different bars.

(In the case of going for a very low one, I think it could make sense to think about the branding and whether it's a good idea to associate forecasting in particular with a low filter.)

Basically, what I'm trying to say is I'd like to be on your side here because I agree with many things you're saying and see where you're coming from, but you're making it impossible for me to side with you if you think there's no difference between biting inevitable bullets in common EA thought experiments vs "actually being racist" or "recently having made incredibly racist comments."

I don't think I'm using the adjective 'racist' here in a sense that is watered down or used in an inflationary sort of way; I think I'm trying to be pretty careful about when I use that word. FWIW, I also think that the terminology "scientific racism" that some people are using is muddling the waters here. There's a lot of racist pseudoscience going around, but it's not the case that you can say that every claim about group differences is definitely pseudoscience (it would be a strange coincidence if all groups of all kinds had no statistical differences in intelligence-associated genes). However, the relevant point is group differences don't matter (it wouldn't make a moral difference no matter how things shake out because policies should be about individuals and not groups) and that a lot of people who get very obsessed with these questions are actually racist, and the ones who aren't (like Scott Alexander, or Sam Harris when he interviewed Charles Murray on a podcast) take great care to distance themselves from actual racists in what they say about the topic and what conclusions they want others to draw from discussion of it. So, I think if someone were to call Scott Alexander and Sam Harris "scientifically racist," then that seems like it's watering down racism discourse because I don't think those people's views are morally objectionable, even though it is the case that many people's views in that cluster are morally objectionable.

Rían O.M @ 2024-06-20T05:28 (+4)

is filled with bizarre factual errors, one of which was so egregious that it merited a connection.

Small nitpick; this is a typo or 'connection' is something I'm not familiar with in this context.