Who would you have on your dream team for solving AGI Alignment?

By Greg_Colbourn @ 2022-08-25T13:34 (+10)

Any living person (or list of people). Assume they can be persuaded that the problem of existential risk from AI is real and important.


Jack R @ 2022-08-25T18:53 (+8)

My primary reaction to this was "ah man, I hope this person doesn't inadvertently annoy important people about AI safety being important, hurting the reputation of AI safety/longtermism/EA etc"

Greg_Colbourn @ 2022-08-25T21:50 (+3)

:( As far as I know, no one from EA has annoyed  (randomly emailed) Terry Tao about it, despite many people saying he would be a great person to have on board.

Obviously I'm not in favour of random EAs annoying important people (and hurting the reputation of EA/AI Alignment), but I do think given the urgency of the situation we are in, at some point, some high up people in EA/AI Alignment have to make some serious attempt at putting together such a dream team (more).

harfe @ 2022-08-26T21:59 (+1)

You are probably aware, but someone recently drafted an email and intended to send it, but was convinced not to send the email.

Greg_Colbourn @ 2022-09-07T09:43 (+3)

Yes, I think the fact that they didn't go through with it is some evidence that such a list need not be counterproductive to our goal (and the EV is probably positive). Ultimately the Dream Team needs to be approached, but I'm optimistic that this can be done in a careful and coordinated manor by the relevant senior people in EA/Alignment.

MatthewDahlhausen @ 2022-08-25T15:39 (+8)

One design ideation method is instead of trying to think of good ideas, try to think of the worst possible idea.

With that in mind, encourage the writers of "It's Always Sunny in Philadelphia" to do an episode "The Gang Solves AGI Alignment".

JulianHazell @ 2022-08-25T14:08 (+8)

I feel like this question is so much more fun if we can include dead people, so I’m gonna do just that.

Off the top of my head:

Peter Wildeford @ 2022-08-30T03:23 (+5)

Here's what GPT3 thinks:

Greg_Colbourn @ 2022-09-07T09:34 (+4)

No surprises there (although a bit surprised that GPT-3 doesn't know that Alan Turing is dead, and can't spell Eliezer).

Peter Wildeford @ 2022-09-08T18:55 (+3)

Actually that was me who misspelled Eliezer ugh

Greg_Colbourn @ 2022-09-07T09:40 (+4)

A bit sad that no one has actually answered the object level question and nearly all the discussion is meta. I can understand why. But I also think that we are at crunch time with this, and the stakes are as high as they can be. So this is actually a very serious question that serious people should be considering. Maybe (some) people high up in EA are considering it. I hope so!

jskatt @ 2022-09-12T04:45 (+3)

I think the question is basically "who are the most talented researchers in fields at least vaguely related to AI?" The EA community is probably not the best group for answering this question. But it's an important question for sure!

jskatt @ 2022-09-12T04:40 (+3)

Any standard list of "top AI researchers" will do. Also look at top researchers in CS, math, stats, physics, philosophy (note the new CAIS philosophy fellowship as an example of how you might attract people from other fields). Edward Witten comes to mind. But you'll get better answers if you ask professors within these subjects or even turn to Reddit, Quora, etc.

Greg_Colbourn @ 2022-11-02T20:24 (+2)

Some ideas for identifying dream team members:

Phil Tanny @ 2022-08-25T14:30 (+2)

Hmm.....   Who are the leading thinkers/speakers who argue we should not further develop AI?  Such folks would not need to be persuaded, and would perhaps be willing to consider the full range of options.  

People who have invested heavily in AI careers are not likely to be receptive to proposals which don't include the continuation of AI development, that is, not open to the full range of options.

One way to solve AI alignment would be to stop developing AI.   I know, very challenging, but then so are all other options, none of which would seem to offer such a definitive solution.