What's going on in video in AI Safety these days? (A list)
By ChanaMessinger @ 2025-09-15T20:30 (+55)
My write-up as of September 2025 - feel free to ask me for a more updated Source of Truth if you're interested in this space - and please let me know what I’m missing!
Key Players Making Video:
Organization
- Palisade Research:
- Does research on AI for policymakers and communicators.
- Has started its own video program - check out their videos! Run by Petr Lebedev, an ex-producer/scriptwriter for Veritasium
- Future of Life Institute (FLI): Has an accelerator program to fund creators making AI-related videos.
- AI Digest: Produces explainers on AI and as far as I know might be interested in someone running a video program for them
- Control AI is getting great clips from podcasts and putting them up on Twitter. I've suggested that they should create a spreadsheet of these so everyone can use them in their videos.
- "Fancier Crowd" that I’m going to be vague about: People with experience in Netflix pitches and successful political social media campaigns are becoming more involved in communicating AI safety through video, often independent of EA organizations
- Conor Axiotes: making an AI documentary
- Michael Trazzi: Made the SB 1047 Documentary and makes TikToks
- 80,000 Hours Video Program (AI in Context):
- Two full-time people (Chana Messinger, Aric Floyd) and contractors.
- Our main effort is the YouTube channel "AI in Context."
- Eventually we might look to expand - more videos, more channels, maybe pitching to streaming services
- Scriptwriting is our current biggest bottleneck
- We contract for production and film people (lights, camera work, sound)
- 80,000 Hours podcast is released as videos
Organizations that might be interested in doing more video
- Giving What We Can
- Seismic Foundation: aims to raise public awareness and engagement with AI safety through broad-reach, relatable content for targeted groups of the general public, such as series and documentaries. They're also starting up a content engine - they're looking for creator partnerships, amplifying content or ideas from other AI Safety organizations and they're on the hunt for a host / talent for some of their projects.
Youtubers / Tiktokkers
Not a complete list of people in this space
- Rob Miles
- Siliconversations: A channel that has received funding from the Future of Life Institute (FLI), makes AI Safety videos
- John Leaver: Runs Digital Engine, Pindex, and Inside AI; has been very helpful and is looking to start a new channel, often seeking hosts.
- Drew Spartz runs a very successful YouTube channel on AI Safety, had a multi-million view video on AI 2027
- Rational Animations
- Mithuna Yoganathan, a physics YouTuber, recently made a video on AI 2027
- Computerphile posted a lot of videos about AI Safety
- Spencer Greenberg is ramping up making YouTube and shortform videos and has a team of people working with him
- Zac of the Neural guide pivoted recently to talking about AI Safety
- There are also points of connection to CGP Grey, Veritasium and Kurzgesagt
- Kris Gasteratos is making excellent videos about cultivated meat; I think there’s a lot he’s learned that I’d like to learn from
- (Copied from Matrice's excellent comment below): There's a decent amount of French-speaking ~AI safety content on YouTube:
- @Shaïman Thürler's channel Le Futurologue
- @Gaetan_Selle 🔷 's channel The Flares
- @len.hoang.lnh's channel Science4All and Thibaut Giraud's channel Monsieur Phi, the two channels cited by 9 of the 17 people citing a YouTube channel as where they first heard of EA in the 2020 EA Survey
- The CeSIA advised for the production of this video, which reached nearly five million views by the time I am writing this (i.e. plausibly >10% of the French adult population)
- The CeSIA also gave an interview to popular scientific-skeptic channel La Tronche en Biais, which has expressed interest in posting more on the topic of AI safety more recently
- David Louapre, who runs Science Étonnante, one of the most popular French-language science channels, has just this week announced pivoting to working as an AI safety researcher at Hugging Face, so it's possible more will come from that direction too
Filmmakers
- Kate Hammond, of https://www.softcutfilms.com/ is making an AI Safety Documentary
- Elizabeth Cox, of Should We Studios, producing Ada, about pressing problems
- Other points of connection to Hollywood
Retreats / coordination / upskilling
- Mox populi
- I've heard about two other people who are considering putting on coordination retreats for people interested in this space.
Lots of people are interested in working on this / helping out:
e.g.
- Another ex-big channel producer/scriptwriter might be interested in getting involved
- We know several people from the ex-Veritasium crowd
- Some of the people who worked on Michael Trazzi’s documentary
- People in the “EA Youtuber” WhatsApp group chat
- I know of two videographers interested in x-risk looking for work
- Someone who’s done videography on both sides of the camera and has done scriptwriting before and is poking around the space
- Someone making tools for video creators
- And others!
- I also have a long list of folks from the expression of interest forms I have out
Links where you can find more information
- Where can I find videos about AI safety? (tries to be quite thorough and is a great resource, but didn't satisfy my personal use case for the kind of video content that is becoming more common now, with more urgency and higher production values)
- How cost-effective are AI safety YouTubers? — EA Forum
Reach out if you'd like to be involved, have experience in video, or know people who do!
Matrice Jacobine @ 2025-09-16T14:29 (+12)
There's a decent amount of French-speaking ~AI safety content on YouTube:
- @Shaïman Thürler's channel Le Futurologue
- @Gaetan_Selle 🔷 's channel The Flares
- @len.hoang.lnh's channel Science4All and Thibaut Giraud's channel Monsieur Phi, the two channels cited by 9 of the 17 people citing a YouTube channel as where they first heard of EA in the 2020 EA Survey
- The CeSIA advised for the production of this video, which reached nearly five million views by the time I am writing this (i.e. plausibly >10% of the French adult population)
- The CeSIA also gave an interview to popular scientific-skeptic channel La Tronche en Biais, which has expressed interest in posting more on the topic of AI safety more recently
- David Louapre, who runs Science Étonnante, one of the most popular French-language science channels, has just this week announced pivoting to working as an AI safety researcher at Hugging Face, so it's possible more will come from that direction too
ChanaMessinger @ 2025-09-16T14:54 (+2)
Oh, nice!
Jian Xin Lim🔹 @ 2025-09-17T17:17 (+4)
On CGP Grey, he has 6.8M YouTube subs and seems to get the alignment concerns. He recently conveyed the alignment risk in this episode of his tech podcast (Revisiting Humans Need not Apply):
"AI is more like biological weapons because they can act autonomously and evolve beyond what you built. Nuclear bombs don't walk out of factories on their own, pathogens do."
Might be worth someone reaching out about e.g. sponsorship.
Michaël Trazzi @ 2025-09-16T10:36 (+4)
Glad you're working with some of the people I recommended to you, I'm very proud of that SB-1047 documentary team.
I would add to the list Suzy Shepherd who made Writing Doom. I believe she will relatively soon be starting another film. I wrote more about her work here.
ChanaMessinger @ 2025-09-16T14:54 (+2)
Thank you!
alex lawsen @ 2025-09-16T06:19 (+4)
FAR AI posts recordings of talks from the events they organise on YouTube.
ChanaMessinger @ 2025-09-16T14:55 (+3)
IMO that's a different category - there's a lot of that kind of thing as well and I'm glad it exists but I think it's useful to separate out.