Statement on Superintelligence - FLI Open Letter
By plex @ 2025-10-22T22:27 (+46)
This is a linkpost to https://superintelligence-statement.org/
We call for a prohibition on the development of superintelligence, not lifted before there is
- broad scientific consensus that it will be done safely and controllably, and
- strong public buy-in.
Sign on the main website or Eko[1].
- ^
Looks like the latter is lower checking/privacy preserving, and gets added to the total figure?
Greg_Colbourn ⏸️ @ 2025-10-23T14:02 (+9)
- There is widespread discontent at the current trajectory of advanced AI development, with only 5% in support of the status quo of fast, unregulated development;
- Almost two-thirds (64%) feel that superhuman AI should not be developed until it is proven safe and controllable, or should never be developed;
- There is overwhelming support (73%) for robust regulation on AI. The fraction opposed to strong regulation is only 12%.
[Source]. I imagine global public opinion is similar. What we need to do now is mobilise a critical mass of that majority. If you agree, please share the global petition far and wide (use this version for people who want their name to be public).