Launching Foresight Institute’s AI Grant for Underexplored Approaches to AI Safety – Apply for Funding!

By elteerkers, Allison Duettmann @ 2023-08-17T07:27 (+48)

Summary 

We are excited to launch our new grant programme that will fund areas that we consider underexplored when it comes to AI safety. In light of the potential for shorter AGI timelines, we will re-grant $1-1.2 million per year to support much needed development in one of the following areas: 

  1. Neurotechnology, Whole Brain Emulation, and Lo-fi uploading for AI safety
  2. Cryptography and Security approaches for Infosec and AI security
  3. Safe Multipolar AI scenarios and Multi-Agent games

Apply for funding– to be reviewed on a rolling basis.

See below, or visit our website to learn more! 

Areas that We’re Excited to Fund 

Neurotechnology, Whole Brain Emulation, and Lo-fi uploading for AI safety

We are interested in exploring the potential of neurotechnology, particularly Whole Brain Emulation (WBE) and cost-effective lo-fi approaches to uploading, that could be significantly sped up, leading to a re-ordering of technology arrival that might reduce the risk of unaligned AGI by the presence of aligned software intelligence. 

We are particularly excited by the following: 

Cryptography and Security approaches for Infosec and AI security

To explore the potential benefits of Cryptography and Security technologies in securing AI systems. This includes:

Safe Multipolar AI scenarios and Multi-Agent games

Exploring the potential of safe Multipolar AI scenarios, such as:

Interested in Applying? 

We look forward to receiving your submissions. Applications will be reviewed on a rolling basis–apply here

For the initial application, you’ll be required to submit:

We will aim to get back to applicants within 8 weeks of receiving their application. 

If you are interested then please find more information about the grant here