Center for AI Safety

Center for AI Safety (CAIS) is a research and advocacy organization focused on reducing societal-scale risks from artificial intelligence. CAIS conducts technical research on AI safety, hosts compute clusters for safety research, and published the widely signed Statement on AI Risk comparing AI risk to pandemics and nuclear war.

← Entity Graph