ARC (Alignment Research Center)
ARC (Alignment Research Center) is an AI safety organization focused on developing evaluations for AI model alignment and capabilities. ARC conducts evaluations for frontier AI labs to assess whether models can autonomously perform dangerous tasks. Founded by Paul Christiano, former head of alignment at OpenAI.