<aside> <img src="/icons/push-pin_red.svg" alt="/icons/push-pin_red.svg" width="40px" />
We are seeking graduate students, faculty, and professionals to advise research, engineering, or governance projects in AI safety. If you are interested, please fill out our project proposal form.
</aside>
Table of Contents
SPAR is an opportunity to run a research, engineering, or governance project related to AI safety. Since AI safety is a nascent field, many students do not have opportunities at their institution to contribute to relevant projects. Meanwhile, there are plenty of ideas and research directions that have yet to be fully explored. This program enables teams of people from across the world to collaborate on projects in AI safety. The primary value of this program is to facilitate project formation by centralizing both the project proposal and application process.
As a project mentor, you will direct or advise a team working a project you propose. SPAR program organizers will promote your project, handle the bulk of administrative work, and manage the application process, while you will have complete control over who you work with and how you manage your project. For technical projects, we envision CS/ML graduate students and academics, as well as technical staff in the AI industry, as good fits to be mentors. For governance projects, we envision social science graduate students and academics, as well as policy researchers in government, non-profits, and think tanks as good fits to be mentors. However, we’re willing to accept project proposals from anyone who is legibly qualified to lead the project they suggest.
We expect most projects to last a minimum of 3 months. Your involvement in the project does not have to be intensive; we require only 1 hour per week from mentors as a minimum. You are free to be heavily involved in managing and contributing to the project, or you may take a consultative approach and allow students to manage the project among themselves. Your team can consist of as many or as few students as you’d like.
**Project proposals open `March 28`**
**Project proposals due `April 26`**
**Mentee applications open `April 29`**
**Mentee applications due `May 25`**
**Projects start `June 14`**
**———————————————————**
**Spring Midterm report due `April 12`**
Spring Midterm feedback report due **`April 15`**
**Earliest Spring projects finish `May 20`**
**Spring Final report due** **`May 31`**
Midterm feedback report due **`April 12`**
(Tentatively) Poster session **`June 1`**
Final feedback report due **`June 3`**
We are excited to host a broad range of projects on AI safety. There are a few guidelines we expect projects to follow, listed below. By default, we expect to host and promote every project proposal that is submitted (i.e. there is no “application process” to be a SPAR mentor), but we’ll privately reach out to mentors whose projects do not meet these criteria:
Project topics must be address catastrophic risk from AI. They may involve alignment research, governance research, or safety engineering.