Our mission

By Jessica P. Wang (Director of SAIGE), April 2026
SAIGE (Safe AI Germany) is a national research and field-building initiative, started in January 2026. We believe the development of advanced AI will likely be one of the most consequential events in history and Germany has a critical role to play. Germany’s world-class talents are critical to the global effort of reducing catastrophic risks brought by artificial intelligence. SAIGE exists to make that contribution possible. We provide our incubator program, resources, professional support, and events, to help German researchers, engineers, and policymakers work on ensuring AI goes well.

Note: SAIGE was entirely self-funded by its director (me) until March 15th, 2026, when we were granted $80k by the AI Safety Tactical Opportunities Fund (AISTOF). We are extremely grateful for this opportunity to let SAIGE run its initial journey! At the meantime, we are looking for further support to scale our activities even further. If you like what we have been doing so far and have any funding leads, please contact me at info@safeaigermany.org. If you don't like what we have been doing so far or have any feedback, please also contact me at info@safeaigermany.org.

A Summary of SAIGE

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 69 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 22 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding. Moreover, we received in total of 226 mentee applications and 142 of those progressed to Stage 2 of the selection process. We are still in the mentee selection process and looking forward to seeing who makes it to the final cohort!

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance. See our Pivot Track for more details.

    In addition, we are also collaborating with Successif for workshops on how to transition one's career into AI Safety, such as this.


We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Theory of Change / What is the plan and how would it lead to our desired outcome?

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Currently, the path of least resistance for high-potential German talents is to swarm into standard industry roles. Our theory of change is focused on expanding AI Safety talents by redirection.

A link to our Theory of Change diagram can be seen here. Note that “Sufficient funding” is still pending at the current time of writing.

We define a successful “AI Safety role” outcome to include either of the following: 

  • Employment: Full-time permanent positions, short-term fixed positions, or  project-based contractor positions at established labs and organisations (e.g. MATS fellowship);

  • Entrepreneurial roles: founding new AI Safety initiatives or non-profits;

  • Civic & ecosystem contribution: High-impact pro-bono work such as advising policymakers or giving educational talks.

Note: Since SAIGE is just starting its journey, although we have plenty of activities listed in our Theory of Change, it is necessary to determine which ones we are prioritising first, according to our goal. See the below planned activities for more details.

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Our Activities

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Due to funding constraints, we separate our activities into two phases. The ones in Phase I are already carried out. These include activities which are relatively low-budget. Phase II activities would mean scaling and instutionalisation, which would be contingent on funding.

  • Phase I:
    - The SAIGE incubator program,
    - Pivot Track for career professionals,
    - low-budget online events, and
    - basic infrastructure support for local groups. We are currently supporting new local groups being setup in Frankfurt, Bonn and Nuremberg.

  • Phase II:
    - In-person events/retreats, incl. national retreat for local leaderships every 6 months, to provide feedback to each other and to SAIGE,
    - SAIGE Day,
    - in-person hackathons (already agreed collaboration with Apart Research),
    - deployment of a centralised tech stack to relieve local organisers of administrative burdens.

Depending on capacity, in Phase II, we could also include events which would likely add to our outreach but are not currently in our priority list, such as an introductory course partnered with AIS Collab to fit into the German semester dates, and also establishing a weekend-intensive program for career professionals to suit more to their schedule and capacity for time commitment. They are currently not listed in Phase I, since the incubator program already aims to include an introductory course, and we currently do not know the exact, quantitative impact of such a program. However, if we gain positive results and receive sufficient funding, we will consider these as well in Phase II. 

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Our Team (but in more detail)

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

One can see the "our team" page for information on who are in our core team and who are our board advisors. Below is a list containing more information on everyone.

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Core Team

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

  • Jessica P. Wang, Director

    Background:
    Educational background in mathematics. Worked at Epoch AI to develop and later co-organised the FrontierMath project. Specifically, as their Outreach Coordinator to source talents to Tier 4, and co-organised the 2025 FrontierMath Symposium, held at Constellation. Top 9 global contributor to Humanity's Last Exam. Previously worked as a reviewer for the $18 million AI for Math Fund at Renaissance Philanthropy. Also as the Global Operations Analyst at Calastone, the largest global funds network. Worked at the International Mathematical Olympiad as the sole official photographer, with 1300+ attendees. Also was the President of the Durham University Maths Society, and the Ambassador for the Institute of Physics. 

    Responsibilities:
    Oversees the overall progress, design, and execution of activities. Communicates with existing and potential collaborators to ensure activities are carried out smoothly. Also responsible for outreach, fundraising and the entire website.

  • Manon Kempermann, Tech Lead

    Background: Educational background in data science and artificial intelligence. Founder of AI Safety Saarland. Currently writing a thesis at Max Planck Institute for Software Systems on red-teaming for misalignment in AI agents. A Pathfinder mentor at Kairos. Organised AI Safety events including a talk with Anthropic containing 300+ attendees. Also works as a research assistant at the Interdisciplinary Institute for Societal Computing. Current research focuses on context-sensitivity in AI safety evaluations. Presented at IASEAI26 in Paris her paper, "Challenges of Evaluating LLM Safety for User Welfare".

    Responsibilities: Works with the Director on the nationwide rollout of the Interdisciplinary Research Incubator model, adapting the successful AIS Saarland framework for a much broader German context. Oversees the strategic pairing of technical mentors with participants to maximise research output.

  • Jessie Kelly, Governance Lead

    Background: Educational background in law. Designed and implemented realignment programs and national policies for governments, including the Australian Government. Over 15 years’ experience in helping governments with new programs and policies, including analysis of technological trends. Along with SAIGE, she is currently working on a project with the UN and a scientific institute to consider what the ground rules for AI Governance in agriculture should be. She has previously worked with Australia’s national science agency (CSIRO), the Australian Embassy in Berlin & the German Red Cross.

    Responsibilities: Oversees and manages the AI Governance track of the SAIGE Research Incubator. Identifying high-quality mentors and helping governance research fellows progress in their projects and careers. Works with the Tech Lead and the Director to ensure the SAIGE incubator runs smoothly.

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Board Advisors

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.

Since our core team has its potential weakness of being relatively new in AI Safety, we are very grateful to have a list of experts across different fields, to help us make good judgement calls in our decisions (including but not restricted to: mentorship project review for our incubator, advising on program management, leadership structure, etc.):

Leadership advisor:
Has regular calls with the Director to provide feedback, and to ensure SAIGE's activities are aligned with the bigger AI Safety ecosystem. Also makes sure that the planned activities are reasonable given the range of capacities within the core team.

Operations advisors:
Advise the core team on the practical execution and logistical planning of SAIGE's activities. Provide concrete guidance when operational uncertainties arise, such as determining the optimal format for programs or advising on resource allocation.

Technical advisors:
Advise the core team on the technical direction of SAIGE's initiatives, drawing on years of in-depth experience in AI alignment. Provide expert evaluation of technical project proposals for the incubator to ensure mission alignment, identify the most critical and relevant AI Safety topics for today's ecosystem, and resolve any technical uncertainties the core team encounters.

Governance advisors:
Analogous to the role of technical advisors but for the governance / technical governance directions of SAIGE's initiatives.

While we are proud of the traction our Incubator and Pivot Tracks have already achieved (+ nearly 300 registrations to our launch event), this is only the beginning of Phase I. The window to positively shape transformative AI is narrow, and leaving Europe’s top talents on the sidelines is a systemic failure we can no longer afford. Whether you are someone interested in exploring AI Safety, a professional looking to pivot your career, an expert willing to mentor the next generation, or a funder ready to help us scale our activities, please join our activities and/or reach out!

We aim to address an urgent inefficiency in the current landscape: the shortage of people from Germany positioned to positively influence the trajectory of advanced AI development. In terms of geopolitics: Germany possesses the political and economic weight to influence in the EU AI ecosystem. For example, during the final stages of the EU AI Act, Germany acted as the ultimate swing vote while some major member states pushed back against the provisional agreement. This ensured the successful adoption of the legislation.

Speaking of technical talents alone: According to the Federal Statistical Office (Destatis), Germany holds the highest share of STEM Master’s degrees in the EU (35%), significantly outperforming the EU average of 25%. Moreover, Germany possesses a world-class engineering sector, together with an annual approx. 300,000 students in STEM (source), >110,000 students in Law (source); Yet global capacity in technical safety and governance remains critically limited. We see a massive structural bottleneck in the local ecosystem: virtually none of this top-percentile talent is funneled to AGI safety. Instead, this hidden reserve of industrial experts flows almost exclusively into traditional roles (e.g. mechanical engineering with 1.3 million employees), simply because they lack the context and infrastructure to apply their skills to AGI safety.

Our mission is to build the centralised infrastructure required to bridge this gap. We are moving beyond volatile student initiatives to create a stable national organisation that supports both groups through:

  • Upskilling: We have launched our inaugural SAIGE incubator program, providing coverage for cities that currently lack local hubs. This ensures high-potential students and professionals have a clear path into the field. We received 66 mentorship applications for the Spring 2026 cohort, but due to capacity (since we only started in January 2026!), we could only include 21 of them (acceptance rate ≈ 32%). This was by no means an easy decision. The project reviewing process was done with the help of our board advisors, each of whom are experts within their fields, including technical/governance research, communication and fieldbuilding.

    Together with the incubator program, we are also organising events such as discussions on AI middle powers, networking meet-ups and talks from global experts, for our community to gain up-to-date information and network opportunities in AI Safety.

  • Career support: For career professionals, we have partnered with High Impact Professionals and Impact Academy to provide network and career guidance.