1,000+ Opportunities
Find the right grant
Search federal, foundation, and corporate grants with AI — or browse by agency, topic, and state.
This listing may be outdated. Verify details at the official source before applying.
Find similar grantsSurvival and Flourishing Fund AI Safety Grant is sponsored by Survival and Flourishing Fund. Supports organizations working to improve the long-term prospects for humanity, with AI safety as a core focus.
Get alerted about grants like this
Save a search for “Survival and Flourishing Fund” or related topics and get emailed when new opportunities appear.
Search similar grants →Extracted from the official opportunity page/RFP to help you evaluate fit faster.
Survival and Flourishing Fund SFF has organized ~$152MM in philanthropic gifts and grants. The S-Process is the algorithm and recurring meeting procedure we follow once or twice per year to produce grant recommendations to our participating Funders.
Speculation Grants are an additional type of grant we make, using an experimental procedure for making decisions more quickly than the S-Process, that is retroactively funded by recommendations from the S-Process. Matching Pledges are commitments made by Funders as part of an S-Process round to match outside donations to a recipient at some rate (e.g. 2-to-1), up to the pledged amount.
The Initiative Committee is a small group, made up of Jaan Tallinn, SFF Advisors, and 2-5 anonymous voters, that makes grants on its own initiative, without receiving or requiring proposals or applications. Frequently Asked Questions Got a question? We’re here to answer it!
SFF grant recommendations are made by a group of independent assessors whose membership changes regularly. The philanthropic priorities of SFF’s Funders play a guiding role (e.g. see Jaan Tallinn’s philanthropic priorities ) but assessors are free to choose to support applications outside of these priority areas and will sometimes do so.
SFF-2026 S-Process Grant Round Application Announcement SFF is announcing its 2026 grant round, featuring three new themed S-Process Grant Rounds in addition to the Main Round. We estimate $20MM–$40MM in funding will be distributed across all rounds and tracks.
SFF-2026 Application Announcement Survival and Flourishing Corp is growing their team to support the continued development and scaling of the grant evaluation software and processes used by SFF. Open positions include: Full-Stack Software Engineer - $250,000 - $350,000 All positions are fully remote with competitive benefits including health insurance, 401(k) matching, and 9 weeks of PTO per year.
Visit the SFC careers page to learn more about these opportunities and apply. SFF has finalized its grant recommendations for 2025: SFF-2025 Recommendations Announcement The total funding expected to be distributed in association with this round is $34. 92MM, exceeding our $10MM-$20MM estimate.
Survival and Flourishing Corp is growing their team to support the continued development and scaling of the grant evaluation software and processes used by SFF. Open positions include: Full-Stack Software Engineer - $250,000 - $350,000 All positions are fully remote with competitive benefits including health insurance, 401(k) matching, and 9 weeks of PTO per year.
Visit the SFC careers page to learn more about these opportunities and apply. SFF has finalized its grant recommendations for 2025: SFF-2025 Recommendations Announcement The total funding expected to be distributed in association with this round is $34. 92MM, exceeding our $10MM-$20MM estimate.
SFF’s new application-sharing agreement with Effective Institutions Project SFF is partnering with the Effective Institutions Project (EIP) to increase the funding pool available to SFF applicants by sharing applications to our SFF Funding Rolling Application with EIP. EIP anticipates recommending $5–15 million in grants to SFF applicants in 2025.
Learn more about this partnership here: SFF Application-Sharing Agreement with Effective Institutions Project SFF-2025 application announcement (round closed) SFF is announcing its 2025 grant round, here: SFF-2025 Application Announcement SFF’s new Matching Pledge Program Starting in 2025, SFF will be making some of its S-Process grant recommendations as Matching Pledges through the new Matching Pledge Program .
SFF Matching Pledges are commitments made by Funders of an S-Process round to match outside donations to a recipient at some rate (e.g. 2-to-1), up to the pledged amount. If you would like to apply for a Matching Pledge, you can complete the Matching Pledge portion of the SFF Funding Rolling Application to be considered for the program.
Job opening: Full-Stack Software Engineer (S-Process) Annual compensation: $250,000 - $350,000 SFF-2024 Flexible Hardware-Enabled Guarantees (flexHEGs) recommendations SFF has finalized its grant recommendations for the 2024 Flexible Hardware-Enabled Guarantees (flexHEGs) round: SFF-2024 flexHEGs Recommendations Announcement The total funding expected to be distributed in association with this round is $4.
1MM, in excess of our $1M-$4MM estimate. SFF has finalized its grant recommendations for 2024: SFF-2024 Recommendations Announcement The total funding expected to be distributed in association with this round is $19. 86MM, in excess of our $5MM-$15MM estimate.
SFF-2024 Flexible Hardware-Enabled Guarantees (flexHEGs) application announcement (round closed) SFF is launching an S-Process round that specifically targets proposals aiming to advance the technical maturity of flexHEGs, here: SFF-2024 flexHEGs Grant Round Application Announcement SFF-2024 application announcement (round closed) SFF is announcing its 2024 grant round, here: SFF-2024 Application Announcement SFF’s new Initiative Committee In 2024, in parallel with at least one The S-Process grant round, SFF will also be making some grants through another process called the SFF Initiative Committee .
The Initiative Committee makes grants on its own initiative, without receiving or requiring proposals or applications. If you would like to propose a grant for SFF to make, the S-Process combined with Speculations Grants are probably still the most efficient route.
SFF-2023-H2 recommendations SFF has finalized its grant recommendations for 2023-H2: SFF-2023-H2 Recommendations Announcement The total funding expected to be distributed in association with this round is $21. 29MM, in excess of our $9MM-$21MM estimate. This total includes $9.
62MM in grants organized by Lightspeed Grants , which Funder Jaan Tallinn requested to incorporate in our announcement.
SFF-2023-H2 application announcement (round closed) SFF is announcing its 2023-H2 grant round, here: SFF-2023-H2 Application Announcement SFF-2023-H1 recommendations SFF has finalized its grant recommendations for 2023-H1: SFF-2023-H1 Recommendations Announcement The total funding distributed was $21MM, higher than our $10MM estimate.
SFF-2023-H1 application announcement (round closed) SFF is announcing its 2023-H1 grant round, here: SFF-2023-H1 Application Announcement SFF-2022-H2 recommendations SFF has finalized its grant recommendations for 2022-H2: SFF-2022-H2 Recommendations Announcement The total funding distributed was $10MM, higher than our $8MM estimate.
SFF-2022-H2 application announcement (round closed) SFF is announcing its 2022-H2 grant round, here: SFF-2022-H2 Application Announcement SFF-2022-H1 recommendations SFF has finalized its grant recommendations for 2022-H1: SFF-2022-H1 Recommendations Announcement The total funding distributed was $8. 063MM, near the middle of our $5-$10MM estimate.
SFF-2022-H1 application announcement (round closed) SFF is announcing its 2022-H1 grant round, here: SFF-2022-H1 Application Announcement SFF-2021-H2 recommendations SFF has finalized its grant recommendations for 2021-H2: SFF-2021-H2 Recommendations Announcement The total funding distributed was $9. 609MM, near the middle of our expected $8-12MM estimate.
SFF-2021-H2 application announcement (round closed) SFF’s 2021-H2 grant round announcement can be found here: SFF-2021-H2 Application Announcement SFF-2021-H1 recommendations SFF has finalized its grant recommendations for 2021-H1: SFF-2021-H1 S-Process Recommendations Announcement The total funding distributed was $9. 756MM, at the high end of our expected $9-10MM estimate.
SFF-2021-H1 application announcement (round closed) SFF’s 2021-H1 grant round announcement can be found here: SFF-2021-H1 Application Announcement SFF-2020-H2 recommendations SFF has finalized its grant recommendations for 2020-H2: SFF-2020-H2 Recommendations Announcement The total funding distributed was $3. 625MM, above the high end of our expected $2. 5-$3MM estimate.
SFF-2020-H2 application announcement (round closed) SFF’s 2020-H2 grant round announcement can be found here: SFF-2020-H2 Application Announcement SFF-2020-H1 recommendations SFF has finalized its grant decisions for 2020-H1: SFF-2020-H1 S-Process Recommendations Announcement The total funding distributed was $1. 82MM, above the high end of our expected $0. 8MM-1.
5MM estimate. SFF-2019-Q4 recommendations SFF has finalized most of its grant decisions for 2019-Q4: SFF-2019-Q4 S-Process Recommendations Announcement The total funding distributed was $2. 01MM, at the high end of our expected $1MM-$2MM estimate.
Survival and Flourishing Fund is a “virtual fund”: we organize application submission and evaluation processes to help donors decide where to make donations. Our goal is to bring financial support to organizations working to improve humanity’s long-term prospects for survival and flourishing. We use this website to host announcements about our plans to investigate grant-making opportunities.
SFF was initially funded in 2019 by a grant of approximately $2 million from the Organizational Grants Program of the Berkeley Existential Risk Initiative (BERI), which in turn was funded by donations from philanthropist Jaan Tallinn. We maintain a DAF at the Silicon Valley Community Foundation under the same name (SFF), and we also occasionally process grant recommendations through other DAFs.
Initiative Committee grants and occasionally other grants are processed through SFC . Andrew Critch is CEO and Co-founder of HealthcareAgents , a company that advocates for early patient access to AI-enhanced healthcare services. Prior to founding HealthcareAgents, he worked as a full-time research scientist at UC Berkeley within CHAI , where he retains a part-time appointment.
Andrew also co-founded BERI , where he now volunteers as President, established SFF with the support of Jaan Tallinn, and co-developed the S-Process for philanthropic grant-making. Andrew earned his Ph. D.
in mathematics at UC Berkeley studying applications of algebraic geometry to machine learning models. He was offered university faculty and research positions in mathematics, mathematical biosciences, and philosophy, cofounded CFAR and SPARC , worked as an algorithmic stock trader at Jane Street Capital, and worked as a Research Fellow at MIRI .
His current research interests include logical uncertainty, open source game theory, "boundary theory", and avoiding arms race dynamics between nations and companies in AI development. Eric is a software engineer and technical leader who led development of CoinList's cryptocurrency exchange and derivatives platform, processing billions in trading volume.
Previously, he was a founding engineer at EverMarkets (acquired by CoinList), co-founder of Arbital. com (a platform for collaborative knowledge-sharing developed partly to address reasoning about existential risk), and did engineering at Amazon and Microsoft. Eric holds a BSE in Electrical Engineering and Computer Science from Duke University.
In 2026, he will join the Anthropic Fellows Program for AI Safety Research. Ethan is President & CEO of Survival and Flourishing Corp (SFC) , where he oversees software development, philanthropy, and operations supporting the Survival and Flourishing Fund and its S-Process grant-making rounds. He joined SFC as a Product Manager in 2023, led the development of OpenLetter.
net , and became CEO later that year. Previously, Ethan worked as a software engineer at TuneIn, Trifacta, and Lyft in the San Francisco Bay Area, and volunteered at CFAR . He holds a BA and MA in Linguistics from the University of Arizona.
Recent Grant Recommendations Below is a list of grants we have recommended using the S-Process and the Initiative Committee . Recommendations after the final date might not be listed yet.
SFF-2025 Jaan Tallinn Agent Foundations Field Network (AFFINE) [Algorithm Design] $79,000 Ashgro, Inc. General support of Algorithm Design SFF-2025 Jaan Tallinn Agent Foundations Field Network (AFFINE) [Technical Research] $165,000 Ashgro, Inc. General support of Technical Research SFF-2025 Jaan Tallinn AI & Democracy Foundation $195,000 Thoughtful Tech Project, Inc. General support SFF-2025 Jaan Tallinn AI Futures Project $1,535,000 +$500,000‡ AI Futures Project General support SFF-2025 Jaan Tallinn AI Lab Watch $371,000 Lightcone Infrastructure, Inc. General support of AI Lab Watch SFF-2025 Jaan Tallinn AI Policy Institute (AIPI) $1,635,000 The Hack Foundation General support of AI Policy Institute SFF-2025 Jaan Tallinn AI Safety Camp $90,000 +$110,000‡ Ashgro, Inc. General support of AI Safety Camp SFF-2025 Jaan Tallinn AI Standards Lab and Holtman Systems Research $228,000 +$100,000‡ Players Philanthropy Fund, Inc. General support of AI Standards Lab and Holtman Systems Research SFF-2025 Jaan Tallinn Alignment Ecosystem Development $91,000 Ashgro, Inc. General support of Alignment Ecosystem Development SFF-2025 Jaan Tallinn Alignment in Complex Systems Research Group (ACS Research) $400,000 +$306,000‡ Epistea, z.
s. General support of Alignment in Complex Systems Research Group SFF-2025 Jaan Tallinn Alliance to Feed the Earth in Disasters (ALLFED) $30,000 ALLFED Institute General support SFF-2025 Jaan Tallinn Amplifying AI Safety $30,000 Epistea, z. s.
General support of Amplifying AI Safety SFF-2025 Jaan Tallinn Association for Long Term Existence and Resilience (ALTER) $60,000 +$22,000‡ Association for Long Term Existence and Resilience General support SFF-2025 Jaan Tallinn Augur $160,000 Augur LLC General support SFF-2025 Jaan Tallinn BERI-CLTC Collaboration $141,000 Berkeley Existential Risk Initiative General support of BERI-CLTC Collaboration SFF-2025 Jaan Tallinn Catalyze Impact $74,000 Ashgro, Inc. General support of Catalyze Impact SFF-2025 Jaan Tallinn Center for AI Safety (CAIS) $289,000 Center for AI Safety, Inc. General support SFF-2025 Jaan Tallinn Center for AI Safety Action Fund (CAIS AF) $772,000 Center for AI Safety Action Fund, Inc. General support SFF-2025 Jaan Tallinn Center for Humane Technology (CHT) $468,000 Center for Humane Technology General support SFF-2025 Jaan Tallinn Center on Long-Term Risk (CLR) $200,000 Center on Long-Term Risk General support SFF-2025 Jaan Tallinn Centre for AI Security and Access (CASA) $325,000 Rethink Priorities General support of Centre for AI Security and Access SFF-2025 Jaan Tallinn Centre for Effective Altruism (CEA) $117,000 Effective Ventures Foundation USA, Inc. General support of Centre for Effective Altruism SFF-2025 Jaan Tallinn Centre for Long-Term Resilience (CLTR) $527,000 +$38,000‡ Founders Pledge, Inc. General support of Centre for Long-Term Resilience SFF-2025 Jaan Tallinn Centre for the Governance of AI (GovAI) $60,000 +$696,000‡ Centre for the Governance of AI, Inc. General support SFF-2025 Jaan Tallinn Civic AI Security Program (CivAI) $600,000 Civic AI Security Program, Inc. General support SFF-2025 Jaan Tallinn Compassion in Machine Learning (CaML) $20,000 +$63,000‡ Players Philanthropy Fund, Inc. General support of Compassion in Machine Learning SFF-2025 Jaan Tallinn Computational Rational Agents Laboratory (CORAL) $140,000 Ashgro, Inc. General support of Computational Rational Agents Laboratory SFF-2025 Jaan Tallinn Cornell University [Angelina Wang’s Responsible AI Lab] $170,000 Cornell University General support of Angelina Wang’s Responsible AI Lab SFF-2025 Jaan Tallinn David Lorell $230,000 Lightcone Infrastructure, Inc. General support of research led by David Lorrell SFF-2025 Jaan Tallinn Effective Institutions Project (EIP) $20,000 +$358,000‡ Effective Institutions Project, Inc. General support SFF-2025 Jaan Tallinn Eisenstat Research Program $593,000 Machine Intelligence Research Institute (1/2) & Ashgro, Inc. (1/2) § General support of Eisenstat Research Program SFF-2025 Jaan Tallinn Encode AI Corporation $516,000 Encode AI Corporation General support SFF-2025 Jaan Tallinn Epistemic Garden $100,000 Sentinel Research General support of Epistemic Garden SFF-2025 Jaan Tallinn EthicsNet Creed.
Space $80,000 Players Philanthropy Fund, Inc. General support of EthicsNet Creed. Space SFF-2025 Jaan Tallinn FABRIC $10,000 +$75,000‡ FABRIC Labs, z. s.
General support SFF-2025 Jaan Tallinn FAR AI $200,000 +$719,000‡ FAR AI, Inc. General support SFF-2025 Jaan Tallinn Flourishing Future Foundation (FFF) $30,000 +$270,000‡ Flourishing Future Foundation General support SFF-2025 Jaan Tallinn Foresight Institute $175,000 The Foresight Institute General support SFF-2025 Jaan Tallinn Forethought Research $103,000 Forethought Research General support SFF-2025 Jaan Tallinn Foundation for American Innovation (FAI) $777,000 Foundation for American Innovation General support SFF-2025 Jaan Tallinn ILINA Program $369,000 Berkeley Existential Risk Initiative General support of ILINA Program SFF-2025 Jaan Tallinn James Payor $202,000 Good Forever Foundation General support of research led by James Payor SFF-2025 Jaan Tallinn John Wentworth $258,000 Lightcone Infrastructure, Inc. General support of research led by John Wentworth SFF-2025 Jaan Tallinn Legal Advocates for Safe Science and Technology (LASST) $100,000 +$50,000‡ Legal Advocates for Safe Science and Technology, Inc. General support SFF-2025 Jaan Tallinn Lightcone Infrastructure $661,000 +$650,000‡ Lightcone Infrastructure, Inc. General support SFF-2025 Jaan Tallinn Live Theory $58,000 +$70,000‡ Epistea, z.
s.
General support of Live Theory SFF-2025 Jaan Tallinn Machine Intelligence Research Institute (MIRI) $1,607,000‡ Machine Intelligence Research Institute General support SFF-2025 Jaan Tallinn Macrostrategy Research Initiative (MRI) $60,000 Macrostrategy Research Initiative Limited General support SFF-2025 Jaan Tallinn Mathematical Metaphysics Institute $85,000 Mathematical Metaphysics Institute General support SFF-2025 Jaan Tallinn Meaning Alignment Institute $149,000 The Hack Foundation General support of Meaning Alignment Institute SFF-2025 Jaan Tallinn Medronho $564,000 Ashgro, Inc. General support of Medronho SFF-2025 Jaan Tallinn Metaculus $750,000 Metaculus, Inc. General support SFF-2025 Jaan Tallinn Mindstream Project $100,000 Mindstream Project, Ltd. General support SFF-2025 Jaan Tallinn Missing Measures $338,000 Lightcone Infrastructure, Inc. General support of Missing Measures SFF-2025 Jaan Tallinn ML Alignment & Theory Scholars Research (MATS Research) $289,000 MATS Research, Inc. General support SFF-2025 Jaan Tallinn Model Evaluation & Threat Research (METR) $120,000 +$428,000‡ Model Evaluation and Threat Research, Inc. General support SFF-2025 Jaan Tallinn Modeling Cooperation $40,000 +$26,000‡ Convergence Analysis General support of Modeling Cooperation SFF-2025 Jaan Tallinn OAISIS $251,000 Whistleblower Netzwerk, e.
V.
General support of OAISIS SFF-2025 Jaan Tallinn Odyssean Institute $79,000 Odyssean Institute General support SFF-2025 Jaan Tallinn Ovelle Bio $667,000 Ovelle Bio Corp. General support SFF-2025 Jaan Tallinn Oxford China Policy Lab $480,000 +$239,000‡ Berkeley Existential Risk Initiative General support of Oxford China Policy Lab SFF-2025 Jaan Tallinn Oxford Martin AI Governance Initiative $311,000 University of Oxford Development Trust Fund General support of Oxford Martin AI Governance Initiative SFF-2025 Jaan Tallinn Palisade Research $10,000 +$1,123,000‡ Palisade Research General support SFF-2025 Jaan Tallinn Panoplia Laboratories $60,000 Panoplia Laboratories, Inc. General support SFF-2025 Jaan Tallinn Plurality Institute $100,000 Plurality Institute General support SFF-2025 Jaan Tallinn Principles of Intelligent Behavior in Biological and Social Systems (PIBBSS) $20,000 +$994,000‡ Principles of Intelligence General support SFF-2025 Jaan Tallinn RAND Corporation [Technology and Security Policy Center] $1,000,000 +$22,000‡ RAND Corporation General support of Technology and Security Policy Center SFF-2025 Jaan Tallinn Rethink Priorities (RP) [AI Strategy Team] $28,000 Rethink Priorities General support of AI Strategy Team SFF-2025 Jaan Tallinn Rethink Priorities (RP) [Worldview Investigation Team] $166,000 Rethink Priorities General support of Worldview Investigation Team SFF-2025 Jaan Tallinn SaferAI $200,000 +$111,000‡ SaferAI General support SFF-2025 Jaan Tallinn Sage Future $170,000 Sage Future, Inc. General support SFF-2025 Jaan Tallinn Secure AI Project $383,000 Secure AI Project, Inc. General support SFF-2025 Jaan Tallinn SecureBio $754,000 SecureBio General support SFF-2025 Jaan Tallinn SecureDNA $1,500,000 SecureBio General support of SecureDNA SFF-2025 Jaan Tallinn Seldon Labs $53,000 Seldon Labs, PBC General support SFF-2025 Jaan Tallinn Simon Institute for Longterm Governance $400,000 Simon Institute for Longterm Governance General support SFF-2025 Jaan Tallinn Singapore AI Safety Hub $300,000 +$36,000‡ Impact Academy Limited General support of Singapore AI Safety Hub SFF-2025 Jaan Tallinn Stichting Legal Safety Lab $101,000 Stichting Legal Safety Lab General support SFF-2025 Jaan Tallinn Tarbell Center for AI Journalism $583,000 +$200,000‡ Tarbell Center for AI Journalism, Inc. General support SFF-2025 Jaan Tallinn The Future Society (TFS) $336,000 The Future Society, Inc. General support SFF-2025 Jaan Tallinn The Institute for AI Policy and Strategy (IAPS) $272,000 +$300,000‡ Rethink Priorities General support of The Institute for AI Policy and Strategy SFF-2025 Jaan Tallinn The Millennium Project $15,950 The Millennium Project General support SFF-2025 Jaan Tallinn The Vitalism Charity Project $220,000 Less Death, Inc. General support of The Vitalism Charity Project SFF-2025 Jaan Tallinn Timaeus Research $70,000 +$206,000‡ Timaeus Research Inc. General support SFF-2025 Jaan Tallinn Topos Institute $200,000 +$20,000‡ Topos Institute General support SFF-2025 Jaan Tallinn Ulyssean PBC $186,000 Ulyssean PBC General support SFF-2025 Jaan Tallinn University of Toronto & University of Michigan [Toronto and Michigan NLP Group for AI Safety] $50,000 +$51,000‡ The Governing Council of the University of Toronto (1/2) & University of Michigan (1/2) § General support of Toronto and Michigan NLP Group for AI Safety SFF-2025 Jaan Tallinn Zig Zag Technology $30,000 Zig Zag Technology General support Jaan Tallinn and Blake Borgeson Center for AI Safety, Inc. Jaan Tallinn and Blake Borgeson Earendil [Security Layers Project] General support of Security Layers Project Jaan Tallinn and Blake Borgeson Earendil [Standards Infrastructure Project] General support of Standards Infrastructure Project MILA - Institut quebecois d’intelligence artificielle Jaan Tallinn and Blake Borgeson University of Southern California University of Southern California Worcester Polytechnic Institute & University of Massachusetts Amherst Worcester Polytechnic Institute (2/3) & University of Massachusetts Amherst (1/3) Players Philanthropy Fund General support of AGI Inherent Non-Safety General support of AI Futures Project Centre For Effective Altruism Usa Inc. General support of AI Risk Mitigation Fund General support of AI Safety Info AI Standards Lab and Holtman Systems Research Players Philanthropy Fund General support of AI Standards Lab and Holtman Systems Research AI: Futures and Responsibility Programme General support of AI: Futures and Responsibility Programme Alignment of Complex Systems Research Group (ACS) General support of Alignment of Complex Systems Research Group (ACS) Alignment Research Center Alignment Research Center Alliance to Feed the Earth in Disasters (ALLFED) General support of Apart Research General support of Apollo Research Balsa Policy Institute Inc. Balsa Policy Institute Inc. Berkeley Existential Risk Initiative General support of BERI-DMIP Collaboration Berkeley Existential Risk Initiative Berkeley Existential Risk Initiative Brown University AI Governance Lab General support of Brown University AI Governance Lab Center for AI Policy, Inc. Center for AI Safety Action Fund, Inc. Center for AI Safety Action Fund, Inc. Center for AI Safety, Inc. Center for AI Safety, Inc. Center for Law and AI Risk General support of Center for Law and AI Risk General support of Convergence: AI Clarity General support of Decode Research Effective Institutions Project Effective Institutions Project Inc. Eisenstat Research Directions Machine Intelligence Research Institute Inc. & Ashgro Inc.‡ General support of Eisenstat Research Directions General support of Encode Justice Foundation for American Innovation General support of Foundation for American Innovation Global Catastrophic Risk Institute Social and Environmental Entrepreneurs General support of Global Catastrophic Risk Institute Social and Environmental Entrepreneurs General support of Global Shield Good Ancestors Policy Ltd. The Trustees of Columbia University in the City of New York General support of History Lab Stiftelsen Impact Academy General support of Impact Academy Limited Institute for AI Policy and Strategy General support of Institute for AI Policy and Strategy Langsikt - Centre for Long-Term Policy Senter for langsiktig politikk AS General support of Langsikt - Centre for Long-Term Policy Legal Advocates for Safe Science and Technology, Inc. Legal Advocates for Safe Science and Technology, Inc. Legal Advocates for Safe Science and Technology Inc. General support of Legal Safety Lab Lightcone Infrastructure Inc. Lightcone Infrastructure Inc. Longview Philanthropy USA Inc. Machine Intelligence and Normative Theory lab, ANU Australian National University General support of Machine Intelligence and Normative Theory lab, ANU Machine Intelligence Research Institute Machine Intelligence Research Institute Inc. Machine Learning for Socio-technical Systems Lab The University of Rhode Island Foundation & Alumni Engagement General support of Machine Learning for Socio-technical Systems Lab Macrostrategy Research Initiative Lightcone Infrastructure Inc. General support of Macrostrategy Research Initiative Mathematical Metaphysics Institute Mathematical Metaphysics Institute Meaning Alignment Institute General support of Meaning Alignment Institute Model Evaluation and Threat Research Model Evaluation and Threat Research, Inc. Science and Technology Futures, Inc. General support of MSEP Project Berkeley Existential Risk Initiative General support of Oxford China Policy Lab American Governance Foundation Inc. General support of Palladium Magazine Psychosecurity Ethics @ EURAIO Players Philanthropy Fund General support of Psychosecurity Ethics @ EURAIO Regents of the University of California at Berkeley Regents of the University of California at Berkeley General support of Prof. Emma Pierson General support of Safe AI Forum General support of SecureDNA General support of Sentinel Simon Institute for Longterm Governance Simon Institute for Longterm Governance General support of Simplex Players Philanthropy Fund General support of Tarbell Fellowship Technology Strategy Roleplay Technology Strategy Roleplay General support of The AI Policy Institute The Centre for Long-Term Resilience (CLTR) General support of The Centre for Long-Term Resilience (CLTR) The Quantified Uncertainty Research Institute General support of The Quantified Uncertainty Research Institute General support of Timaeus The Vanderbilt University Initiative Committee 2024 Center for Applied Rationality General support of flexHEG prototyping Initiative Committee 2024 Economics of Transformative AI Gift in support of Erik Brynjolfsson’s work at Stanford Initiative Committee 2024 General support of Rootclaim Initiative Committee 2024 General support of Our World in Data Initiative Committee 2024 International Dialogues on AI Safety (IDAIS) Safe Artificial Intelligence Forum Institute General support of International Dialogues on AI Safety (IDAIS) Initiative Committee 2024 Initiative Committee 2024 Initiative Committee 2024 Center for AI Safety, Inc. To support research and analysis of relevance to the Bureau of Industry and Security (BIS).
Initiative Committee 2024 Center for Applied Rationality General support of flexHEG prototyping Initiative Committee 2024 General support of the AI Policy Institute Initiative Committee 2024 HitRecord AI Safety Project LLC Initiative Committee 2024 Institute for Security and Technology Institute for Security and Technology Initiative Committee 2024 Initiative Committee 2024 Initiative Committee 2024 Collective Intelligence Project Collective Intelligence Project Inc. Initiative Committee 2024 Scholarship top-up funding to support scholarship on international governance of technologies with catastrophic potential, including AI Initiative Committee 2024 Center for Applied Rationality General support of flexHEG prototyping Initiative Committee 2024 Center for Applied Rationality General support of Lightcone Infrastructure Initiative Committee 2024 Signal Technology Foundation Initiative Committee 2024 To support work on prototyping flexible hardware-enabled [chip] governors, or “FlexHEGs” Initiative Committee 2024 General support of Safe AI for Humanity, a portion of which may be allocated for related overhead costs.
Initiative Committee 2024 Center for AI Safety, Inc. Initiative Committee 2024 Collective Intelligence Project General support of CIP’s Global Representation/Constitutions Project Initiative Committee 2024 Center for AI Safety, Action Fund Center for AI Safety Action Fund, Inc. Initiative Committee 2024 HitRecord AI Safety Project LLC Initiative Committee 2024 Initiative Committee 2024 Effective Ventures Foundation USA, Inc. General support of 80,000 hours Initiative Committee 2024 Association for Computing Machinery General support of ACM FAccT Effective Ventures Foundation (UK) General Support of 80,000 Hours Machine Intelligence Research Institute General Support of AI Impacts Alignment of Complex Systems Research Group (ACS) Association for Long Term Existence and Resilience (ALTER) Association for Long Term Existence and Resilience (ALTER) Berkeley Existential Risk Initiative General Support of BERI-SRL Collaboration Berkeley Existential Risk Initiative Berkeley Existential Risk Initiative Center for AI Safety Action Fund Center for AI Safety, Inc. Center for Artificial Intelligence Safety, Inc. Centre for the Governance of AI (GovAI) Effective Ventures Foundation Community Health and Special Projects team at the Centre for Effective Altruism Effective Ventures Foundation USA, Inc. General Support of Community Health and Special Projects team at the Centre for Effective Altruism Center for Applied Rationality General Support of Coordination Project Effective Altruism Sweden Rationality Research Project Effektiv Altruism Sverige (Effective Altruism Sweden) General Support of Effective Altruism Sweden Rationality Research Project Effektiv Altruism Sverige Foundation for American Innovation Lincoln Network (dba The Foundation for American Innovation) Center for Applied Rationality General Support of Lightcone Infrastructure Effective Ventures Foundation USA General Support of Long-Term Future Fund Massachusetts Institute of Technology (Tegmark group) Massachusetts Institute of Technology Berkeley Existential Risk Initiative General Support of MATS London Ltd Online Team at the Centre for Effective Altruism Effective Ventures Foundation USA, Inc. General Support of the Online Team at the Centre for Effective Altruism General Support of Orthogonal Berkeley Existential Risk Initiative General Support of Oxford China Policy Lab Principles of Intelligent Behavior in Biological and Social Systems (PIBBSS) General Support of PIBBSS SERI ML Alignment Theory Scholars Program Berkeley Existential Risk Initiative General Support of SERI ML Alignment Theory Scholars Program Center for Applied Rationality General Support of SLT Summit organizers Technologies for Pandemic Defense General Support of Technologies for Pandemic Defense The AI Governance & Strategy team within Rethink Priorities General Support of The AI Governance & Strategy team within Rethink Priorities The Australian Responsible Autonomous Agents Group Federation University Australia General Support of the Australian Responsible Autonomous Agents Group The Center for Election Science The Center for Election Science The Centre for Long-Term Resilience (Alpenglow Group Limited) General Support of The Centre for Long-Term Resilience (Alpenglow Group Limited) The Events team at the Centre for Effective Altruism Effective Ventures Foundation USA, Inc. General Support for The Events team at the Centre for Effective Altruism The Intrinsic Perspective General Support of The Intrinsic Perspective The Rethink Priorities Existential Security Team (XST) General Support of The Rethink Priorities Existential Security Team (XST) General Support of Timaeus Effective Ventures Foundation General Support of Wytham Abbey General Support of AI Objectives Institute Alignment Research Center (Evals Team) Alignment Research Center General Support of Alignment Research Center (Evals Team) Alliance to Feed the Earth in Disasters (ALLFED) Alliance to Feed the Earth in Disasters (ALLFED) Players Philanthropy Fund General Support of Arkose Berkeley Existential Risk Initiative General Support of BERI-CHAI Collaboration Berkeley Existential Risk Initiative General Support of BERI-CLTC Collaboration Center for AI Safety, Inc. General Support of Center for AI Safety Center for Strategic and International Studies (AI Governance Project) Center for Strategic and International Studies General Support of Center for Strategic and International Studies (AI Governance Project) Centre for the Governance of AI (GovAI) Effective Ventures Foundation General Support of Centre for the Governance of AI (GovAI) Center for Innovative Governance (d/b/a Charter Cities Institute) General Support of Center for Innovative Governance (d/b/a Charter Cities Institute) Hansjorg Wyss Institute For Biologically Inspired Engineering General Support of Church Lab Meiosis Team AI Safety Support Ltd - Equivalency Determination General Support of Holtman Systems Research Institute for Advanced Consciousness Studies (IACS) Institute for Advanced Consciousness Studies (IACS) Center for Applied Rationality General Support of Lightcone Infrastructure General Support of Manifold Markets Effective Ventures Foundation General Support of Project Solve Redwood Research Group Inc. Redwood Research Group Inc. Senter for langsiktig politikk/Centre for Long-Term Policy General Support of Senter for langsiktig politikk/Centre for Long-Term Policy The Collective Intelligence Project The Collective Intelligence Project Open Collective Foundation General Support of The Unjournal University of Chicago Existential Risk Laboratory The University of Chicago General Support of University of Chicago Existential Risk Laboratory University of Louisville (Dr. Roman Yampolskiy’s Research Group (Cybersecurity Lab)) University of Louisville Foundation, Inc. General Support of University of Louisville (Dr. Roman Yampolskiy’s Research Group (Cybersecurity Lab)) Machine Intelligence Research Institute General Support of AI Impacts Alliance to Feed the Earth in Disasters (ALLFED) Alliance to Feed the Earth in Disasters (ALLFED) Technical Alignment Impossibility Proofs Ronin Institute for Independent Scholarship Incorporated General Support of Technical Alignment Impossibility Proofs Alignment Research Center Alignment Research Center Berkeley Existential Risk Initiative General Support of BERI-ALL Collaboration Berkeley Existential Risk Initiative General Support of BERI-CHAI collaboration Constructive Dialogue Institute (formerly called OpenMind Platform) Constructive Dialogue Institute Inc. Centre for Enabling EA Learning & Research (CEEALAR) [formerly the EA Hotel] Centre for Enabling EA Learning & Research Center for Applied Rationality (CFAR) Center for Applied Rationality (CFAR) The Center for Election Science The Center for Election Science Fund for Alignment Research (FAR) Center for Applied Rationality General Support of Lightcone Infrastructure Effective Ventures Foundation General Support of Effective Ventures Foundation Elizabeth Van Nostrand c/o Lightcone Center for Applied Rationality General Support of Elizabeth Van Nostrand c/o Lightcone General Support of Modeling Cooperation Berkeley Existential Risk Initiative General Support of Oxford China Policy Lab General Support of Pronatalist.
org Center for Applied Rationality General Support of Rationality Meetups Players Philanthropy Fund (PPF) General Support of Scholarship Workshop The Benjamin Franklin Society Library Inc. General Support of The Society Library University of Wisconsin - Madison University of Wisconsin Foundation General Support of University of Wisconsin - Madison Unite America Institute Inc.
Based on current listing details, eligibility includes: Nonprofit organizations and research institutions. Applicants should confirm final requirements in the official notice before submission.
Current published award information indicates Funding amounts vary based on project scope and sponsor guidance. Always verify allowable costs, matching requirements, and funding caps directly in the sponsor documentation.
The current target date is rolling deadlines or periodic funding windows. Build your timeline backwards from this date to cover registrations, approvals, attachments, and final submission checks.
Federal grant success rates typically range from 10-30%, varying by agency and program. Build a strong proposal with clear objectives, measurable outcomes, and a well-justified budget to improve your chances.
Requirements vary by sponsor, but typically include a project narrative, budget justification, organizational capability statement, and key personnel CVs. Check the official notice for the complete list of required attachments.
Yes — AI tools like Granted can help research funders, draft proposal sections, and check compliance. However, always review and customize AI-generated content to reflect your organization's unique strengths and the specific requirements of the solicitation.
Review timelines vary by funder. Federal agencies typically take 3-6 months from submission to award notification. Foundation grants may be faster, often 1-3 months. Check the program's timeline in the official solicitation for specific dates.
Many federal programs offer multi-year funding or allow competitive renewals. Check the official solicitation for continuation and renewal policies. Non-competing continuation applications are common for multi-year awards.
The Survival and Flourishing Fund (SFF) supports organizations working on long-term survival and flourishing of sentient life, with a strong focus on AI safety, AI governance, biosecurity, and institutional resilience. Founded by Jaan Tallinn (co-founder of Skype), SFF has distributed approximately $152 million since 2019, with $34.9 million in 2025 alone. The 2026 round is estimated at $20-40 million. SFF uses the S-Process evaluation method involving multiple independent assessors and offers three funding mechanisms: Speculation Grants (rolling basis, smaller amounts for quick-turnaround projects), full S-Process grants (annual competitive round), and Initiative Committee funding. Individual grants range from $10,000 to $4,000,000. Assessors may fund proposals across AI safety, pandemic preparedness, governance, and other existential risk areas.
SFF-2026 S-Process Grant Round is a large philanthropic grant program from the Survival and Flourishing Fund (SFF), organized in collaboration with Jaan Tallinn, that distributes $20 million to $40 million or more across a Main Round and three Theme Rounds focused on Climate Change, Animal Welfare, and Human Self-Enhancement and Empowerment. The Main Round includes three tracks — Main, Freedom, and Fairness — with a combined $14–28 million, while Theme Rounds provide $2–4 million each with dedicated expert recommenders. Eligible applicants are incorporated for-profit or nonprofit organizations globally, excluding adversarial nations; applicants must first receive a Speculation Grant (awarded automatically to over 95% of applicants). Main Round applications were due April 22, 2026; Theme Round supplemental deadlines run from June to July 2026. Recommendations for all rounds are expected in Fall 2026.