1,000+ Opportunities
Find the right grant
Search federal, foundation, and corporate grants with AI — or browse by agency, topic, and state.
The AI Risk Mitigation Fund (ARM Fund) is a grant from the AI Risk Mitigation Fund, a nonprofit spun out of the Long-Term Future Fund, that funds technical research, policy work, and training programs aimed at reducing catastrophic risks from advanced AI systems.
Launched in December 2023, the ARM Fund actively supports projects across AI safety research, governance, and early-career researcher development, with a track record of funding researchers early in their careers who have gone on to prominent work in the field. Eligible applicants include researchers and organizations working on technical AI safety, AI policy, or AI safety training programs.
Contact the ARM Fund directly through their website for current grant opportunities and application guidelines.
Get alerted about grants like this
Save a search for “AI Risk Mitigation Fund (a non-profit spun out of the Long-Term Future Fund)” or related topics and get emailed when new opportunities appear.
Search similar grants →Extracted from the official opportunity page/RFP to help you evaluate fit faster.
R e d u c e c a t a s t r o p h i c r i s k s f r o m a d v a n c e d A I The AI Risk Mitigation Fund (ARM Fund) is a non-profit aiming to reduce catastrophic risks from advanced AI through grants towards technical research, policy, and training programs for new researchers. → Our views on risks from advanced AI → The case for independent AI Safety funding → Our grant making process The ARM Fund launched in Dec 2023.
The ARM Fund is now actively supporting projects across our focus areas. We are currently working on preparing detailed grant announcements, which will be published here soon. Below are some grants made by the ARM Fund team as part of their past grantmaking for the Long-Term Future Fund .
We are proud to have funded many of these grantees early on in their careers, many of whom subsequently went on to make significant contributions in technical AI safety and AI governance.
Building research capacity Start-up funds for computing resources for a deep learning and AI alignment research group at the University of Cambridge Centre for the Governance of AI Two-year funding to conduct public and expert surveys on AI governance and forecasting.
Building research capacity 4-month stipend for a research visit to collaborate with academics in Cambridge on evaluating non-myopia in language models and RLHF systems Year-long stipend for research into shard theory and mechanistic interpretability in reinforcement learning Technical AI safety research Seed funding for a new AI interpretability research organization University College London Compute for empirical work on AI Safety Via Debate Technical AI alignment research Technical research can uncover dangerous capabilities before it’s too late, or enable us to design future AI systems that are easier to understand, monitor and control.
Good policy can ensure that governments and corporations appropriately guard against catastrophic risks. Building AI safety research capacity Investment has poured into AI capabilities development, yet strikingly few researchers are working on key problems in AI safety, particularly outside of major industry labs. Grants in this area aim to bring new talent into the AI safety field.
This fund was spun out of the Long-Term Future Fund (LTFF), which makes grants aiming to reduce existential risk. Over the last five years, the LTFF has made hundreds of grants, specifically in AI risk mitigation, totalling over $20 million. Our team includes AI safety researchers, expert forecasters, policy researchers, and experienced grantmakers.
We are advised by staff from frontier labs, AI safety nonprofits, leading think tanks, and others. Help reduce catastrophic risks from advanced AI
Based on current listing details, eligibility includes: Organizations and individuals engaged in technical AI safety research, policy, and training programs. Applicants should confirm final requirements in the official notice before submission.
Current published award information indicates Not specified (past grants varied, e.g., $12,321 - $300,000) Always verify allowable costs, matching requirements, and funding caps directly in the sponsor documentation.
The current target date is rolling deadlines or periodic funding windows. Build your timeline backwards from this date to cover registrations, approvals, attachments, and final submission checks.
Federal grant success rates typically range from 10-30%, varying by agency and program. Build a strong proposal with clear objectives, measurable outcomes, and a well-justified budget to improve your chances.
Requirements vary by sponsor, but typically include a project narrative, budget justification, organizational capability statement, and key personnel CVs. Check the official notice for the complete list of required attachments.
Yes — AI tools like Granted can help research funders, draft proposal sections, and check compliance. However, always review and customize AI-generated content to reflect your organization's unique strengths and the specific requirements of the solicitation.
Review timelines vary by funder. Federal agencies typically take 3-6 months from submission to award notification. Foundation grants may be faster, often 1-3 months. Check the program's timeline in the official solicitation for specific dates.
Many federal programs offer multi-year funding or allow competitive renewals. Check the official solicitation for continuation and renewal policies. Non-competing continuation applications are common for multi-year awards.
Research on Circular Economy, Smart Manufacturing, and Energy-Efficient Microelectronics is sponsored by U.S. Department of Energy (DOE) Advanced Materials & Manufacturing Technologies Office (AMMTO). This funding opportunity supports innovative technology R&D across the manufacturing sector with a focus on circular economy, smart manufacturing, and energy-efficient microelectronics. While the stated deadline for full applications has passed, AMMTO frequently issues similar solicitations, and this highlights a relevant area of interest for the DOE.
NIST Small Business Innovation Research (SBIR) Phase II Program - Quantum Information Science is sponsored by National Institute of Standards and Technology (NIST). This program allocates funding to small businesses for prototyping innovative technologies in areas including quantum information science, artificial intelligence, and semiconductors. These Phase II awards follow successful Phase I feasibility studies.