1,000+ Opportunities
Find the right grant
Search federal, foundation, and corporate grants with AI — or browse by agency, topic, and state.
Long-Term Future Fund (LTFF) is sponsored by Manifund (acting as a regranting platform for various funders). Long-Term Future Fund (LTFF) is a grant from Manifund that funds projects and individuals working to positively influence the long-term trajectory of civilization, with a focus on global catastrophic risks from advanced artificial intelligence and pandemics.
Get alerted about grants like this
Save a search for “Manifund (acting as a regranting platform for various funders)” or related topics and get emailed when new opportunities appear.
Search similar grants →Extracted from the official opportunity page/RFP to help you evaluate fit faster.
Long-Term Future Fund | Effective Altruism Funds Funds Long-Term Future Fund We make grants that address global catastrophic risks, especially potential risks from advanced artificial intelligence and pandemics. We also seek to promote longtermist ideas and increasing the likelihood that future generations will flourish. Donate on every.
org Recommended for donors outside the UK or Netherlands Donate on Giving What We Can Recommended for donors in the UK or Netherlands The Long-Term Future Fund has recommended several million dollars' worth of grants to a range of organizations, including: Created an instruction-generalization benchmark for LLMs Built and maintained digital infrastructure for the AI safety ecosystem Conducted public and expert surveys on AI governance and forecasting Ran an AI safety independent research program The Fund has historically supported researchers in areas such as cause prioritization, existential risk identification and mitigation, and technical research on the development of safe and secure artificial intelligence—where it was among the first funders.
Most of our fund managers have built their careers working full time in areas directly relevant to the Fund’s mission. The Fund managers can be contacted at longtermfuture[at]effectivealtruismfunds. org The Fund has a broad remit to make grants that promote, implement and advocate for longtermist ideas.
Many of our grants aim to address potential risks from advanced artificial intelligence and to build infrastructure and advocate for longtermist projects. However, we welcome applications related to long-term institutional reform or other global catastrophic risks (e.g., pandemics or nuclear conflict).
We intend to support: Projects that directly contribute to reducing existential risks through technical research, policy analysis, advocacy, and/or demonstration projects Training for researchers or practitioners who work to mitigate existential risks, or help with relevant recruitment efforts, or infrastructure for people working on longtermist projects Promoting long-term thinking Featured grants with outstanding outcomes 6-month stipend to create language model (LM) tools to aid alignment research through feedback and content generation 1-year stipend to make videos and podcasts about AI Safety/Alignment, and build a community to help new people get involved 6 month salary & operational expenses to start a cybersecurity & alignment risk assessment org 5-month part time stipend for collaborating on a research paper analyzing the implications of compute access with Epoch, FutureTech (MIT CSAIL), and GovAI View more Long-Term Future Fund grants Payouts chart is temporarily unavailable due to site maintenance.
Stats for the last 1000 applications received The future could include a large number of flourishing humans (or other beings). However, it is possible that certain risks could make the future much worse, or wipe out human civilization altogether.
Actions taken to reduce these risks today might have large positive returns over long periods of time, greatly benefiting future people by making their lives much better, or by ensuring that there are many more of them. Donations to this fund might help to fund some of these actions and increase the chance of a positive long-term future.
Many people believe that we should care about the welfare of others, even if they are separated from us by distance, country, or culture. The argument for the long-term future extends this concern to those who are separated from us through time. Most people who will ever exist, exist in the future.
However, the emergence of new and powerful technologies puts the potential of these future people at risk. Of particular concern are global catastrophic risks . These are risks that could affect humanity on a global scale and could significantly curtail its potential, either by reducing human civilization to a point where it could not recover, or by completely wiping out humanity .
For example, tech companies are pouring money into the development of advanced artificial intelligence systems; while the upside could be enormous, there are significant potential risks if humanity ends up creating AI systems that are many times smarter than we are, but that do not share our goals.
As another example, previous disease epidemics, such as the bubonic plague in Europe, or the introduction of smallpox into the Americas were responsible for many millions of deaths. A genetically-engineered pathogen to which few humans had immune resistance could be devastating on a global scale, especially in today’s hyper-connected world.
In addition to supporting direct work, it’s also important to advocate for the long-term future among key stakeholders. Promoting concern for the long-term future of humanity — within academia, government, industry, and elsewhere — means that more people will be aware of these issues, and can act to safeguard and improve the lives of future generations.
Why you might choose not to donate to this fund 01 You don’t think that we should focus on the long-term future 02 You don’t think that future or possible beings matter, or that they matter significantly less 03 You have a preference for supporting more established organizations 04 You are pessimistic about room for more funding 05 You have identified projects or interventions that seem more promising to you than our recommendations 06 You are skeptical of the risks posed by advanced artificial intelligence 07 You have different views about how to improve the long-term future Donors might conclude that improving the long-term future is not sufficiently tractable to be worth supporting.
It is very difficult to know whether actions taken now are actually likely to improve the long-term future. To gain feedback on their work, organizations must rely on proxy measures of success: Has the public become more supportive of their ideas? Are their researchers making progress on relevant questions?
Unfortunately, there is no robust way of knowing whether succeeding on these proxy measures will cause an improvement to the long-term future. Donors who prefer tractable causes with strong feedback loops should consider giving to the Global Health and Development Fund .
Fund Advisor at Effective Altruism Infrastructure and Long-Term Future Fund Centre for Effective Altruism Frequently asked questions How do I make a donation to an EA Fund? What is the risk profile of the Long-Term Future Fund? Why donate to the Long-Term Future Fund instead of donating directly to individual organizations?
Can I apply for funding to the Long-Term Future Fund? Rigorous grantmaking for high-impact projects Donate on every. org Recommended for donors outside the UK or Netherlands Donate on Giving What We Can Recommended for donors in the UK or Netherlands
Based on current listing details, eligibility includes: Individuals and small projects focused on positively influencing the long-term trajectory of civilization. Applicants should confirm final requirements in the official notice before submission.
Current published award information indicates Varies Always verify allowable costs, matching requirements, and funding caps directly in the sponsor documentation.
The current target date is rolling deadlines or periodic funding windows. Build your timeline backwards from this date to cover registrations, approvals, attachments, and final submission checks.
Federal grant success rates typically range from 10-30%, varying by agency and program. Build a strong proposal with clear objectives, measurable outcomes, and a well-justified budget to improve your chances.
Requirements vary by sponsor, but typically include a project narrative, budget justification, organizational capability statement, and key personnel CVs. Check the official notice for the complete list of required attachments.
Yes — AI tools like Granted can help research funders, draft proposal sections, and check compliance. However, always review and customize AI-generated content to reflect your organization's unique strengths and the specific requirements of the solicitation.
Review timelines vary by funder. Federal agencies typically take 3-6 months from submission to award notification. Foundation grants may be faster, often 1-3 months. Check the program's timeline in the official solicitation for specific dates.
Many federal programs offer multi-year funding or allow competitive renewals. Check the official solicitation for continuation and renewal policies. Non-competing continuation applications are common for multi-year awards.
Small Business Innovation Research Program (SBIR) Phase II is sponsored by Administration for Community Living. Small Business Innovation Research Program (SBIR) Phase II is a forecasted funding opportunity on Grants.gov from Administration for Community Living. Fiscal Year: 2026. Assistance Listing Number(s): 93.433. <p>The purpose of the Federal SBIR program is to stimulate technological innovation in the private sector, strengthen the role of small business in meeting Federal research or research and development (R/R&D) needs, and improve the return on investment from Federally-funded research for economic and social benefits to the nation. The specific purpose of NIDILRR's SBIR program is to improve the lives of people with disabilities through R/R&D products generated by small businesses, and to ...
The J.M.K. Innovation Prize is a grant from The J.M. Kaplan Fund recognizing early-stage social entrepreneurs working on environmental, heritage, and social justice challenges. The prize rewards individuals and organizations demonstrating innovative, entrepreneurial approaches to enduring problems. Applications for the 2025 prize were accepted February 11 through April 25, 2025 via an online portal. Spanish-language applications are welcomed, and a Spanish application form is available for download. The prize is biennial and open to a broad range of applicants across the United States working on forward-thinking solutions at the intersection of environment, community, and cultural heritage.