1,000+ Opportunities
Find the right grant
Search federal, foundation, and corporate grants with AI — or browse by agency, topic, and state.
Scaling Trust - ARIA Funding is a grant from the Advanced Research and Invention Agency (ARIA), a UK government R&D funding body established to unlock scientific and technological breakthroughs. The Scaling Trust programme funds research and development focused on trust and trustworthiness in emerging technologies.
ARIA supports work from the full R&D ecosystem — including startups, universities, and research organisations — and actively encourages cross-disciplinary and cross-sector collaboration. Eligible applicants are UK-based researchers and organisations.
ARIA operates as an open funding agency, reaching across disciplines and institutions to break down research silos and discover new pathways at the frontier of what is scientifically or technologically possible.
Get alerted about grants like this
Save a search for “Advanced Research and Invention Agency (ARIA)” or related topics and get emailed when new opportunities appear.
Search similar grants →Extracted from the official opportunity page/RFP to help you evaluate fit faster.
Advanced Research and Invention Agency (ARIA) You are using an unsupported browser. Please upgrade to Chrome, Firefox or the latest version of Microsoft Edge. Advanced Research and Invention Agency (ARIA) Advanced Research and Invention Agency (ARIA) home Advanced Research and Invention Agency (ARIA) ARIA is an R&D funding agency built to unlock scientific and technological breakthroughs that benefit everyone.
We empower scientists and engineers to pursue research at the edge of what is technologically or scientifically possible. We will reach across disciplines, sectors and institutions to shape, fund and manage projects across the R&D ecosystem, from startups to universities, to break down silos and discover new pathways.
Live funding opportunities Check out our live funding opportunities below Rolling Opportunity Seeds Accelerated Adaptation (Concept Papers) Massively Scalable Neurotechnologies (Concept Papers) Universal Fabricators (Concept Papers) Enduring Atmospheric Platforms (Full Proposals) Grant disbursement software by Good Grants #### Maintenance in progress Please try again momentarily, or check Good Grants status. The page expired.
Please refresh your browser. Oops! Something didn't go right.
The technical team have been notified and will investigate. If the problem persists, try again later. No response received.
There may be a network interruption. Try again in a moment. No internet.
Please check your network connection. Are you sure you would like to delete this item? Are you sure you would like to delete these items?
Are you sure you want to change the status of this item? Are you sure you want to change the status of these items? Are you sure you would like to delete this comment?
Are you sure you would like to archive this item? Are you sure you would like to archive these items? Unable to save column order.
Preview mode is read only. Changes can’t be saved.
Based on current listing details, eligibility includes: UK-based researchers and organisations eligible for ARIA funding. Applicants should confirm final requirements in the official notice before submission.
Current published award information indicates Funding amounts vary based on project scope and sponsor guidance. Always verify allowable costs, matching requirements, and funding caps directly in the sponsor documentation.
The current target date is March 24, 2026. Build your timeline backwards from this date to cover registrations, approvals, attachments, and final submission checks.
Federal grant success rates typically range from 10-30%, varying by agency and program. Build a strong proposal with clear objectives, measurable outcomes, and a well-justified budget to improve your chances.
Requirements vary by sponsor, but typically include a project narrative, budget justification, organizational capability statement, and key personnel CVs. Check the official notice for the complete list of required attachments.
Yes — AI tools like Granted can help research funders, draft proposal sections, and check compliance. However, always review and customize AI-generated content to reflect your organization's unique strengths and the specific requirements of the solicitation.
Review timelines vary by funder. Federal agencies typically take 3-6 months from submission to award notification. Foundation grants may be faster, often 1-3 months. Check the program's timeline in the official solicitation for specific dates.
Many federal programs offer multi-year funding or allow competitive renewals. Check the official solicitation for continuation and renewal policies. Non-competing continuation applications are common for multi-year awards.
ARIA's Scaling Trust programme is a £50 million (~$63 million) initiative to create the capability for AI agents to securely coordinate, negotiate, and verify with one another on behalf of humans. The programme's Phase 1 seeks to fund teams developing open-source coordination infrastructure and performing fundamental research that advances from empirical to theory-driven guarantees in agentic coordination. The Opportunity Space is titled 'Trust Everything, Everywhere' and is led by Programme Director Alex Obadia. ARIA does not run rolling grant competitions; instead, Programme Directors define bold scientific missions and build tailored cohorts of researchers and innovators to deliver them. This programme addresses the critical challenge of enabling trustworthy multi-agent AI systems that can autonomously coordinate while maintaining security and verifiability guarantees. Research areas include cryptographic protocols for agent verification, game-theoretic coordination mechanisms, formal methods for trust guarantees, and open-source infrastructure for agent-to-agent communication.
ARIA's Rolling Opportunity Seeds provide up to £500,000 (~$630,000) each to individual research teams exploring new pathways across multiple AI-related opportunity spaces. The 'Mathematics for Safe AI' opportunity space leverages mathematical approaches to ensure powerful AI systems operate safely with real-world systems and populations. Other opportunity spaces include 'Nature Computes Better,' 'Scoping Our Planet,' and 'Scalable Neural Interfaces.' These seed awards are designed to fund high-risk, high-reward exploratory research that could open entirely new directions in AI safety, bio-inspired computing, and neural interface technology. ARIA's overall budget allocation is £184 million for 2025-26. As part of the Safeguarded AI programme (backed by £59 million), ARIA has funded major projects including Oxford University research on developing novel technical approaches to safe AI deployment. Seeds are accepted on a rolling basis with no fixed deadline.
ARIA Scaling Trust Programme is a grant from ARIA (Advanced Research and Invention Agency) providing up to £50 million for research teams to develop secure coordination infrastructure for AI agents. The programme seeks to enable AI agents to securely coordinate, negotiate, and verify with one another on behalf of humans. Phase 1 funding targets open-source coordination infrastructure and fundamental research moving from empirical to theory-driven guarantees in agentic coordination. ARIA also offers opportunity seeds up to £500,000 for individual research teams. Eligible applicants include research organizations, universities, and technology companies working on AI safety and coordination challenges.
ARIA's Scaling Trust programme is a £50 million (~$63 million) initiative to create the capability for AI agents to securely coordinate, negotiate, and verify with one another on behalf of humans. The programme's Phase 1 seeks to fund teams developing open-source coordination infrastructure and performing fundamental research that advances from empirical to theory-driven guarantees in agentic coordination. The Opportunity Space is titled 'Trust Everything, Everywhere' and is led by Programme Director Alex Obadia. ARIA does not run rolling grant competitions; instead, Programme Directors define bold scientific missions and build tailored cohorts of researchers and innovators to deliver them. This programme addresses the critical challenge of enabling trustworthy multi-agent AI systems that can autonomously coordinate while maintaining security and verifiability guarantees. Research areas include cryptographic protocols for agent verification, game-theoretic coordination mechanisms, formal methods for trust guarantees, and open-source infrastructure for agent-to-agent communication.