1,000+ Opportunities
Find the right grant
Search federal, foundation, and corporate grants with AI — or browse by agency, topic, and state.
This listing may be outdated. Verify details at the official source before applying.
Find similar grantsTexas Advanced Computing Center (TACC) Research Programs is sponsored by Texas Advanced Computing Center (TACC) at The University of Texas at Austin. Offers access to advanced computing resources and support for research in areas such as machine learning, data analytics, and high-performance computing.
Get alerted about grants like this
Save a search for “Texas Advanced Computing Center (TACC) at The University of Texas at Austin” or related topics and get emailed when new opportunities appear.
Search similar grants →Extracted from the official opportunity page/RFP to help you evaluate fit faster.
Addressing the most complex problems in science, engineering, and society. Through collaborations with institutions and domain scientists around the globe, TACC aims to advance scientific discovery, innovation, and education. This evolving list of projects conducted by TACC staff members utilizes the center's advanced computing resources and robust scientific software environment, drawing on their broad expertise in research computing.
Cloud Computing & Interface Technologies Cyberinfrastructure Systems Cyberinfrastructure Community Services Data Management & Collections Health Informatics & Compliance Machine Learning & Analytics Research Computing Infrastructure Software Defined Visualization TACC's software programs provides algorithms and standard libraries that perform calculation, data processing, and automated reasoning.
TACC's programmers also create custom libraries. Most modern software systems provide libraries that implement the majority of system services. Such libraries have commoditized the services which a modern application requires.
As such, most code used by modern applications is provided in these system libraries. Derives Krylov methods in the FLAME framework to show how FLAME can simplify the process of deriving iterative methods, and make a case that a Flame-based environment for deriving new iterative methods is a distinct, and attractive, possibility.
Sparse Direct Factorizations through Unassembled HyperMatrices A novel strategy for sparse direct factorizations that is geared towards the matrices that arise from hp adaptive Finite Element Methods. Bioinformatics is the computer-assisted data management discipline that helps researchers gather, analyze, and represent information to understand life's processes in the healthy and disease states, and find new or better drugs.
TACC's Life Sciences Group is committed to providing the research community with the bioinformatics computational tools and expertise needed to address modern biological research questions. The TACC team is focused on ensuring that TACC maintains the hardware, software and domain expertise to support a wide variety of bioscience research. To develop and operate computational resources in support of grand challenges in the life sciences.
Domain Information & Vocabulary Extraction (DIVE) This project aims to develop new methods and tools to extract biological entities from publication and other text documents.
NSF Cyverse Grant #DBI-1265383 Economic, Environmental and Agricultural Impacts in the Texas-Mexico Border Region This project will create coupled fuel, electricity, water, and air models for Texas and eastern Mexico to a develop quantitative understanding of the implications of increased natural gas usage in the Texas-Mexico border region.
This NeuroNex Technology Hub for enhanced resolution three-dimensional electron microscopy (3DEM) will enable the discovery of new details in how brain synapses function across different regions and species.
NeuroTechnology Hub award - NSF DBI-1707356 Cloud Computing & Interface Technologies As a complement to TACC's HPC and visualization resources,TACC offers cloud services to give researchers access to on-demand or persistent hosting of virtual servers, data sets, and gateways. TACC conducts research into cloud architectures and interfaces to improve next generation cloud platforms.
Learning to Use Essential Tools This project aims to research and development of a dynamic, reconfigurable tool to bridge users and remote computing resources to facilitate educational tools for data science. With this tool, curriculums and educational materials on essential tools for data science will be developed and disseminated.
Cyberinfrastructure Systems Solving the world's biggest problems requires the world's biggest solutions and TACC's advanced computing systems are among the most powerful in the world. These systems are fundamental to science and society, allowing researchers to push the boundaries of human knowledge and tackle the biggest scientific challenges we face today.
UT System Research Cyberinfrastructure (UTRC) Superior, comprehensive scientific capabilities that provide research advantages that enable breakthrough results and impact science, attract superior faculty and students, and attract funding to UT System institutions.
UT System Research Cyberinfrastructure Cyberinfrastructure Community Services In this day and age, conducting scientific research relies on the analysis and interpretation of tremendous amounts of data. To find meaning in this sea of information, researchers must first be able to organize, store, and share their data.
Once data are stored, researchers are able to leverage the advances in data analysis and computing capabilities to enable the systems at TACC to bring understanding and structure to these complex problems and highlight interesting results. Data Management & Collections (DMC) Group Meeting the needs of faculty and researchers for data collection services, and contributing to the potential of data-driven research to make discoveries.
Student Cluster Competition A real-time, 48-hour race between teams of undergraduate or high-school students to build and deploy small cluster computers on the exhibit floor at the annual SC conference. A web interface for users to manage their TACC account, projects, and allocations.
UT System Research Cyberinfrastructure (UTRC) Superior, comprehensive scientific capabilities that provide research advantages that enable breakthrough results and impact science, attract superior faculty and students, and attract funding to UT System institutions.
UTexas System Research Cyberinfrastructure Building Research Innovation at Community Colleges (BRICCs) Examines the research and educational needs from advanced cyberinfrastructure in two-year colleges and smaller institutions of higher education. cockerill@tacc. utexas.
edu SouthWest Expertise in Expanding, Training, Education and Research (SWEETER) A web-portal and shared cyberinfrastructure that will help remove resource barriers faced by community colleges and smaller institutions. cockerill@tacc. utexas.
edu Data Management & Collections TACC works to meet the needs of faculty and researchers for data collection services, and contributes to the potential of data-driven research to make discoveries. The center builds and maintains large data-management and storage resources and consults with collections' creators in all aspects of the data lifecycle from creation to long-term preservation and access.
The DMC group also provides special support, such as workflow consulting and preservation pipelines, to the center-wide initiatives listed below. If you have a data management challenge and wish to contact us, please email data@tacc. utexas.
edu . Data Management & Collections (DMC) Group Meeting the needs of faculty and researchers for data collection services, and contributing to the potential of data-driven research to make discoveries. Development of curation and publication pipelines and of the preservation and metadata back-end for natural hazards engineering data.
Deployment of Fedora-based repository with linked-data support using BlazeGraph for RDF query. Collaboration with Experimental Facilities to automate terabyte-scale data transfers to DesignSafe infrastructure at TACC. Support for diverse aspects of the Cyverse data store and Data Commons, a platform for sharing of research data in the life sciences.
Contributions include metadata model and minimum requirements, replication strategy for reliable data access, long-term preservation strategy and pipelines, integration with DOI framework for persistent references to data. Hosting and support for petabyte-scale data collections directly and indirectly incorporated into CyVerse.
Development of web-based portal for storage, management and retrieval of images and related experimental measurements of diverse porous materials. Contributions include development of customized metadata model, analysis backend, search optimization and long-term preservation strategy.
UT System Research Cyberinfrastructure (UTRC) Superior, comprehensive scientific capabilities that provide research advantages that enable breakthrough results and impact science, attract superior faculty and students, and attract funding to UT System institutions. UTexas System Research Cyberinfrastructure Natural History Collections at TACC Maintains an array of ongoing collaborations with the Texas Natural Sciences Center.
An open research system to investigate the use of field-programmable gate arrays (FPGAs) as data center accelerators to improve performance, reduce power consumption, and open new research avenues of investigation. Discovery is a testbed for benchmarking and testing hardware. TACC uses it to experiment and evaluate new technologies.
Fabric is an experimental system combining IBM's latest Power 8 processors with NVIDIA GPUs, and FPGAs from Altera and Xilinx, for researchers exploring alternate computer architectures. Health Informatics & Compliance Health informatics is the study of resources and methods for the management of health information.
TACC is capable of handling research with data sets containing personalized health information through HIPAA and FISMA compliance. This enables deeper connections to clinical studies and research in human health. Center for Lung Development Imaging and Omics Create a detailed spatial-temporal molecular atlas of the developing lung in preterm infants to young children.
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar5 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
Individualizing Care in Pregnancy and Childbirth through Digital Phenotyping Brings together researchers in medicine, computation, visualization and data to deploy passive monitoring to a cohort of 1000 pregnant women in Central Texas via ubiquitous cell phone apps to develop a digital phenotype of pregnancy.
Machine Learning & Analytics At the edge of statistics, computer science and emerging applications in industry, machine learning and analytics focus on the development of fast and efficient algorithms for real-time processing of data as a main goal to deliver accurate predictions of various kinds.
Data Management & Collections (DMC) Meeting the needs of faculty and researchers for data collection services, and contributing to the potential of data-driven research to make discoveries. Experimental Data Architectures Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
Memory Error Impact on Deep Learning Training This project quantifies the impact of silent data corruption on deep learning training. Scalable and Efficient I/O for Distributed Deep Learning This project enables scalable and efficient I/O for distributed deep learning training in computer clusters with existing hardware/software stack.
Scalable Deep Learning on Supercomputers This project enables deep learning training on supercomputer scale without losing test accuracy. The effective use of advanced computing resources is a challenge to researchers. Providing powerful and intuitive interfaces accelerates the rate at which science can be performed by allowing researchers to focus on scientific exploration rather than technological hurdles.
To develop and operate computational resources in support of grand challenges in the life sciences. A resource-sharing Web platform that will enable computer models and simulations of natural hazards that can be validated against real-world data, creating an easily accessible resource for natural hazards researchers across the country.
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar5 supercomputer to search more than 700,000 drug-like compounds in only a few hours. A web interface for users to manage their TACC account, projects, and allocations. A cyberinfrastructure which captures the "whole tale" of data driven computational research in an easy to use web-based environment.
By capturing the input data, processing and analysis steps, and the final data products, the three can be published along with any research publication to allow others to reproduce results or expand upon other's findings directly. What will TACC build next?
TACC leadership and staff are constantly monitoring, evaluating and developing new and innovative technologies in the marketplace to consider what next generation of computer systems will be most effective for science and engineering. Projects in this category are early and experimental technology evaluations, usually in collaboration with industry partners, that look at the efficacy of future system designs.
TACC maintains its own expertise in this area creating new frontiers and new challenges for next generation computing. Advanced Computing and Evaluation Laboratory (ACELab) The ACELab develops and packages open source benchmarks appropriate for the HPC environment. Experimental Data Architectures Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
SODA (Scalable Object and Analysis) This project aims to develop new methods to processes video data collected from traffic monitoring cameras to support traffic analyses including traffic counts and vehicle-pedestrian interactions.
Research Computing Infrastructure The Chameleon testbed is deployed at the University of Chicago (UC) and the Texas Advanced Computing Center (TACC) and consists of 650 multi-core cloud nodes, 5PB of total disk space, and leverage 100 Gbps connection between the sites.
National Science Foundation, award #1419152 The goal of this project and system is to open up new possibilities in science and engineering by providing computational capability that makes it possible for investigators to tackle much larger and more complex research challenges across a wide spectrum of domains.
National Science Foundation, award #1818253 Jetstream is a new type of computational research resource for the national (non-pclassified) research community - a data analysis and computational resource that scientists and engineers use interactively to conduct their research anytime, anywhere.
National Science Foundation, award #1445604 Stampede 2 offers a powerful new system that builds on the success of Stampede and will continue to enable groundbreaking science.
National Science Foundation, award #1540931 Accelerating Computing for Emerging Sciences (ACES) A platform that removes significant bottlenecks in advanced computing by introducing the flexibility to aggregate various components (i.e., processors, accelerators and memory) on an as-needed basis. National Science Foundation, award #2112356 cockerill@tacc. utexas.
edu Software Defined Visualization Software-defined visualization libraries provide performant rendering on general-purpose processors, including both traditional rasterization methods and ray tracing methods that can provide improved visual fidelity and expanded opportunities for simulation integration.
By removing the dependence on particular hardware for performant rendering, these libraries enable visualization capabilities for both in situ and post-process analysis across the various hardware architectures found in high performance computing. Software Defined Visualization This project develops a dynamic ray scheduling algorithm that effectively manages both ray state and data accesses.
Accelerated 3D Reconstruction and Visualization of Compressible Flow This project aims to create efficient shadowgraph and Schlieren visualizations of simulated airflow leveraging commodity hardware and open-source rendering platforms.
US Air Force SBIR Topic AF151-182 This project investigates making visual analysis of data more effectively by using time-tested methods of visual artists to investigate, reframe and generally make sense of their world. Imagine if data-intensive computing could be a more innately human and creative process, like sculpting (visual, tangible, active, physical).
National Science Foundation To reduce the overall cost of the visual analysis of large-scale simulation and observational data by replacing brute-force techniques with data sampling based on the actual requirements and capabilities of the user.
Department of Energy Office of Advanced Scientific Computing Research The Arabidopsis Informational Portal Causes and Consequences of Epigenetic Variation in Maize Control of Display Environments with Depth-based Skeletal Tracking Dynamic Changes in the Chick Developing Heart in Response to Altered Hemodynamics Epigenome Dynamics During DNA Replication eXtreme Digital (XD) Technology Investigation Service (TIS) IBP: Integrated Breeding Platform (portal) Interactive Anatomic System Dashboard (IASD) Interactive Parallelization Tool (IPT) LANIC: Mining Information & Web Archive Sustainable Places Project Synergistic Discovery and Design Environment (SD2E) Texas Test Server Problem A Thousand Words: Advanced Visualization for the Humanities Topology-Aware MPI and Job Scheduling Touch Interface Techniques for Collaborative and Interactive Visualization Virtual Reality for Journalists VisIT Performance Enhancements XSEDE Technology Audit Service
Based on current listing details, eligibility includes: Researchers and organizations engaged in advanced computing research. Applicants should confirm final requirements in the official notice before submission.
Current published award information indicates Varies Always verify allowable costs, matching requirements, and funding caps directly in the sponsor documentation.
The current target date is rolling deadlines or periodic funding windows. Build your timeline backwards from this date to cover registrations, approvals, attachments, and final submission checks.
Federal grant success rates typically range from 10-30%, varying by agency and program. Build a strong proposal with clear objectives, measurable outcomes, and a well-justified budget to improve your chances.
Requirements vary by sponsor, but typically include a project narrative, budget justification, organizational capability statement, and key personnel CVs. Check the official notice for the complete list of required attachments.
Yes — AI tools like Granted can help research funders, draft proposal sections, and check compliance. However, always review and customize AI-generated content to reflect your organization's unique strengths and the specific requirements of the solicitation.
Review timelines vary by funder. Federal agencies typically take 3-6 months from submission to award notification. Foundation grants may be faster, often 1-3 months. Check the program's timeline in the official solicitation for specific dates.
Many federal programs offer multi-year funding or allow competitive renewals. Check the official solicitation for continuation and renewal policies. Non-competing continuation applications are common for multi-year awards.
America's Seed Fund (SBIR/STTR) - Robotics (R) Topic is sponsored by National Science Foundation (NSF). This NSF SBIR/STTR topic focuses on robot intelligence and experiential learning, specifically in high-performance processors or hardware that provide situational awareness and improved artificial intelligence. It encourages innovations in voice, obstacle and image recognition, emotional response, and hand-eye coordination. Proposals that borrow features from animal nervous systems and include biologists, neuroscientists, and psychologists are also encouraged. The program also seeks proposals for next-generation automation, flexible assembly lines for mass customization, advanced control with agile robotic systems, and applications supporting individuals with disabilities, healthcare, smart drones, and personal robots.
Impact Challenge: AI for Government Innovation is sponsored by Google.org. This challenge funds nonprofits, social enterprises, and academic institutions that partner with government entities to deploy generative and agentic AI solutions to transform public service delivery. Selected organizations receive funding, participation in a Google.org Accelerator, technical support from Google AI experts, and Google Cloud credits.