Writing an NIH T32 Training Grant: Building the Institutional Case for Your Program
March 19, 2026 · 15 min read
Jared Klein
The T32 Is Not a Research Grant
The Ruth L. Kirschstein National Research Service Award Institutional Research Training Grant — the T32 — is fundamentally different from every other NIH mechanism you have written. An R01 funds a discrete scientific project. A K-award funds an individual investigator's development. The T32 funds an environment. Your job is to convince reviewers that your institution has built something greater than the sum of its individual labs, and that this collective training infrastructure produces scientists who would not exist without it.
This distinction trips up even experienced investigators. Program Directors who approach the T32 like a multi-PI research grant — foregrounding faculty research portfolios and treating the training plan as an afterthought — produce applications that score poorly. Reviewers want to see a coherent training philosophy, a curriculum that adds measurable value beyond standard graduate or postdoctoral education, and hard evidence that your program changes career trajectories.
T32 funding has contracted substantially over the past decade. From FY2014 to FY2022, the number of trainees on T32 grants fell by 20 percent, and the number of active T32 projects declined by 21 percent. Some NIH institutes have shifted support to alternative mechanisms like the TL1. In this tightening landscape, every section of your application must demonstrate that your program delivers outcomes that justify continued investment.
Understanding the Five Scored Review Criteria
Since January 2025, NIH has retained but refined five scored review criteria for T32 applications. Reviewers assign individual criterion scores (1 to 9) for each, then provide an overall impact score reflecting their judgment of the program's potential to develop a highly skilled, diverse biomedical research workforce.
Training Program and Environment. This is where you make the institutional case. Reviewers evaluate whether the proposed program provides training experiences beyond what is already available at your institution. They want to see a clear rationale for the program's existence, specific goals and measurable objectives, a structured curriculum of didactic and experiential components, and evidence that the institutional environment supports rigorous research training. The training program must add value — it cannot simply be a funding conduit for students and postdocs who would receive identical training without it.
Training Program Director(s)/Principal Investigator(s). The PD/PI must be an established investigator with both scientific credibility and demonstrated administrative and mentoring capacity. Reviewers look for a track record of training successful scientists, sufficient protected effort to lead the program, and — critically since the 2025 updates — evidence that the PD/PI has received formal mentor training or has a concrete plan to obtain it before the program starts.
Preceptors/Mentors. Every participating faculty member is individually evaluated. Reviewers assess the strength of each mentor's research funding, publication record, history of training success, and commitment to the program. Faculty without active R01-equivalent funding will draw scrutiny. Early-stage investigators can participate as co-mentors but generally should not serve as primary mentors for T32 trainees.
Trainees. For renewal applications, this criterion examines who has been appointed, how they were selected, and how diverse the trainee pool is. For new applications, reviewers evaluate the recruitment plan, selection criteria, and the anticipated trainee pool. The quality and diversity of trainees reflect directly on the program's reach and selectivity.
Training Record. For renewals, this is often the most consequential criterion. Reviewers evaluate trainee publications (especially first-author papers), subsequent funding success (particularly transition to K-awards or R-series grants), career placement, and time to degree. New applications must present the training track record of the PD/PI and participating faculty as proxy evidence.
A sixth element — Training in the Responsible Conduct of Research — moved from "Additional Review Considerations" to "Additional Review Criteria" in 2025. This means RCR training now directly influences the overall impact score. A perfunctory RCR plan will hurt you.
Designing the Training Program
Define the Gap Your Program Fills
Before writing a single word of the application, answer one question: what does your program provide that your institution's existing graduate programs and postdoctoral appointments do not? If you cannot articulate a clear, specific answer, you do not yet have a T32 program — you have a funding request.
Strong T32 programs are built around an identifiable training gap. Perhaps your institution has deep expertise in computational genomics but no structured pathway for wet-lab trainees to acquire quantitative skills. Perhaps your medical school trains excellent physician-scientists but lacks a formal translational research curriculum that bridges bench and bedside. The gap defines the program. Everything else — the curriculum, the faculty roster, the recruitment strategy — flows from it.
Build a Structured Curriculum with Measurable Milestones
Reviewers distinguish between programs that offer a genuine curriculum and those that list loosely connected seminars and journal clubs. A competitive T32 curriculum typically includes several integrated components.
Core coursework. Formal courses or modules that address the specific training gap. These should be distinct from standard departmental requirements. If your program emphasizes rigor and reproducibility in preclinical research, offer a dedicated course on experimental design, statistical power analysis, and transparent reporting — do not simply point to the department's existing biostatistics requirement.
Research rotations or structured mentored experiences. Define how trainees are matched with mentors, what the expectations are for each rotation, and how research progress is evaluated. Include milestone checkpoints — qualifying exams, committee meetings, annual progress reviews — with explicit criteria for advancement and remediation.
Professional development. Grant writing workshops, manuscript preparation seminars, oral presentation coaching, teaching practica, and career exploration panels. NIH expects trainees to develop six core competencies: discipline-specific conceptual knowledge, research skill development, communication skills, professionalism, leadership and management skills, and responsible conduct of research. Map your professional development offerings to these competencies explicitly.
Exposure to interdisciplinary science. T32 programs that confine trainees to a single department score lower than those that facilitate cross-departmental or cross-school collaboration. If your program is housed in a biochemistry department, describe how trainees interact with clinical investigators, bioengineers, or computational scientists. Structured co-mentoring arrangements between faculty in different departments are particularly valued.
Design a Rigorous RCR Training Plan
The elevation of Responsible Conduct of Research to a scored review criterion means your RCR plan needs real substance. NIH requires that RCR instruction include the following components: format (a combination of didactic and small-group discussion, not solely online modules), subject matter (covering data management, authorship, peer review, mentoring, conflicts of interest, research misconduct, and human subjects and animal welfare as applicable), faculty participation (senior researchers must participate in and lead discussions), duration (instruction must be substantive, typically eight or more contact hours per year), and frequency (at least once during each training period, with different or updated content for repeat participants).
Do not treat RCR as a compliance checkbox. Programs that integrate ethical reasoning into the research training experience — for example, by incorporating case studies drawn from the program's own scientific domain rather than generic scenarios — demonstrate a genuine commitment to research integrity that reviewers notice.
Assembling and Documenting the Faculty
Selecting Participating Faculty
The faculty roster is one of the most scrutinized components of a T32 application. Every proposed mentor must be individually justified, and the collective group must form a coherent training team rather than a random assortment of funded investigators in the same department.
Start with non-negotiable requirements. At most institutes, primary T32 mentors should hold active R01-equivalent funding. NCI explicitly states this expectation. NIGMS and other institutes are similarly rigorous. Faculty whose grants have lapsed or who hold only R21s or industry funding will raise flags. Early-stage investigators with strong startup packages and K-awards can participate as co-mentors, and including them signals program vitality, but they should be paired with established investigators.
Beyond funding, evaluate each faculty member's mentoring record. How many doctoral students and postdocs have they trained? Where are those former trainees now? Do they hold faculty positions, industry research roles, or other research-intensive careers? A brilliant scientist with an active R01 portfolio but no history of successful trainee mentorship is a weaker candidate for your roster than a slightly less prolific investigator who has launched multiple independent research careers.
Preparing Faculty Biosketches and Documentation
Every participating faculty member must submit a biosketch tailored to the training grant. This is not a copy of the biosketch from their latest R01. The T32 biosketch should emphasize three things: the relevance of the faculty member's research to the training program's scientific focus, their approach to teaching scientific rigor and mentoring trainees, and their training outcomes (names of former trainees, their current positions, their publications from the training period).
All faculty biosketches must be updated within six months of the application submission date and compiled into a single PDF. For a program with 15 to 20 mentors, this document alone can exceed 80 pages. Start collecting and reviewing biosketches at least six months before the submission deadline. Faculty who submit generic biosketches at the last minute will weaken your application.
In addition to biosketches, each faculty member must provide Other Support documentation listing all current and pending research funding. This information helps reviewers assess whether mentors have sufficient research infrastructure to support trainee projects and whether the program draws on genuinely active research environments.
Describing Mentor Training
The 2025 application updates place new emphasis on mentor training. Your application must describe how the PD/PI and participating faculty have received — or will receive before the program starts — formal training in effective mentorship. Acceptable approaches include completion of evidence-based mentor training curricula (such as Entering Mentoring or the CIMER framework), institutional mentoring workshops, or structured mentoring communities of practice.
This is not optional language. Reviewers are specifically instructed to evaluate mentor training, and an application that omits it or offers vague assurances ("faculty are committed to excellent mentoring") will lose points.
Trainee Selection, Diversity, and Outcomes
Recruitment and Selection
Describe your recruitment strategy in concrete operational terms. Where do you advertise positions? How do you reach candidates from underrepresented backgrounds? What criteria do applicants meet to be considered? How does the selection committee evaluate candidates, and who serves on it?
Strong programs recruit nationally, not just from their own institution's pipeline. They use holistic review processes that consider research potential, fit with the program's scientific focus, and contribution to trainee diversity. They describe explicit outreach to minority-serving institutions, participation in diversity recruitment events, and partnerships with undergraduate research programs that serve underrepresented populations.
Tracking Outcomes with xTRACT
NIH requires all T32 programs to track trainee outcomes for 15 years after completion of support. The Extramural Trainee Reporting and Career Tracking (xTRACT) system is the mandated platform for generating the data tables that appear in your application and progress reports. These tables include trainee demographics, publications during and after the training period, subsequent grant support, and career outcomes.
For renewal applications, the data tables are the evidentiary backbone of the Training Record criterion. Reviewers will calculate publication rates per trainee, examine the proportion of former trainees who obtained independent research funding, and evaluate career placement patterns. Programs where a high percentage of former trainees remain in research-intensive positions score well. Programs where trainees leave research or cannot be tracked score poorly.
For new applications, present analogous data from the PD/PI's and participating faculty's individual training records. If your proposed PD/PI has mentored 20 doctoral students over the past decade, reviewers want to know how many published first-author papers during training, how many obtained postdoctoral positions, and how many are now in research careers.
Benchmarking Your Outcomes
Contextualize your trainee outcomes against national benchmarks. NIH publishes workforce data through the Biomedical Research Workforce reports. If your program places 70 percent of predoctoral trainees into research-intensive postdoctoral positions compared to a national average of 50 percent, say so explicitly. If your postdoctoral trainees secure K-awards or R01s at rates above the national median, provide the numbers. Reviewers evaluate outcomes comparatively, and programs that present data without context make the reviewer do unnecessary work.
The Institutional Commitment Letter
The institutional commitment letter is required for all T32 applications — new, renewal, resubmission, and revision. It must be signed by a president, provost, dean, or equivalent institutional leader and may not exceed 10 pages. This letter is not a formality. It is the institutional counterpart to the PD/PI's training plan, and reviewers read it carefully.
What the Letter Must Address
The letter must describe specific activities and resources the institution commits to ensuring the program's success. This includes several mandatory elements.
Research infrastructure. Describe core facilities, shared equipment, and technology platforms available to trainees. Be specific about access policies — reviewers want to know whether trainees can use the confocal microscopy core, the sequencing facility, or the biostatistics consulting service without charge or at subsidized rates.
Faculty support. Describe how the institution enables faculty participation in training. Does the department provide protected time for mentors? Are there reduced clinical or teaching loads for faculty who take on T32 trainees? What happens if a mentor's research funding lapses — does the institution provide bridge funding to protect the trainee's research continuity?
Culture and rigor. Since 2025, the institutional letter must explicitly describe the institution's commitment to developing a culture of scientific rigor, reproducibility, and responsible conduct. This is not boilerplate — describe specific policies, training requirements, and oversight mechanisms.
Anti-discrimination policies. A separate mandatory component requires the letter to describe institutional commitment to preventing discriminatory harassment and ensuring equitable treatment of all trainees. Describe specific policies, reporting mechanisms, and oversight structures.
Trainee contingency planning. The letter should describe what happens to trainees appointed in the final year of funding if the competing renewal is not awarded. Reviewers want assurance that the institution will support trainees through completion of their training regardless of the grant's fate.
Strategic Use of the Letter
Beyond the mandatory elements, use the institutional commitment letter to differentiate your program. If the provost is willing to commit new faculty lines in the program's scientific area, include that. If the institution is investing in a new research building or core facility that will benefit trainees, describe the timeline and relevance. Concrete, quantifiable commitments — dollar amounts, FTE allocations, space assignments — carry far more weight than general expressions of enthusiasm.
Avoid the common mistake of using the institutional letter to repeat information from the training plan. The letter should complement the application by providing the institutional perspective on resource commitment, not summarize what the PD/PI has already described.
Budget Considerations
T32 budgets follow NRSA stipend levels set annually by NIH, not negotiated by the applicant. For FY2025, predoctoral stipends are set at $28,224 per year and postdoctoral stipends range from $56,484 to $68,604 depending on years of experience. The grant also provides a training-related expense allowance and an institutional allowance to offset tuition and fees.
Importantly, formal cost sharing is almost never required for T32 awards. NIH program announcements for T32s do not include cost sharing as a selection factor. However, the institutional commitment letter should describe resources the institution provides beyond what the grant covers — not as formal cost sharing, but as evidence of institutional investment. There is a meaningful difference between required cost sharing (rare) and demonstrated institutional support (essential).
Request enough trainee slots to sustain a critical mass of program participants. A program with two predoctoral and one postdoctoral slot will struggle to create the cohort dynamics and peer interactions that define a genuine training program. Most competitive T32 programs support four to eight trainees, though the right number depends on the program's scope, the size of the eligible trainee pool, and the number of qualified mentors.
Common Weaknesses That Sink T32 Applications
The program is indistinguishable from the existing graduate program. If reviewers cannot identify what the T32 adds beyond stipend support, the application will not score well. Every curricular element, mentoring structure, and professional development activity must be clearly identified as T32-specific or T32-enhanced.
Faculty without active research funding. Even one or two unfunded mentors on the roster will prompt reviewer concern about the program's research environment. Vet every faculty member's funding status within three months of submission and have contingency plans for mentors whose grants are in no-cost extension or pending renewal.
Weak or missing trainee outcome data. For renewals, incomplete tracking data is devastating. Invest in systematic alumni tracking well before the renewal is due. For new applications, if the PD/PI's personal training record is thin, consider whether a co-PD/PI with a stronger mentoring history would strengthen the application.
Generic RCR plan. Listing online CITI modules and an annual seminar is insufficient. Describe a substantive, interactive, domain-specific RCR curriculum with documented faculty participation.
Institutional commitment letter that reads as boilerplate. A one-page letter from a dean who clearly did not read the application undermines the entire program. Work with institutional leadership to produce a detailed, specific, and credible commitment letter.
No diversity recruitment strategy. Programs that describe no concrete outreach to underrepresented populations or that rely solely on their institution's existing pipeline will lose points on the Trainees criterion.
Timeline for a Competitive Submission
T32 applications require far more lead time than research grants because of the number of contributors and documents involved. A realistic timeline looks like the following.
12 months before submission. Identify the PD/PI team, define the program's scientific focus and training gap, and contact your NIH program officer to discuss fit with institute priorities. Begin identifying participating faculty.
9 months before submission. Finalize the faculty roster. Begin collecting biosketches and other support documents. Draft the training plan outline and identify curricular components. Engage institutional leadership for the commitment letter.
6 months before submission. Circulate biosketch guidelines and deadlines to all faculty. Draft the training program and environment section. Begin compiling trainee outcome data (for renewals) or faculty training records (for new applications). Share the draft training plan with your program officer for informal feedback.
3 months before submission. Complete drafts of all narrative sections. Review and edit all faculty biosketches for consistency and relevance. Finalize the institutional commitment letter. Assemble the RCR plan and diversity recruitment strategy.
6 weeks before submission. Internal review by colleagues and, if possible, a mock study section. Incorporate feedback, finalize all documents, and begin uploading to the electronic submission system.
2 weeks before submission. Final proofreading, compliance checks, and institutional sign-off.
Frequently Asked Questions
Can early-stage investigators serve as T32 mentors?
Early-stage investigators and new investigators can participate as co-mentors, typically paired with a senior, funded primary mentor. Most institutes expect primary mentors to hold active R01-equivalent research support. Including a few early-career faculty signals program growth and succession planning, but they should not constitute a large share of your mentor roster. Check your target institute's specific expectations — NCI, for example, explicitly requires R01-equivalent funding for primary mentors.
How many trainee slots should we request?
There is no single right number, but the program needs enough trainees to create meaningful cohort dynamics. Most competitive programs support four to eight trainees across predoctoral and postdoctoral levels. The request should be proportional to your mentor pool (roughly one to two trainees per every three to four mentors), the size of your eligible applicant pool, and your ability to deliver the proposed curriculum to all appointees. Requesting too many slots relative to your infrastructure is as risky as requesting too few.
What trainee outcome metrics matter most to reviewers?
For renewals, reviewers focus on first-author publication rates during training, time to degree completion, subsequent grant funding (especially transition to individual fellowships or K-awards), and career placement in research-intensive positions. Programs that can demonstrate a high percentage of former trainees in tenure-track faculty roles, research-intensive industry positions, or government research careers score best on the Training Record criterion. The 15-year tracking window means you need robust alumni tracking systems well before the renewal application is due.
How do the 2025 review changes affect our application strategy?
The most significant 2025 change is the elevation of Responsible Conduct of Research from an additional review consideration to a scored additional review criterion that directly influences overall impact scores. This means your RCR plan needs to be substantive, specific, and integrated with your training philosophy — not an appendix-quality afterthought. The 2025 updates also emphasize mentor training documentation and the institutional commitment to anti-discrimination policies. Review the updated PA-25-168 announcement and the NIH guide notice on training grant updates carefully, as the specific language and instructions have changed in ways that affect how you organize your application.
Should we contact the NIH program officer before submitting?
Absolutely, and you should do it early — ideally 12 months before submission. Program officers can tell you whether your proposed focus aligns with institute training priorities, whether the institute is accepting new T32 applications or only renewals, and whether your faculty roster and institutional environment meet threshold expectations. They cannot review drafts or guarantee funding, but a 30-minute conversation can prevent you from investing months of effort in an application that does not fit the institute's portfolio. For renewal applications, your existing program officer can provide invaluable feedback on what the previous review panel flagged as strengths and weaknesses.
A T32 application is one of the most labor-intensive grants in the NIH portfolio, requiring coordination across dozens of faculty, institutional leaders, and administrative staff — Granted can help you organize that complexity and keep every component aligned from first draft through submission.
Related Guides
Ready to put this into practice?
Draft your proposal with Granted AI. Win a grant in 12 months or get a full refund.
Backed by the Granted Guarantee