Granted

NSF Just Gutted Its Peer Review Process — What It Means for Your Next Proposal

February 24, 2026 · 4 min read

Claire Cummings

For decades, the deal was straightforward: submit a proposal to the National Science Foundation, and a panel of three to five outside scientists would read it, debate its merits, and recommend funding. That system is now being dismantled.

Under sweeping changes announced this month, NSF proposals may receive as few as one external review — down from the longstanding minimum of three. Expert panels will no longer routinely discuss individual reviews in detail. Review summaries that once ran a full page will be compressed to three-to-five sentences. And program officers, the NSF staff who manage portfolios, will gain substantially more authority to recommend which proposals get funded and which don't.

NSF leadership frames this as a modernization effort that will speed up time-to-award and reduce administrative burden. Critics — including many working scientists and former program directors — see something more troubling: a concentration of decision-making power that weakens the scientific input that has defined NSF's identity since 1950.

Fewer Reviewers, More Discretion

The core change is arithmetic but its effects are structural. When a proposal gets three to five independent reviews, outlier opinions get balanced by the group. A reviewer who misunderstands the methodology gets counterweighted by two who don't. A panel discussion surfaces disagreements and forces reviewers to defend their assessments.

With a single review, none of that happens. One person's reading of your proposal may be the only external scientific input the program officer receives. And with the new guidelines expanding program officer discretion, that officer can weigh — or disregard — that single review as they see fit.

This doesn't mean program officers will make arbitrary decisions. Many are rigorous scientists themselves. But the structural safeguard of multiple independent assessments, the mechanism that kept any single viewpoint from dominating, is being weakened at exactly the moment when trust in institutional decision-making is already fragile.

The "Administration Priority Areas" Signal

Buried in the restructuring is a telling detail: NSF's updated guidance explicitly calls out "Administration priority areas" as factors in funding decisions. The named priorities are artificial intelligence, quantum information science, biotechnology, nuclear energy, and translational science.

This isn't entirely new — agencies have always had strategic priorities. But stating them this explicitly alongside changes that give program officers more unilateral power creates a clear message about where money will flow. Researchers working in these areas should see a tailwind. Researchers in fields that don't map cleanly onto these priorities — much of the social sciences, ecology, pure mathematics, humanities-adjacent work — should be paying very close attention.

The practical question is whether program officers will feel pressure to align funding decisions with these stated priorities even when the science doesn't warrant it. Without the counterbalance of robust panel deliberation, that pressure has fewer checks.

What This Means for Proposal Strategy

If you're writing an NSF proposal in 2026, you're writing for a different audience than you were a year ago. Here's what changes:

Your program officer matters more than ever. When panels drove decisions, a strong proposal could succeed even if the program officer was lukewarm. Now the officer's assessment carries disproportionate weight. Understanding your program officer's research interests, reading their recent awards, and — where appropriate — having a pre-submission conversation becomes not just helpful but essential.

Broader Impacts needs real substance. With fewer reviewers providing detailed critiques, program officers need easy justification for funding decisions. A compelling Broader Impacts section gives them that justification. Generic statements about "training the next generation of scientists" won't cut it. Specific, measurable plans with institutional support letters will.

Clarity beats cleverness. When three to five specialists read your proposal, you could afford some technical density — at least one reviewer probably worked in your subfield. With potentially just one reviewer, the odds of landing a deep specialist drop. Write so that any scientist in your broad discipline can follow the argument and see its significance.

Connect to stated priorities where honest. If your work genuinely touches AI, quantum, biotech, nuclear energy, or translational science, make that connection explicit. Don't contort your research to fit — reviewers and program officers can smell opportunistic framing — but don't leave legitimate connections unstated either.

The Speed Trade-Off

NSF's stated rationale deserves fair consideration. Time-to-award at NSF has been a chronic problem, with some programs taking 12-18 months from submission to decision. The reviewer recruitment crisis is real — senior scientists are drowning in review requests and increasingly declining. Reducing the number of required reviews and streamlining panel processes could genuinely accelerate funding.

But speed and rigor aren't equivalent, and the history of science funding is littered with examples of what happens when evaluation shortcuts lead to concentration of authority. The question isn't whether the old system was perfect — it wasn't — but whether the replacement preserves enough scientific input to maintain credibility.

What to Watch Next

These changes are rolling out across NSF programs now, but implementation will vary by directorate and division. Some program officers will continue seeking multiple reviews even when one is technically sufficient. Others will embrace the new flexibility aggressively.

The next six months of award decisions will reveal how these changes play out in practice. Researchers should track award patterns in their programs, talk to their program officers, and compare notes with colleagues. If certain programs begin showing patterns that suggest reduced scientific input is affecting quality, the research community needs to document and raise those concerns while the policy is still being implemented.

For researchers navigating this shifting landscape, tools like Granted can help you identify the right funding opportunities and craft proposals calibrated to the new reality — because when the rules change, the first movers who adapt their strategy have a measurable edge.

Get AI Grants Delivered Weekly

New funding opportunities, deadline alerts, and grant writing tips every Tuesday.

Browse all NSF grants

More NSF Articles

Not sure which grants to apply for?

Use our free grant finder to search active federal funding opportunities by agency, eligibility, and deadline.

Find Grants

Ready to write your next grant?

Let Granted AI draft your proposal in minutes.

Try Granted Free