Introduction
Getting reviewer feedback on your EPA SBIR proposal can feel like a mixed bag—especially if the outcome was a rejection. But in reality, this feedback is one of your most valuable tools. It tells you exactly what a panel of experts thought, what gave them pause, and what you need to fix to earn funding next time. Many successful EPA SBIR awardees didn’t win on their first try—they won on a resubmission informed by targeted revisions.
This guide will show you how to interpret common reviewer comments, identify what they’re really saying, and use those insights to build a stronger, more fundable proposal. Whether you’re planning a Phase I or Phase II resubmission, learning how to decode the review language is your first step to turning “not recommended” into “award funded.”
Understand the Structure of EPA Reviewer Comments
After your EPA SBIR proposal is reviewed, you’ll receive written critiques from multiple reviewers. These comments are not random—they’re guided by specific evaluation criteria laid out by the EPA. Understanding how these criteria shape the feedback is key to decoding what reviewers are really saying.
Evaluation Framework
Reviewer comments generally map to five core categories used by the EPA SBIR program:
- Technical merit and approach
- Innovation and originality
- Commercial potential and market understanding
- Environmental relevance and benefits
- Qualifications of the team and adequacy of facilities
For Phase I proposals, reviewers want to see a tightly scoped, credible proof-of-concept with potential for real-world impact. For Phase II, the bar is higher: you must demonstrate strong Phase I results and show a clear path to market.
Common Feedback Types
Despite the variety of projects submitted, certain critiques show up again and again. Comments like “the work plan is overly ambitious,” “the commercialization strategy lacks specificity,” or “the environmental impact is not quantified” are standard signals that your proposal fell short in predictable areas. That’s good news—because it means they can be fixed.
Technical Merit: Fixing Weaknesses in Your R&D Plan
Many EPA SBIR proposals are rejected not because the idea is bad, but because the research plan isn’t convincing. When reviewers flag your technical merit, they’re often telling you that your proposal lacks clarity, feasibility, or rigor.
Methodology and Scope
A common critique is that your experimental plan is too vague or attempts too much within the Phase I timeframe. If you see comments like “unclear how the tasks will be completed” or “objectives are too ambitious for a 6-month project,” that’s a red flag. Reviewers need to see that your technical goals are specific, measurable, and achievable with the resources available.
Design and Innovation
EPA reviewers also assess scientific rigor. Comments such as “sample size appears insufficient” or “missing control conditions” indicate concerns about whether your methods will generate reliable results. Similarly, “overlaps with existing solutions” usually means your innovation isn’t well differentiated. If your technology offers only a minor improvement over what’s already out there, it’s your job to explain why that improvement matters.
Risk Mitigation
Another frequent concern is the failure to address technical risks. If your proposal doesn’t mention known hurdles—such as scalability, material compatibility, or environmental durability—reviewers may assume you haven’t thought them through. Your resubmission should include a section that names these risks and outlines how you’ll address them, even if only in early-stage ways.
Strengthening the Commercialization Strategy
Even with a solid technical plan, your proposal won’t advance if reviewers question the business case. The EPA SBIR program weighs commercial potential just as heavily as technical merit, and many proposals fall short in this area.
Market Size and Fit
One of the most common critiques is that the commercialization plan “does not clearly define the target market.” This signals that reviewers were left guessing about who will buy your solution, how large the market is, or why your timing is right. Strengthen this by quantifying the market size, identifying key customer segments, and specifying how your solution solves a real pain point.
Partnerships and Distribution
If a reviewer notes “lack of identified partnerships,” that usually means they’re skeptical about your go-to-market strategy. In your revision, name specific collaborators—such as pilot customers, manufacturers, or distribution partners. Letters of support can provide strong validation here.
Financial Projections
A vague or inflated revenue forecast undermines credibility. Phrases like “overly optimistic” or “unsupported projections” suggest you need to back up your numbers with clear assumptions. Tie your forecasts to real sales channels, pricing models, and customer acquisition plans.
Making the Environmental Case Stronger
EPA SBIR proposals must do more than promise a marketable innovation—they must also demonstrate meaningful environmental impact. If reviewers flagged this part of your proposal, it’s often because the benefits weren’t clearly articulated, quantified, or aligned with EPA priorities.
Relevance to EPA Priorities
A comment like “unclear connection to the EPA topic” or “weak alignment with environmental goals” is a serious warning. Reviewers expect your proposal to explicitly address the problem identified in the EPA solicitation. Be specific: state which environmental issue your solution addresses and cite EPA language where possible.
Quantifying Benefits
Reviewers often criticize proposals that “generally describe benefits” without offering concrete metrics. If you’re claiming pollutant reduction, cost savings, or energy efficiency, include estimates: how much, compared to what, and how will it be measured?
Lifecycle Considerations
Some reviewers also expect proposals to account for the product’s lifecycle—especially if there are potential trade-offs. If your solution involves chemicals, materials, or disposal impacts, acknowledge those concerns and explain how they’ll be mitigated.
By clearly linking your innovation to EPA’s environmental mission—and backing that link with measurable impact—you give reviewers a stronger reason to recommend funding.
Responding to Team and Budget Critiques
Reviewers also evaluate whether your team has the expertise and resources to carry out the project. If this part of your proposal drew criticism, it often points to a perceived mismatch between your goals and your team’s capabilities—or questions about how your budget supports the work.
Team Credentials
Comments like “team lacks expertise in X” or “roles are not clearly defined” are not uncommon. They suggest that reviewers doubt your ability to execute. Address this by highlighting each team member’s relevant experience more effectively, or by adding an advisor or subcontractor to cover any gaps.
Facility Readiness
If a reviewer questions your access to basic equipment or lab space, your proposal may have signaled an unprepared launch. Be sure to clarify what you have in-house and what you’ll access through partnerships. Include letters confirming access to shared labs or university equipment where appropriate.
Budget Realism
Sometimes reviewers flag costs that seem inappropriate or unjustified for a Phase I or Phase II effort. For example, if a large portion of your budget goes toward purchasing expensive equipment, that might trigger skepticism.
When reviewers raise concerns about your team or budget, it’s not just about money or resumes—it’s about trust. They need to be convinced you can deliver on your proposed work without unnecessary risk.
Final Checklist Before You Resubmit
Once you’ve interpreted the reviewer feedback and made your revisions, it’s time to step back and double-check that your resubmission covers all the key bases. Use this checklist as a final sweep before uploading your next EPA SBIR proposal.
Technical Revisions
- Have you clarified your research objectives and methodology?
- Did you address concerns about feasibility, risk, or innovation?
- Have you added detail to your experimental design or included preliminary data?
Commercialization Enhancements
- Did you better define your target market and customer needs?
- Have you specified your path to market, including partners or pilots?
- Are your revenue projections backed by realistic assumptions?
Environmental Impact
- Does your proposal clearly tie into an EPA topic area?
- Have you quantified the environmental benefits?
- Did you address lifecycle or sustainability concerns?
Team and Budget Improvements
- Are all key roles clearly assigned with relevant expertise?
- Have you clarified access to necessary facilities and equipment?
- Is your budget aligned with your scope—and fully justified?
Communication and Tone
- Is your language clear and persuasive without being exaggerated?
- Have you used section headers, white space, and graphics (where allowed) to improve readability?
- Did you clearly explain changes from your previous submission if allowed?
Resubmitting can feel like starting over, but it’s actually a second chance with a major advantage: specific insights into what didn’t land the first time. Use those insights well, and your odds of success increase significantly.
Conclusion
Reviewer feedback may be tough to read, but it’s one of the most valuable resources you have in the EPA SBIR process. It doesn’t just point out what went wrong—it gives you a roadmap to a stronger, more fundable proposal. The key is to treat that feedback as actionable intelligence: dissect it, respond to it, and revise accordingly.
Many SBIR awardees didn’t succeed on their first submission. What sets them apart is how they used reviewer comments to improve. With clear revisions, sharper alignment to EPA priorities, and a focused resubmission strategy, you can turn a “no” into a well-earned “yes.”