How Federal Proposal Evaluation Works
Federal agencies evaluate proposals using criteria published in the solicitation. Understanding this process helps you write proposals that score well.
The evaluation framework:
- Section L — Instructions for proposal content
- Section M — Evaluation criteria and their relative importance
- Evaluation team — Government personnel who score proposals
- Source Selection Authority — Official who makes award decision
Evaluation process overview:
- Proposals received by deadline
- Initial compliance check (all required elements present?)
- Technical evaluation against criteria
- Price/cost evaluation
- Past performance evaluation
- Best value tradeoff (if applicable)
- Award decision
Key principle:
Evaluators can only score what you write. They won't assume capabilities you don't describe. Make it easy for them to find and credit your strengths.
Evaluation Factors and Subfactors
Typical evaluation structure:
Factor 1: Technical Approach
- Subfactor 1.1: Understanding of requirements
- Subfactor 1.2: Technical solution
- Subfactor 1.3: Innovation/value-added features
Factor 2: Management Approach
- Subfactor 2.1: Program management
- Subfactor 2.2: Quality control
- Subfactor 2.3: Risk management
Factor 3: Key Personnel
- Subfactor 3.1: Qualifications
- Subfactor 3.2: Relevant experience
Factor 4: Past Performance
Factor 5: Price/Cost
Relative importance:
Section M specifies how factors relate:
- "Factor 1 is significantly more important than Factor 2"
- "Factors 1 and 2 combined are more important than Factor 3"
- "All non-price factors combined are significantly more important than price"
These statements guide how you allocate proposal effort and resources.
Adjectival Rating Systems
Common rating scales:
| Rating | Definition |
|---|---|
| Outstanding/Exceptional | Exceeds requirements, significant strengths, very low risk |
| Good/Very Good | Meets and may exceed some requirements, strengths outweigh weaknesses |
| Acceptable/Satisfactory | Meets requirements, no significant strengths or weaknesses |
| Marginal | Doesn't fully meet requirements, weaknesses offset strengths |
| Unacceptable | Fails to meet requirements, significant deficiencies |
Color coding (some agencies):
- Blue — Exceptional
- Green — Acceptable
- Yellow — Marginal
- Red — Unacceptable
Numerical scoring:
Some agencies use point systems:
- Outstanding: 90-100 points
- Good: 70-89 points
- Acceptable: 50-69 points
- Marginal: 30-49 points
- Unacceptable: 0-29 points
Get the Cheat Sheet
Join 5,000+ GovCon professionals. Get weekly insights and free templates.
No spam. Unsubscribe anytime.
Strengths, Weaknesses, and Deficiencies
Strength:
An aspect of the proposal that exceeds requirements or enhances the probability of successful performance.
Examples:
- Innovative approach that reduces risk
- Key personnel with directly relevant experience
- Additional capabilities beyond requirements
- Proven past performance on similar work
Weakness:
A flaw in the proposal that increases the risk of unsuccessful performance.
Examples:
- Approach that doesn't fully address a requirement
- Personnel with limited relevant experience
- Unclear or incomplete descriptions
- Missing information
Significant Weakness:
A flaw that appreciably increases the risk of unsuccessful performance.
Deficiency:
A material failure to meet a government requirement or a combination of significant weaknesses that increases performance risk to unacceptable level.
Impact on ratings:
- Multiple strengths, no weaknesses → Outstanding
- Some strengths, minor weaknesses → Good
- Meets requirements, no significant issues → Acceptable
- Weaknesses that create doubt → Marginal
- Deficiencies → Unacceptable
What Evaluators Look For
Understanding requirements:
- Do you demonstrate you understand the problem?
- Have you addressed all requirements?
- Does your approach fit the context?
Technical approach:
- Is your solution technically sound?
- Will it actually work?
- Have you identified and mitigated risks?
- Is the approach realistic and feasible?
Management:
- Can you actually execute what you propose?
- Do you have adequate controls and processes?
- How will you handle problems?
Personnel:
- Are proposed staff qualified?
- Do they have relevant experience?
- Are they available?
- Is the team structured appropriately?
What earns high scores:
- Exceeding requirements with valuable enhancements
- Demonstrating deep understanding
- Proposing innovative, lower-risk approaches
- Strong, specific past performance
- Clear, well-organized presentation
Writing for High Scores
Address every requirement:
- Use a compliance matrix to track
- Explicitly address each "shall" statement
- Don't assume evaluators will infer
Lead with benefits:
- Don't just describe what you'll do — explain why it's better
- "Our approach reduces schedule risk by..." not just "We will..."
- Connect features to government benefits
Be specific:
- Generalities don't earn strengths
- Quantify where possible
- Name names, cite specific experience
- "Our team has 500 years combined experience" → less compelling than specific project examples
Organize for evaluators:
- Follow RFP structure exactly
- Use headings that match evaluation criteria
- Make strengths easy to find
- Use graphics to reinforce key points
Avoid weaknesses:
- Address potential concerns proactively
- Don't leave gaps or ambiguities
- Explain anything unusual
- If you're light somewhere, acknowledge and mitigate
Best Value Tradeoffs
How tradeoff analysis works:
In best value procurements, the government compares technical scores against price to determine best value:
- Is a higher-rated, higher-priced proposal worth the extra cost?
- What is the government getting for the price difference?
- Are the strengths meaningful to successful performance?
Tradeoff documentation:
The Source Selection Authority must document:
- What discriminators justify the decision
- Why higher price is (or isn't) worth the added value
- How the decision serves the government's best interest
Implications for proposals:
- Your strengths must be worth paying for
- Generic strengths don't justify price premiums
- Discriminators must reduce risk or increase value
Example tradeoff:
"Offeror A is rated Outstanding and priced at $10M. Offeror B is rated Good and priced at $8M. Offeror A's strengths include proven experience with the exact system being replaced and a technical approach that eliminates integration risk. The $2M premium is justified by reduced risk and higher probability of on-time delivery."
Common Evaluation Pitfalls
Proposal problems that hurt scores:
Compliance failures:
- Not addressing all requirements
- Missing required sections
- Exceeding page limits
- Wrong format
Technical issues:
- Approach doesn't fully meet requirements
- Solution is technically flawed
- Risks not identified or mitigated
- Unrealistic claims
Presentation issues:
- Hard to follow or poorly organized
- Vague, generic language
- Inconsistencies between sections
- Poor quality (typos, formatting errors)
Strategic errors:
- Not understanding what evaluators value
- Focusing on wrong things
- Underinvesting in important areas
- Not differentiating from competitors
How to avoid these:
- Thorough compliance reviews
- Color team reviews (Pink, Red, Gold)
- Write to evaluation criteria, not just requirements
- Give evaluators what they need to score you highly
Frequently Asked Questions
Q:Can I see my evaluation scores after award?
Yes. You're entitled to a debriefing that includes your scores, strengths, weaknesses, and how you compared to the winner. Request a debriefing whether you win or lose — it's valuable feedback.
Q:How do evaluators handle missing information?
They typically can't give credit for what you don't write. Missing information becomes a weakness. In competitive range, you may get questions to clarify, but don't count on it.
Q:Do all evaluation factors matter equally?
No. Section M specifies relative importance. Some factors are "significantly more important" than others. Allocate your effort accordingly — invest most in highest-weighted factors.
Q:What if I disagree with my evaluation?
Request a detailed debriefing to understand the rationale. If you believe the evaluation was flawed, you may have grounds for a GAO protest — but protests are expensive and rarely successful.
Q:How do I know what will be considered a "strength"?
Strengths typically exceed requirements in ways that benefit the government — reducing risk, improving quality, saving time/money. Think "what will make evaluators say this is better than just meeting the requirement?"
Q:Is "acceptable" a bad score?
Acceptable means you meet requirements but don't stand out. In best value procurements, acceptable may lose to proposals with strengths. In LPTA, acceptable is sufficient if your price is lowest.
Q:Can I contact evaluators during the process?
No. All communication must go through the contracting officer. Attempting to contact evaluators could disqualify your proposal and potentially violate procurement integrity rules.
Q:How long does evaluation typically take?
Varies widely — from weeks to months depending on complexity, number of proposals, and agency workload. Complex best-value procurements take longer than simple LPTA evaluations.
Maximize Your Evaluation Scores
Understanding how evaluators score proposals is key to winning. Our team helps you write proposals that earn high technical scores and differentiate from competitors.
Get Proposal HelpLand a High-Paying GovCon Role
Jobs that use the skills from this guide