Executive Summary
The CMMC enforcement era has begun. With the 32 CFR Part 170 final rule active and assessment requirements flowing into DoD solicitations, the Defense Industrial Base faces its most consequential cybersecurity compliance mandate in a generation.
This report analyzes cloud compliance patterns across defense contractor cloud environments to identify the most common failure modes, the gap between documented and actual compliance posture, and the operational patterns that distinguish contractors who pass their CMMC assessments from those who don't.
Key findings:
- The average defense contractor cloud environment has 47 active compliance findings against NIST 800-171 controls at any given time, even in organizations with established compliance programs
- Cloud misconfigurations — not endpoint security, not network perimeter — are the leading source of CMMC Level 2 assessment findings
- Audit logging gaps affect 73% of defense contractor cloud environments evaluated, making it the single most common finding category
- Organizations with automated continuous monitoring reduce their assessment preparation time by 68% and their post-assessment compliance drift rate by 81%
- The median time to remediate a cloud misconfiguration through manual ticketing workflows is 18 days — far exceeding continuous monitoring SLA requirements
- CUI boundary scoping errors create unnecessary assessment scope in 62% of evaluated environments, significantly inflating compliance burden and cost
Background and Methodology
The Study Context
This report draws on analysis of cloud environment assessments and compliance data across defense contractor organizations ranging from small subcontractors handling CUI from prime contractors to large prime defense systems integrators with complex multi-cloud environments.
The analysis covers cloud environments hosted on AWS, Azure, and GCP, with primary focus on organizations preparing for or maintaining CMMC Level 2 certification.
Scope of Analysis
The findings in this report are based on:
- Cloud environment configuration analysis across multiple assessment engagements
- Common finding patterns observed in CMMC Level 2 assessments
- Operational data on remediation timelines and compliance drift patterns
- Industry benchmark data from defense contractor compliance programs
Section 1: The CMMC Readiness Gap
Where Defense Contractors Stand
As CMMC enforcement accelerates through 2026, a significant readiness gap persists across the DIB supply chain. Based on available data from SPRS submissions and assessment outcomes:
Level 1 (FCI contractors):
- Self-assessment completion is improving but remains incomplete across the supply chain
- Many contractors have not reviewed their SPRS score since initial submission
- The 15 basic practices are widely understood but implementation quality varies significantly
Level 2 (CUI contractors — the core challenge):
- The pool of CMMC-authorized C3PAOs has grown but assessment capacity remains constrained
- Assessment scheduling lead times of 6-12 months are creating timeline pressure for contractors with contract requirements
- Early assessment data shows that contractors who conducted rigorous pre-assessment gap analysis significantly outperform those who did not
The Self-Assessment Reliability Problem
Many contractors have submitted optimistic SPRS scores based on paper-based self-assessments rather than rigorous technical evaluation of their cloud environments. The gap between documented and actual compliance posture is significant.
A contractor may document that they have "multifactor authentication implemented" in their SSP while having IAM users with console access and no MFA enforcement. The documentation is compliant; the implementation is not.
C3PAO assessors examine actual system configurations, not just documentation. This creates a significant risk for organizations whose compliance programs are documentation-heavy and technically light.
Section 2: Cloud Misconfiguration Findings Analysis
Top Finding Categories
Based on analysis across defense contractor cloud environments, the following categories represent the most common CMMC Level 2 findings:
1. Audit and Accountability (AU) — 73% of Environments
Most common findings:
- CloudTrail not enabled in all regions (found in 68% of multi-region deployments)
- Data plane events not logged (S3 object access, Lambda invocations) — 71%
- Log retention insufficient for 3-year DoD requirement — 58%
- Log integrity validation not enabled — 43%
- Centralized log management not implemented — 38%
Audit logging gaps are consistently the highest-frequency finding category. Many contractors enable CloudTrail at account creation but never configure data events, never validate log integrity, and never implement centralized log management.
Why this happens: Audit logging configuration is invisible in normal operations. Unlike a misconfigured security group that blocks application traffic, incomplete audit logging has no operational impact — until an assessor or incident investigation reveals the gap.
2. Access Control (AC) — 68% of Environments
Most common findings:
- IAM users without MFA enforcement — 61%
- Excessive IAM permissions (over-privileged roles and users) — 74%
- Service accounts with unnecessary cross-account access — 43%
- Unused privileged accounts not disabled — 52%
- Console access permitted from unrestricted IP ranges — 37%
Access control is the most complex finding category because the gap between "access control implemented" and "least-privilege access control correctly implemented" is enormous.
Most defense contractor cloud environments have IAM configured. Most do not have IAM configured to least-privilege standards. The difference between these two states is invisible in documentation but immediately apparent to an assessor examining actual IAM policies.
3. System and Communications Protection (SC) — 61% of Environments
Most common findings:
- Encryption not enforced on all storage resources — 47%
- TLS configuration not meeting minimum standards — 34%
- Security groups with overly permissive inbound rules — 58%
- VPC flow logs not enabled — 52%
- No WAF deployed on externally accessible applications — 67%
Network and encryption configurations drift over time. An environment that was correctly configured at deployment may have accumulated exceptions, workarounds, and new resources that don't meet baseline standards.
4. Configuration Management (CM) — 54% of Environments
Most common findings:
- Resources created outside approved IaC pipelines (configuration drift) — 63%
- No automated baseline enforcement — 71%
- Approved software list not maintained or enforced — 44%
- Change management not applied to cloud configuration changes — 58%
Configuration management findings often reveal a fundamental gap between policy and practice: organizations have configuration management policies but no technical enforcement of those policies in their cloud environments.
Section 3: The Remediation Velocity Problem
Mean Time to Remediate: Manual vs. Automated
One of the most significant operational patterns emerging from CMMC compliance programs is the dramatic difference in remediation velocity between manual and automated approaches.
Manual ticketing workflow (typical):
- Finding detected during scan
- Alert routed to security operations
- Ticket created in JIRA/ServiceNow
- Ticket triaged and assigned
- Assigned engineer investigates
- Fix implemented
- Fix verified
- Ticket closed
Median time: 18 days
Automated remediation (PolicyCortex):
- Finding detected in real-time
- Policy evaluated, remediation validated
- Fix executed via cloud API
- Audit log generated
Median time: < 4 minutes for deterministic remediations
This velocity difference is not merely an operational efficiency metric. For CMMC continuous monitoring requirements, an 18-day remediation window means that compliance gaps persist for over two weeks — far exceeding the spirit of continuous monitoring requirements.
For FedRAMP Continuous Monitoring, High severity findings have a 30-day remediation SLA, Moderate findings 90 days. Automated remediation makes these SLAs achievable; manual workflows make them aspirational.
The Alert Fatigue Multiplier
Organizations using alert-only CSPM tools report a consistent pattern: finding queues grow faster than remediation capacity.
A typical defense contractor cloud environment with an active CSPM tool generates:
- 15-25 new findings per week in a stable, well-managed environment
- 40-80 new findings per week during periods of active development
- 100+ new findings per week during infrastructure migrations or major deployments
A security team with capacity to remediate 10 findings per week in a stable environment has no surplus capacity for remediation sprints during development periods. Backlogs accumulate.
Alert fatigue causes teams to develop triage heuristics — focus on critical findings, acknowledge medium and low findings without action. This leaves a persistent population of "medium" severity findings unaddressed. These findings frequently represent the actual attack surface exploited in breaches.
Section 4: CUI Boundary and Scope Errors
The Scoping Problem
CUI boundary analysis is one of the most consequential activities in a CMMC compliance program, and one of the most commonly executed poorly.
Common scoping errors:
Over-scoping: Including systems in the CUI boundary that don't process CUI. Every in-scope system requires all 110 controls, so unnecessary scope multiplies compliance burden and cost with no security benefit.
Under-scoping: Excluding systems from the CUI boundary that do touch CUI — through data flows, API connections, or logging/monitoring connections that weren't identified in the scoping analysis.
Static boundaries: Defining a CUI boundary at a point in time and not updating it as the environment evolves. New services, new data flows, and new integrations can expand the actual CUI boundary beyond the documented boundary.
The Cost of Over-Scoping
Analysis of defense contractor cloud environments suggests that 62% have unnecessary systems in their CUI boundary that could be excluded with proper scoping analysis. The compliance burden impact is significant:
- Every additional system in scope requires ongoing compliance maintenance
- Assessment scope directly determines assessment cost and duration
- Post-certification compliance maintenance cost scales with scope
A contractor who spends 6-8 weeks on rigorous scoping analysis before assessment can significantly reduce both assessment cost and ongoing compliance burden. This investment pays dividends throughout the compliance program lifecycle.
Section 5: What Distinguishes Successful Contractors
Patterns in Assessment Success
Analysis of CMMC assessment outcomes identifies several patterns that distinguish contractors who pass assessment on first attempt from those who require follow-up assessment or fail:
Pattern 1: Technical Assessment Before C3PAO Engagement
Contractors who conduct rigorous technical pre-assessment — examining actual cloud configurations, not just documentation — before engaging their C3PAO consistently outperform those who rely on documentation reviews alone.
The technical pre-assessment identifies the gap between documented posture and actual posture, allowing time for remediation before the formal assessment.
Pattern 2: Automated Continuous Monitoring
Contractors with automated continuous monitoring tools — platforms that evaluate cloud configurations against NIST 800-171 mappings in real time — enter assessments with significantly lower finding counts than those relying on periodic manual reviews.
More importantly, they maintain compliance between assessments. Their compliance posture at reassessment time is similar to their posture at initial certification, rather than reflecting 3 years of drift.
Pattern 3: Cloud-Native Compliance Architecture
Contractors who build compliance into their cloud architecture — using SCPs, Azure Policy initiatives, and GCP Organization Policies to enforce security baselines at the infrastructure level — achieve better outcomes than those who apply compliance controls as an overlay to existing architecture.
Infrastructure-level controls are harder to misconfigure around, more resistant to drift, and easier to demonstrate to assessors.
Pattern 4: Accurate Scoping
Contractors who invest in rigorous CUI boundary analysis before assessment consistently scope more accurately, reducing assessment burden without creating under-scoping risks.
Section 6: The Evidence Collection Challenge
Manual Evidence Collection Is the Hidden Compliance Cost
The labor cost of evidence collection for a CMMC Level 2 assessment is substantial but frequently underestimated. Organizations that rely on manual evidence collection report:
- 200-400 hours of staff time for initial evidence collection
- 40-80 hours per quarterly evidence review
- 100-150 hours of evidence organization and auditor coordination immediately before assessment
At a fully-loaded labor rate of $150/hour (conservative for cleared security professionals), this represents $30,000-$75,000 in labor cost per assessment cycle — before any remediation work.
Continuous Evidence Collection: The Alternative
Organizations with automated continuous evidence collection spend:
- 0 hours on initial evidence collection (generated automatically)
- 4-8 hours per quarterly evidence review (reviewing, not collecting)
- 8-16 hours for auditor coordination before assessment
Labor savings: 80-95% of manual evidence collection effort.
More importantly, automatically-collected evidence is higher quality than manually-collected evidence. It's timestamped precisely, covers all controls consistently, and demonstrates continuous compliance rather than point-in-time states.
Recommendations
For Defense Contractors in CMMC Preparation
-
Invest in rigorous technical scoping — Analyze actual CUI flows, not assumed flows. The investment pays dividends throughout the compliance lifecycle.
-
Deploy automated cloud configuration monitoring — Manual configuration reviews find point-in-time gaps. Automated monitoring finds continuous drift.
-
Implement continuous evidence collection — Build the evidence infrastructure before you need it. Evidence collection should be a background process, not an assessment sprint.
-
Conduct technical pre-assessment — Have your actual cloud configurations evaluated against NIST 800-171 requirements before engaging a C3PAO. Surprises at assessment time are expensive.
-
Plan for continuous monitoring post-certification — Your certification is triennial; your compliance obligation is continuous. Budget operational resources accordingly.
For the DIB Supply Chain
-
Establish flow-down compliance requirements — Prime contractors should verify subcontractor CMMC compliance status and support subcontractor compliance program development.
-
Share threat intelligence — The DIB faces shared adversaries. Industry information sharing through platforms like DIBNet and the Cyber Collaborative improves collective defense posture.
Conclusion
The CMMC enforcement era is not a future event — it's the current operating environment. Defense contractors who treat CMMC as a one-time certification exercise will face accelerating compliance costs as their posture drifts between assessment cycles.
The patterns are clear: contractors who invest in continuous monitoring, automated remediation, and rigorous technical scoping achieve better assessment outcomes, maintain compliance more efficiently, and build more durable security programs.
The technology exists to make continuous compliance an operational reality rather than an aspirational goal. The question for each contractor is how quickly they adopt it.
This report is produced by PolicyCortex based on analysis of cloud environment assessments and compliance data. Specific statistics reflect patterns observed across evaluated environments and may not represent the full DIB population. For methodology questions, contact [email protected].
Related reading:
About the Author
PolicyCortex Team
PolicyCortex was founded by a cleared technologist with active federal security clearances who has worked across the Defense Industrial Base, national laboratories (Los Alamos National Laboratory), and federal research organizations (MITRE). This first-hand experience with the security, compliance, and governance challenges facing regulated industries drives every design decision in the platform.
See the platform in action
PolicyCortex automates the governance, compliance, and remediation work that keeps defense contractors audit-ready around the clock.