Resource
CMMC Level 2 Readiness Scorecard
A scorecard across all 14 NIST SP 800-171 families (110 controls in total). For each family you get an example selection of controls, the evidence a CMMC C3PAO assessor expects, and the common interpretation failures that send organizations back for a second attempt. Use the list as a working document during gap assessment and a briefing document during the final readiness review.
What CMMC Level 2 certification actually means
CMMC Level 2 is the Department of Defense assessment regime for organizations handling Controlled Unclassified Information (CUI) in the defense industrial base. The technical control set is identical to NIST SP 800-171, which has been a contractual requirement under DFARS 252.204-7012 since 2017. What CMMC adds is third-party verification. Level 2 introduces assessments by a Certified Third-Party Assessment Organization (C3PAO) accredited by the Cyber-AB (the CMMC Accreditation Body).
The policy framework for CMMC lives on the DoD CIO CMMC page. The details shift over time, but the practical implication has been stable for several years: if your contracts reference DFARS 252.204-7012 and you handle CUI, you need a System Security Plan, a POA&M, and evidence that all 110 controls are either implemented or have an approved compensating control.
The strategic decision most contractors need to make early is whether the enclave that handles CUI will be the whole enterprise or a scoped subset. A scoped enclave (a dedicated tenant, a separate environment, or a dedicated network zone) dramatically reduces the number of endpoints, users, and systems in assessment scope. The trade-off is operational friction when work flows between the enclave and the broader environment. Deciding this in month one saves months of rework later.
Level 2 certification is contract-driven. If you do not have a contract that requires Level 2, you do not need to pursue certification. If you do, the assessment is a gating item for award (or renewal) and the timeline is non-negotiable.
How the 110 controls are structured
The 110 controls in NIST SP 800-171 are organized into 14 control families. Access Control (AC) is the largest at 22 controls and System and Communications Protection (SC) is the second largest at 16. At the other end, Personnel Security (PS) and Awareness and Training (AT) are small but appear disproportionately in findings because they are often treated as policy exercises rather than operational ones.
Each control in the list reads as a requirement statement. The assessor does not grade you on how the statement was written; they grade you on whether you have implemented it. The evidence you produce is what connects the requirement to reality. The most common shortfall is evidence that documents intent but not operation: a policy that says MFA is required, without the identity provider export that shows MFA is actually enrolled for every user in scope.
NIST SP 800-171A is the companion assessment guide that tells assessors what objectives they should evaluate for each control. Any serious readiness preparation reads 800-171A alongside 800-171. The scorecard below summarizes our operating interpretation, not a substitute for those two documents.
The 14 control families: a readiness scorecard
Each card lists the family, the control count, a handful of example controls, what evidence a C3PAO typically wants, and the failure modes we see most often on readiness reviews. Use these as anchors during your self-assessment and as briefing material when you walk leadership through the remaining gaps.
Access Control (AC)
22 controlsExample controls
- Limit system access to authorized users, processes, and devices (3.1.1).
- Separate the duties of individuals to reduce risk of malevolent activity (3.1.4).
- Employ the principle of least privilege (3.1.5).
- Control remote access sessions with authorized cryptographic mechanisms (3.1.13).
Evidence expectations
Access request and approval records, user access review attestations, PAM session logs, MFA enrollment data, remote access policy, and VPN or ZTNA configuration export.
Common failure modes
Privileged accounts used for routine work. Remote access over weak cryptography. User access reviews skipped across one or more quarters of the assessment window.
Awareness and Training (AT)
3 controlsExample controls
- Ensure managers and users are aware of the security risks associated with their activities (3.2.1).
- Train personnel to carry out their assigned security-related duties (3.2.2).
- Insider threat awareness training (3.2.3).
Evidence expectations
Training platform completion reports, role-based curriculum, insider threat module, HR records showing new-hire completion within 30 days.
Common failure modes
Generic annual training that does not cover CUI handling. No separate insider threat module. Contractors excluded from training.
Audit and Accountability (AU)
9 controlsExample controls
- Create and retain system audit logs and records (3.3.1).
- Ensure actions of individual users can be uniquely traced (3.3.2).
- Review and update logged events (3.3.3).
- Alert in the event of an audit logging process failure (3.3.4).
Evidence expectations
SIEM architecture diagram, retention policy, sample extracts, alerting rules for log ingestion failure, documented log review cadence.
Common failure modes
Logging pipeline drops events under load without alerting. Shared admin accounts that defeat individual attribution. No documented review cadence.
Configuration Management (CM)
9 controlsExample controls
- Establish and maintain baseline configurations (3.4.1).
- Establish and enforce security configuration settings (3.4.2).
- Track, review, approve or disapprove, and log changes (3.4.3).
- Restrict, disable, or prevent the use of nonessential programs (3.4.7).
Evidence expectations
Baseline configuration documents, CIS benchmark scan output, change management ticket sample, software inventory with allow-list policy.
Common failure modes
Baselines exist but are not enforced (drift detection absent). Change tickets missing approver identity. Software allow-lists untested.
Identification and Authentication (IA)
11 controlsExample controls
- Identify system users, processes, and devices (3.5.1).
- Authenticate identities as a prerequisite to access (3.5.2).
- Use multifactor authentication for local and network access to privileged and non-privileged accounts (3.5.3).
- Employ replay-resistant authentication mechanisms (3.5.4).
Evidence expectations
Identity provider export showing MFA enrollment per user, privileged account list with MFA status, password policy or SSO standard.
Common failure modes
MFA enforced on the IdP but bypassable through legacy protocols. Service accounts without rotation. Shared accounts for shared mailboxes or kiosks.
Incident Response (IR)
3 controlsExample controls
- Establish an operational incident-handling capability (3.6.1).
- Track, document, and report incidents to designated officials (3.6.2).
- Test the organizational incident response capability (3.6.3).
Evidence expectations
Incident response plan, annual tabletop exercise report, incident ticket sample, escalation matrix, notification records to the DoD where required.
Common failure modes
Plan exists but no tabletop in the assessment window. No documented DoD notification path. Tickets do not capture lessons learned.
Maintenance (MA)
6 controlsExample controls
- Perform maintenance on organizational systems (3.7.1).
- Provide controls on the tools, techniques, mechanisms, and personnel used (3.7.2).
- Sanitize equipment removed for off-site maintenance (3.7.3).
- Supervise maintenance activities of maintenance personnel without required access authorization (3.7.6).
Evidence expectations
Maintenance records, vendor access approvals, sanitization logs for returned hardware, remote maintenance session records.
Common failure modes
Vendor laptops connected to maintenance ports without sanitization logs. Remote maintenance sessions not recorded. Sanitization policy exists but not followed.
Media Protection (MP)
9 controlsExample controls
- Protect system media containing CUI (3.8.1).
- Limit access to CUI on system media to authorized users (3.8.2).
- Sanitize or destroy system media before disposal or reuse (3.8.3).
- Control the use of removable media (3.8.7).
Evidence expectations
Media inventory, destruction certificates, removable media policy, USB control settings on endpoints, encrypted drive enforcement.
Common failure modes
USB ports allowed by default. Destruction certificates missing serial numbers. Encryption policy not enforced at the endpoint level.
Personnel Security (PS)
2 controlsExample controls
- Screen individuals prior to authorizing access to systems containing CUI (3.9.1).
- Ensure systems are protected during and after personnel actions (terminations, transfers) (3.9.2).
Evidence expectations
Background check policy, HR termination procedure with same-day access removal, transfer records showing access adjusted.
Common failure modes
Termination access removal delayed beyond 24 hours. Transfers retain prior access. Background check gaps for contractors.
Physical Protection (PE)
6 controlsExample controls
- Limit physical access to systems and equipment to authorized individuals (3.10.1).
- Protect and monitor the physical facility (3.10.2).
- Escort visitors and monitor visitor activity (3.10.3).
- Maintain audit logs of physical access (3.10.4).
Evidence expectations
Badge system exports, visitor logs, camera retention policy, facility assessment, alternate work site policy for remote employees handling CUI.
Common failure modes
Visitor logs paper-only with gaps. Tailgating not addressed. Remote worker locations not assessed for physical security.
Risk Assessment (RA)
3 controlsExample controls
- Assess risk to organizational operations (3.11.1).
- Scan for vulnerabilities in systems and applications (3.11.2).
- Remediate vulnerabilities in accordance with risk assessments (3.11.3).
Evidence expectations
Annual risk assessment, vulnerability scan reports (monthly at minimum), remediation SLAs by severity, exception log with compensating controls.
Common failure modes
Scans run but remediation SLAs untracked. Exceptions approved indefinitely without periodic review. No explicit risk tolerance.
Security Assessment (CA)
4 controlsExample controls
- Periodically assess the security controls (3.12.1).
- Develop and implement plans of action (POA&M) to correct deficiencies (3.12.2).
- Monitor security controls on an ongoing basis (3.12.3).
- Develop, document, and update the System Security Plan (SSP) (3.12.4).
Evidence expectations
System Security Plan (the most commonly missing artifact), POA&M with open items tracked to closure, self-assessment schedule, continuous monitoring strategy.
Common failure modes
SSP drafted once and never updated. POA&M items sit open past target dates with no justification. No formal self-assessment cadence.
System and Communications Protection (SC)
16 controlsExample controls
- Monitor, control, and protect communications at external boundaries (3.13.1).
- Employ architectural designs and software development techniques that promote effective information security (3.13.2).
- Separate user functionality from system management functionality (3.13.3).
- Implement cryptographic mechanisms to prevent unauthorized disclosure of CUI during transmission (3.13.8).
Evidence expectations
Network architecture diagram, FIPS 140-validated cryptography inventory, TLS configuration scans, DNS security posture, egress filtering rules.
Common failure modes
Cryptography in use but not FIPS 140 validated. Management plane mixed with user plane. TLS downgrade vectors unaddressed.
System and Information Integrity (SI)
7 controlsExample controls
- Identify, report, and correct system flaws in a timely manner (3.14.1).
- Provide protection from malicious code (3.14.2).
- Monitor system security alerts and advisories (3.14.3).
- Monitor organizational systems, including inbound and outbound communications, to detect attacks (3.14.6).
Evidence expectations
Patch management reports, endpoint protection coverage, SIEM detection rule inventory, threat intelligence subscription records.
Common failure modes
Patch SLAs documented but not met on critical systems. EDR deployed but on a fraction of endpoints. No tuning of detection rules over time.
Total controls across the 14 families: 110.
Evidence that C3PAO assessors actually want to see
C3PAO assessors evaluate the practice at each control. For most controls the assessor wants three things: a policy or procedure that documents intent, a sample of the artifact that shows the control operates, and an interview that confirms the person running the control understands it. Missing any one of the three is an open finding. Missing two is an unsatisfactory score.
Evidence should be pre-organized by control identifier. A folder structure keyed to 3.1.1, 3.1.2, and so on (matching the 800-171 control numbering) makes the assessor's job dramatically easier and reduces the chance of back-and-forth during fieldwork. Include a System Security Plan that explicitly maps each control to the implementing system, the responsible person, and the artifact location. A well-structured SSP is the single most impactful document in the entire engagement.
Use our manufacturing CMMC Level 2 case study as a reference for how evidence packages look when they are production-ready. The case captures the System Security Plan structure, the POA&M approach, and the enclave scoping decisions that a mid-sized manufacturer made during preparation.
Common interpretation gaps that fail assessments
The most frequent cause of an unsatisfactory score is a gap between stated practice and operational practice. The policy says CUI is encrypted at rest with FIPS 140-validated modules. The implementation uses TLS 1.2 in transit and native disk encryption at rest that has not been configured to use a FIPS 140-validated module. Both steps encrypt data, but only the second satisfies 3.13.11. Assessors read the requirement language precisely.
Another recurring gap is boundary definition. If the assessor cannot identify the boundary of the system that handles CUI (which networks, which endpoints, which cloud tenants), every control becomes harder to evaluate. A boundary diagram that matches the System Security Plan narrative is worth more than any number of additional policies. Work this out early and keep it current.
A third gap is separating administrator functions from user functions. Many Level 2 environments run management consoles, monitoring tools, and production workloads in the same network zone, accessed by the same accounts. 3.13.3 requires separation. Achieving it often means a dedicated management network, a privileged access workstation program, or both. See zero trust architecture for the identity and network decisions that address this cleanly.
The 90-day path to certification
A 90-day path is realistic only when the starting position includes an already-operating SP 800-171 program with a self-assessment score in the 100 range or above, and a System Security Plan that matches reality. The first 30 days close outstanding POA&M items, tighten evidence organization, and run a mock assessment. The next 30 days address findings from the mock. The final 30 days are the formal C3PAO assessment and report issuance.
A more common starting position is a self-score in the 80s or below, in which case the path is six to nine months, not ninety days. Leadership that promises a 90-day path from a cold start is setting up the team for failure and the contract for delay.
For sector-specific guidance see our CMMC compliance for manufacturers and the manufacturing industry page.
What the C3PAO assessment looks like
A C3PAO Level 2 assessment spans several days, sometimes more depending on scope. The assessors review documentation, interview personnel, observe controls in operation, and sample artifacts. Every control is evaluated as either MET or NOT MET. A finding can sometimes be closed in-engagement if the gap is small and the artifact can be produced on the spot. A finding that cannot be closed in-engagement must be remediated and reassessed, which costs time and money.
Before the assessors arrive, run a full readiness review with an independent assessor who is not your C3PAO. Treat the review as adversarial: produce evidence in the exact order and format the assessment will use, and time the response. The goal is to identify every gap in advance of the billable engagement.
The assessment culminates in a report that becomes part of the DoD Supplier Performance Risk System record. This is the artifact contracting officers rely on to verify compliance, and it is visible for three years.
Ongoing attestation and annual self-assessment
Level 2 certification is valid for three years, but the obligations do not pause. Annually, a senior official attests to continued compliance. The POA&M continues to track open items. Any material change in scope, architecture, or personnel needs to be reflected in the System Security Plan and communicated to the contracting officer as appropriate. Treating certification as a point-in-time event is how organizations fail their next assessment cycle.
Maintain a continuous monitoring cadence. Monthly vulnerability scans, quarterly access reviews, annual penetration tests, annual tabletop exercises, and an annual risk assessment should all run on a schedule. The artifacts produced by those activities become the evidence for the next cycle. Organizations that do this have an easier renewal than organizations that let the program go dormant until the next assessment approaches.
How CMMC maps to NIST SP 800-171 and NIST SP 800-172
Level 2 is effectively a third-party assessment of NIST SP 800-171. Level 3 introduces enhanced requirements from NIST SP 800-172 that target advanced persistent threat activity. Level 3 applies to a narrower set of contractors handling the most sensitive CUI and carries a government-led assessment process.
For the majority of defense industrial base suppliers, Level 2 is the relevant target. Do not scope to Level 3 unless your contracts require it, and do not dismiss Level 1 if your handling is limited to Federal Contract Information (FCI) rather than CUI. The level flows from the data, and the data flows from the contract.
If the scorecard has surfaced gaps that need to close before a C3PAO assessment makes sense, the free assessment is where to start the conversation.