williamzujkowski

NIST SP 800-53 Compliance Validator

3
0
# Install this skill:
npx skills add williamzujkowski/cognitive-toolworks --skill "NIST SP 800-53 Compliance Validator"

Install specific skill from multi-skill repository

# Description

Validate NIST SP 800-53 control implementation with evidence mapping, gap analysis, automated testing, and compliance reporting across 20 control families.

# SKILL.md


name: NIST SP 800-53 Compliance Validator
slug: compliance-nist-validator
description: Validate NIST SP 800-53 control implementation with evidence mapping, gap analysis, automated testing, and compliance reporting across 20 control families.
capabilities:
- NIST SP 800-53 Rev 5 control validation (1,189 controls across 20 families)
- Evidence collection and mapping to controls
- Gap analysis with missing/partial control identification
- Automated compliance testing (60-80% effort reduction)
- Control family assessment (AC, SI, CM, IA, AU, etc.)
- OSCAL format integration (SSP, SAP, SAR, POA&M)
- Continuous monitoring configuration
- Remediation recommendations with prioritization
inputs:
- NIST baseline (LOW, MODERATE, HIGH, or custom control set)
- System documentation (architecture, data flows, configurations)
- Existing controls implementation (technical, operational, management)
- Evidence sources (logs, configs, policies, procedures, test results)
- Compliance scope (control families, specific controls, or full catalog)
- Assessment type (initial, periodic, continuous monitoring)
outputs:
- Control validation results (implemented, partial, not-implemented)
- Evidence mapping matrix (controls → evidence artifacts)
- Gap analysis report with missing controls and recommendations
- Compliance score by control family and overall
- OSCAL-formatted SAR (Security Assessment Report)
- Prioritized remediation plan with effort estimates
- Continuous monitoring configuration
keywords:
- nist
- sp-800-53
- compliance
- security-controls
- privacy-controls
- evidence
- gap-analysis
- oscal
- automation
- assessment
- nist-800-53a
version: 1.0.0
owner: cognitive-toolworks
license: MIT
security: public
links:
- title: "NIST SP 800-53 Rev 5.2.0 (August 2025)"
url: "https://csrc.nist.gov/pubs/sp/800/53/r5/upd1/final"
accessed: "2025-10-26T18:38:09-0400"
- title: "NIST SP 800-53A Rev 5.2.0 Assessment Procedures"
url: "https://csrc.nist.gov/pubs/sp/800/53/a/r5/final"
accessed: "2025-10-26T18:38:09-0400"
- title: "NIST SP 800-53 Release 5.2.0 Announcement (August 27, 2025)"
url: "https://csrc.nist.gov/News/2025/nist-releases-revision-to-sp-800-53-controls"
accessed: "2025-10-26T18:38:09-0400"
- title: "Automated Security Testing for NIST 800-53"
url: "https://satinetech.com/2025/05/28/automated-security-testing-for-nist-800-53-controls-a-practical-guide/"
accessed: "2025-10-26T18:38:09-0400"


Purpose & When-To-Use

Invoke this skill when validating NIST SP 800-53 control implementation, conducting compliance assessments, identifying gaps, or generating evidence-based compliance reports for federal systems, contractors, or organizations adopting NIST frameworks.

Trigger Conditions:
- "Validate NIST SP 800-53 compliance for [system]"
- "Generate gap analysis for MODERATE baseline"
- "Map evidence to NIST controls for [control family]"
- "Assess [control family] implementation (e.g., AC, SI, CM)"
- "Create OSCAL SAR for NIST compliance assessment"
- "Automate NIST 800-53 control testing"
- "Continuous monitoring for NIST controls"

Out of Scope:
- FedRAMP-specific requirements (use compliance-fedramp-validator)
- OSCAL format conversion without validation (use compliance-oscal-validator)
- General compliance automation (use compliance-automation-engine)


Pre-Checks

  1. Time Normalization: Compute NOW_ET using NIST/time.gov semantics (America/New_York, ISO-8601).
  2. Input Validation:
  3. NIST baseline specified (LOW, MODERATE, HIGH, or custom)
  4. System boundaries defined (scope of assessment)
  5. Control families or specific controls identified
  6. Documentation Availability: Verify access to system documentation, configurations, evidence artifacts.
  7. OSCAL Integration: Check if OSCAL SSP (System Security Plan) exists for evidence mapping.
  8. Assessment Type: Clarify if initial assessment, periodic re-assessment, or continuous monitoring.

Abort Conditions:
- No baseline or control scope specified → emit TODO list with required inputs.
- Zero documentation or evidence → warn that validation will be minimal.


Procedure

T1: Quick Control Gap Analysis (≤2k tokens)

Use Case: Fast path for common scenarios (80% of requests).

Steps:
1. Identify NIST Baseline: Select applicable baseline controls.

Baseline Control Count Use Case
LOW 125 controls Low-impact systems (public info)
MODERATE 325 controls Moderate-impact (most federal systems)
HIGH 421 controls High-impact (national security, critical infrastructure)
  1. Sample 5 High-Priority Controls (T1 Fast Check):
Control ID Control Name Family Validation Check
AC-2 Account Management Access Control User account policies exist, periodic review evidence
SI-2 Flaw Remediation System Integrity Patch management process, vulnerability scan results
CM-2 Baseline Configuration Configuration Mgmt Configuration baselines documented, change control
IA-2 Identification & Auth Identification/Auth MFA enabled, authentication logs
AU-2 Event Logging Audit & Accountability Audit policy configured, log retention evidence
  1. Quick Gap Score:
  2. Implemented: Control fully operational with evidence (Green).
  3. Partial: Control partially implemented, missing evidence or incomplete (Yellow).
  4. Not Implemented: Control missing or no evidence (Red).

  5. Top 3 Gaps: Identify highest-risk missing controls with immediate remediation actions.

Output: Baseline mapping, 5-control quick check, gap score, top 3 priorities.


T2: Comprehensive Control Family Assessment (≤6k tokens)

Use Case: Full control family validation for production assessments.

Steps:

1. NIST SP 800-53 Rev 5.2.0 (20 Control Families)

Complete Control Families (1,189 controls total):

ID Family Name Controls Focus Area
AC Access Control 25 User access, permissions, least privilege
AT Awareness and Training 6 Security training, role-based awareness
AU Audit and Accountability 16 Event logging, audit review, log protection
CA Assessment, Authorization, Monitoring 9 Security assessments, continuous monitoring, authorization
CM Configuration Management 14 Baseline configs, change control, inventory
CP Contingency Planning 13 Backup, disaster recovery, alternate processing
IA Identification and Authentication 12 MFA, credential management, authenticator management
IR Incident Response 10 Incident handling, reporting, testing
MA Maintenance 6 System maintenance, remote maintenance, tools
MP Media Protection 8 Media sanitization, storage, transport
PE Physical and Environmental Protection 20 Physical access, visitor control, environmental controls
PL Planning 11 Security planning, architecture, privacy
PM Program Management 31 Risk management strategy, governance
PS Personnel Security 9 Position categorization, termination, sanctions
PT PII Processing and Transparency 8 Privacy controls, consent, data minimization
RA Risk Assessment 10 Risk assessments, vulnerability scanning
SA System and Services Acquisition 23 SDLC security, supply chain, developer testing
SC System and Communications Protection 51 Network security, crypto, transmission integrity
SI System and Information Integrity 23 Flaw remediation, malware protection, monitoring
SR Supply Chain Risk Management 12 Supply chain risk, supplier assessments, provenance

Total: 1,189 controls across 20 families (NIST SP 800-53 Rev 5.2.0, accessed 2025-10-26T18:38:09-0400).

2. Control Validation Methodology (NIST SP 800-53A)

Assessment Methods (per NIST SP 800-53A):

Method Description Evidence Examples
Examine Review documentation, policies, procedures Policies, plans, procedures, configs, logs
Interview Discuss with responsible personnel Interview notes, attestations, walkthroughs
Test Execute technical validation, scanning, testing Scan results, penetration tests, automated checks

Control Assessment Example (AC-2: Account Management):

control_id: AC-2
control_name: Account Management
family: AC (Access Control)
baseline: MODERATE

assessment_objectives:
  - AC-2a: Organization manages system accounts (types, establish, activate, modify, review, disable, remove)
  - AC-2(1): Automated account management (enhancements)
  - AC-2(2): Automated account removal/disabling

assessment_methods:
  examine:
    - Account management policy and procedures
    - Configuration baselines for account creation
    - Account review logs (periodic review evidence)
  interview:
    - System administrators on account lifecycle
    - HR on termination processes
  test:
    - Automated scan: verify inactive accounts disabled after 90 days
    - Test account creation workflow: verify approval required
    - Verify MFA enforcement for privileged accounts

evidence_artifacts:
  - account_policy.pdf (policy document)
  - user_access_review_2025-Q3.xlsx (quarterly review)
  - qualys_scan_2025-10-15.pdf (automated compliance scan)
  - iam_config_baseline.json (baseline configuration)

validation_result: Implemented (Green)
findings: All assessment objectives met with documented evidence
compliance_score: 100%

3. Evidence Mapping Matrix

Evidence Collection (60-70% of compliance effort - automate for efficiency):

Control Family Evidence Type Automated Collection Manual Collection
AC (Access Control) IAM configs, user lists, MFA logs ✅ Config exports, SIEM logs Policies, interviews
SI (System Integrity) Patch status, vuln scans, AV logs ✅ Qualys/Nessus scans, SCCM Patch procedures
CM (Config Mgmt) Config baselines, change logs ✅ Git commits, Terraform state Change board minutes
AU (Audit/Accountability) Audit logs, SIEM retention ✅ Splunk/ELK exports Log retention policy
IA (Identification/Auth) MFA enrollment, auth logs ✅ Okta/AD reports Authentication policy

Automation Benefits:
- 60-80% effort reduction with automated evidence collection (accessed 2025-10-26T18:38:09-0400).
- Continuous monitoring vs. point-in-time snapshots.
- Real-time compliance dashboards.

4. Gap Analysis & Remediation Planning

Gap Analysis Workflow:

  1. For each control in baseline:
  2. Check implementation status: Implemented / Partial / Not Implemented.
  3. Verify evidence: Complete / Incomplete / Missing.
  4. Calculate control strength: Strong / Weak / None.

  5. Gap Categorization:

Gap Type Description Remediation Effort
Implementation Gap Control not implemented High (design, deploy, test)
Evidence Gap Control implemented, no evidence Low (document, collect artifacts)
Partial Implementation Control partially implemented Medium (complete missing aspects)
Enhancement Gap Base control OK, enhancements missing Medium (add enhancements)
  1. Prioritization (Risk-Based):
# Risk score = (Control Criticality × Impact × Likelihood) / Effort
# Criticality: LOW=1, MODERATE=2, HIGH=3
# Impact: 1-5 (data breach, downtime, regulatory)
# Likelihood: 1-5 (exploitability, threat landscape)
# Effort: hours to remediate

def calculate_priority(control):
    criticality = {"LOW": 1, "MODERATE": 2, "HIGH": 3}[control.baseline]
    impact = control.impact_score  # 1-5
    likelihood = control.likelihood_score  # 1-5
    effort = control.remediation_hours

    risk_score = (criticality * impact * likelihood) / effort
    return risk_score

# Sort gaps by risk_score descending → prioritized remediation list
  1. Remediation Plan Example:
Priority Control Gap Type Effort Risk Score Deadline
1 SI-2 (Flaw Remediation) Implementation 40h 18.0 30 days
2 IA-2(1) (MFA) Partial 20h 15.0 45 days
3 CM-2 (Baseline Config) Evidence 8h 12.0 60 days
4 AC-2(4) (Auto Account Mgmt) Enhancement 30h 10.0 90 days

5. OSCAL Integration (Interoperability with OSCAL Validator)

OSCAL Artifacts:

  • SSP (System Security Plan): Control implementation descriptions, responsible roles.
  • SAP (Security Assessment Plan): Assessment objectives, methods, schedules.
  • SAR (Security Assessment Report): Results, findings, recommendations.
  • POA&M (Plan of Action & Milestones): Remediation plan with milestones.

Generate OSCAL SAR (Security Assessment Report):

{
  "assessment-results": {
    "uuid": "550e8400-e29b-41d4-a716-446655440000",
    "metadata": {
      "title": "NIST SP 800-53 Assessment Results",
      "published": "2025-10-26T18:38:09-04:00",
      "version": "1.0",
      "oscal-version": "1.1.2"
    },
    "import-ap": {
      "href": "#assessment-plan-uuid"
    },
    "results": [
      {
        "uuid": "result-uuid-1",
        "title": "AC-2 Account Management Assessment",
        "finding": {
          "uuid": "finding-uuid-1",
          "title": "AC-2 Implemented with Evidence",
          "description": "Account management controls fully implemented.",
          "related-observations": [
            {
              "observation-uuid": "obs-uuid-1",
              "type": "examine",
              "description": "Account management policy reviewed and current."
            },
            {
              "observation-uuid": "obs-uuid-2",
              "type": "test",
              "description": "Automated test: Inactive accounts disabled after 90 days."
            }
          ]
        },
        "risk": {
          "status": "satisfied"
        }
      }
    ],
    "assessment-log": {
      "entries": [
        {
          "title": "Assessment Start",
          "start": "2025-10-15T09:00:00-04:00",
          "end": "2025-10-26T17:00:00-04:00"
        }
      ]
    }
  }
}

OSCAL POA&M for Gaps:

{
  "plan-of-action-and-milestones": {
    "uuid": "poam-uuid-1",
    "metadata": {
      "title": "NIST SP 800-53 Remediation Plan",
      "published": "2025-10-26T18:38:09-04:00"
    },
    "poam-items": [
      {
        "uuid": "poam-item-1",
        "title": "Implement SI-2 Flaw Remediation",
        "description": "Deploy automated patch management solution.",
        "related-findings": ["finding-uuid-si2"],
        "milestones": [
          {
            "uuid": "milestone-1",
            "title": "Deploy SCCM/Patch Management",
            "description": "Install and configure patch management platform.",
            "target-date": "2025-11-25"
          },
          {
            "uuid": "milestone-2",
            "title": "Test Patch Deployment",
            "description": "Pilot patch deployment to test systems.",
            "target-date": "2025-12-10"
          }
        ]
      }
    ]
  }
}

Output: Complete control family assessment, evidence mapping, gap analysis, remediation plan, OSCAL SAR/POA&M.


T3: Enterprise-Wide Continuous Monitoring (≤12k tokens)

Use Case: Continuous monitoring, multi-system assessments, automated testing at scale.

Steps:

1. Continuous Monitoring Architecture

NIST SP 800-137 Continuous Monitoring Framework:

  1. Define: Metrics, assessment frequency, roles.
  2. Establish: Baseline configuration, continuous monitoring tools.
  3. Implement: Automated data collection, analysis, reporting.
  4. Analyze: Correlate findings, trend analysis, risk scoring.
  5. Respond: Remediation, escalation, risk acceptance.
  6. Review: Periodic review of monitoring effectiveness.

Automated Control Testing (60-80% effort reduction):

Control Family Automated Test Tool/Method Frequency
AC (Access Control) Inactive account detection SIEM query, IAM API Daily
SI (System Integrity) Vulnerability scanning Qualys, Nessus, Tenable Weekly
CM (Config Management) Configuration drift detection Chef InSpec, AWS Config Continuous
AU (Audit/Accountability) Log collection and retention Splunk, ELK, SIEM Continuous
IA (Identification/Auth) MFA enrollment rate Okta/AD reports Daily

Continuous Monitoring Dashboard:

NIST SP 800-53 Compliance Dashboard (Real-Time)

Overall Compliance Score: 87% (325/375 controls implemented)

Control Families (MODERATE Baseline):
┌─────────────────────────────────────────────────────────────┐
│ AC (Access Control):        ████████░░ 85% (21/25)          │
│ SI (System Integrity):      ███████░░░ 78% (18/23)          │
│ CM (Configuration Mgmt):    ██████████ 92% (13/14)          │
│ AU (Audit/Accountability):  ████████░░ 81% (13/16)          │
│ IA (Identification/Auth):   ██████████ 100% (12/12)         │
└─────────────────────────────────────────────────────────────┘

Recent Findings (Last 7 Days):
❌ SI-2: 15 systems missing critical patches (detected 2025-10-24)
⚠️  AC-2(3): 3 inactive accounts not disabled (detected 2025-10-25)
✅ CM-2: Configuration drift remediated on 10 systems (2025-10-26)

Trending:
↗ Compliance improved 5% from 82% → 87% (last 30 days)
↘ Open findings decreased from 45 → 32 (last 30 days)

2. Multi-System Assessment (Enterprise Scale)

Scenario: Assess 50 federal systems across 3 data centers.

Approach:
- Standardized Baselines: Define organization-wide baselines (LOW/MODERATE/HIGH).
- Centralized Evidence Repository: Shared evidence library (common controls).
- Automated Collection: API integrations with cloud providers, SIEM, CM tools.
- Parallel Assessments: Assess systems concurrently (reduce timeline).
- Rollup Reporting: Aggregate compliance scores, common gaps, organization-wide trends.

Common Controls (Inheritance):

Control Common Control Provider Inheriting Systems
PE-2 (Physical Access) Data Center Operations All systems in DC1, DC2, DC3
IR-1 (Incident Response Policy) CISO Office All organizational systems
CP-1 (Contingency Planning Policy) Business Continuity Team All organizational systems

Benefit: Assess common controls once, inherit across all systems (reduce duplication).

3. Advanced Automation (Infrastructure as Code)

Policy-as-Code Validation:

# Example: Chef InSpec test for AC-2 (Account Management)
control 'AC-2' do
  impact 1.0
  title 'Account Management'
  desc 'Verify inactive accounts are disabled after 90 days'

  # Check all user accounts
  describe command('lastlog -b 90') do
    its('stdout') { should eq '' }  # No logins in last 90 days
  end

  # Verify accounts are disabled
  users = command('awk -F: \'{print $1}\' /etc/passwd').stdout.split("\n")
  users.each do |user|
    describe user(user) do
      it { should_not be_disabled } if user_last_login(user) < 90.days.ago
    end
  end
end

Infrastructure Scanning:

# AWS Config rule for SI-2 (Flaw Remediation)
aws configservice put-config-rule --config-rule '{
  "ConfigRuleName": "ec2-managedinstance-patch-compliance-status-check",
  "Description": "Checks if patches are applied to EC2 instances",
  "Source": {
    "Owner": "AWS",
    "SourceIdentifier": "EC2_MANAGEDINSTANCE_PATCH_COMPLIANCE_STATUS_CHECK"
  },
  "Scope": {
    "ComplianceResourceTypes": ["AWS::EC2::Instance"]
  }
}'

4. NIST SP 800-53 Rev 5.2.0 (August 2025 Updates)

Key Updates in Release 5.2.0:
- SA-15 (Development Process, Standards, Tools): Enhanced software update security.
- SA-11 (Developer Testing): Improved testing requirements for software security.
- CM-3 (Configuration Change Control): Strengthened update deployment management.
- SI-7 (Software Integrity): Enhanced integrity verification and validation.

Executive Order 14306 Compliance:
- Focus: "Strengthening the Nation's cybersecurity."
- Emphasis: Software supply chain security, update integrity, secure development.

Output: Continuous monitoring architecture, multi-system assessment, policy-as-code automation, 5.2.0 compliance.


Decision Rules

  1. Baseline Selection:
  2. If system impact level known → use corresponding baseline (LOW/MODERATE/HIGH).
  3. If impact unknown → default to MODERATE (325 controls, most federal systems).
  4. If custom controls → validate against full 1,189 control catalog.

  5. Evidence Sufficiency:

  6. Strong Evidence: Multiple artifacts from different sources (examine + interview + test).
  7. Weak Evidence: Single artifact or outdated evidence (>1 year old).
  8. No Evidence: No documentation or artifacts → control marked "Not Implemented."

  9. Gap Prioritization:

  10. Prioritize by risk score: (Criticality × Impact × Likelihood) / Effort.
  11. Critical controls (HIGH baseline) before enhancements.
  12. Quick wins (high risk, low effort) prioritized for fast remediation.

  13. Automation Threshold:

  14. If >10 systems → implement continuous monitoring (cost-effective).
  15. If evidence collection >100 hours → automate with API integrations.
  16. If assessment frequency ≥ quarterly → continuous monitoring required.

  17. OSCAL Integration:

  18. If OSCAL SSP exists → import and validate control descriptions.
  19. Generate OSCAL SAR for all assessments (interoperability).
  20. If gaps identified → generate OSCAL POA&M with milestones.

Uncertainty Thresholds:
- If baseline unclear → request system categorization (FIPS 199).
- If evidence sources unknown → emit TODO list with required artifacts.
- If control scope ambiguous → default to full baseline assessment.


Output Contract

Required Fields:

baseline:
  - level: "LOW" | "MODERATE" | "HIGH" | "CUSTOM"
    control_count: integer
    control_families: array (AC, SI, CM, etc.)

validation_results:
  - control_id: string (e.g., "AC-2")
    control_name: string
    family: string
    implementation_status: "Implemented" | "Partial" | "Not Implemented"
    evidence_artifacts: array (file names, URLs)
    compliance_score: float (0-100%)
    findings: string (summary)

evidence_mapping:
  - control_family: string
    automated_evidence: array (log files, configs, scan results)
    manual_evidence: array (policies, interviews, procedures)
    collection_method: "automated" | "manual" | "hybrid"

gap_analysis:
  - gap_type: "Implementation" | "Evidence" | "Partial" | "Enhancement"
    control_id: string
    risk_score: float
    remediation_effort: integer (hours)
    priority: integer (1=highest)
    deadline: date

remediation_plan:
  - control_id: string
    remediation_action: string
    assigned_to: string (role/team)
    effort_estimate: integer (hours)
    target_date: date
    dependencies: array (other controls)

oscal_outputs:
  - sar: object (OSCAL SAR JSON)
    poam: object (OSCAL POA&M JSON)
    format: "json" | "xml" | "yaml"

compliance_metrics:
  - overall_score: float (0-100%)
    by_family: object ({AC: 85%, SI: 78%, ...})
    trend: "improving" | "stable" | "declining"
    finding_count: integer

continuous_monitoring:  # If T3
  - enabled: boolean
    assessment_frequency: string (daily, weekly, monthly)
    automated_tests: array (test names)
    dashboard_url: string

Token Tier Minimums:
- T1: baseline, validation_results (5 sample controls), gap_analysis (top 3).
- T2: All of T1 + evidence_mapping, remediation_plan, oscal_outputs, compliance_metrics.
- T3: All of T2 + continuous_monitoring, multi-system assessment, advanced automation.


Examples

AC-2 Account Management Validation:

control_id: AC-2
control_name: Account Management
family: AC (Access Control)
implementation_status: Implemented
evidence_artifacts:
  - account_management_policy.pdf
  - quarterly_user_review_2025-Q3.xlsx
  - qualys_inactive_account_scan.pdf
compliance_score: 100%
findings: All assessment objectives met. Automated testing confirms inactive accounts disabled after 90 days.

See examples/nist-moderate-baseline-assessment.txt for a complete MODERATE baseline assessment.


Quality Gates

  1. Token Budgets:
  2. T1 response ≤2k tokens (quick gap analysis, 5 controls).
  3. T2 response ≤6k tokens (full family assessment, evidence mapping).
  4. T3 response ≤12k tokens (continuous monitoring, multi-system).

  5. Safety Checks:

  6. No PII or classified information in evidence artifacts.
  7. Evidence collection respects data privacy (NIST PT family).
  8. Automated tests do not disrupt production systems.

  9. Auditability:

  10. All validation results include evidence artifact references.
  11. Gap analysis includes risk scoring methodology.
  12. Remediation plans include effort estimates and deadlines.

  13. Determinism:

  14. Same baseline + evidence → same validation results.
  15. Gap prioritization deterministic (risk score formula).

  16. Citations:

  17. NIST SP 800-53 Rev 5.2.0: 1,189 controls across 20 families (accessed 2025-10-26T18:38:09-0400, NIST CSRC).
  18. Release 5.2.0 (August 27, 2025): Software security, Executive Order 14306 (accessed 2025-10-26T18:38:09-0400, NIST News).
  19. Automation: 60-80% effort reduction (accessed 2025-10-26T18:38:09-0400, Satine Tech).

Resources

Official NIST Documentation:
- NIST SP 800-53 Rev 5.2.0 (August 2025) (accessed 2025-10-26T18:38:09-0400)
- NIST SP 800-53A Rev 5.2.0 Assessment Procedures (accessed 2025-10-26T18:38:09-0400)
- NIST SP 800-137 Continuous Monitoring
- NIST SP 800-53B Control Baselines

OSCAL Integration:
- OSCAL Homepage
- OSCAL SAR Schema
- OSCAL POA&M Schema

Automation Tools:
- Chef InSpec (compliance as code)
- AWS Config (cloud resource compliance)
- Azure Policy (NIST compliance built-in)
- OpenSCAP (open-source security compliance)

# Supported AI Coding Agents

This skill is compatible with the SKILL.md standard and works with all major AI coding agents:

Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.