Three-Dimensional Assessment Methodology
The SecAI Framework Approach
The SecAI Framework evaluates Azure environments across three critical dimensions:
- Configuration Assessment - Technical settings and deployed resources
- Process Assessment - Operational procedures and governance
- Best Practices Assessment - Alignment with industry frameworks
This methodology was developed through real-world assessment of enterprise Azure environments and is proven effective for:
- CSP to MCA migrations
- Azure Landing Zone validations
- Compliance audit preparation
- Quarterly security assessments
Dimension 1: Configuration Assessment
Objective
Systematically collect evidence of deployed resources and security configurations.
Method
Automated PowerShell + Python scripts executing Azure CLI and Resource Graph queries.
Coverage: 12 Security Domains
- Network Security - VNets, NSGs, Firewalls, Load Balancers, Private Endpoints
- Identity & Access Management - RBAC, PIM, Azure AD
- Privileged Access - Privileged role assignments, PIM usage
- Data Protection - Encryption, Key Vault, TDE, storage security
- Asset Management - Resource inventory, tagging, classification
- Logging & Threat Detection - Log Analytics, Sentinel, diagnostics
- Incident Response - Processes, playbooks, response capabilities
- Posture & Vulnerability - Secure Score, recommendations
- Endpoint Security - VM security, antimalware, patching
- Backup & Recovery - Recovery vaults, policies, DR
- DevOps Security - CI/CD security, IaC scanning
- Governance & Strategy - Policies, compliance, standards
Execution
1
2
3
4
5
6
7
8
9
10
11
12
cd implementation/2-Scripts/Collection
pwsh ./00_login.ps1
pwsh ./01_scope_discovery.ps1
pwsh ./02_inventory.ps1
pwsh ./03_policies_and_defender.ps1
pwsh ./04_identity_and_privileged_NO_AD_BYPASS_SSL.ps1
pwsh ./05_network_security.ps1
pwsh ./06_data_protection.ps1
pwsh ./07_logging_threat_detection.ps1
pwsh ./08_backup_recovery.ps1
pwsh ./09_posture_vulnerability.ps1
python ./10_evidence_counter.py
Output: 800+ JSON evidence files
Time: 3-4 hours automated execution
Dimension 2: Process Assessment
Objective
Evaluate operational maturity and governance effectiveness.
Method
Structured interviews with technical teams and leadership, documentation review.
Coverage: 8 Operational Domains
- Change Management - How changes are approved and deployed
- Incident Response - Security event handling and escalation
- Access Provisioning - User onboarding/offboarding processes
- Patch Management - Update deployment and testing
- Security Monitoring - Alert triage and investigation
- Backup & Recovery - DR testing and validation
- Compliance Management - Policy enforcement and audit prep
- Vendor Management - Third-party security oversight
Maturity Model
5-level scoring: Initial → Managed → Defined → Quantitatively Managed → Optimizing
Execution
Use structured interview templates with key personnel (2-4 hours per domain).
Output:
- Interview transcripts
- Process maturity scores
- Gap analysis by domain
Time: 8-16 hours interviews
Dimension 3: Best Practices Assessment
Objective
Validate alignment with industry security frameworks and compliance standards.
Method
Automated PowerShell validation modules comparing collected evidence against control requirements.
Coverage: 5 Compliance Frameworks
- Microsoft Cloud Security Benchmark (MCSB) - Native Azure framework
- CIS Controls v8 - Critical security controls
- NIST SP 800-53 - Government/enterprise standard
- PCI-DSS v3.2.1 - Payment card security
- CSA Cloud Controls Matrix (CCM) - Cloud-specific controls
Execution
1
2
cd workspace/3-Best-Practices-Work
pwsh ./Validate-All-Frameworks.ps1 -DataPath "../../implementation/2-Scripts/out"
Output:
- Compliance scores by framework
- Control pass/fail details
- Gap analysis reports
- Remediation priorities
Time: 5-10 minutes automated validation
Recommended Execution Sequence
Phase 1: Dimension 1 (Week 1)
- Run all collection scripts (3-4 hours)
- Transform to CSV (30 minutes)
- Initial data review
Phase 2: Dimension 3 (Week 1-2)
- Run multi-framework validation (5-10 minutes)
- Review compliance scores
- Identify top gaps
Phase 3: Dimension 2 (Week 2-3)
- Use D3 findings to focus interviews
- Conduct structured interviews (8-16 hours)
- Score process maturity
Phase 4: Reporting (Week 3-4)
- Generate executive summary
- Create remediation roadmap
- Present findings
Output Deliverables
Configuration Evidence
- 800+ JSON files of Azure resource data
- CSV files for analysis
- Resource inventory spreadsheet
Process Documentation
- Interview transcripts
- Process maturity scores
- Gap analysis by domain
Compliance Reports
- Framework-specific scores (e.g., “CIS: 78% compliant”)
- Control pass/fail details
- Prioritized remediation list
Executive Summary
- Overall security posture rating
- Top 10 risks
- Remediation roadmap with timelines
Real-World Validation
The SecAI Framework was validated through assessment of a confidential insurance services customer:
Environment:
- 34+ Azure subscriptions (CSP and MCA tenants)
- 5,000+ resources across Dev, Test, UAT, PreProd, Prod
- Complex security stack: Wiz, CrowdStrike, Cribl, Chronicle, Splunk
- Multi-environment logging architecture
Results:
- 800+ evidence files collected
- Compliance scores across 5 frameworks
- 200+ findings identified
- Prioritized remediation roadmap delivered
Timeline:
- Week 1: Dimension 1 execution
- Week 2: Dimension 3 validation, interviews begin
- Week 3: Complete Dimension 2 interviews
- Week 4: Report generation and presentation
Comparison with Traditional Assessments
Aspect | Traditional Assessment | SecAI Framework |
---|---|---|
Scope | Single dimension (config OR process) | Three dimensions (config + process + best practices) |
Automation | Mostly manual | 80% automated (D1 + D3) |
Frameworks | Single framework focus | Multi-framework (5 frameworks) |
Evidence | Sample-based | Comprehensive (800+ files) |
Timeline | 6-8 weeks | 2-4 weeks |
Repeatability | Manual, varies by assessor | Automated, consistent |
Cost | High consulting fees | Open source + internal labor |
Best Practices for Execution
Before Starting
- Obtain proper authorization from Azure tenant owner
- Document scope and objectives
- Set up data storage location (10GB+)
- Test Azure CLI connectivity
- Verify RBAC permissions
During Execution
- Run scripts during off-hours (lower Azure API load)
- Monitor script progress and logs
- Save evidence files securely
- Don’t modify scripts mid-execution
- Document any errors or issues
After Completion
- Backup all evidence files
- Review data quality and completeness
- Run validation checks
- Generate preliminary findings
- Schedule stakeholder review
Integration with Azure Landing Zones
The SecAI Framework aligns with Azure Landing Zone architecture:
Platform Subscriptions:
- Identity subscription assessment
- Management subscription assessment
- Connectivity (hub) subscription assessment
Landing Zone Subscriptions:
- Corp (internal) subscription assessment
- Online (external) subscription assessment
Validation Against ALZ Policies:
- Policy compliance checking
- Configuration drift detection
- Baseline alignment verification
Continuous Assessment
The framework supports ongoing security monitoring:
Quarterly Assessments:
- Rerun Dimension 1 scripts quarterly
- Track changes over time
- Monitor compliance score trends
- Identify new risks
Monthly Spot Checks:
- Run specific domain scripts (e.g., Network, Identity)
- Quick compliance validation
- Targeted reviews
Continuous Monitoring:
- Azure Policy continuous compliance
- Microsoft Defender for Cloud alerts
- Log Analytics queries for anomalies
Last Updated: October 18, 2025
Status: Production Ready
Framework Version: 2.0