What Is CRAMM? Start Here

CRAMM stands for CCTA Risk Analysis and Management Method. If that name doesn’t immediately ring a bell, you’re not alone — CRAMM occupies a particular corner of the information security world that’s less visible in mainstream US cybersecurity conversations but has been quietly central to UK government, NATO, and critical infrastructure risk management for nearly four decades.

The short version: CRAMM is a structured, tool-supported IT security risk assessment methodology developed by the UK government in 1987. It gives you a systematic way to identify and value your information assets, assess the threats and vulnerabilities that could affect them, and select proportionate countermeasures from a pre-built library of over 3,000 controls. The output is an audit-ready risk assessment and a prioritised treatment plan.

If you’ve encountered CRAMM in a job description, a government contract requirement, a NATO security accreditation context, or a comparison with ISO 27005 or NIST SP 800-30, this guide will give you everything you need to understand what it does, how it works, where it’s used today, and how it compares to the frameworks you’re probably already familiar with.

AttributeDetail
Full nameCCTA Risk Analysis and Management Method
Acronym originCCTA = Central Computer and Telecommunications Agency (UK government)
Developed byCCTA (now absorbed into the UK Cabinet Office / UKCSC), 1987
Current ownerInsight Consulting (licensed commercial product)
Current versionCRAMM version 5.1 (last major release; ongoing maintenance updates)
Primary purposeStructured IT/information security risk assessment aligned to UK government and NATO requirements
Standards basisAligned to ISO/IEC 27001, ISO/IEC 27005, BS 7799; precursor to many modern ISMS frameworks
Typical usersUK central government, MOD, NATO member states, financial services, critical national infrastructure
US applicabilityNIST SP 800-30 alignment; used by US federal contractors operating in UK/NATO contexts

Table 1: CRAMM at a glance — key facts and context.

A Brief History of CRAMM: From Whitehall to the World

CRAMM was developed in 1987 by the Central Computer and Telecommunications Agency (CCTA), the UK government body responsible for information technology policy and procurement for central government. The mandate was practical: British government departments were operating increasingly complex IT systems and had no consistent framework for assessing the security risks those systems posed.

The first version of CRAMM introduced the three-stage structure that still defines the methodology today — asset identification and valuation, threat and vulnerability assessment, and countermeasure selection. Version 3 in the mid-1990s saw widespread adoption across UK central government departments and the Ministry of Defence. NATO adopted CRAMM as an accreditation tool for classified information systems across member states, giving it reach well beyond the UK.

Insight Consulting acquired the commercial rights and developed the CRAMM Expert software tool, which automates much of the calculation-heavy Stage 2 and Stage 3 work. Version 5.1, the current release, updated the threat library and countermeasure set to reflect modern cyber threats while maintaining backward compatibility with the audit trails organisations had built up over years of use.

Today, CRAMM operates alongside rather than instead of ISO 27001/27005 and NIST frameworks. Many organisations use it as their primary risk assessment engine within a broader ISMS built to ISO/IEC 27001:2022 requirements. The detailed audit trail CRAMM generates is one reason it retains a loyal following among organisations with formal accreditation obligations.

How CRAMM Works: The Three-Stage Methodology

CRAMM’s three stages follow a logical progression from understanding what you have, to understanding what threatens it, to deciding what to do about it. Here’s the structure in full:

StageNameKey ActivitiesOutputsISO 27005 / NIST SP 800-30 Alignment
1Asset Identification and ValuationScope the system; catalogue assets (data, applications, infrastructure); assign business impact values to data assets using CRAMM value scales (1–10)Asset register; business impact ratings; data dependency mapISO 27005 §8.2 Asset identification; NIST 800-30 Step 1 System characterisation
2Threat and Vulnerability AssessmentSelect applicable threats from CRAMM threat library (70+ threat types); assess countermeasure effectiveness; score vulnerability for each asset-threat pairThreat profile; vulnerability scores per asset; risk matrix (threat x vulnerability x asset value)ISO 27005 §8.3 Threat identification; §8.4 Vulnerability identification; NIST 800-30 Steps 2–4
3Countermeasure Selection and ReviewMap risk scores to CRAMM countermeasure library (3,000+ controls); select, justify, and prioritise countermeasures; produce implementation scheduleCountermeasure recommendations; residual risk statement; audit-ready risk treatment planISO 27005 §9 Risk treatment; ISO 27001 Annex A controls; NIST 800-30 Step 8 Control recommendations

Table 2: CRAMM three-stage methodology with activities, outputs, and standards alignment.

Stage 1: Asset Identification and Valuation

Stage 1 is where most teams underestimate the effort required. Before you can assess risk, you need to know what you’re protecting — and that means building a genuinely complete asset inventory, not just listing your servers.

CRAMM recognises three asset types: data assets (the information itself, which is typically what you’re most concerned about protecting), application assets (the software that processes the data), and physical infrastructure (hardware, communications, facilities). For each data asset, you score the potential impact of a confidentiality breach, integrity compromise, or availability loss using CRAMM’s 1–10 scale. Those scores become the asset value that flows through into Stage 2 risk calculation.

The asset valuation exercise is fundamentally a business impact assessment, and the questions it forces — what would happen to this organisation if this data were disclosed to an unauthorised party? — are the same questions a sound BIA process asks. If you’ve run a BIA under your business continuity programme, Stage 1 will feel familiar.

For organisations that have already completed a Business Impact Analysis under ISO 22301, Stage 1 data is largely available — you’re mapping existing BIA outputs into CRAMM’s asset value scale. See our guide on Business Impact Analysis: A Complete Framework for how to structure that mapping.

ScoreLevelBusiness Impact DescriptionExample Data AssetIndicative RTO/RPO Range
1–2Very LowLoss causes minor inconvenience; easily recoverable from public sourcesPublished press releases; publicly available product specsRTO 72 hrs+; RPO 48 hrs+
3–4LowLoss causes operational disruption but limited financial or reputational harmInternal training materials; non-sensitive staff recordsRTO 48 hrs; RPO 24 hrs
5–6MediumLoss causes significant operational impact; moderate financial loss or reputational damage possibleCustomer PII; contract terms; internal financial reportsRTO 24 hrs; RPO 8 hrs
7–8HighLoss causes major operational disruption; significant financial loss; regulatory breach likelyAuthentication credentials; payment card data; patient health recordsRTO 4 hrs; RPO 1 hr
9–10Very HighLoss threatens organisational survival; national security implications; irreversible harmClassified intelligence; nuclear facility control data; central bank transaction logsRTO < 1 hr; RPO near-zero

Table 3: CRAMM asset value scale with business impact descriptions, example assets, and RTO/RPO guidance.

Stage 2: Threat and Vulnerability Assessment

This is where CRAMM’s built-in libraries do the heavy lifting. The CRAMM threat library contains over 70 categorised threat types — covering deliberate human threats (from insider threats and espionage to hacking and sabotage), accidental human threats (operator errors, procedural failures), environmental threats (fire, flood, power failure), and technical threats (hardware failure, software defects, transmission errors).

For each asset, you work through the relevant threats from the library and score two dimensions: the likelihood of the threat occurring (informed by your environment, sector, and existing controls) and the vulnerability of the asset to that threat given your current countermeasure posture. The CRAMM risk score for each asset-threat combination is derived from those inputs, weighted by the Stage 1 asset value.

Threat CategoryCRAMM Threat TypeDescriptionAssets Most at RiskCIA ImpactTypical Vulnerability Driver
Human – DeliberateUnauthorised accessInsider or external actor accessing systems beyond authorised permissionsData, applications, networkConfidentiality, IntegrityWeak access controls; poor IAM
Human – DeliberateMalicious code (malware/ransomware)Intentional introduction of software designed to damage, disrupt or exfiltrateApplications, data, endpointsConfidentiality, Integrity, AvailabilityUnpatched systems; no AV/EDR
Human – AccidentalOperator errorUnintentional misconfiguration, deletion, or data entry error by authorised userData, applicationsIntegrity, AvailabilityInsufficient training; no change control
EnvironmentalPower failureLoss of electrical supply to data centre or comms facilitiesInfrastructure, networkAvailabilityNo UPS/generator; single power feed
EnvironmentalFirePhysical fire event causing damage to hardware, media, or facilitiesPhysical infrastructure, mediaAvailability, IntegrityNo suppression; poor cable management
TechnicalHardware failureFailure of servers, storage, or network hardware due to age, defect, or overloadInfrastructureAvailabilityNo redundancy; ageing hardware
Supply chainThird-party service failureFailure of outsourced service or cloud provider to deliver contracted availabilityApplications, data, networkAvailability, ConfidentialityNo SLA monitoring; no exit plan

Table 4: Selected CRAMM threat library entries — categories, CIA impact, and typical vulnerability drivers.

The risk calculation is where the methodology becomes concrete. A high-value asset facing a well-established threat against which you have weak countermeasures will generate a critical risk score. The same threat against a low-value asset generates a low risk score even if your defences are poor. This weighting by asset value is what makes CRAMM proportionate rather than generic — it stops you from applying the same level of protection to your published press releases as to your payment processing data.

Threat Score \ Asset Value1–2 (Very Low)3–4 (Low)5–6 (Medium)7–8 (High)9–10 (Very High)
Very Low (1–2)1 — Negligible2 — Negligible4 — Low6 — Low8 — Medium
Low (3–4)2 — Negligible4 — Low8 — Medium12 — High16 — Critical
Medium (5–6)4 — Low8 — Medium12 — High18 — Critical30 — Critical
High (7–8)6 — Low12 — High18 — Critical24 — Critical40 — Critical
Very High (9–10)8 — Medium16 — Critical30 — Critical40 — Critical50 — Critical

Table 5: CRAMM risk scoring matrix — Asset Value × Threat Score. Green = Low/Negligible, Yellow = Medium, Orange = High, Red = Critical.

Stage 3: Countermeasure Selection

Stage 3 is where CRAMM’s 3,000-plus countermeasure library earns its keep. Once your risk scores are calculated, the CRAMM Expert tool maps each risk score to a set of recommended countermeasures from the library. The recommendations are tiered — baseline countermeasures that should always be in place regardless of risk level, and additional countermeasures triggered by specific risk scores.

Each countermeasure in the library is categorised by the type of control it represents (physical, technical, procedural), the threat types it addresses, and the implementation complexity. The Stage 3 output isn’t just a list of things to do — it’s a prioritised implementation schedule based on risk scores, with the highest-score risks driving the first tranche of countermeasures.

This direct link between risk score and control recommendation is one of CRAMM’s most practically useful features. It removes the judgement call about which controls are proportionate to which risks and replaces it with a documented, auditable rationale — which is exactly what regulators and accreditors want to see.

Worked Example: CRAMM Assessment for a Government Finance Portal

Here’s how the three stages play out for a concrete system — a citizen-facing tax filing portal operated by a government agency. The system processes personal financial data for millions of citizens, which means asset values are high across confidentiality and availability dimensions. The environment includes public internet exposure, legacy infrastructure components, and a mix of internal and outsourced operational staff.

Asset IDAssetThreatAsset ValueThreat ScoreVuln ScoreRisk ScoreCountermeasure (Stage 3)
DA-01Citizen tax records (database)Unauthorised access97840 — CriticalImplement role-based access control; MFA for all admin accounts; quarterly access review; SOC monitoring with SIEM alerts
DA-01Citizen tax records (database)Ransomware98630 — CriticalImmutable offline backups (3-2-1 rule); network segmentation; endpoint EDR; tested IR playbook
APP-01Tax filing web applicationDenial of service86724 — CriticalCloud DDoS mitigation service; rate limiting at WAF; standby failover environment; BCP runbook for degraded service
NET-01Core network switch infrastructureHardware failure75618 — CriticalN+1 hardware redundancy; hot-standby configuration; 4-hour vendor hardware SLA
APP-02Internal HR portalOperator error – data deletion54512 — HighSoft-delete with 30-day retention; change control process; role separation between HR admin and system admin
FAC-01Data centre power supplyPower failure83412 — HighDual UPS with generator backup; annual load test; automatic transfer switch; DCIM monitoring

Table 6: CRAMM risk assessment — six asset-threat combinations with risk scores and countermeasure recommendations.

A few things worth noting in this example. First, the same asset (DA-01, the tax records database) appears twice with different threats and different risk scores. CRAMM assesses each asset-threat combination independently, which is why you can have a 40 (Critical) for unauthorised access and a 30 (Critical) for ransomware against the same asset. The countermeasures for each are related but distinct.

Second, the risk scores are proportionate. FAC-01 (power supply) scores 12 — High despite being an asset value of 8, because the threat likelihood and vulnerability scores are relatively low (3 and 4). Compare that to DA-01 where high asset value combined with elevated threat and vulnerability scores pushes both combinations into Critical territory. The model rewards organisations that have strong physical security and power resilience, even for high-value assets.

Third, Stage 3 countermeasures are specific and actionable. “Implement role-based access control; MFA for all admin accounts; quarterly access review; SOC monitoring with SIEM alerts” is a workable implementation brief, not a vague recommendation to “improve access controls.”

CRAMM vs. ISO 27005 vs. NIST SP 800-30: Which Should You Use?

This is the question that comes up most often from practitioners working across UK/US or multinational contexts. The honest answer is that these aren’t mutually exclusive — they operate at different levels of prescription and serve overlapping but distinct purposes.

DimensionCRAMM v5.1ISO/IEC 27005:2022NIST SP 800-30 Rev.1
OriginUK Government (CCTA), 1987; commercial productInternational standard (ISO/IEC JTC 1/SC 27); 2022 editionUS Federal Government (NIST); 2012 edition
Methodology typePrescriptive toolset with structured library-driven processPrinciples-based framework; methodology-agnosticStructured process guidance; threat-source/scenario focused
Asset valuationMandatory; 1–10 scale across CIA dimensions; tool-assistedRequired; approach left to practitionerRequired; qualitative or quantitative; Very Low–Very High scale
Threat libraryBuilt-in; 70+ threat types; regularly maintainedNo built-in library; references external threat cataloguesAppendix D threat sources; Appendix E threat events
Countermeasure libraryBuilt-in; 3,000+ controls; linked to risk scoresNo built-in library; references ISO 27001 Annex A / externalNo built-in library; practitioner-defined
Risk scoringQuantitative: Asset Value × Threat Score × Vulnerability ScoreQualitative or quantitative; practitioner choiceSemi-quantitative: Likelihood × Impact using defined scales
Software toolCRAMM Expert software (licensed); desktop or networkNo dedicated tool; supports any RA toolNo dedicated tool; NIST provides supporting spreadsheets
US Federal useLimited; primarily UK/NATO context; used by US contractors in UK/NATO programmesWidely referenced; underpins many US ISMS implementationsMandatory for US federal information systems (FISMA/RMF)
Audit readinessHigh; tool produces audit-ready documentation trailDepends on how framework is implemented and documentedHigh in US federal context; aligns with FISMA/RMF audit requirements

Table 7: CRAMM v5.1 vs. ISO/IEC 27005:2022 vs. NIST SP 800-30 Rev.1 — methodology comparison.

The practical implications of this comparison are worth spelling out clearly. If you’re a US federal agency or contractor operating under FISMA and the NIST Risk Management Framework, NIST SP 800-30 is your primary tool and CRAMM is likely supplementary at best. If you’re operating in a UK government or NATO context, CRAMM may be a mandatory or strongly preferred methodology for system accreditation.

If you’re building an ISO 27001-aligned ISMS in a commercial context — which describes most private sector organisations in both the US and UK — ISO 27005 is your natural reference framework, and CRAMM can be used as a concrete implementation methodology within it. The key point is that CRAMM produces outputs that satisfy ISO 27001’s risk assessment requirements (Clause 6.1.2) and the full Annex A control set is represented in CRAMM’s countermeasure library. See our guide on ISO 31000 Risk Assessment Framework Explained for how CRAMM’s three stages map to the broader ISO risk management lifecycle.

CRAMM in the United States: When and Why It’s Relevant

CRAMM isn’t a mainstream tool in US domestic cybersecurity practice. NIST SP 800-30, the NIST Cybersecurity Framework (CSF), and FedRAMP dominate the federal landscape, while NIST CSF and SOC 2 dominate commercial practice. So when does CRAMM show up in a US context?

US Contractors Working on UK Government or NATO Programmes

Defence contractors, intelligence community partners, and technology suppliers working on UK MOD, GCHQ, or NATO programmes regularly encounter CRAMM as a system accreditation requirement. The UK National Cyber Security Centre (NCSC) and its predecessor CESG published guidance requiring CRAMM for certain classification levels. US firms bidding on these contracts need practitioners who understand CRAMM’s structure and can produce CRAMM-compliant documentation.

Multinational Financial Services

Banks and financial services firms with operations in both the US and UK may use CRAMM for their UK entity’s information security risk assessments, particularly where UK regulatory expectations or FCA/PRA guidance leans on it. In this context, CRAMM outputs feed into both ISO 27001 ISMS documentation and UK regulatory submissions.

Critical National Infrastructure

Energy, water, and transport operators in the UK (many of which have US parent companies or investors) use CRAMM within the UK Government’s Centre for the Protection of National Infrastructure (CPNI) framework for CNI security. US parent companies overseeing UK subsidiaries therefore need CRAMM literacy at the group risk and internal audit level.

Academic and Professional Certification

CRAMM appears in CISSP study materials, CISM exam prep, and several academic information security curricula as a case study in structured risk assessment methodology. For US-based security professionals pursuing or holding these certifications, CRAMM knowledge is part of the professional landscape even if they never use the tool operationally.

KRIs for CRAMM-Managed Systems: What to Monitor Between Assessments

One of CRAMM’s known weaknesses is that it produces a point-in-time snapshot. A full CRAMM assessment is resource-intensive enough that most organisations do it annually at best. Between formal assessments, you need KRIs to tell you whether your risk posture is holding or deteriorating.

KRIWhat It MeasuresGreenAmberRedEscalation Action
Critical risk countOpen CRAMM risk scores ≥ 30 not yet treated01–2> 2CISO / Board escalation; emergency treatment plan
Countermeasure implementation rate% of Stage 3 countermeasures implemented on schedule≥ 90%75–89%< 75%Resource review; reprioritise treatment backlog
Asset register currencyDays since last asset register review0–30 days31–60 days> 60 daysTrigger Stage 1 re-assessment; assign asset owner
Threat intelligence updates actioned% of new CRAMM threat library updates reviewed and applied100%80–99%< 80%Security team review; re-score affected assets
CRAMM reassessment cycleMonths since last full CRAMM assessment was completed0–12 months13–18 months> 18 monthsInitiate full reassessment; engage CRAMM facilitator
Residual risk acceptance sign-off% of accepted residual risks with documented CISO / exec sign-off100%90–99%< 90%Escalate unsigned items; freeze system changes until resolved

Table 8: KRI dashboard for CRAMM-assessed systems with green/amber/red thresholds. Review monthly; escalate on red.

The CRAMM reassessment cycle KRI deserves particular attention. CRAMM’s threat library and countermeasure sets are periodically updated by Insight Consulting to reflect new threat intelligence. If you’re running a 2020 assessment against a 2024 threat environment, your risk scores are stale. Build the reassessment trigger into your ISMS management review process rather than waiting for an external audit to flag it. For a broader treatment of KRI design, see our guide on Key Risk Indicators: Design and Implementation.

CRAMM’s Strengths and Limitations: An Honest Assessment

Where CRAMM Genuinely Excels

  • Audit-ready documentation trail. The CRAMM Expert tool generates comprehensive, structured reports that satisfy accreditation requirements out of the box. For organisations facing formal security accreditation processes — particularly in UK government and NATO contexts — this is a significant practical advantage over methods that require you to build documentation from scratch.
  • Proportionality by design. Because risk scores are weighted by asset value, CRAMM naturally directs the most intensive countermeasure requirements to your highest-value assets. This prevents the common failure mode of applying generic baseline controls uniformly regardless of what you’re actually protecting.
  • Comprehensive threat and countermeasure libraries. The built-in libraries reduce reliance on practitioner judgement for what to consider and what to do about it. This is particularly valuable for less experienced teams or for assessments of system types that practitioners haven’t previously worked with.
  • Consistent results across assessors. The structured methodology and tool support mean that two experienced CRAMM practitioners assessing the same system will produce substantially similar results. This consistency is valuable for organisations conducting portfolio-level risk reporting or comparing assessments across business units.

Where CRAMM Has Real Limitations

  • It’s a commercial licensed product. Unlike ISO 27005 or NIST SP 800-30, using CRAMM properly requires a licensed copy of CRAMM Expert software. That cost and licensing dependency is a barrier for smaller organisations and creates a risk around long-term tool availability.
  • The methodology is showing its age in cloud-native environments. CRAMM was designed around on-premises IT infrastructure. Cloud services, containerised applications, serverless architectures, and SaaS supply chains don’t map cleanly to CRAMM’s asset categorisation scheme. Practitioners working in modern environments need to adapt the methodology, which partially negates the standardisation benefit.
  • Full assessments are time and resource intensive. A thorough CRAMM assessment of a complex system can take weeks of facilitated workshop time. This makes it poorly suited to fast-moving development environments or organisations that need to assess a large number of systems on an ongoing basis.
  • Limited community outside UK/NATO contexts. Finding CRAMM-trained practitioners in the US is harder than finding CISSP or CISM holders with NIST framework experience. If you’re building internal capability, the talent pool is narrower than for NIST or ISO 27005-based approaches.

Running a CRAMM Assessment: A Practical Step-by-Step

If you’re about to facilitate your first CRAMM assessment — or you’re preparing to commission one — here’s what the process looks like in practice:

Step 1: Define the scope boundary. Agree precisely which system, information flow, or organisational unit the assessment covers. CRAMM is thorough; scope creep is expensive. Document the boundary in writing before the first workshop.

Step 2: Identify and catalogue assets. Work through the CRAMM asset categories systematically. Involve data owners and system owners — they know what the data actually is and what it’s used for. Don’t rely on IT alone for this step; the business impact context comes from the business.

Step 3: Score asset values. For each data asset, conduct a structured impact assessment across confidentiality, integrity, and availability. Use the CRAMM 1–10 scale and document the rationale for each score. These scores should be reviewed and signed off by the asset owner.

Step 4: Work through the threat library. Select applicable threats from the CRAMM library for each asset type. For each applicable threat, score the threat likelihood in your environment and the vulnerability of the asset given current controls. The CRAMM Expert tool calculates the risk score from these inputs.

Step 5: Review Stage 2 outputs before proceeding. Before moving to countermeasure selection, review the risk score distribution. Are there surprises? Are asset values calibrated correctly? Stage 2 outputs that look implausible usually indicate miscalibrated asset values or threat scores in Stage 1-2, not errors in the model.

Step 6: Generate and review countermeasure recommendations. Run the Stage 3 countermeasure selection. Review the recommendations for feasibility and prioritise by risk score. Not every recommended countermeasure will be practically implementable — document acceptance decisions for those you don’t implement.

Step 7: Produce the risk treatment plan. Convert Stage 3 outputs into a time-bound implementation plan with owners, due dates, and review milestones. This plan is your CRAMM deliverable and the document your accreditor or auditor will scrutinise.

Step 8: Schedule the next assessment. Agree the reassessment trigger — either a fixed annual cycle or event-based (significant system change, new threat intelligence, security incident). Set the KRI thresholds in Table 8 and assign monitoring ownership.

CRAMM doesn’t operate in isolation. These guides cover the adjacent frameworks and techniques that sit alongside it in a mature information security and risk management programme:

•  ISO 31000 Risk Assessment Framework Explained — How CRAMM’s three-stage structure maps to the ISO 31000 risk management lifecycle.

•  Business Impact Analysis: A Complete Framework — Stage 1 asset valuation and BIA share the same business impact logic; here’s how to run a structured BIA.

•  Key Risk Indicators: Design and Implementation Guide — Turn your CRAMM residual risk profile into a monitored KRI dashboard.

•  Monte Carlo Simulation for Risk Analysis — Extend CRAMM’s point estimates into probabilistic ranges for board-level scenario analysis.

•  NUDD Analysis: Engineering Hazard Identification — Complement CRAMM’s information security focus with structured process hazard identification for OT/ICS environments.

•  Business Continuity Planning for Technology Teams — Translate CRAMM availability risk scores into BCP recovery objectives and tested recovery plans.

Download the Free CRAMM Risk Assessment Template

The asset value worksheet, threat assessment log, risk calculation matrix, and Stage 3 countermeasure tracking template from this article are available as a free downloadable Excel file at riskpublishing.com/cramm-risk-assessment-template. The file includes the 1–10 asset value scale with impact descriptors, a threat-vulnerability scoring worksheet with formulas for automatic risk score calculation, the risk matrix in Table 5 with conditional formatting, and the KRI dashboard from Table 8 with amber/red thresholds pre-configured.

If you’re preparing for a CRAMM-based accreditation, advising a client on methodology selection, or integrating CRAMM into an ISO 27001 ISMS, the contact page is the place to start a more detailed conversation.

Sources & Further Reading

1. Insight Consulting — CRAMM Official Product Page — Commercial CRAMM licensee and tool provider.

2. ISO/IEC 27005:2022 — Information Security Risk Management — International Organization for Standardization.

3. NIST SP 800-30 Rev.1 — Guide for Conducting Risk Assessments — National Institute of Standards and Technology.

4. ISO/IEC 27001:2022 — Information Security Management Systems — International Organization for Standardization.

5. NIST Cybersecurity Framework 2.0 — National Institute of Standards and Technology.

6. UK NCSC — Risk Management Guidance — UK National Cyber Security Centre.

7. CPNI — Security Risk Management — Centre for the Protection of National Infrastructure (UK).

8. CISA — Risk and Vulnerability Assessments — Cybersecurity and Infrastructure Security Agency (US).