| Key Takeaways |
| A well-structured RCSA template transforms risk identification from a vague brainstorming exercise into a repeatable, auditable process that connects risks to controls and owners. |
| Facilitated workshops consistently outperform survey-only approaches because they surface hidden risks through cross-functional dialogue and real-time challenge. |
| A 5×5 scoring matrix with calibrated likelihood and impact definitions eliminates the subjectivity that undermines most self-assessments. |
| Control effectiveness scoring must assess both design adequacy and operating performance to give an accurate picture of residual risk. |
| Organizations that align their RCSA template to ISO 31000 and COSO ERM create a shared language between first-line risk owners, second-line oversight, and third-line assurance. |
| The OCC found in 2024 that over half of large US banks had weaknesses in operational risk frameworks, proving that static, checkbox RCSAs leave material gaps. |
| Embedding trigger-based reassessment alongside periodic cycles keeps the RCSA dynamic rather than a once-a-year compliance artifact. |
A 2024 RMA and PwC survey found that 61 percent of financial institutions use a blend of workshops and offline analysis to conduct their risk and control self-assessments.
Yet, the same survey revealed that only 20 percent of respondents have adopted modern tools such as AI to support the process.
The gap between aspiration and execution remains wide. Most organizations know they need a robust RCSA framework, but too many still rely on ad hoc spreadsheets, inconsistent scoring scales, and workshops that produce more confusion than clarity.
This article delivers a practitioner-ready RCSA template you can deploy immediately. The guide walks through every phase of a facilitated workshop, provides a calibrated 5×5 scoring matrix with defined likelihood and impact scales, and shows how to assess control effectiveness using both design and operating criteria.
Each component aligns to ISO 31000 risk management principles and COSO ERM governance expectations, giving your program credibility with regulators, auditors, and the board.
Deloitte’s 2025 analysis noted that the RCSA consumes more first- and second-line resource time than any other operational risk component, yet consistently ranks lowest in perceived value.
The root cause is not the concept itself but poor template design, weak facilitation, and ambiguous scoring criteria.
The sections that follow address each of those failure points with actionable frameworks, ready-to-use tables, and a 90-day implementation roadmap.
What an RCSA Template Must Include
An effective RCSA template does more than list risks in a column. The template must create a clear chain from business objectives through risks, controls, residual ratings, and action plans.
According to the Three Lines Model, first-line managers own the risks and controls, second-line risk functions provide methodology and challenge, and third-line audit gives independent assurance. Your template should enforce these accountability boundaries from the first field onward.
Core Template Fields
| Field | Description | Owner | Standards Reference |
| Process / Activity | Name of the business process being assessed | 1st Line Process Owner | ISO 31000 Clause 6.3.2 |
| Business Objective | Strategic or operational objective the process supports | 1st Line Process Owner | COSO ERM Principle 6 |
| Risk ID | Unique identifier for tracking across registers | 2nd Line Risk Function | ISO 31000 Clause 6.4 |
| Risk Description | Cause-Event-Consequence statement following structured format | 1st Line with 2nd Line challenge | ISO 31000 Clause 6.4.2 |
| Risk Category | Classification aligned to enterprise risk taxonomy | 2nd Line Risk Function | COSO ERM Principle 10 |
| Inherent Likelihood (1–5) | Probability before controls, using calibrated scale | Workshop Consensus | ISO 31010 Clause 5 |
| Inherent Impact (1–5) | Severity across financial, operational, reputational, and compliance dimensions | Workshop Consensus | ISO 31010 Clause 5 |
| Inherent Risk Score | Likelihood × Impact (1–25) | Auto-calculated | ISO 31000 Clause 6.4.3 |
| Control ID | Unique identifier for each control | 2nd Line Risk Function | COSO IC Principle 10 |
| Control Description | Specific action or mechanism that mitigates the risk | 1st Line Control Owner | COSO IC Principle 10 |
| Control Type | Preventive, Detective, or Corrective | 2nd Line Risk Function | COSO IC Framework |
| Design Effectiveness (1–5) | Rating of how well the control is designed to mitigate the risk | 2nd Line Assessment | IIA Standard 2130 |
| Operating Effectiveness (1–5) | Rating of how consistently the control performs in practice | 1st Line with 2nd Line validation | IIA Standard 2130 |
| Residual Likelihood (1–5) | Probability after controls | Workshop Consensus | ISO 31000 Clause 6.4.4 |
| Residual Impact (1–5) | Severity after controls | Workshop Consensus | ISO 31000 Clause 6.4.4 |
| Residual Risk Score | Residual Likelihood × Residual Impact | Auto-calculated | ISO 31000 Clause 6.4.4 |
| Risk Appetite Comparison | Within / Approaching / Exceeding appetite | 2nd Line Risk Function | COSO ERM Principle 7 |
| Action Plan | Specific remediation steps for risks exceeding appetite | 1st Line Action Owner | ISO 31000 Clause 6.5 |
| Due Date | Target completion date for action plan | 1st Line Action Owner | COSO ERM Principle 17 |
| Status | Open / In Progress / Closed / Overdue | 1st Line with 2nd Line monitoring | IIA Standard 2500 |
This template structure ensures traceability from business objective to residual risk to action plan.
The structured risk description format (Cause → Event → Consequence) prevents the common mistake of confusing causes with risks or risks with impacts. You can find detailed guidance on building risk registers and risk taxonomies in our supporting guides.
The 5×5 RCSA Scoring Matrix
Scoring consistency is the single biggest determinant of RCSA quality. Without calibrated definitions, one assessor’s “medium likelihood” is another’s “high.”
The risk assessment matrix below provides anchored definitions that reduce subjectivity and support aggregation across business units.
Likelihood Scale
| Rating | Label | Frequency Anchor | Probability Anchor |
| 1 | Rare | Less than once in 10 years | < 5% chance in next 12 months |
| 2 | Unlikely | Once every 5–10 years | 5–15% chance in next 12 months |
| 3 | Possible | Once every 1–5 years | 15–40% chance in next 12 months |
| 4 | Likely | Once per year or more frequently | 40–70% chance in next 12 months |
| 5 | Almost Certain | Multiple times per year | > 70% chance in next 12 months |
Impact Scale
| Rating | Financial | Operational | Reputational | Compliance |
| 1 – Negligible | < $50K loss | Minor process delay; no service disruption | No media coverage; internal only | Administrative finding; no regulatory action |
| 2 – Minor | $50K–$250K loss | Short-term workaround required; < 4 hours downtime | Local or trade media mention | Formal regulatory inquiry; no penalty |
| 3 – Moderate | $250K–$1M loss | Significant disruption; 4–24 hours recovery | National media coverage; manageable | Regulatory penalty < $1M or MRA |
| 4 – Major | $1M–$10M loss | Critical process failure; 1–7 days recovery | Sustained negative media; customer attrition | Consent order or penalty $1M–$10M |
| 5 – Catastrophic | > $10M loss | Extended outage > 7 days; business continuity invoked | Systemic brand damage; leadership change | License revocation or penalty > $10M |
Risk Rating Matrix (Likelihood × Impact)
| Likelihood / Impact | 1 Negligible | 2 Minor | 3 Moderate | 4 Major | 5 Catastrophic |
| 5 – Almost Certain | 5 (Medium) | 10 (High) | 15 (High) | 20 (Critical) | 25 (Critical) |
| 4 – Likely | 4 (Low) | 8 (Medium) | 12 (High) | 16 (High) | 20 (Critical) |
| 3 – Possible | 3 (Low) | 6 (Medium) | 9 (Medium) | 12 (High) | 15 (High) |
| 2 – Unlikely | 2 (Low) | 4 (Low) | 6 (Medium) | 8 (Medium) | 10 (High) |
| 1 – Rare | 1 (Low) | 2 (Low) | 3 (Low) | 4 (Low) | 5 (Medium) |
Risk bands: Low (1–4) = accept and monitor. Medium (5–9) = treat within existing resources. High (10–16) = escalate to senior management with action plan. Critical (17–25) = immediate board reporting and urgent remediation.
These bands should reflect your organization’s risk appetite statement and be recalibrated annually.
Control Effectiveness Scoring Framework
Scoring controls is where most RCSA programs stumble. A simple “effective / partially effective / ineffective” scale lacks granularity.
Splitting the assessment into design effectiveness and operating effectiveness, as recommended by the IIA Global Internal Audit Standards, gives a far richer picture of the control environment and directly feeds into the residual risk calculation.
Design Effectiveness Scale
| Rating | Label | Criteria | Example |
| 5 | Robust | Control specifically addresses the root cause; fully automated where feasible; clear ownership and escalation path | Automated three-way match in AP with exception-based approval workflow |
| 4 | Adequate | Control addresses the key risk drivers; mostly automated; documented procedures exist | Segregation of duties enforced through system roles with quarterly access review |
| 3 | Needs Improvement | Control partially addresses the risk; manual steps create vulnerability; documentation gaps | Manual journal entry review with checklist but no independent verification |
| 2 | Weak | Control has significant design gaps; relies on individual judgment; no escalation path | Verbal approval process with no audit trail for high-value transactions |
| 1 | Absent / Ineffective | No meaningful control exists or the design does not address the risk | No reconciliation process between sub-ledger and general ledger |
Operating Effectiveness Scale
| Rating | Label | Criteria | Example |
| 5 | Consistently Effective | Control operates as designed > 95% of the time; no exceptions in last 12 months | System-enforced limit checks with zero overrides in audit sample |
| 4 | Mostly Effective | Control operates as designed 85–95% of the time; minor deviations self-corrected | Monthly reconciliation completed on time 11 of 12 months |
| 3 | Partially Effective | Control operates 70–85% of the time; recurring deviations noted but not systemic | Dual authorization performed but with documented workarounds during peak periods |
| 2 | Unreliable | Control operates < 70% of the time; systemic non-compliance observed | Incident reporting process exists but staff bypass the procedure routinely |
| 1 | Non-Operating | Control is not being performed despite being documented | Data backup procedure documented but no backups have run in 6 months |
Composite Control Score: Multiply Design (1–5) by Operating (1–5) to get a composite score from 1 to 25. Scores below 12 should trigger a remediation action plan.
This composite approach, aligned to COSO Internal Control principles, gives a more defensible basis for residual risk adjustments than a single subjective rating.
| Composite Score Range | Rating | Residual Risk Adjustment | Required Action |
| 20–25 | Strong | Reduce inherent scores by up to 2 levels per dimension | Monitor and maintain; include in periodic testing |
| 12–19 | Satisfactory | Reduce inherent scores by 1 level per dimension | Monitor with enhanced frequency; address any gaps |
| 6–11 | Needs Improvement | Minimal reduction; residual stays close to inherent | Formal action plan with 60-day deadline |
| 1–5 | Unsatisfactory | No reduction; residual equals inherent | Immediate escalation; 30-day remediation or risk acceptance by senior management |
RCSA Workshop Facilitation Guide
The facilitated workshop remains the gold standard for RCSA data collection. The RMA/PwC survey confirmed that organizations using workshops alongside offline analysis produce higher-quality assessments than those relying on questionnaires alone.
Below is a step-by-step guide to planning and running an effective risk assessment workshop.
Pre-Workshop Preparation (2–3 Weeks Before)
| Task | Detail | Responsible |
| Define scope and objectives | Identify which processes, business units, or risk categories the workshop will cover. Align scope to strategic objectives and recent audit findings. | 2nd Line Risk Function |
| Select participants | Include first-line process owners, subject matter experts, compliance representatives, and IT/security staff as relevant. Target 6–12 participants for productive dialogue. | Workshop Facilitator |
| Distribute pre-read materials | Share process maps, previous RCSA results, recent incident data, KRI dashboards, and audit reports. Participants should arrive informed. | 2nd Line Risk Function |
| Prepare scoring criteria handout | Print the likelihood scale, impact scale, and control effectiveness scales for each participant. Consistent reference material reduces calibration drift. | Workshop Facilitator |
| Set up risk register template | Pre-populate known risks from prior cycles. Leave space for newly identified risks. Use the template fields defined in Section 1. | 2nd Line Risk Function |
| Book venue and logistics | Reserve a room with a projector and whiteboard. Plan for 3–4 hours. Virtual workshops need a collaboration platform with real-time editing capability. | Administrative Support |
Workshop Agenda (3–4 Hours)
| Time | Activity | Method | Output |
| 0:00–0:15 | Opening and ground rules | Facilitator presents objectives, confidentiality, and scoring approach | Participant alignment on workshop purpose |
| 0:15–0:45 | Process walk-through | Process owner presents end-to-end workflow with dependencies | Shared understanding of process scope |
| 0:45–1:30 | Risk identification | Structured brainstorm using Cause-Event-Consequence format; group risks by taxonomy category | Draft risk register with new and existing risks |
| 1:30–1:45 | Break | — | — |
| 1:45–2:30 | Inherent risk scoring | Facilitator guides group through each risk using calibrated scales; discussion and consensus | Inherent likelihood and impact scores for each risk |
| 2:30–3:15 | Control identification and effectiveness scoring | Map existing controls to each risk; assess design and operating effectiveness using dual scale | Control inventory with composite effectiveness scores |
| 3:15–3:45 | Residual risk scoring and action planning | Calculate residual scores; compare to risk appetite; assign action owners and deadlines for risks exceeding appetite | Completed RCSA register with action plans |
| 3:45–4:00 | Wrap-up and next steps | Summarize key findings; confirm action plan owners; set follow-up review date | Workshop minutes and action tracker |
Successful facilitation requires the facilitator to maintain neutrality while challenging groupthink.
Use techniques such as silent brainstorming (each participant writes risks on sticky notes before group discussion), devil’s advocate questioning, and explicit calibration checks (asking participants to justify their rating against the anchor definitions).
These methods align with ISO 31010 guidance on risk assessment techniques and improve the reliability of workshop outputs.
RCSA Approaches Compared: Workshop vs. Questionnaire vs. Hybrid
Organizations typically choose among three primary approaches to conduct RCSAs. Each has strengths depending on organizational culture, scale, and risk maturity.
The comparison below helps you select the right approach or, more commonly, the right combination for your program.
| Dimension | Facilitated Workshop | Questionnaire / Survey | Hybrid Approach |
| Data richness | High – real-time discussion captures nuance, hidden risks, and interdependencies | Medium – structured responses but limited depth | High – combines breadth of survey with depth of workshop |
| Resource intensity | High – requires facilitator preparation, participant time, and follow-up | Low – can be distributed at scale with minimal facilitator time | Medium – surveys reduce workshop time; focused sessions address gaps |
| Scalability | Limited – practical for 6–12 participants per session | High – can reach hundreds of respondents across geographies | High – surveys cover breadth; workshops target material risks |
| Consistency | Moderate – dependent on facilitator skill and calibration materials | High – standardized questions produce comparable data | High – survey provides baseline; workshop addresses outliers |
| Engagement quality | High – encourages ownership and cross-functional dialogue | Low – survey fatigue and checkbox mentality are common | Medium-High – balances participation burden with meaningful discussion |
| Best suited for | Complex, high-risk processes; initial RCSA deployment; regulatory-sensitive areas | Mature programs needing periodic refresh; lower-risk business units | Large organizations balancing depth with coverage requirements |
Best practice, as documented in Wolters Kluwer’s 2025 analysis, recommends the hybrid model for most organizations.
Use questionnaires to gather baseline self-assessments across all business units, then target facilitated workshops on processes where inherent risk scores exceed appetite thresholds or where control gaps have been identified.
This approach maximizes coverage without burning out first-line participants. Organizations building an enterprise risk management framework should embed the RCSA cadence into their annual risk management lifecycle.
Integrating RCSA Outputs into the ERM Framework
The RCSA does not exist in isolation. Completed RCSA registers should feed directly into the enterprise risk register, inform KRI thresholds, and shape scenario analysis programs. Below is the integration model.
| RCSA Output | Integration Point | Value Added |
| Risk register entries | Enterprise risk register aggregation | Bottom-up risk data validates top-down strategic risk assessment |
| Inherent and residual scores | Risk appetite monitoring and board reporting dashboards | Quantified risk positions enable traffic-light reporting and trend analysis |
| Control effectiveness ratings | Internal audit planning and SOX/compliance testing prioritization | Low-rated controls drive audit focus; reduces duplication of effort |
| Action plans | Issues and actions register with closure tracking | Central visibility on remediation progress; aging analysis for overdue items |
| Risk identification insights | Scenario analysis and stress testing inputs | Workshop-surfaced tail risks inform forward-looking scenario design |
| KRI recommendations | KRI dashboard configuration with thresholds and escalation rules | Operational KRIs linked to assessed risks create early warning capability |
| Process dependency maps | Business impact analysis and BCP planning | Dependencies identified in RCSA feed RTO/RPO calculations |
The GRC framework should provide the governance architecture connecting these outputs. Quarterly RCSA updates feed the risk committee reporting cycle, while dynamic trigger-based reassessments respond to material events such as regulatory changes, M&A activity, or significant incidents.
Capco’s 2025 research on operational resilience confirms that mature institutions embed RCSA results into their operational resilience programs through impact tolerance mapping and important business service identification.
RCSA Reporting and Governance
Completed RCSAs must translate into actionable intelligence for decision-makers. Board and senior management reporting should follow the “
What, So What, Now What” structure: present the risk profile (what), explain the implications and trends (so what), and recommend specific decisions or actions (now what).
A risk quantification approach for boards turns heatmap colors into financial exposure ranges that executives can act on.
RCSA Reporting Frequency and Audience
| Report Type | Frequency | Audience | Content |
| RCSA Dashboard Summary | Monthly | Risk Committee / CRO | Heatmap of current residual risk positions; trend arrows; top 10 risks; overdue action count |
| Detailed RCSA Results | Quarterly | Business Unit Leadership / 1st Line Heads | Full risk register; control effectiveness ratings; action plan progress; new risks identified |
| Board Risk Report (RCSA Extract) | Quarterly / Semi-Annual | Board Risk Committee | Strategic risk implications; risks exceeding appetite; aggregated residual risk profile; key decisions required |
| Audit / Regulatory Pack | Annual or as triggered | Internal Audit / External Regulators | RCSA methodology; population coverage; testing results; control deficiency trends; remediation status |
Effective RCSA governance also requires clear escalation protocols. Risks scoring Critical (17–25) should trigger immediate notification to the CRO and board risk committee.
Risks scoring High (10–16) require documented treatment plans within 30 days. This cadence aligns with regulatory risk management expectations and Basel III Pillar 2 internal capital adequacy process requirements.
RCSA Implementation Roadmap
| Phase | Actions | Deliverables | Success Metrics |
| Days 1–30: Foundation | Map enterprise processes to risk taxonomy. Define scoring scales and control effectiveness criteria. Select pilot business unit. Train facilitators on workshop methodology. Establish RCSA governance charter. | Approved RCSA methodology document. Calibrated 5×5 scoring matrix. Trained facilitator pool. Pilot scope definition. | Methodology document approved by Risk Committee. At least 3 trained facilitators certified. Pilot unit confirmed with management buy-in. |
| Days 31–60: Pilot Execution | Conduct first facilitated workshop with pilot unit. Complete scoring of all identified risks and controls. Build RCSA dashboard prototype. Validate results with 2nd and 3rd line. Refine template based on lessons learned. | Completed pilot RCSA register. Draft dashboard with heatmap and action tracker. Facilitator debrief report with refinements. Updated template reflecting pilot feedback. | Pilot unit RCSA completed within 2 workshop sessions. 100% of identified risks scored and mapped to controls. Dashboard reviewed and accepted by CRO. |
| Days 61–90: Enterprise Rollout | Deploy refined template to remaining business units. Schedule workshops by risk priority. Integrate RCSA data into enterprise risk register. Establish quarterly refresh cadence and trigger-based reassessment protocol. | Enterprise RCSA deployment plan with timeline. Integrated risk register reflecting RCSA inputs. Board-ready reporting template. Trigger-based reassessment protocol documented. | At least 3 additional business units completed. RCSA data feeding enterprise risk register. First quarterly board report produced. Trigger criteria documented and communicated. |
Common RCSA Pitfalls and How to Avoid Them
| Pitfall | Root Cause | Remedy |
| Checkbox mentality: RCSA becomes a compliance exercise with no real risk insight | Lack of senior management engagement; RCSA disconnected from business decisions | Link RCSA outputs directly to capital allocation, audit planning, and strategic risk reporting. Require management sign-off on residual risk acceptance. |
| Inconsistent scoring across business units | No calibrated scales; facilitators interpret scales differently | Deploy the anchored scales in this guide. Run calibration sessions where multiple units score the same test scenario and compare results. |
| Confusing causes with risks | Poor risk description methodology; participants describe symptoms rather than events | Enforce the Cause → Event → Consequence format in every risk description. Train participants with examples before the workshop begins. |
| Overestimating impact and underestimating likelihood | Anchoring bias; failure to consider incident management and recovery capabilities | Use frequency-based likelihood anchors. Remind assessors that impact ratings should reflect net impact after incident response, not theoretical worst case. |
| Static, annual-only assessments | No trigger-based reassessment mechanism; RCSA treated as a point-in-time snapshot | Define explicit triggers: regulatory changes, new products, significant incidents, M&A. Embed dynamic reassessment in the RCSA governance charter. |
| Control effectiveness assessed superficially | Single-dimension rating; no distinction between design and operating effectiveness | Use the dual-scale framework (design × operating) defined in this guide. Require evidence-based ratings supported by testing or audit results. |
| Workshop fatigue and low participation quality | Over-long sessions; too many participants; no pre-read materials distributed | Limit workshops to 3–4 hours with 6–12 participants. Distribute pre-reads 2 weeks in advance. Use the hybrid approach for lower-risk units. |
| No linkage to enterprise risk register | RCSA results stored in standalone spreadsheets with no upstream integration | Establish data feeds from RCSA output to the enterprise risk register and KRI dashboard. Use a GRC platform or standardized Excel model with validation rules. |
Looking Ahead: RCSA Trends for 2025–2027
The RCSA landscape is shifting rapidly. Regulatory expectations continue to rise, with the OCC, PRA, and ECB all emphasizing the need for forward-looking, data-driven operational risk assessments rather than static compliance artifacts.
Organizations that treat RCSA as a foundational capability rather than a periodic exercise will gain a competitive advantage in resilience, audit readiness, and stakeholder confidence.
AI-augmented risk identification is the most significant near-term trend. Machine learning models can analyze incident databases, loss event data, and external intelligence feeds to surface emerging risks that human assessors might miss.
The RMA/PwC survey found that only 20 percent of institutions currently use AI in their RCSA programs, but that number is expected to grow substantially as AI risk assessment frameworks mature and GRC platforms integrate natural language processing for automated risk categorization and control mapping.
Continuous risk assessment is replacing periodic cycles. Trigger-based RCSA updates, driven by real-time KRI breaches, incident alerts, and regulatory change notifications, enable organizations to maintain a current risk profile without waiting for the next scheduled review.
This shift aligns with the operational resilience agenda, where regulators expect institutions to demonstrate ongoing awareness of risks to important business services.
Convergence of risk assessments is another critical trend. Deloitte’s 2025 research noted that many organizations run a dozen or more independent risk assessments covering compliance, conduct, resilience, IT risk, cybersecurity, financial crime, and fraud.
Leading institutions are consolidating these into a unified RCSA platform with shared taxonomies and consistent scoring, reducing duplication and improving cross-functional risk visibility.
The path forward demands a well-designed RCSA template that can flex across risk domains while maintaining methodological consistency.
Ready to deploy a professional RCSA program? Visit riskpublishing.com for practitioner-developed templates, workshop facilitation guides, and consulting services to accelerate your implementation.
Explore our risk assessment library and contact us to discuss how we can support your organization’s risk management maturity journey.
References
1. ISO 31000:2018 – Risk Management Guidelines — International Organization for Standardization
2. COSO Enterprise Risk Management – Integrating with Strategy and Performance (2017) — Committee of Sponsoring Organizations
3. ISO 31010:2019 – Risk Assessment Techniques — International Organization for Standardization
4. IIA Global Internal Audit Standards (2024) — Institute of Internal Auditors
5. Basel III Pillar 2 – Supervisory Review Process — Basel Committee on Banking Supervision
6. Principles for the Sound Management of Operational Risk (2021) — Basel Committee on Banking Supervision
7. RCSA Best Practices for Effective Risk Management (2025) — Wolters Kluwer
8. The Ten Steps to RCSA Redemption (2025) — Deloitte UK
9. How Banks Are Refining RCSA Programs (2024) — Risk Management Association / PwC
10. OCC Finds Weak Risk Management at Half of Large Banks (2024) — Reuters
11. Building Operational Resilience Through RCSA (2025) — Capco
12. RCSA Practice Benchmark Study — ORX (Operational Riskdata eXchange)
13. Six Critical Factors to Modernize Your RCSA Program — MetricStream
14. RCSA Best Practices for Financial Services (2024) — Forvis Mazars

Chris Ekai is a Risk Management expert with over 10 years of experience in the field. He has a Master’s(MSc) degree in Risk Management from University of Portsmouth and is a CPA and Finance professional. He currently works as a Content Manager at Risk Publishing, writing about Enterprise Risk Management, Business Continuity Management and Project Management.
