Appendix E: Quick Reference Cards

Regulatory Technology (RegTech)

Condensed reference material for practitioners. Each card covers a key domain, framework, or concept from the textbook. Designed for field use — when you need the key points quickly, without re-reading a full chapter.


Card 01: AML Transaction Monitoring — Key Metrics

Metric Definition Indicative Targets Red Flag
False Positive Rate Alerts closed as non-suspicious / Total alerts < 30% (AI-driven); < 50% (rules-based) > 50% → recalibrate threshold
True Positive Rate (Recall) True suspicious transactions caught / All suspicious Context-dependent; track trend Declining trend → model drift
Alert-to-SAR Rate SARs filed / Total alerts 1–5% (higher volume systems) < 0.5% → threshold too high
SAR on Time SARs filed within regulatory deadline 100% Any breach = regulatory event
Population Stability Index Distribution shift in model inputs < 0.10 stable > 0.25 → recalibration required
Alert Review SLA % alerts reviewed within target (e.g., 5 days) ≥ 98% < 90% → capacity issue

PSI Interpretation: - < 0.10: Stable — no action required - 0.10–0.25: Minor shift — increase monitoring frequency - > 0.25: Major shift — recalibrate model


Card 02: DORA Major Incident Notification Timeline

Phase Trigger Deadline Key Content Recipient
Initial notification Classification of major incident 4 hours after classification Incident classification; brief initial assessment; whether actual or potential impact on financial interests Competent authority (e.g., FCA, PRA, EBA-regulated NCA)
Intermediate report After initial notification 72 hours after classification Root cause analysis; current status; measures taken; cross-border impact Same as initial
Final report After recovery 30 days after classification (complex: up to 1 year) Full root cause analysis; impacts; lessons learned; follow-up measures Same as initial

Key DORA Concepts: - Classification trigger: Date of detection, not date of recovery - Major incident criteria: Defined in Commission Delegated Regulation (thresholds for affected users, duration, geographic spread, economic impact) - Management body: Must be "adequately trained" on ICT risk (Article 5)

UK equivalent (FCA/PRA): PRIN 11 notification for material incidents. No defined timeline but FCA expects "as soon as practicable." Impact tolerance breach must be documented.


Card 03: GDPR Key Obligations for RegTech Systems

Obligation Article RegTech Application
Lawful basis Art. 6 Monitoring systems must have identified legal basis (legal obligation, legitimate interest)
Data minimisation Art. 5(1)(c) Monitoring should use minimum personal data necessary
Purpose limitation Art. 5(1)(b) Data collected for AML cannot be repurposed for marketing
Accuracy Art. 5(1)(d) Customer data in monitoring systems must be kept current
Storage limitation Art. 5(1)(e) Minimum retention periods; maximum where not required
Security Art. 32 Appropriate technical and organizational measures
Data breach notification Art. 33/34 ICO within 72 hours of awareness; individuals where high risk
Data processing agreements Art. 28 Required for all processors (RegTech vendors)
Right of erasure Art. 17 May be overridden by legal retention requirements (Art. 17(3)(b))
Automated decisions Art. 22 Right to human review for solely automated decisions with significant effect

AML vs. GDPR tension: AML regulations require retention for 5 years minimum; GDPR requires no longer than necessary. Resolution: retention for the statutory minimum AML period is justified under Art. 6(1)(c) (legal obligation).


Card 04: EU AI Act — High-Risk AI Quick Reference

Eight Annex III High-Risk Categories (relevant to financial services): 1. Biometric identification and categorization (KYC facial recognition) 2. Critical infrastructure management 3. Educational access 4. Employment, worker management (compliance team screening AI) 5. Access to essential services and benefits (credit decisions) 6. Law enforcement 7. Migration and asylum 8. Administration of justice (AML case decisioning)

High-Risk AI Requirements (Arts. 9–15):

Requirement What It Means in Practice
Risk management system Documented lifecycle risk management; residual risk acceptable
Data governance Training data: relevant, representative, free from bias; testing data documented
Technical documentation Model card; performance data; architecture description
Record-keeping Automatic logging of system operation where technically feasible
Transparency Clear disclosure that decision is AI-assisted where relevant
Human oversight Designated human(s) capable of monitoring, understanding, and overriding
Accuracy, robustness Consistent performance across foreseeable operating conditions
Cybersecurity Resilient to manipulation of training data, model inputs

Conformity assessment: High-risk AI systems must undergo conformity assessment (self-assessment for most; third-party for biometrics and remote identification).


Card 05: Model Risk Management — SR 11-7 Three Pillars

Pillar Core Requirement Key Activities
Development & Implementation Models must be built, tested, and implemented with rigor Conceptual soundness; testing against holdout data; documentation of assumptions and limitations
Validation Independent validation of all models Performed by function independent of development; covers conceptual soundness, data analysis, outcomes testing; initial + ongoing
Governance Management oversight of model risk Model inventory; risk tiering; policy; escalation process; board/senior management awareness

Key SR 11-7 concepts: - Model risk: Potential adverse consequences from decisions based on incorrect or misused models - Model inventory: Required for all models; must include model purpose, use, owner, validation status - Ongoing monitoring: Not just at deployment; performance must be monitored in production - Conceptual soundness: Model methodology must be appropriate for its intended use


Card 06: KYC Customer Risk Rating — Quick Framework

Standard Risk Factors: - Customer type (natural person / legal entity / PEP / high-risk business) - Country of origin / country of transaction (FATF greylist / blacklist) - Product/service used (high-value, cash-intensive, anonymous) - Delivery channel (non-face-to-face, correspondent) - Transaction patterns (unusual amounts, frequency, counterparties)

Risk Tiers (illustrative):

Tier CDD Level Review Frequency Typical Customers
Low Simplified (SDD) 5 years Low-risk retail, standard products, domestic
Standard Customer (CDD) 3 years Most retail and SME customers
High Enhanced (EDD) 1 year Complex structures, high-risk jurisdictions, PEPs
Very High EDD + ongoing 6 months or event-driven Highest-risk: correspondent banks, high-risk PEPs

EDD Triggers (MLR 2017, Reg. 33): - Customer or jurisdiction appears on FATF blacklist/greylist - PEP or immediate family member - Non-face-to-face business - Business sectors with elevated ML risk (cash-intensive, gambling, dealers in precious metals)


Card 07: Sanctions Screening — Calibration Quick Reference

Key Screening Lists: - OFAC SDN List (US): US-mandated for US persons and dollar transactions globally - HM Treasury Consolidated List (UK): Post-Brexit UK sanctions - EU Consolidated Sanctions List: EU persons and entities - UN Security Council Consolidated List: Binding on all UN members - OFAC Sectoral Sanctions (US): Industry-wide restrictions (Russian energy)

Matching Quality Metrics:

Metric Definition Typical Range
Precision True matches / Total matches returned 30–70%
False Positive Rate False alerts / Total alerts 30–70% (depends on threshold)
False Negative Rate Missed true matches / All true matches Target: < 0.1%

Matching Parameters to Tune: - Exact match: Name must be identical (very low false positive, higher false negative) - Fuzzy match: Name similarity threshold (typically 85-95%); higher = more precise, fewer false positives - Phonetic matching: Sounds-like matching for transliterated names - Alias matching: Should include known aliases from the list

Alert Disposition Categories: 1. True match: Confirmed sanctions match → block transaction; file report 2. Potential match pending: Under investigation; hold transaction 3. False positive: Investigation shows no match → close; document 4. Refer: Insufficient information; escalate to senior analyst


Card 08: Regulatory Reporting — Key Timelines

Report Jurisdiction Filing Deadline Frequency Key Authority
MiFIR transaction EU/UK T+1 Per trade ESMA/FCA
EMIR derivative EU/UK T+1 Per trade ESMA/FCA
SAR/STR UK No defined deadline; "as soon as practicable" Per suspicion NCA
CTR (US) US 15 days from transaction Per transaction >$10,000 FinCEN
CASS oversight UK Within 10 days of period end Monthly FCA
ICAAP EU/UK At least annually Annual EBA/PRA
LCR EU/UK Monthly Monthly EBA/PRA
COREP EU/UK Quarterly Quarterly EBA/PRA
GDPR breach EU/UK 72 hours after awareness Per breach DPA (ICO in UK)
DORA major incident (initial) EU 4 hours after classification Per incident Competent authority
FCA PRIN 11 UK "As soon as practicable" Per material incident FCA

Card 09: Change Management — ADKAR Quick Diagnostic

ADKAR: The Five Stages

Stage Question Intervention if Stalled
Awareness Does the person understand WHY the change is happening? Communicate the regulatory/risk driver; town halls; Q&A
Desire Does the person WANT to support the change? WIIFM; acknowledge losses; involve in design
Knowledge Does the person know HOW to change? Role-based training; workflow practice; FAQ
Ability Can the person perform the new behaviors RELIABLY? Extended practice; safe environment; second training wave
Reinforcement Is the change SUSTAINED over time? Remove old system; monitor adoption; recognize adopters

Red Flags by Stage: - Awareness: "I don't understand why we're changing" / repeated basic questions - Desire: "I'd rather keep the old system" / working around the new tool - Knowledge: "I don't know how to do X in the new system" - Ability: High error rates in production; excessive escalations; long completion times - Reinforcement: Reversion to old methods; old system still in use


Card 10: RegTech Business Case — NPV Quick Reference

Formula: NPV = Σ (Net Cashflow_year / (1 + r)^year) for year 0 to N

Where: - Net Cashflow = Benefits - Costs for that year - r = Discount rate (typically firm's hurdle rate or WACC, often 8–12% for internal projects) - Year 0 = Implementation year (costs only; minimal benefits)

Four Value Categories:

Category Measurability How to Quantify
Cost efficiency High FTE time saved × loaded cost; false positive investigation cost × reduction
Risk reduction Medium Expected value: probability of fine × magnitude × reduction probability
Regulatory relationship Low Qualitative; scenario analysis for examination efficiency
Revenue enablement Medium Speed to market; onboarding time reduction × revenue per customer

Sensitivity Analysis Template:

Scenario Benefit Multiplier 3-Year NPV
Upside 125% £X
Base 100% £Y
Base - Risk only 75% £Z
Downside 50% £W

If downside NPV is still positive: strong case. If downside NPV is negative: identify which assumptions are critical and how confident you are in them.


Card 11: Integration Testing — The Four Tests

Test 1: Single Customer View Can any analyst see all compliance-relevant data about a customer — across KYC, AML, fraud, sanctions, case history — in a single interface? - Pass: Yes, in under 2 minutes, without switching systems - Fail: "Mostly, but you need to check [system] separately"

Test 2: Cross-Domain Alert Correlation When a customer has concurrent alerts in multiple domains, does anyone see the combined picture automatically? - Pass: Unified alert registry surfaces cross-domain flags; investigation teams notified - Fail: Each domain investigates separately; combined picture never assembled

Test 3: Audit Trail Continuity Can a regulator's examination request — "show us everything about customer X" — be answered from a single query? - Pass: Complete decision history available from audit log aggregator - Fail: Requires manual reconstruction from 3+ systems

Test 4: Management Information Consistency Does the Board report say the same thing as the operational metrics? - Pass: Both derived from same governed source; numbers match - Fail: Different numbers in different reports; unexplained discrepancies


Card 12: Professional Ethics — Three Framework Quick Reference

Framework Central Question Applied to Algorithms
Consequentialism What produces the best aggregate outcomes for all affected parties? Evaluate: who benefits? who is harmed? how are harms distributed? is aggregate calculation complete?
Deontology What do people have a right to — regardless of outcomes? Rights: explanation; contestation; not to be assessed through discriminatory proxies; consent
Virtue ethics What does this reflect about the kind of institution we are? Asks: "Would we be proud if this decision process were visible?" — and requires character-based judgment, not rule-following

The Key Insight: Compliance is necessary but not sufficient for ethical behavior. Legal compliance is the floor. Ethical analysis asks what is right above that floor.

The Compliance Professional's Distinctive Role: Positioned to ask, systematically and institutionally, the questions others are not structured to ask: Who is this affecting? Is it fair? Can we explain and defend it? Can we be proud of how this decision is made?