Key Takeaways
Chapter 40: Integrating the RegTech Stack — A Full Program Review
The Essentials
-
A stack of tools is not a program. Individual systems that perform well in isolation can collectively fail as a compliance program if the data flows between them are inadequate, the audit trail is fragmented, or the management information does not aggregate across domains.
-
The five domains must share data, not just coexist. KYC data must flow to AML monitoring; AML alerts must carry customer context; case management must receive structured data from any monitoring system; regulatory reporting must draw from a governed, canonical data source. Integration happens through data architecture, not through proximity.
-
The four integration tests are the practical diagnostic. Can you produce a single customer view? Can you surface cross-domain alert correlations? Can you reconstruct a complete audit trail from a single query? Does management information say the same thing regardless of which system it comes from? Where these tests fail, integration work is needed.
-
Model risk management is continuous, not a one-time exercise. Every analytical model — transaction monitoring, fraud detection, customer risk scoring — must be subject to ongoing performance monitoring, drift detection, and documented escalation thresholds. SR 11-7 and the EU AI Act both require this. A model that is not being monitored may be providing worse results than it was at deployment — without anyone knowing.
-
The audit trail is the evidence base for regulatory examinations. It must be complete, accurate, timely, and retained for the full regulatory period. A fragmented audit trail — complete within each system but incoherent across them — creates examination risk even if the underlying compliance decisions were correct.
-
Management information must be derived, not assembled. If your compliance report requires manual data assembly from multiple systems, it is both delayed and potentially inconsistent. Automated, consistent management information derived from operational systems is the governance layer that holds the program together.
-
The compliance professional's role is integration. Technical competence (understanding what the systems do), regulatory knowledge (understanding what they must achieve), and organizational skill (making the humans use them) must come together in one person or team. No single capability is sufficient.
The Five Domains Reference
| Domain | Core Function | Data Flows To | Integration Failure Mode |
|---|---|---|---|
| Identity & KYC | Verify customer identity; classify risk | AML monitoring, sanctions screening, case management, regulatory reporting | Customer risk data not accessible downstream; monitoring lacks context |
| Financial Crime (AML/Fraud/Sanctions) | Detect suspicious patterns; investigate; file SARs | Case management, regulatory reporting, governance layer | Three separate alert queues; no cross-domain visibility; manual data assembly for investigations |
| Regulatory Reporting | Generate and submit required reports | Regulatory bodies; governance audit trail | Manual data extraction; report inconsistencies; deadline breaches |
| Market Surveillance | Detect market manipulation; monitor trading controls | Case management, regulatory reporting | Siloed from financial crime; same-customer patterns missed |
| Governance, Risk & Control | Model risk management; audit trail; management information; regulatory relationship | All domains (aggregation function) | Each domain has its own reporting; no cross-domain health view; fragmented audit trail |
The Four Integration Tests (Quick Reference)
| Test | Question | Pass | Fail |
|---|---|---|---|
| Single customer view | Can any analyst see everything about a customer from one interface? | Yes, in under 2 minutes | "Mostly, but check KYC separately" |
| Cross-domain correlation | When a customer has alerts across domains, does someone see the combined picture? | Yes, automatically surfaced | Teams investigate independently |
| Audit trail continuity | Can you reconstruct a complete decision history from a single query? | Yes, including context used at each step | Requires manual reconstruction from 3+ systems |
| Management information consistency | Does the Board report match the operational metrics? | Yes, derived from the same source | Numbers differ between reports |
Program Health Indicators
Monitoring Performance (by domain): - False positive rate: target < 20%; >30% requires recalibration - Overdue alerts (>5 days): zero tolerance; 10+ requires escalation - SAR filing on time: 100% required; any breach is a regulatory event
KYC Program: - KYC-current rate: target > 99%; <95% is a material gap - High-risk customer review on schedule: 100%
Regulatory Reporting: - On-time submission rate: target > 99% - Rejected submissions: should be zero; investigate root cause for any occurrence
Model Governance: - Population Stability Index (PSI): >0.25 requires recalibration assessment - Validation: all models validated annually at minimum - Model inventory: all production models documented with owner, purpose, validation date
Architectural Principles for Program Integration
- Customer Master Record: Single authoritative customer record, accessible to all systems, with versioned updates
- Transaction Data Lake: Normalized transaction data from all source systems; monitoring systems read from the lake, not from source systems
- Unified Alert Registry: All alerts from all monitoring domains in one registry, linked to customer record and transactions
- Tamper-Evident Audit Log: Complete record of all compliance decisions, accessible for examination and cross-domain analysis
- Aggregated Management Information: Automated derivation of cross-domain metrics, not manual assembly
What Maya Would Say
"The Risk Committee asked me for a view of the whole compliance program — not five separate updates about five separate systems. Getting to that view required three things: shared data standards so systems could talk to each other; a governance layer that pulled metrics from across the program; and the discipline to stop adding new tools until the ones we had were genuinely working together.
The technology was the easy part. Building a program — making it coherent, making it governable, making it explicable to a board — was the work."