Chapter 40 Quiz
Integrating the RegTech Stack — A Full Program Review
12 questions. Answers follow.
1. The distinction between a "RegTech stack" and a "RegTech program" is primarily that:
A) A stack uses more advanced technology; a program uses simpler, more reliable systems B) A stack describes individual systems performing their functions; a program describes the integrated operation of those systems including data flows, governance, and cross-domain coherence C) A stack is used by larger institutions; a program is used by smaller ones D) A stack is vendor-managed; a program is internally managed
2. The "single customer view" integration test asks:
A) Whether every customer has been verified through the same identity verification process B) Whether a single analyst can see everything the institution knows about a customer — across all domains — in one interface C) Whether customer data is stored in a single database D) Whether customers have a single point of contact for all compliance-related queries
3. A compliance program fails the "audit trail continuity test" when:
A) The audit log does not contain the full text of every transaction B) Audit logs are retained for fewer years than regulations require C) The complete decision history of a specific compliance matter cannot be reconstructed from a single query — it requires manual assembly from multiple systems D) The audit trail has not been reviewed by external auditors in the past year
4. The chapter argues that the "transaction data lake" should be the source that monitoring systems read from, rather than the source transaction processing systems directly. The primary reason is:
A) Transaction data lakes are faster than source systems B) Reading directly from production source systems would slow them down C) A governed, normalized, canonical data lake ensures consistency, reduces data quality issues, and simplifies monitoring system architecture D) Regulators require all transaction data to be stored in a central repository
5. A Population Stability Index (PSI) score of 0.28 for a transaction monitoring model indicates:
A) The model is performing well — higher PSI scores reflect better detection capability B) The model's input distribution has shifted significantly from its training distribution, triggering a recalibration assessment C) The model has processed 280,000 transactions and is due for a performance review D) 28% of alerts from the model are false positives
6. The "cross-domain alert correlation" integration test identifies a failure when:
A) An AML alert and a fraud alert about the same transaction are processed by the same analyst B) Two domains generate duplicate alerts about the same transaction C) A customer who has active alerts in multiple domains (AML, fraud, sanctions) is investigated separately by each domain team with no visibility between them D) The total number of alerts across all domains exceeds the team's capacity
7. The governance layer of a RegTech program encompasses which of the following?
A) Only model risk management B) Model risk management, audit trail integrity, management information architecture, ongoing change management, and regulatory relationship management C) The technical infrastructure that connects all compliance systems D) The compliance team's organizational structure and reporting lines
8. Management information that requires "manual assembly" from multiple systems creates which specific governance problem?
A) The cost of producing management reports is higher than with automated reporting B) The reports may be delayed and potentially inconsistent — different numbers appearing in different reports because they come from different systems at different points in time C) Manual assembly cannot be done by compliance professionals without technical training D) Regulators require that all management information be produced by automated systems
9. In the integrated program model, the "Alert Registry" serves which function?
A) A list of all regulatory alerts and notifications sent by the firm to regulators B) A unified store of all alerts from all monitoring domains, linked to customer records and transactions, enabling cross-domain analysis C) A register of all alerts that have been escalated to senior management D) A vendor-provided alert management system that connects to all monitoring platforms
10. SR 11-7 (US Federal Reserve) and the EU AI Act both require which of the following in relation to compliance models?
A) That all models be open-source so they can be audited externally B) That models be replaced every three years regardless of performance C) That models be subject to ongoing performance monitoring, drift detection, and continuous validation — not just initial deployment validation D) That models must be approved by the regulator before deployment
11. The "management information consistency test" is failed when:
A) Management reports are produced more than once a week B) The CCO and the Head of AML have different numbers for the same metric because they are drawing from different systems at different times C) Management information is presented in graphical rather than tabular format D) The same individual produces both operational and management reports
12. The chapter's conclusion that "the compliance professional's role is integration" refers to:
A) Their role in technically integrating different software systems B) The need to integrate all dimensions — technical, regulatory, organizational, and ethical — into effective compliance programs, holding them together in practice C) Their responsibility for ensuring data integration across the compliance technology stack D) The compliance professional's duty to integrate new regulatory requirements into existing procedures
Answer Key
| Q | A | Explanation |
|---|---|---|
| 1 | B | A stack describes components; a program describes how they work together — including data flows, governance, and the cross-domain integration that determines whether the components function as a coherent compliance capability. Individual systems performing well does not guarantee program-level effectiveness. |
| 2 | B | The single customer view test asks whether any analyst can see, in one interface, all the compliance-relevant information about a customer across all domains. This requires architectural integration of the customer master record with all operational systems — not just storage of customer data in one database. |
| 3 | C | Audit trail continuity requires that the complete history of a compliance decision — what was known, what was reviewed, what decision was made, by whom — can be produced from a single query. Manual assembly from multiple systems creates examination risk, introduces inconsistency, and makes it impossible to provide regulators with a coherent account of a compliance matter. |
| 4 | C | The governed, normalized data lake is the integration mechanism for transaction data — it ensures that all monitoring systems see the same data, in the same format, with the same quality standards applied. Reading directly from source systems creates architecture complexity, data quality inconsistency, and load on production systems. |
| 5 | B | PSI (Population Stability Index) measures distribution shift in the model's inputs. A score >0.25 indicates major shift — the model is being applied to a population that differs significantly from its training population. This triggers a recalibration assessment, not necessarily immediate replacement. PSI does not measure detection capability or processing volume. |
| 6 | C | Cross-domain correlation failure occurs when separate domain teams investigate alerts about the same customer independently, without visibility into each other's work. The combined pattern — which might be significant — is never seen. This is a governance and architecture failure, not a volume or duplicate-detection failure. |
| 7 | B | The governance layer is comprehensive: model risk management (validating analytical models), audit trail integrity (complete, tamper-evident decision records), management information architecture (aggregated, consistent cross-domain metrics), change management (governing ongoing system and process changes), and regulatory relationship management (examination readiness and communication). |
| 8 | B | Manually assembled management information is both delayed (it takes time to gather from multiple systems) and inconsistent (different reports produced at different times may show different numbers for the same metric). Inconsistency undermines confidence in the compliance program and creates difficulty when regulators find discrepancies between reports. |
| 9 | B | The unified Alert Registry is the integration mechanism for compliance alerts across all monitoring domains. By storing all alerts in one place, linked to customer records and transactions, it enables cross-domain analysis — surfacing when one customer has concurrent alerts in multiple domains — that is not possible when each domain has its own separate alert queue. |
| 10 | C | Both SR 11-7 and the EU AI Act (for high-risk AI) require ongoing monitoring and continuous validation of models, not just initial validation at deployment. This means: ongoing performance metrics tracking; drift detection; documented escalation procedures; and a process for recalibration or replacement when performance deteriorates. |
| 11 | B | The management information consistency test fails when the same metric — say, the false positive rate for AML — shows different numbers in the CCO's report and the Head of Financial Crime's report, because they are drawing from different queries of different systems at different times. Consistency requires a single, governed source for all management information. |
| 12 | B | The compliance professional's integrative role is not technical integration — it is the ability to simultaneously apply technical, regulatory, organizational, and ethical knowledge to complex, ambiguous compliance challenges. This cross-dimensional capability is the core professional competence that the book has been developing and that no single-domain expertise can substitute for. |