Case Study 38.1: Maya's Board Presentation — RegTech ROI at Verdant Bank
Overview
In October of the fourth year of her tenure as CCO at Verdant Bank, Maya Osei faced the request that many compliance leaders dread: justify the firm's RegTech investment in financial terms, to the Board, in six weeks.
The investment in question comprised three discrete technology implementations over the preceding eighteen months:
- KYC Automation Platform (vendor: a mid-tier UK RegTech specialist): £520,000 all-in, including implementation, licensing for 18 months, and integration work with Verdant's core banking system
- Transaction Monitoring Upgrade (AI-augmented overlay on the existing AML platform): £380,000, including the model training and tuning engagement
- Regulatory Reporting Platform (covering FCA GABRIEL submissions, COREP, and monthly statistical returns): £340,000, including implementation and data mapping
Total investment: £1,240,000.
The remaining £860,000 of the £2.1M the CFO had highlighted comprised vendor support contracts, additional licensing for ancillary tools, and an enterprise compliance training platform — investments that were spread across a larger portfolio and which Maya judged less amenable to rigorous individual ROI analysis. She made a tactical decision to scope the formal ROI review to the three major platform investments and note the remaining spend separately, with a brief explanation of why it was not included in the primary analysis.
This was, as Priya Nair observed when Maya explained the scope decision, exactly right. "Never let the perfect be the enemy of the useful. If you try to build an ROI case for everything, you'll either make it up for the things you can't measure, which destroys your credibility, or you'll caveat everything into mush. Pick the investments where the data supports a real analysis and do those rigorously."
Building the Numbers
KYC Automation: The Most Measurable Case
The KYC automation platform had the cleanest data because Maya had, almost by accident, documented the baseline. When she had initiated the procurement process eighteen months earlier, she had included in the RFP response evaluation a section asking each vendor to verify what automation rate they could achieve against Verdant's current volume. To answer this question, Maya had pulled three months of KYC completion data from the operations team and calculated: 3,400 KYC completions per month, requiring an average of 52 minutes of analyst time each.
Post-implementation data from the platform's own dashboard showed: 3,850 completions per month (volume had grown), requiring an average of 9 minutes of analyst time per completion, with 76% of cases fully automated (no human review required) and 24% requiring a review step averaging 38 minutes.
Maya did the calculation:
- Pre-technology analyst time: 3,400 × 52 minutes = 176,800 minutes per month = 2,947 hours per month
- Post-technology analyst time: 3,850 × ((0.76 × 0 min review) + (0.24 × 38 min)) = 3,850 × 9.12 = 35,112 minutes = 585 hours per month
- Monthly time saving: 2,947 − 585 = 2,362 hours
- Annual time saving: 28,344 hours
At a fully-loaded analyst cost of £70,000 per year (£43.75 per hour at 1,600 productive hours), the annual cost efficiency saving was £28,344 × £43.75 = £1,240,500.
She stared at this number for a while. It seemed too large. She checked it three times and had Priya check it too. It was correct, but it was partly a function of the volume growth: Verdant's customer onboarding had increased significantly over the period, meaning the comparison was between the current volume at post-technology efficiency versus the current volume at pre-technology efficiency — not pre-technology volume at pre-technology efficiency. The technology had effectively handled 13% more volume without any FTE increase. That was real value, but it needed to be labelled correctly.
Maya adjusted her framing: "2.5 FTEs avoided, plus approximately £180,000 in avoided contractor spend that would have been required to handle volume growth without the platform." This was more defensible than the raw calculation because it clearly labelled the headcount and the volume uplift separately.
Transaction Monitoring: The Harder Case
The transaction monitoring upgrade had been motivated by an alert backlog problem: in the months before the upgrade, the AML team had been running a chronic backlog of unreviewed alerts, with some alerts more than three weeks old when reviewed — a regulatory risk in itself, given the SAR filing timeline obligations.
Baseline data was messier here. Maya had the weekly alert volume reports (average 520 per week pre-upgrade), but she did not have a formal false positive rate measurement because the previous system had not tracked it systematically. She had to reconstruct the pre-technology false positive rate from the case management system's historical records: in the six months before the upgrade, the AML team had filed 34 SARs, investigated approximately 6,500 alerts (estimated from analyst time records), and closed the rest as false positives. Implied false positive rate: (6,500 − 34) / 6,500 = 99.5%.
Post-upgrade, the platform's dashboard showed: 490 alerts per week (slightly lower volume due to improved tuning), 82% false positive rate, 54 SARs filed in the first six months post-upgrade.
The calculation: - Pre-upgrade false positives per week: 520 × 0.995 = 517 - Post-upgrade false positives per week: 490 × 0.82 = 402 - Weekly reduction: 115 false positive investigations - Time per false positive investigation: 22 minutes (from team time study) - Annual hours saved: 115 × 22 × 52 / 60 = 2,189 hours - Annual cost saving: 2,189 × £43.75 = £95,769
But the bigger quantifiable benefit was the SAR filing compliance improvement. In the pre-upgrade period, the team had missed two regulatory SAR filing deadlines — situations where a SAR should have been filed within the required timeframe but was not due to the backlog. Maya had documented these internally. She did not know exactly what the regulatory consequence of discovered missed filings would have been — it would depend on the specific circumstances — but the FCA's published guidance made clear that systematic SAR filing failures were serious. Using a conservative expected value estimate: a 15% probability that discovered missed filings would result in enforcement action, with an expected action of £500,000, gave an expected cost of £75,000 per missed deadline, or £150,000 for the two incidents. The transaction monitoring upgrade, by eliminating the backlog, had eliminated this risk.
Regulatory Reporting: The Qualitative Case
The regulatory reporting platform was the most straightforward operationally — it had taken the COREP and FINREP production process from a two-day, three-analyst exercise to a four-hour, one-analyst review — but the ROI case was complicated by the fact that the platform had also required significant ongoing rule maintenance as the reporting schemas changed, consuming more time than anticipated.
Net of the additional configuration time, the platform was delivering approximately 1.3 FTE of time saving annually, worth approximately £91,000. But its bigger value, which Maya described qualitatively, was eliminating the error risk. In the pre-platform period, Verdant had received two FCA queries about data quality in its regulatory submissions — not formal findings, but queries that had required senior management responses. The platform's data validation and reconciliation controls had eliminated this risk.
Maya labelled this section "Clean Regulatory Submissions: Quantitative Benefit £91,000 / year; Qualitative Benefit: Elimination of data quality risk and senior management time on regulatory queries."
The CFO Challenge
Damien Walsh reviewed Maya's analysis the week before the Board presentation. He had four challenges:
Challenge 1: "The KYC number is inflated because volume grew." Maya agreed — and showed him the adjusted calculation that separated headcount avoidance from volume handling. The adjusted number was smaller but still compelling.
Challenge 2: "The SAR deadline expected value is speculative." Maya acknowledged this and showed him the methodology — explicit probability assumptions, documented source data — and noted that she had used a deliberately conservative probability estimate. She offered to remove it from the quantified benefits and present it as a qualitative risk avoided. He said: leave it in, but label it clearly as an estimate.
Challenge 3: "What's the NPV? I want to see a three-year NPV on each platform." Maya had not initially built a forward-looking three-year NPV; she had built an 18-month lookback analysis. She rebuilt the models overnight, projecting forward to a three-year view based on conservative benefit assumptions (benefits flat in Year 2, a modest uplift in Year 3 as adoption matured). The NPVs: - KYC platform: £340,000 over three years at 8% discount rate - Transaction monitoring: £180,000 over three years - Regulatory reporting: £95,000 over three years - Combined: £615,000 three-year NPV
Against a £1.24M investment, that implied a breakeven point of approximately 3.2 years — longer than the three-year evaluation window, but positive-trajectory.
Challenge 4: "Why did we spend £860K on the other tools and where's the return on those?" Maya had anticipated this and had a brief summary: the remaining spend was on support contracts, training infrastructure, and ancillary tools that supported the three major platforms. They were enabling costs for the primary investments, not standalone investments. She offered to build a more detailed breakdown if Damien wanted it; he declined.
The Board Presentation and Outcome
Maya's Board presentation ran thirty-eight minutes. She opened with the three key messages: the investment had delivered a combined three-year NPV of approximately £615,000 on the three platforms analysed; compliance capability had materially improved (quantified: 34% fewer false positive investigations, SAR deadline compliance now 100%, clean FCA annual review); and she was recommending a next-phase investment of £680,000 to extend the platform coverage to sanctions screening automation and model risk reporting.
The Risk Committee NED asked the most probing question: "You've got a payback period of 3.2 years. That's at the outer edge of what we'd typically accept for technology investment. What gives you confidence in the year three projections?"
Maya had prepared for exactly this question. She showed the sensitivity analysis: even at 75% of projected Year 3 benefits, the combined NPV remained positive. The payback period assumption was sensitive to the KYC volume trajectory — if customer growth continued at its current rate, Year 3 benefits would exceed projections rather than fall short. And the analysis was deliberately conservative: it had excluded the regulatory relationship value and the revenue enablement benefit of faster onboarding, neither of which she had been able to quantify precisely.
The Board approved the next-phase investment.
Lessons from the Case
Lesson 1: Baseline data is worth its weight in gold. The KYC platform produced the cleanest ROI analysis because Maya had accidentally documented baseline metrics during the procurement process. In future implementations, baseline documentation should be a mandatory pre-condition for any technology go-live.
Lesson 2: Transparency about uncertainty builds credibility. The most challenged number in the analysis — the SAR deadline expected value estimate — survived scrutiny because Maya had disclosed her methodology and assumptions completely. The CFO's challenge was not "this number is wrong" but "can you defend this?" — and she could.
Lesson 3: Scope the analysis to what you can rigorously defend. The decision to build the primary analysis around three platforms and note the remaining spend separately was tactically correct. A thinner analysis spread across all spending would have been less defensible than a rigorous analysis of the primary investments.
Lesson 4: The CFO is an ally, not an adversary. Damien Walsh's challenges made the analysis better. The overnight rebuild into a three-year NPV model produced a more useful document than the original lookback. The friction of CFO scrutiny is part of what makes the final analysis credible.
See also: Chapter 38, Section 38.6 (Communicating to the Board) and Section 38.7 (Communicating to the CFO).