Case Study 6.1: Verdant's KYC Transformation — Building the Automated Onboarding Journey

The Situation

Organization: Verdant Bank (fictional) Challenge: Transforming KYC from a 19-day manual process to a largely automated digital journey Timeline: Q4 2020 — Q2 2021 Maya's role: CCO leading the design and implementation


Background

By Q3 2020, Maya had closed the KYC backlog using the remediation firm. The immediate crisis was resolved. The next challenge was building a permanent KYC process that could scale with Verdant's growth without recreating the backlog problem.

The business targets: Verdant's CEO projected 100,000 new customers in the next 12 months, reaching 440,000 total. At the current manual processing rate (25 applications/day with 3 analysts), this was impossible — it would require hiring 15 additional KYC analysts just to maintain the current processing rate.

The regulatory targets: the FCA's supervisory feedback had identified KYC process quality — not just throughput — as a concern. The manual process was producing errors: documents approved without adequate scrutiny, screening not systematically applied to all applications.

Maya's design principles for the new system: 1. Speed must be maintained: a customer who completes verification should have a decision within 24 hours 2. Quality must improve: the automated process must be more consistent than the manual one 3. Human judgment must be preserved: analysts should review cases that genuinely require judgment, not cases that could be automated 4. The FCA must be able to audit: every decision, automated or manual, must be logged and explainable


The Design Process

Maya spent six weeks in design before selecting a vendor. The design process included:

Week 1-2: Requirements mapping Maya documented every current KYC process step, the regulatory basis for each, and the decision logic that analysts applied. This was harder than expected — much of the current process lived in analysts' heads rather than documented procedures.

Week 3-4: Risk framework design Maya worked with the risk team to define Verdant's customer risk segmentation for onboarding purposes: what signals indicated low, medium, or high risk at the point of application? This framework would drive the orchestration logic.

Week 5-6: Vendor evaluation Three vendors were evaluated against the requirements. Key selection criteria: integration with Verdant's core banking API, UK-specific eIDV data sources (not a generic international product), transparent audit logging, and a human review interface for analysts.


The Orchestration Logic

After vendor selection, Maya documented the orchestration decision tree with the vendor's implementation team:

Fast track (eIDV only) — approximately 60% of expected applications: Requirements: UK national, age 18+, positive eIDV match (score ≥ 0.85), no PEP or adverse media flags, applying for standard retail account. Time to decision: < 2 minutes.

Standard track (eIDV + document verification) — approximately 25% of expected applications: Requirements: eIDV match score 0.70–0.84, or non-UK national with UK address, or applying for higher-value account tier. Time to decision: < 24 hours (document processing).

Enhanced track (manual review required) — approximately 15% of expected applications: Requirements: eIDV match score < 0.70, PEP indicator, adverse media flag, complex application (joint account, trust, corporate), high-risk jurisdiction, or flagged by automated screening. Time to decision: 2–5 business days.


Implementation: What Went Well and What Didn't

What went well:

The eIDV integration was smooth — the vendor had existing connections to Verdant's chosen credit bureau and the UK electoral roll database. The document verification processed images reliably. The analyst workflow interface reduced the per-review time from an average of 22 minutes (manual process) to approximately 14 minutes (tool-assisted), because analysts no longer needed to manually enter data extracted from documents.

The FCA audit log was a success. Every decision — automated or analyst — was logged with timestamp, decision, rationale, data inputs, and the analyst ID where applicable. When the FCA conducted a sample review three months post-implementation, every sampled decision could be produced with full documentation within minutes.

What didn't go well:

The eIDV match rate for non-UK nationals was lower than projected — approximately 60% achieving the 0.85 threshold versus the vendor's projections of 75%. This meant more applications than expected were routed to the standard or enhanced track, which initially overwhelmed the analyst team.

The fix: Maya reconfigured the threshold for non-UK nationals specifically (accepting a lower eIDV score for this segment before requiring document verification), while maintaining higher standards for UK nationals where the eIDV data was more reliable.

The adverse media screening generated a high volume of false positives for common names — "David Jones" or "Sarah Ali" matched news articles about unrelated people with the same name. This required a name disambiguation step (cross-referencing the matched article's details against the customer's profile) that the system initially lacked.


Outcome: Six Months Post-Implementation

Metric Pre-Automation 6 Months Post
Median onboarding time 19 days 4 days
Automated approvals 0% 73%
Analyst time per manual review 22 min 14 min
KYC errors (misapprovals identified post-hoc) ~5% of sampled cases <1% of sampled cases
Customer abandonment (14-day window) 18% 9%
Cost per KYC decision £47 (estimated) £8 (estimated)
FCA assessment "Material concern" "Material improvement"

Discussion Questions

1. Maya spent six weeks in design before selecting a vendor. What are the risks of doing design without a vendor and then finding the chosen design is not achievable with the selected vendor? What is the alternative approach?

2. The eIDV match rate for non-UK nationals was lower than projected. This is a common issue with eIDV in diverse demographic markets. What are the compliance implications of a KYC process that is systematically less accurate for certain demographic groups?

3. The adverse media false positive problem — common names matching unrelated articles — required a disambiguation step. Design a disambiguation rule that would reduce false positives while maintaining appropriate sensitivity to genuine adverse media.

4. The estimated cost per KYC decision fell from £47 to £8. What costs go into the "£47" estimate? What might be underestimated in the "£8" estimate?

5. The FCA audit log proved immediately valuable during a supervisory review. What specific elements of the audit log would an FCA examiner be looking for in a sample KYC decision review?