Case Study 35.2 — The Governance Vacuum: Why Cornerstone's RegTech Programme Stalled
Overview
Organization: Cornerstone Financial Group (fictional composite) Type: UK mid-tier bank with retail, SME, and institutional divisions Investment: £2.3 million regulatory reporting platform Problem: 18 months after go-live, only 40% of reports generated through the new system Root cause: Governance vacuum — no designated post-production ownership Resolution: Explicit ownership assignment, accountability framework, and incentive alignment
Background: The Successful Implementation That Wasn't
By every technical measure, Cornerstone Financial Group's new regulatory reporting platform was a success. It had been delivered on time, broadly within budget, and had passed a rigorous programme of user acceptance testing. The implementation team had done what they were paid to do: they had built and deployed a regulatory reporting system capable of producing all eighteen of Cornerstone's material regulatory reports automatically, with full data lineage and an audit trail that the FCA could examine on request.
The go-live celebration had been genuine. The programme director sent a note to the management committee describing the deployment as "a significant step change in Cornerstone's regulatory reporting capability." The CCO, who had sponsored the project from its inception, described it in the quarterly board pack as "transforming our ability to meet our regulatory obligations efficiently and accurately." The vendor presented the implementation as a case study at an industry conference.
Eighteen months later, Rafael Torres was called in to find out why forty percent of the firm's regulatory reports were being generated through the new system and sixty percent were still being produced manually in Excel.
The Investigation: What Rafael Found
Rafael Torres had spent fifteen years as a compliance technology consultant, the last four specialising in post-acquisition integration. He had seen many technology implementations. He was not surprised by what he found at Cornerstone. He had seen it before.
The investigation took three weeks of structured interviews, process observation, and documentation review. The findings were organised around three questions: who uses the system, why don't more people use it, and whose job is it to make them.
Who uses the system? The platform was being used, consistently and correctly, by two teams: the Prudential Reporting team (responsible for COREP and FINREP submissions), which had been closely involved in the implementation and had two team members who had participated in the user acceptance testing; and a portion of the Transaction Reporting team that covered MiFID II trade reports. Between them, these teams were responsible for approximately 40% of Cornerstone's material regulatory reports.
Why don't more people use it? The remaining teams — responsible for CMAR submissions, AML regulatory reporting, FCA product sales data, and several smaller regulatory returns — had continued using their existing Excel-based processes. When Rafael interviewed the leads of these teams, the responses were consistent:
"The system was built for the Prudential team. Nobody ever really explained to us what we were supposed to do with it."
"We did a two-hour training session during the rollout. That was a year ago. I don't remember how to use it."
"I asked IT about access six months ago. They said there was a queue for user provisioning. I just carried on with the spreadsheet."
"I know the system exists. But my manager never said I needed to use it. So I use what I know works."
Whose job is it to make them use it? This was the most revealing question. Rafael asked it of every person he interviewed: who is responsible for ensuring that all regulatory reports are produced through the new platform?
The answers revealed the governance vacuum precisely.
The programme director said: "The implementation was completed to specification. Our job was to deliver the platform. Change management during implementation was in scope. Ongoing adoption is an operational matter."
The CCO said: "I sponsored the project and we delivered it. I expected the business lines to adopt it. I assumed the operations function was managing that."
The COO said: "I wasn't involved in the implementation. I didn't know we had an adoption problem until this review."
The Head of Regulatory Reporting — the most senior individual with daily responsibility for producing the reports in question — said: "Nobody told me the new system was my responsibility. I manage the reporting process. I use whatever tools are available. The tools I know are the spreadsheets."
The platform had been deployed into an ownership vacuum. During the implementation project, the CCO owned the project. The programme director managed delivery. The implementation team built and tested the system. But when the project formally closed and the programme director moved to her next engagement, ownership of the live system did not transfer to anyone. There was no designated system owner. There was no adoption target. There was no accountability mechanism for whether the system was being used.
The Root Cause Analysis
Rafael's root cause analysis identified two interacting failures.
The governance vacuum: No one was assigned ownership of the system's post-production outcomes. The programme formally closed upon go-live, as programmes do. The assumption — implicit, unexamined — was that the business lines would adopt the system because it was there and because it was better than their existing tools. This assumption had no mechanism behind it: no named owner, no adoption target, no review process, no management escalation if adoption stalled.
The change management gap: The training and communication programme during implementation had been scoped for the two teams who participated in user acceptance testing — the teams who were using the system. The teams who were not involved in the implementation had received a single two-hour training session twelve months earlier, with no follow-up, no reinforcement from management, no process documentation tailored to their use cases, and no easy access to support when they encountered problems.
The combination of these two failures produced a predictable outcome: the teams that had been directly supported through implementation adopted the system; the teams that had not been supported did not. In the absence of an owner with accountability for overall adoption, the gap compounded over time. Each month that passed made the Excel-based processes more entrenched and the adoption problem harder to address.
The financial and regulatory cost was material. The manual Excel processes were producing reports with an error rate of approximately 3.2% — a rate that had been deemed acceptable before the new system existed but that was clearly inadequate against the FCA's stated expectations for reporting accuracy. Two of the five manual Excel report types had errors identified in the firm's last internal audit, one of which had required a regulatory notification. The cost of the manual processes — staff time, error remediation, audit findings — was approximately £340,000 per year above what the platform would have cost to operate.
The Resolution: Making Ownership Real
Rafael's recommendations were operational rather than technical. The platform was sound. The problem was human governance, and the solution was human governance.
Step 1: Assign explicit post-production ownership. A named owner was designated for each of the eighteen regulated reports: the individual who would be accountable for ensuring that report was produced correctly, on time, and through the new platform. For each owner, accountability was documented in their role profile and their performance objectives. Critically, the accountability was positive — not "don't produce this report wrong" but "you are accountable for the quality and timeliness of this report, and you are expected to use the platform to produce it."
The designation of named owners was not presented as a punitive action — nobody had been told that using the platform was their job, so it would have been unfair to hold them accountable for not knowing. It was presented as a clarification of how the firm intended to operate going forward.
Step 2: Build the change management programme that should have been built during implementation. For each of the non-adopting teams, Rafael designed a structured onboarding programme: a user journey walkthrough of the specific reports that team was responsible for (not a generic system demo); an updated quick reference guide specific to their report types; a two-week "shadowing period" in which their existing Excel process ran in parallel with the platform process so users could see that the platform produced equivalent outputs; and a formal cutover date after which the Excel process would be retired.
The cutover dates were communicated to the team leads and to their managers. The managers were asked explicitly to confirm that they understood their teams would be transitioning to the platform and to actively reinforce the expectation with their teams.
Step 3: Address the access backlog. The IT user provisioning queue had been a genuine blocker for several users who had attempted to access the system. Rafael escalated this to the COO, who established a service level commitment: new user access requests would be processed within five business days. The access backlog was cleared within two weeks.
Step 4: Establish ongoing governance for the live system. A new quarterly review process was established, chaired by the Head of Regulatory Reporting (now the named system owner for overall platform performance), with attendance from the CCO, the data team, and the technology team responsible for platform maintenance. The quarterly review covered: adoption rate by report type; error rate by report type; data quality issues surfaced by the platform; pending platform updates required by regulatory change; and any issues escalated by report owners.
This was not a heavy governance structure. It was the minimum viable governance for a system of material importance. Its absence for the first eighteen months of the platform's life was the cause of the problem.
Results
Twelve weeks after Rafael's recommendations were implemented: - Adoption rate increased from 40% to 87% of regulated reports produced through the platform - The two remaining non-adopting report types (legacy submissions with bespoke data requirements) were in active implementation, with cutover planned for month four - The manual error rate on reports transitioning to the platform dropped from 3.2% to 0.4% - The user provisioning backlog was eliminated - The Head of Regulatory Reporting had convened the first quarterly review meeting
At eighteen months post-intervention, platform adoption stood at 100% of material reports. The annual cost saving versus the manual process — approximately £340,000 — was being realised. The FCA's follow-up review of Cornerstone's reporting processes noted improvement in accuracy and in the quality of audit trail documentation.
The total cost of the eighteen-month governance vacuum — wasted platform investment returns, manual process costs above platform costs, audit findings, and regulatory notification costs — was estimated by Cornerstone's finance team at approximately £680,000.
The cost of the governance remedy — Rafael's engagement, the change management programme, and the ongoing quarterly review — was approximately £85,000.
Key Lessons
A system that goes live without a designated owner is not live — it is waiting to fail. The Cornerstone platform was technically live but operationally inert for 60% of its intended use. Technical go-live is not programme completion. Operational adoption is programme completion.
Change management cannot be an afterthought for non-participating teams. The teams that participated in user acceptance testing adopted the system because they understood it. The teams that did not participate did not adopt it because they did not understand it and had not been required to learn. Change management must cover all intended users, not just the teams involved in the build.
Management reinforcement is not optional. Users will use the tools their managers tell them to use. If managers do not reinforce the expectation that the new system is the required working method, users will continue using the familiar old process. The absence of explicit management reinforcement at Cornerstone allowed sixty percent of users to opt out of the new system without consequences.
Governance must be scoped for post-production life, not just implementation. The implementation project had governance. The live system had none. The programme director was accountable for delivery; nobody was accountable for adoption. Post-production governance — an owner, a performance review process, an escalation path — must be designed before go-live, not after the problem becomes visible.
The cost of the governance vacuum almost always exceeds the cost of the governance remedy. At Cornerstone, the governance vacuum cost approximately £680,000. The remedy cost £85,000. This ratio — roughly 8:1 — is typical in Priya's and Rafael's combined experience. The investment required to establish and maintain post-production governance is modest relative to the cost of the failure mode it prevents.
This case study illustrates the governance vacuum failure pattern from Section 35.8.3 and the change management gap failure pattern from Section 35.8.4. It also illustrates the importance of the post-production ownership question in Priya's Five Questions framework (Question 2: "Who will own the output of this system day-to-day, and do they know that?").