Case Study 37.1: The System Nobody Used — Cornerstone's Failed Regulatory Reporting Transformation
The Situation
Organization: Cornerstone Financial Group (fictional composite institution) Function: Regulatory Reporting Challenge: A technically complete regulatory reporting platform, implemented at a cost of £2.3M, was eighteen months post-go-live with only 38% adoption — the majority of reports were still being produced manually Primary characters: The reporting team lead (James), the project manager (Alicia), Cornerstone's newly appointed Head of Change Management (Natasha), and Priya Nair's advisory team, brought in to diagnose and remediate
The Investment Case
In early 2023, Cornerstone's regulatory affairs function faced a growing reporting burden. The number of regulatory submissions had increased by 40% over three years as EMIR, MiFIR, and SFTR obligations expanded. The team's manual processes — spreadsheet-based aggregation, manual data validation, emailed files — were creating operational risk. A single analyst's error in a Q3 2023 MiFIR submission had resulted in an FCA engagement requiring a corrections process.
The solution seemed straightforward: implement an integrated regulatory reporting platform. The procurement process was thorough. Cornerstone selected a tier-one vendor with a proven installation base. The implementation was project-managed. Training was conducted. Go-live occurred in September 2023, on schedule.
By February 2025, the system was technically operational. And largely unused.
What Priya Found When She Arrived
Priya Nair was engaged in February 2025 — at Cornerstone's request, following a difficult quarterly business review in which the CFO asked why £2.3M had been spent on a system that 62% of its users were circumventing.
She conducted two weeks of structured interviews: the reporting team, the project team, senior leadership. What she found was not a technology failure.
Finding 1: The ownership gap.
During the implementation project, the CCO had owned the program. There had been a project manager, a steering committee, weekly status reporting. When the project closed in October 2023 — on time, on budget, scope delivered — the project governance structures dissolved.
No one was designated to own the platform post-go-live. The CCO assumed that because the implementation was complete, the system was "running." The reporting team lead, James, had been a passive stakeholder during the implementation — he had attended training, provided feedback on one UAT session, and then the project had declared success and moved on.
James had not been asked whether he would adopt the system. He had not been asked to commit to using it. He had not been given authority over how his team used it. And when the project team left, he made a quiet organizational decision: for now, they would continue producing reports manually, supplementing with the new system where it was clearly better, until they had more confidence in it.
"More confidence" never arrived. Because there was no governance mechanism, no monitoring, and no one asking.
Finding 2: The training gap.
Training had occurred in August 2023 — one month before go-live. By February 2025, most of the trained staff had turned over or been assigned to other roles. The three analysts who had received the deepest training were no longer on the reporting team. New team members had received no training at all — they had learned the manual processes from colleagues who had also been only partially trained.
The knowledge that existed in the reporting team as of February 2025 was almost entirely knowledge of the manual process. The system was a sophisticated tool whose capabilities were almost completely unknown to its intended users.
Finding 3: The incentive misalignment.
The reporting team's performance metrics — accuracy of submissions, on-time rate, error rate — were the same after the new system as before. There was no metric that measured system usage. There was no incentive to adopt the platform, and there was a significant disincentive: the risk that a new, unfamiliar system would produce errors in regulatory submissions with potential FCA consequences.
James had made a rational decision given his incentives. The cost of system adoption (risk of errors, time to build competence, unfamiliarity) outweighed the benefit (no direct reward for using it). The manual process was slower and more resource-intensive, but it was reliable and familiar.
Finding 4: The supervisor's implicit message.
Priya asked the CCO how often he reviewed reports from the new system. The CCO said he reviewed the regulatory submissions, which were attached to briefing papers, without knowing how they had been produced. He had never asked how the reports were being produced. He had never asked James for an update on adoption.
From James's perspective, the CCO's lack of inquiry was tacit approval of continuing the manual process.
The Remediation Program
Priya's recommendation was not a re-implementation. The system was technically capable. The problem was organizational. She proposed a 90-day adoption program.
Week 1–2: Governance establishment. A named system owner was appointed — James, with explicit accountability for adoption. Monthly adoption reporting was added to the CCO's management information pack. A 90-day adoption target was set: 85% of reports produced through the system by month-end.
Week 3–4: Team retraining. A complete retraining program was designed, this time led by James, who participated in a "trainer training" session with the vendor. Retraining was structured around the actual reports the team produced — not system features — and included the five most common error scenarios from the team's first six months of intermittent usage.
Month 2: Supervised adoption. For 30 days, each report was produced in the system and reviewed by James before submission. Errors were caught and corrected in the system, building both competence and confidence. James maintained a "issues log" that was shared with the CCO weekly.
Month 3: Full adoption and monitoring. The manual process was formally retired (spreadsheet templates were archived, not deleted). System-generated reports became the only accepted format for regulatory submissions. The old manual templates were removed from the shared drive.
At 90 days: adoption was at 91%. At six months: the team's report production time had fallen by 34%. Two junior analysts reported that they now found the manual process intimidating — the system had become the familiar environment.
The Governance Lesson
Cornerstone's board asked Priya what the organization should have done differently. Her answer was precise:
"You needed two project managers: one who owned the technical implementation, and one who owned the change. The technical project ended in October 2023. The change project should have run for another twelve months — through supervised adoption, retraining, metric establishment, and the formal retirement of the manual process. You closed the project when the system was installed. You should have kept it open until the system was adopted."
The technical project had been well-managed. The change program had never existed.
Discussion Questions
1. The reporting team lead (James) made what Priya described as "a rational decision given his incentives." His decision was to continue manual production. What incentive structures would have made system adoption the rational choice? How should these have been designed before go-live?
2. The "ownership gap" — no named system owner after project close — allowed non-adoption to persist undetected for 18 months. Design a post-go-live governance structure for a compliance technology platform that would have surfaced this problem within 30 days.
3. The CCO's lack of inquiry about how reports were being produced was interpreted by the team lead as tacit approval of continuing the manual process. How should compliance leadership visibly reinforce technology adoption without micromanaging the operational team?
4. New team members had received no training at all — they had learned the manual process from colleagues. How should a training program be designed to ensure that team turnover does not erode system adoption over time?
5. Priya recommended "retiring" the manual process — archiving the spreadsheets rather than deleting them. Is this the right balance? When, if ever, should firms completely delete previous process tools? What are the compliance implications of keeping vs. removing old process documentation?