To set the scene, this global bank took over the back-office processing of a number of asset-management companies. Although this was effective from a business perspective, the IT department ended up with multiple systems running different technology stacks, all performing similar activities. The challenge for the bank was to deliver on the anticipated economies of scale by consolidating everything into a new processing system. However, with many tens of thousands of interfaces and reports across the different systems, this was a much more complex task than had been imagined. The analysis effort alone was a sizable and costly piece of work that affected program timelines and delayed the realization of any benefits that the new system would deliver.
Much of the analysis phase of the project was a metadata challenge: to relate data about the existing systems to the new system, it was important to understand how data was processed and how it flowed through the different systems. All of this had to be done while both the existing and new systems were continually changing.
Initial attempts at the analysis were carried out using spreadsheets, but it quickly became clear that the process was unmanageable. The number of spreadsheets was growing exponentially, and this limited the bank’s ability to check for completeness and accuracy. They soon realized that a sophisticated metadata-management system was their only hope.
Up to this point, no Ab Initio products were being used within that division, but after intense analysis of metadata technologies, Ab Initio’s Enterprise Meta>Environment (EME) technology was selected as the foundation for the analysis work.
The bank proceeded to capture metadata about all their legacy applications. This metadata in turn drove the new application architecture, new interfaces and reports, and a data dictionary. As a result, the bank was able to fully support the analysts to:
Much of the metadata required to support this process was captured automatically from non-Ab Initio technologies, including databases, modeling tools, reporting environments and “ETL” products. Additionally, more than 1,000 SQL stored procedures were parsed to automatically extract embedded data-transformation rules and mappings to further enrich the metadata picture.
The system was operational within a couple of months, at which point the bank saw an immediate leap in productivity. The analysts needed much less time to understand existing functionality. Specifying new requirements was much faster. They had completely eliminated the specification writing step from the development lifecycle, and the specifications that were automatically generated were consistent, complete, and generally of better quality.
According to the bank, interface and report definition now requires 80% less effort than was required with the previous process. Furthermore, with hundreds of thousands in savings each month, the metadata solution paid for itself within 3 months and enabled the critical path of the program to contract by 6 months. The bank also now has visibility and control of the program together with real agility – the team can assess and activate a wide range of change projects far faster than ever before.
In the end, the academics were right... there IS enormous benefit in metadata, but only if you have a sound vision and the deep technology to support that vision.