Spaghetti Junction

September 26, 2016

By Bryan Adare.

Repetition might help when it comes to learning lines for a play or practicing for a music concert, but is a hindrance when it comes to financial data. A period of consolidation in the banking sector has left a handful of large players controlling many of the world’s financial transactions, and with them, the related data, be it personal details, monetary value or exchange rates, to name but a few. This means that each institution has several sets of similar data which is unruly and difficult to handle – effectively a pile of spaghetti.

The key is to merge this information into one system and eradicate any duplication. As well as improving regulatory compliance, reorganising the different data channels helps businesses gain a better understanding of their operations, enabling advanced analytics, better customer service and enhanced cross-selling capabilities.

“The key is to remove the spaghetti architecture that currently exists in many banks as a result of years of mergers and acquisitions,” says Brickendon Executive Director, Nathan Snyder. “Many institutions have too much duplicated information housed in too many different places and it is time to address this issue.”

So how do you go about doing this? After all, old habits are hard to break, especially when legacy systems are already in place.

The first step in the data consolidation process is to analyse the existing data types and storage methodologies. The most common are:

Centralised data stores: a one-stop shop for data capture, on-boarding, data storage and distribution for the entire firm.

Standalone databases: the data contained within these stores will generally be specific to the requirements of the respective system. ie. a database which feeds into a single trading system and only contains information for that specific system, product or asset class.

Manual static data: data is stored in plain text, comma-separated value (csv) or Excel documents and is maintained and manipulated by one or many operators. This data type has many inherent and complex risks associated with it, in particular the possibility of human error. Many legacy systems and processes rely on static data which is manually maintained.

The next stage in the consolidation process is to develop a business plan to show the impacts and benefits of migration.


  • Simplified architecture: a logical data model can be adapted to accept both data sets and data can be stored in a more efficient structure. Reduction of orphaned items and simplification of data architecture will enable more straightforward data extension and distribution to the systems and processes which leverage it.
  • Reduced complexity: consolidation of systems reduces the complexity of data storage, on-boarding and distribution.
  • System decommissioning: obsolete, outdated and surplus systems can be decommissioned.
  • Lowered costs: a single system requires less maintenance and has lower technical operations and service costs.
  • Streamlined distribution: easier distribution to systems and process which require the data and faster on-boarding of additional systems.
  • Enhanced data: data remediation efforts are often required when consolidating data and this can improve accuracy, as well as providing an opportunity to revisit data architecture to ensure the best fit.


  • Delivery risk: overly ambitious timeline and delivery expectations can lead to the risk of project failure or inability to deliver on proposed scope.
  • Cost of migration: migrating data can be expensive.
  • Resources required: significant project and delivery resources will be required to complete the programme.
  • Project management capabilities: Lack of onsite capabilities to manage such a sizable and complex project can significantly hinder the ability to successfully deliver the programme.

Once the business is convinced of the merits of consolidating its data, it is necessary to consider which methodology should be used. This depends on the type, size and complexity of the data, as well as the fit within the existing architecture.

The available options include:

  • Overwrite: a complete overwrite of data can be useful when migrating from obsolete systems which were previously primary data stores.
  • Compare and combine: this is most appropriate when there are two (or more) data sources for consolidation and their data type and architecture are aligned. IT enables a complete data set to be created which contains all the data from both (or all) previous sources. A Comparison Engine tool is utilised to automate much of the comparison process, while human operators will view and decide on any duplicates or exceptions which are identified.
  • Maintain separately: There will be some situations where it is beneficial to maintain data sources separately, but to consolidate into one system to leverage the benefits of integrated systems (different data types, architecture, regulatory or Chinese wall rationale).

Irrespective of which option you decide to go with, it is important to ensure that the change plan is communicated well to the key people in the business. Training will need to be provided and the necessary tools made available for staff to familiarise themselves with the new system and processes. Once in place, the new data system will require ongoing support and, where necessary, upgrades.

As with anything, dealing with fewer systems should be easier, however the importance of thorough planning, preparation and equally importantly, communication, should not be underestimated. If less is more, then preparation is paramount.

Become a Brickendon Change Leader

What can we help you achieve?