Part 3: Off-the-shelf versus do-it-yourself

The 1970 space mission Apollo 13 famously featured one of the finest pieces of patched-together engineering ever seen on or indeed above this planet: making a square peg fit a round hole.

The Mailbox; credit: NASA

In Houston, legions of expert NASA engineers worked alongside the reserve crew to improvise a way of fitting a square type of carbon dioxide filter into a system designed for circular cartridges, using only what the astronauts had on board: space suit hoses, pieces of card, plastic bags and lots of silver duct tape.

Less famously, this fascinating, life-saving demonstration of ingenuity burned up in the atmosphere shortly after the crew left it behind on their safe return to Earth.

Now, retail banks might never be faced with genuine life and death situations, but they are frequently challenged by problems which draw teams of engineers and operational staff into War Room settings, amid a search for a fix, workaround or hack that will save the day.

When faced with a new regulation or demand of the vast reams of data held by the bank, the temptation can be to follow the same pattern and assemble the consultants, engineers and specialists to try and figure out how to build a solution from the available knowledge, systems and parts.

But what if the answer to the problem could be bought instead – off-the-shelf and ready to go? What if it was a simple case of plugging in an existing, scoped and developed solution?

The pros and cons of build versus buy

It’s fair to say that any decision on building or buying a data quality solution needs to satisfactorily answer the following questions:

1. Will this do what I need, or just what I can do?

Your internal programmers and developers are well able to adapt your systems to add new data fields, system requirements or processes, but will these deliver the results the business wants to achieve and to a known level of accuracy? For instance, if your existing data records are not cross-referenced against external data sources to aid de-duplication and augmentation, will it be cost-effective or indeed possible to develop that capability internally, or should a plug-and-play option be considered instead? Then once you have the deployment in place, can your end users adapt the solution themselves or will it have to keep going back to IT for any changes?

2. Will it deploy correctly?

Many IT changes are delayed as existing systems are rarely fully understood by everyone. As they are adapting core (and very frequently, old) systems, building a solution yourself will require a significant level of scoping to comprehend how the processes currently work and establishing that no downstream impacts will occur. Buying a solution to do a specific task can sit alongside core systems and interface to a desired level; that way, differences in how data are used between completely different parts of the business do not have to have an impact on customers or regulatory reporting further downstream.

3. Will it be possible to measure what is happening?

Internal teams will be able to assess the requirements and deliver a solution. After all, that’s why you continue to hire them! But can they measure the quality of the data, advise on its condition and conformity to your or external standards, and give you guidance as to how to fix anything that doesn’t fit? If you can’t view or report on whether the data is improving or deteriorating over time, it makes the objective pretty meaningless. (The first blog looked at the reasons why data quality measurement is something that cannot be ignored in retail banking)

4. What training and support will be required?

Leveraging existing systems should mean little training is required as users are familiar with the current setup. But how will the end user actually use the solution, what interface will they have? What level of knowledge or interpretative skill will be required? What process manuals need to be written, tested, and maintained? Will buying a solution prove more practical if training manuals are already a part of the offering on the table? Will your existing development teams also provide support or does the external vendor offer this as part of their deployment?

5. What budget is available?

Build-it-yourself can leverage staff already hired in key roles as well as knowledge of the operating systems. This can make the build option seem more cost-effective, but if that were truly the case it’s arguable that the fintech world wouldn’t exist. Banks would simply do all their own development instead of buying up or partnering with software specialists to deliver capability they cannot provide on their own.

The temptation can be to try and find one solution to solve all problems at an enterprise level, but being specific about one critical issue and running a proof-of-concept can lead to a better understanding of the problem, and the relative benefits and demerits of the built and bought options. Budget ought to also include the cost of what could be incurred if things go wrong: how readily can the change be backed out, and what customer impact could occur?

In summary, the buy vs. build decision needs to take into account not just the time it will take to investigate and deploy the change, but also the subsequent time in maintenance and updates for development teams and downstream users.

If your internal solution delivers only what you can do and not what you need to do then some level of manual workaround, interrogating or running queries may well still be required, and the risk is that your people end up doing what the machine should be doing rather than being freed up to engage in activities that grow the business.

When it came to planning Apollo 14, NASA didn’t install the electrical-tape workaround; instead, they started afresh with new parts to prevent the underlying issue from occurring. Life or death may not be on the cards for a bank but profitability, capability and compliance always are; against that backdrop the choice to buy or build is rarely simple, but always critical.

Click here for Parts 1 and 2 of this blog, taking an in-depth look at  the need for data quality metrics in retail banking.

Matt Flenley is continuing the theme of looking at how people (including astronauts) outside the silos of retail banking address critical problems.

If your data needs seem as vast as the black infinity of space and just as incalculable, we’d love to talk to you.

 

Blog Categories: DQM, Finance, Metrics, Regulation and Retail Banking.