Data quality issues in banks are becoming more and more transparent due to the required regulatory granularity
Data quality issues and inconsistencies across the various deliveries, which previously were not discovered due to aggregations or missing possibilities for temporal comparisons, might now break through and be revealed by supervisory authorities as a result of the extended delivery requirements.
“Never change a running system” turns into a boomerang
Most of the banks are familiar with the causes of these risks, since their IT infrastructure and reporting processes evolved over time. Oftentimes, these banks preferred quick solutions at minimum costs in the short term, thus turning the resulting IT and reporting landscapes into complex and partially redundant entities. This led to a vicious circle, because the increased complexity more and more justified parallel solutions in order to avoid undesired side effects from changes. Therefore, the concept of integration started to fade into the background.
In the meantime, most banks have realized that a “Carry on” mentality is no longer an alternative. Additionally, existing data architectures are unsuitable for implementing further increasing time-to-market requirements or lead to unjustifiable efforts for such implementations. This theory is also backed by the results of a zeb benchmark analysis, which found out that the current environment is a difficult terrain for banks that keep their existing and obsolete data architectures and only accept minimal adjustments.
Numerous banks see BCBS 239 as an opportunity for sustainably improving their data architecture—however, there is a lack of positive arguments.
In view of the current regulatory requirements laid down by BCBS 239 and the new MaRisk, banks set up corresponding implementation projects. Two procedures have turned out to be the most common: Some banks orient their project towards implementing the regulatory requirements with minimum effort. Other banks initiate advanced projects that aim at setting up a high-performance data architecture (including governance and processes) for overall bank management. The advanced approach entails a fundamental modification of the entire planning-related data supply and processing and addresses aspects such as the setup of an interdisciplinary glossary, a technical data model and consequently an integrated data warehouse for bank management (beyond regulatory and risk).
Especially the second scenario requires some perseverance to completely achieve the defined objective. In the contest for resources within the bank, the project manager has to ensure that the project leads to short-term benefits by applying a suitable release cut. Furthermore, the project’s total expense has to entail benefits for the bank. When focusing solely on regulation and risk management (look behind: “How can I avoid damage?”), many project managers have trouble setting up a reproducible cost-benefit analysis.
Let’s follow the fintech example and handle data as a strategic asset
This attitude of the “chased” neglects the fact that besides the improved capability to deliver the necessary information to the regulators, in the middle and long term, this also provides the opportunity to turn data into a strategic asset for the bank and to generate new earnings (look forward: “How can I generate new earnings?”). Fintech companies lead the way for traditional institutions. Thanks to lean and data-driven processes, it is easier to better satisfy the demands of digital natives or, to put it more dramatically, to raise new expectations in the first place. Banks have to change their mentality in order to not become less attractive for customers. According to studies, the “Carry on” mentality will lead to a loss in profits for banks of up to 60% due to the direct competition and margin pressure generated by fintech companies.
Large banks have realized the need to act and have thus established new CDO functions, which are staffed with top-class IT experts (e.g. Deutsche Bank, UniCredit). Besides their internal activities, banks are also deliberately trying to approach fintech companies. They launch so-called innovation labs in innovations centers such as London or Silicon Valley and enter into cooperation with fintech companies. There are numerous examples for such cooperations:
- HVB, Comdirect, ING-DiBa with Gini: recording of bank transfer data by scanning invoice data with a smartphone camera
- Deutsche Bank, HVB, Apobank, Commerzbank with Fintura: comparison of terms through web portal for SME financing
- UBS, HVB and Sum up: mobile payment for companies with limited transaction volume or for field service
- DKB and PayPal: integration of PayPal into DKB’s online banking, taking over of additional credit card transactions from online transactions
- Commerzbank and IDnow: online legitimation via mobile devices within a few seconds
Way out of the crisis OF MOTIVATION: combining the necessary with the useful
Be it internal solutions of the bank or cooperations with fintech companies—against the background of new challenges and opportunities presented by the activities of fintech companies, bank data represent a central asset. A high-performance data architecture and a high level of data quality are indispensable for the upcoming activities in the contest or cooperation with fintech companies. The processes triggered by regulation for optimizing the IT landscape should also be regarded in light of the opportunity to turn data into a strategic asset. These aspects have to be considered in the cost-benefit calculation of upcoming BCBS 239 or similar far-reaching projects and may lead to a completely different view on the benefits of such projects.