Now more than ever, financial institutions are in a dynamic process of change. Strict regulatory requirements, digitalization and disruptive market influences are creating persistent pressure on the entire sector to take action and present banks with considerable challenges in terms of managing their data. In the context of data management, the concept of a chief data officer (CDO) as an organizational entity has been around for several years. Its design varies between purely technical and purely functional organizational units in banks.
This article highlights both the economic motivation in view of current challenges and the path to shaping this role.
First challenge: a dynamic approach to market developments is already noticeable
The changed market conditions are forcing banks to rethink—a dynamic approach to market developments is noticeable and creates a need for action. In times of persistently low interest rates, traditional business models need to be put to the test and adapted accordingly. With a focus on pure maturity transformation and partially observable negative interest rates, it is no longer possible to generate sufficient interest income. Earnings from the classic deposit and lending business have therefore declined significantly in recent years. One result from this development is the flight of German institutions into a higher proportion of risky assets, while at the same time increasing the average residual maturity in their proprietary business. Achieving profitability by purchasing increased credit and market risk, though, cannot be a sustainable solution.
The obvious step towards a more comprehensive and frequent adaptation of business models to the dynamics of market developments not only requires an adjustment of business processes and IT, but also of data management, so that data quality, architectural principles and controlled complexity do not fall victim to the dynamics.
Second challenge: The constant development of regulatory requirements towards higher numbers and finer granularity will continue to accelerate in the foreseeable future
Current regulatory initiatives, such as BCBS 239 and AnaCredit, cause high implementation costs across the sector. The required level of detail in reporting and the complexity of the requirements have a significant impact on the data management of banks. The initiatives represent a substantial burden for credit institutions whose data management is not aligned to the requirements. Stress tests and intra-year ad-hoc inquiries from the regulators aggravate the situation, as massive amounts of data have to be generated and evaluated for informative reporting. At the same time, regulation in the financial sector is further increased by initiatives (e.g. GDPR). Often, competent regulators only sketch forthcoming initiatives superficially—but explicitly point out their data relevance and impact. As a consequence, regulatory pressure to implement data management will remain high in the future as well.
Third challenge: digitalization is equally an opportunity and a risk—competitors and newcomers never sleep
Moreover, the “digitalization” megatrend in the financial industry is progressing and opening up new fields of action for newcomers. Fintech companies, offering innovative products and services, successfully push into selected areas of the traditional banking market. In contrast to established players, they are better able to anticipate customer perceptions in specific areas and offer solutions and services through digital technologies and platforms that are quickly and accurately aligned with the needs of those customers who have an affinity for digital services. Fintech companies are therefore already triggering customer churn in selected services and thereby generate a decline in the business volume of traditional institutions.
Insofar as banks continue to adhere to existing processes and business models, there is a risk of completely losing the interface with the customer in the future. In order to regain lost ground, especially in competitive factors such as technological expertise and data focusing, bank processes need to be streamlined and made more flexible. It can be observed that institutions are increasingly entering into strategic alliances and cooperations with fintech companies and technology partners in order to position themselves in key areas in a future-oriented manner. In addition, with the PSD2 directive, the EU has created a single legal framework that requires banks to provide third parties with access to account and payment information. For traditional players, this presents an opportunity to open up new business opportunities and align business models with digital products and services.
This can be achieved especially in dynamic markets with regard to new providers of service offerings, provided that customers’ trust in banks can be strengthened—not only in terms of financial services but also in terms of data security and enhanced data services—and at the same time harnessed for sales potential. Data management must keep up with this development and, despite the required dynamization, be able to design and control all data-related processes and their organizational integration in such a way that no frictions, misinformation, responsibility or information gaps arise in the field of conflict between data supplier and customer.
The challenges impact on banks’ data management structures that are poorly prepared in terms of organization, processes and technology and thus create excessive stress for these banks
Although data is a central asset of banks and a core element of any business process, a centralized view of the data and clear responsibilities for specific data (areas) are frequently lacking in both business and IT.
Often the functional responsibility for the data is not or only implicitly regulated in the company. Who is responsible, e.g. for completeness, correctness and up-to-dateness of customer data? The sales representative who concludes the account contract with the customer or, if available, the central unit in the back office that enters this data? Which department is allowed to address requests for new data fields to IT and who ensures that these changes are then correctly and promptly followed up in financial and risk reporting? An interdisciplinary harmonization of technical terminology and definitions of data objects and their attributes is a basic requirement for a clear assignment of responsibilities. These are the objects to which data responsibility can be committed.
Few banks have so far established a functioning and assertive architecture management. Due to long implementation times in IT, quick solutions are often implemented without taking architectural guidelines into consideration. This leads to historically grown IT landscapes with redundant data (flows) and bypasses as well as individual developments in Excel and Access that have hardly or not at all been documented. The complexity of these solutions is difficult to master and extends the implementation times while increasing the error rate of the systems at the same time.
For historical reasons, traditional IT “thinks” in systems rather than in data areas. The various departments often have a decentralized IT application manager at their disposal. Decisions on where new requirements are to be implemented are more likely to be spontaneous, depending on which system owner receives the request. The proposed solutions are usually based on existing or familiar solutions and thereby anticipate possible solution spaces.
Conclusion: In 2018, with a growing number of competitors who have recognized the value of data to secure and grow their business as well as regulatory pressure and transparency requirements, a competitive bank must rate data with the value of a strategic asset and adequately set up data management.
Economic potential from data available internally and on the market can only become a competitive advantage if it is holistically understood as an asset. This can be promoted from three perspectives:
- maintain and expand trust, e.g. with increasing service offers via fintech companies,
- generate sales approaches from own and third-party data, without violating point 1 and
- ensure the traditional view of the company’s management through clear, quality-assured and timely information.
Today’s pluralistic challenges and the full exploitation of the potential associated with the assets data storage and data management call for a bundling of data competencies and data know-how—this role is referred to as the chief data officer (CDO). The term is also used below to denote the corresponding function (see definition of terms in figure 2).
Many banks are currently in the process of establishing and bringing to life the role of a CDO. This CDO or a CDO department, located below the CFO, CRO and CIO level, can be developed into a value-creating and value-protecting organizational unit. Currently, there is a strong case for an allocation to one of the specialist departments. This is not, however, about outstripping specialists, such as the GDPR Officer, but rather about creating a cross-sectional advisory and gatekeeper entity, which builds bridges and harmonizes views (both functional to technical and functional to functional). A CDO can already assume tasks in the context of data-driven projects and gains in importance during the actual operation. The preservation of a centrally created data management, as incorporated in institutions in the wake of the BCBS 239 implementation, not only needs to be implemented but must also be lived sustainably from day one after the project.
The consequences of the lack of such a role can be serious. Large-scale projects aimed at building an integrated data warehouse often fail due to the diffusion-of-responsibility phenomenon where no one represents the harmonized view of information. Even in the case of a successful project completion, in the subsequent years, there is a risk of returning to familiar patterns and individual views. For investment protection reasons alone, the CDO role deserves closer attention.
The design forms of a CDO unit should be derived from a holistic understanding along data-relevant topics and disciplines. For this purpose, determining the status quo is an ideal opportunity, which can easily be achieved along prioritized design aspects. Reconciling the status quo with the desired ambition level in terms of potential exploitation makes needs for action apparent. Planning the latter along resources and time results in an implementation concept. For this process, a consideration of the ten dimensions depicted in figure 3 can serve as a structuring aid.
The choice of the hierarchy level, a certain independence (gatekeeper function) and the range of competence and staffing (bridge function) must be carefully considered. Only then can the company form the necessary opposite pole to technical specializations and generate a cross-company potential exploitation. This approach creates sustainable added value, given that an overall management is achieved internally as well as for the market, and regulatory initiatives are no longer pursued in a singular, isolated manner. The CDO will become an essential instrument for securing competitiveness in the long term.
 Bundesbank (2017): www.bundesbank.de/Redaktion/DE/Reden/2017/2017_11_29_dombret.html
 Bundesbank (2017): www.bundesbank.de/Redaktion/DE/Reden/2017/2017_02_16_thiele.html
 Bundesbank (2017): www.bundesbank.de/Redaktion/DE/Reden/2017/2017_09_07_thiele.html