Enterprise Data Management Solutions for Financial Markets: Future-Proofing your Data Architecture
In the first in a series of articles designed to help data professionals design and implement an Enterprise Data Management (EDM) solution, we explore the key drivers for EDM adoption and the outcomes that you should look for to judge the success of your project.
Why are Enterprise Data Management Solutions needed?
It has long been recognized that data management needs to be coordinated across the enterprise in order to streamline processes and the flow of information. More recently, evolving financial regulations and advanced risk management techniques have reinforced the need for transparency and a single source of the truth. In addition, the drive for operational efficiency and innovation in an ever more competitive environment demand greater agility and, at the same time, greater governance of processes, methods and models.
These challenges have driven the development of centralised enterprise data management solutions to provide controls and governance around the flow of data through the business, and in turn should be taken into consideration when designing how a sustainable EDM solution should look to meet current and future needs.
Drivers for the Adoption of Enterprise Data Management
Regulation & Compliance
The financial crisis of 2008 resulted in a strong response from financial regulators and all aspects of a financial institution’s activities are under scrutiny. The effect of these regulations on banks is often to increase capital requirements. As a result, optimizing capital has become a major focus for financial businesses.
A good example of how this affects data management is the Risk Factor Eligibility Test (RFET) – a key component of the Basel Committee’s Fundamental Review of the Trading Book (FRTB) – which is due to come into effect in 2023. The Risk Factor Eligibility Test requires firms to source enough real price data to ensure that a risk factor can be classified as modellable. At least 24 real prices per year, with at least four pricing events in each 90-day period will be required for the test to be passed. If they are unable to satisfy this requirement, they will not be allowed to use their own models and must apply punitive capital charges based on stressed market scenarios.
The key to solving the RFET and minimizing the cost of non-modellable risk factors (NMRFs) lies in better data management. The tests encompass a wide variety of processes – banks will need to source all relevant price data; normalise and map that data to risk factors / buckets; run relevant validation rules to ensure ‘real’ price criteria are met; run the RFET itself; monitor test results; alert when risk factors approach non-modellable status; and source additional committed quote data, where possible, to prevent that from happening.
Compliance with new regulations can therefore be a key driver for reform in data management. While delivering on the latest requirements is the priority, forward-thinking institutions will look for systems that allow them to respond quickly and efficiently as regulations change. At the same time, businesses themselves must be able to adapt as new regulations affect the competitive landscape, with the ability to respond to internal change becoming just as important as meeting external obligations.
Risk Management
The risk management department has always been a major client of the data management function of a business, and changes in risk management continue to be a major driver in EDM requirements. Some of the common issues that we find feeding into these requirements include:
- Shorter timescales for delivering data into risk systems and calculations.
- Multiple risk systems present within the organisation.
- More markets and multiple trading venues.
- More data vendors being required to feed into the system.
- Illiquidity in some markets make it difficult to provide accurate prices and rates.
- Complex data structures such as curves, surfaces and cubes.
- Operational risk that comes with siloed data and spreadsheets – and the governance challenges presented by such tactical solutions.
Changing Business Models
The challenges of the current business environment will continue to demand greater operational efficiencies and drive the transition toward capital-light business models. Institutions need to be more agile and more innovative even when resources may be stretched. The pace of change means that responding with large-scale IT projects is no longer an option. The need for agility and innovation means that changes to business processes have to be part of “business-as-usual” and data management systems must provide the flexibility to respond. EDM systems must both anticipate and enable change if they are to be of lasting value to an organization.
Enterprise data management has grown gradually from its beginnings as an IT function. Increasingly, organizations are recognizing that data is a vital enterprise asset and needs to be managed with the same level of governance as other enterprise assets such as people, finance and intellectual property. This recognition is clearly seen as more financial institutions raise the management of data to C-level in their organizational hierarchies. In the process, data management has risen in importance from a simple IT function with limited scope to a wide-ranging enterprise function that is a vital enabler of business change and innovation.
Desired Business Outcomes from Implementing an Enterprise Data Management Solution
Given these challenges, what are the key aspects of an enterprise data management solution that a business should look to implement, and what benefits will it bring?
Consolidation: A Centralised Gold Copy Source
The obvious benefit of a centralised “gold copy” is the reduction of conflicts between data used by different departments and desks – conflicts that may result in contradictory reporting and an insecure base for decisions. The other major benefit of centralised gold copy data is in eliminating duplicated effort, both in processes and subsequent reconciliation of diverging data. The cost savings from this can be significant – especially if the solution provides a method for rationalising and consolidating internal requests for data from vendors.
Data Quality and Consistency
The existence of a centralised data source is worthless if it cannot be trusted. Untrusted data can result either from lack of transparency or obsolete processes and models. Data needs not only to possess quality and consistency, but to demonstrably possess quality and consistency in order to gain trust. That means being able to evidence all of the checks (such as validation rules and independent price verification processes) that the data has undergone to ensure its status as a reliable source.
The desire for a single source of the truth seems inarguable, but we must recognize that different users (e.g. the risk management department and the finance department) often have different data needs or different views of the “truth”. For example, one might require an end-of-day price from the market in question, while another wants a snapshot price at a certain time of day. As a result, multiple golden copies are often required. For these datasets and the relationships between them to become trusted across the enterprise, the processes and people involved in creating each view need to be transparent and readily available to all.
Data Transparency and Data Lineage
The need for processes themselves to be trusted means that users must be able to trace data back to its source. This is easily achieved when all that has happened is that data has been moved from one place to another or undergone simple processes such as scaling, mapping or interpolation. It becomes more difficult when the “gold copy” is derived from multiple sources or from complex modelling. In either case transparency is vital.
Making data and processes transparent means that a query on how a data point was derived can be answered quickly and cost-effectively. This rapid and complete auditability in turn reinforces trust in data quality.
Enterprise data management solutions were traditionally the domain of the back office, but a modern EDM solution needs to serve the needs of the front and middle offices too. By providing data and process transparency, business users are not only able to spot potential issues but can understand their causes. In the best-case scenario this will lead to a virtuous cycle of feedback between users and the system rather than to a demand for additional tactical solutions. Issues can be resolved at source rather than in ways that simply circumvent and hide the real problem and create inconsistencies.
Improved Data Governance
Data governance dictates that there are proper controls around all of the processes that affect data. To best achieve this, control must be moved as much as possible from IT to operations. The division of responsibility where business users write specifications and IT handles implementation is no longer adequate. Typically, levels of control within large financial institutions (both sell-side and buy-side) have been heightened. Processes such as independent price verification have been implemented deliberately to ensure there are checks and oversight structures to validate front office marks, along with auditable processes that can be reported on.
This emphasis on ensuring the independence of control functions also heightens the need for collaborative workflows, enabling the business to continue providing expert insights into the pricing process and raising price challenges if they feel the independent control function has misinterpreted market information.
Reduced Operational Risk
Operational risk is greatly affected by the use of spreadsheets and tactical solutions in key processes. Therefore the goal of reducing operational risk should include:
- Fewer business-critical spreadsheets;
- Fewer manual processes;
- Fewer tactical solutions.
All of these in turn lead to less manpower required and a lower cost of maintenance, in addition to lower operational risk. However, the need for business agility that prompted these solutions in the first place should not be forgotten and flexibility must be maintained.
Flexibility / Rapid Integration
In the current fast-changing environment keeping integration projects short is an important objective not only for cost control, but also because the longer a project takes, the more likely it becomes that the requirements will have moved on by the time it is delivered.
In addition to any initial integration, rapid integration is also needed for subsequent changes. A more versatile solution will enable changes to be made as part of business-as-usual operations rather than requiring further IT projects.
Reduced Total Cost of Ownership
A well-designed EDM solution should be able to deliver a reduction in the total cost of the data management function, enabling for example:
- Reduced data vendor fees as a result of data source consolidation;
- Flexibility and rapid integration reducing the need for costly-to-maintain tactical solutions;
- Greater automation leading to reduced headcount.
Summary
Enterprise Data Management (EDM) systems have been with us for a long time. Many firms have undertaken large IT projects in an attempt to achieve a truly enterprise-wide platform for trustworthy data to support operations and reporting, but, as with all large projects, issues such as scope creep, cost overruns and the dangers of a “big bang“ approach have often impeded their success. The measure of success of a solution designed to be the go-to place for data across the enterprise is whether it has retained that primacy five or ten years down the line. In practice, many find that position eroded as the pace of change in financial markets exposes the inflexibility of older EDM architectures. This inflexibility often leads to tactical responses to new business problems, which then become dislocated from the core EDM architecture.
The demands on a contemporary EDM solution for capital markets continue to go beyond traditional ETL and data management practices. To ensure it remains current in today’s environment any system must provide data centralisation and consolidation, data quality, and improved transparency and data governance. Further than that it should have the flexibility to react as the needs of the business adapt and change – only then will it provide the strategic solution that the business requires.