Data Management Solutions for Market Risk and Valuation Control
In this blog post we look at some of the technical challenges that commonly affect data management for Market Risk and Valuation Control departments, and discuss how improving internal processes can deliver benefits for the business.
Market Risk and Valuation Control
Market risk is the risk that an organisation’s assets will shrink or their liabilities grow due to unforeseen changes in the markets in which they operate. Market risk management is the process by which risk managers quantify the potential impacts of these changes to allow trading and management processes to be used to manage these risks.
It is critical that senior management and risk managers are working with accurate information otherwise their models will not be representative of both the firm and the market, and financial reports will be peppered with inconsistencies including mismarked positions and P&L breaks.
It is the responsibility of valuation control to ensure the accuracy of all this information. Some of the drivers and key aspects of the valuation control process are well illustrated in this diagram from a report on the subject by PricewaterhouseCoopers:
This defines four core areas of competence for valuation control departments:
- Independent Price Verification (IPV)
- P&L Reporting
- Capital Reserves
- Model Control
PWC also highlight the challenges posed by the measurement of ‘fair value in illiquid markets’ and difficulty in validating the Front Office marks with very few market prices available, and note that ‘infrastructure has not caught up in all cases with the growth in the Front Office and the complexity of the products’.
If the organisation only trades vanilla products then it may be possible to outsource much of the IPV and control function to service providers provided that senior management can demonstrate that they still have sufficient oversight, governance and control within the organisation to manage their business.
However if the portfolio contains complex products which by virtue of their design include many trading judgements, then it becomes impossible to ensure this. In such situations outsourcing may not be the best solution and an internal infrastructure must therefore be created to handle the function in-house.
Existing Architecture Challenges
When reviewing in-house data management capabilities for market risk and valuation control, there are several areas that present particular challenges to the data team.
Data Silos
Data silos are the inevitable consequence of legacy systems that are driven by geographic and departmental factors, and lead to a fractured data infrastructure. They are extremely difficult and hence costly to remove so they often remain within organisations that are under continual pressures to reduce costs while meeting evolving regulations and business needs. It is usually much cheaper and faster to apply a tactical solution to a meet a new regulation, innovation, or a change in the market rather than invest in a strategic solution. Unfortunately, there can be many negative consequences to the tactical approach, including a degradation in the accuracy and timeliness of the data available for the valuation control function, and a lack of governance and controls around the process.
Multiple Risk Systems
Many companies now use multiple risk systems. These risk systems have often become responsible for calculating and managing derived data (e.g. curves, surfaces). Technically this made sense because these systems already possessed the necessary calculation engines; architecturally it is no longer tenable given the need for consistent data across multiple risk systems. Even if risk management is confined to a single system, the demand of other systems such as the OMS and, increasingly, business users doing their own analyses, means that derived data must now sit alongside market and reference data in the EDM system so that it can be shared consistently.
Unmanaged Spreadsheets
Another issue affecting controls and governance around the valuations process comes with the use of spreadsheets as the default method to store and perform calculations. Due to their flexibility and ease of use spreadsheets are often the tool of choice when performing valuation of fixed income, complex derivative and over-the-counter (OTC) instruments. However, they do not lend themselves easily to the challenge of conforming to regulatory requirements – such as the need for control, consistency and transparency regarding valuation model inputs and outputs. To counteract this, many financial institutions are looking at the way they currently use and manage spreadsheets with the view of reducing operational risk, increasing automation and improving the quality of the underlying data that feeds their calculations processes.
Operational Risk
As we have outlined, the most obvious signs of the lack of adequate data management is the proliferation of critical data held in spreadsheets and silos – loosely managed and controlled and lacking any transparency or enterprise-wide visibility. This situation presents a huge operational risk to an organisation:
- Such systems are often developed and managed by a single person (in the worst cases an external contractor who has long ago moved on). Little or no documentation is typical and exacerbates the key person risk.
- Governance is usually an after-thought, if it exists at all. Lack of adequate version control of processes and analytics casts doubt on data quality – a risk that is too often ignored.
- A lack of auditability, together with the fact that these systems are often part of a tangled web of dependencies, mean that transparency and data lineage are non-existent and proper reporting almost impossible.
Many organisations rely heavily on such tactical systems, but the inevitable result is that there is a high risk of important decisions being made based on bad data. Spreadsheets in particular cause problems that are not easy to eliminate without losing vital flexibility. This risk is widely recognised and is a major driver in the implementation of enterprise data management solutions.
Requirements for a Centralised Data Hub
A data hub – whereby all systems share a centralised, validated source of information – can deliver both process and cost efficiencies, but many technical challenges need to be overcome for it to perform its function correctly. For example, it should:
- Have a data model that can cope with the very wide range of data that needs to be centralised
- Be easy to modify the data model to keep it current
- Be able to store increasingly large data sets
- Accommodate derived data from any analytics and calculations applied to the data
- Allow a single version (gold copy) of the data for all systems to use
- Automate the creation of this data
- Conform to the regulations, such as BCBS 239
- Allow the tracking of data assets from import through to their export
Many data hubs cannot meet all these requirements and hence additional components have to be introduced, such as data lakes, external calculation engines, workflow products and other data stores. The ideal would be a holistic solution that unifies the various components and provides a platform that offers proper controls and governance, operational efficiencies, and enables future growth to satisfy the business requirements of the firm.
The Benefits of a Strategic Approach
In summary, issues often found when addressing the data management needs of market risk and valuation control include:
- Siloed and fragmented system infrastructure and default use of spreadsheets
- Largely manual processes and high workload
- Lack of common data sources
- Out of date or incomplete inventories and mapping of products and models
To combat this, firms are looking to implement strategic solutions to provide the governance and oversight needed for risk calculations, management reporting and compliance. As part of this process, they often find that there is great potential to rationalise and improve efficiency, and consequently reduce their overheads in terms of data and operational costs. These benefits should not be overlooked when considering changes to how valuation controls and risk calculations are supported. Improving internal processes need not mean an expensive cost for the business as – within relatively short timeframes – projects can often be found to pay for themselves.