The Data Management Challenges of FRTB
This article by Xenomorph CEO Brian Sentance is entitled Data Management Challenges of FRTB and was originally published in the October 2016 issue of PRMIA’s Intelligent Risk magazine.
The Basel Committee’s Fundamental Review of the Trading Book (FRTB) initially placed the spotlight on rising market risk capital charges and their impact on bank trading operations. Given that the most recent quantitative impact study predicts market risk capital charges will rise by between 1.5x and 2.4x [1] – some market participants questioned the structure (and in some cases viability) of bank trading operations.
However, industry attention has now turned to practical implementation challenges, and in particular, the issue of data management. The data management challenges posed by FRTB are not only broad, they are also of pressing importance. The sooner that data management challenges can be addressed, the more time banks will have to take strategic business decisions about their trading operations, and the better informed those decisions will be.
It is difficult to do justice to the complexity of the challenges involved in an article of this length. However, some of the key issues have been briefly outlined into the following high-level categories:
Data Acquisition and Consolidation
FRTB will require firms to consolidate data from a number of sources, both internal and external. To fulfil P&L attribution and back-testing requirements, banks will need to integrate and analyse a wide range of data, including pricing and analytics from vendors and internal systems, position and transactional data from trading or back-office systems, alongside model outputs from risk engines. Ensuring all of this data can be normalized into a consistent data model and analysed over long time periods is by no means a trivial challenge.
In addition, the issue of risk factor ‘modellability’ has also been flagged as a key challenge. The latest ISDA QIS suggests that non-modellable risk factors (NMRFs) could account for approximately 30% of total market risk capital charges under the internal models approach (IMA). This has prompted much debate about how to minimize the burden and ensure enough ‘real’ data is available to model risk factors. There are already a variety of participants preparing to offer solutions to that challenge, including industry consortia, data vendors, interdealer brokers, and potentially even Indication of Interest (IOI) networks (provided they provide actionable IOIs that represent committed quotes). While it may be too early to gauge which such solution/s will prevail, an important prerequisite will be that they have strong data governance and validation processes in place to ensure the criteria for ‘real’ prices are demonstrable.
Data Storage
With so many risk factors and derived data sets to be analysed over significant time horizons, the breadth and depth of data required for modelling firm-wide and individual desk-level risk is vast. Banks not only need to ensure they have a sufficiently scalable architecture in place to store validated and cleansed data. They also need a flexible data model to adequately map relationships between different datasets, such as pricing, derived data / analytics, liquidity horizons, risk classes, buckets, etc.
Data Derivation / Calculation
Managing market risk always requires a wide variety of analytics – from computing basic correlations (between risk factors and across buckets) and standard deviations to much more complex metrics. In addition, the requirement for back-testing and P&L attribution will require firms that opt for the IMA to continue demonstrating the accuracy (and eligibility) of their internal models. Having the right data architecture in place means not only being able to compute test results and produce relevant reports, but also having the right alerting mechanisms in place. For example, it will be important to flag whether a risk factor is approaching non-modellable status, or when a desk may be skirting with the eligibility criteria for using the Internal Models Approach.
Data Validation and Cleansing
Expected Shortfall (ES) is inherently more sensitive to outliers than VaR given that it focuses squarely on tail risk. That means any incorrect data that creeps into models could result in punitive capital charges, therefore placing a more pressing obligation on firms to ensure models are fed with accurate, high quality data. In addition, independent price verification processes will be required to ensure that illiquid assets are being valued accurately.
Data Visualisation
With the amount of data that needs to be analysed on an ongoing basis, data visualisation tools will be crucial to ensure each stakeholder has access to the information they need, displayed via visually intuitive dashboards to help spot patterns and anomalies. Interfacing with tools like Microsoft PowerBI and Tableau can ensure such dashboards meet users’ requirements, are easily configurable, and cost effective.
Data Lineage and Process Auditability
The revised market risk capital framework will place much greater importance on ensuring data quality, which in turn, will prompt the need for firms to establish clear oversight of data management operations. Knowing where data has come from, which validation checks were run, whether alternative sources are available, and how long it took to resolve exceptions will all become critical to business as usual operations.
Conclusions
Data management has always played a key supporting role to the function of risk management. Yet recent regulatory initiatives by the Basel Committee have reinforced the importance of that role. While BCBS 239 succeeded in emphasizing best practices for data aggregation and reporting, FRTB threatens to have significantly more clout as a change driver across the industry.
FRTB has a direct impact on the business. As such, there will be more tangible benefits from getting data management right. And while the deadline for compliance may seem a long way into the future, the issue of data management should be seen as a pressing priority. Given the key strategic decisions facing many banks, it is vital that those decisions are based on accurate data.