PRMIA on Systemic Risk Part #2 – plus the OFR
Lewis Alexander (ex-US Treasury) carried on the theme of systemic risk at the PRMIA seminar “Risk, Regulation and Financial Technology & Identifying the Next Crisis“. He started by saying that whilst systemic risk was a risk to the economy and industry as a whole, systemic risk was also relevant to the risks (such as market or credit) that a risk manager at an individual institution needs to assess.
Lewis said that there had really only been three systemic crises over the past century or so (1907, 1933 and 2008) with obviously many more disruptions in markets that should not be described at systemic. As such this is one problem of assessing systemic risk which is that crises are rare events so there is little data to analyse. He also warned that the way the system responds to small shocks should not be taken as a proxy for how it responds to large ones, that the relationship between asset prices and systemic risk is a complex one, and that reporting (mainly accounting but also in risk) had not kept up with financial markets innovation.
Lewis said that “stress test” methods can help to identify vunerable institutions but that this method of looking at systemic risk does not deal with the propogation of risk from one institution to another. He said that network analysis can help to assess propogation but the weakness with these methods was the lack of counterparty data. Liquidity methods also suffer from a lack of data. He said that “Leading Indicators” (see past post on Bubble Indices) tell us little of what creates systemic risk.
He mentioned the use of CoVaR (based on VaR) for systemic risk, using CDS pricing to theoretically “insure” the industry against crisis and a “Merton Model” approach to estimate potential losses due to default for a group of banks. He said that all of these models were good comparators, but not good as indicators.
Given the previous talk on systemic risk, Lewis switched his focus to what can done with the main focus for him being data where we need:
- Robust data on both asset and counterparty exposures
- Data on leverage through the system
- Data on the depth of liquidity to assess the vunerability of assets to fire sales
A final few points from his talk:
- Dodd-Frank will help given new reporting mandates e.g. swap data repositories being invaluable sources of data for regulators
- Could we use the payments/settlement system to provide yet more insight into what is going on by sensibly tagging transactional flows (DTCC take note apparently!)
- SEC registration of a new financial product could help to enforce what is reported, how and to act as a limit on what products can be sold
- Lewis said that up to 5,000 attributes are needed to describe any financial transaction so it can be done
- As he became involved in the FSOC and the formation of the OFR he thought initially that collecting all the data needed was impossible, but his view has changed on this with modern technology and processing power.
- The above said, he thought that until standards were in place (such as LEI) then it did not make sense for the OFR to start collecting data
- A member of the audience suggested that if data could be published in a standard form, it would “Google to the rescue” in terms of doing aggregation across the industry without centralising the data in one store. (maybe Google plans to usurp Microsoft Excel as the defacto trading and risk management system for the industry?)
Lewis gave a very good and interesting talk. I think some of his ideas on the OFR were good, but given the state of the data infrastructure that I have observed at many large institutions I would be worried that he is being optimistic on how quickly the industry is able to pull all the data together, however standardised. I think the industry will get there (particularly if mandated), but given the legacy of past systems and infrastructure it will take some good time to achieve yet.