Data Management Panel
Thomson Reuters held a panel event on data management at their London offices on Tuesday last week, with speakers from Barcap, LCH.Clearnet, DB, Mizuho and Citi. This event was held in follow up to their recent report “Beyond Golden Copy“. Below are some of my notes on the summary points the panelists made:
- The Value of Data – Kris Bhattacharjee of Barcap said that there were currently two main drivers behind the perceived business value of data; i) Regulators are expecting more information, adding additional requirements and conducting more adhoc reporting requests. ii) Business users/decision makers want more granular understanding of trading and risk management data, in order to decide how best to allocate scarce capital to what trading positions.
- Data Metrics – Kris said that the metrics were many but timeliness of data was becoming a key metric – over the past two years regulators have moved from allowing say 2 months as a reporting timeline down to 10 days recently. Additionally timeliness is again vital as regulators demand adhoc reporting in response to market events.
- Accuracy/Completeness – Again regulators are driving this, with the “bad numbers in, bad numbers out” as the main motivation. Unsurprisingly, counterparty data is also being required at a new level of detail and accuracy down to a portfolio level in light of the crisis.
- Granularity of Data – Deeper granularity of data being driven by scarce capital and the need to understand how efficiently it is being used. Basel II has also driven greater granularity over Basel I. Reflecting what I have heard from some our clients, Kris added that the data associated with securitised products had increased greatly as people need to understand exposure/risk and pricing in more detail (rather than assume blanket statistical behaviour for a whole basket of assets).
- Stress Scenarios – Kris again mentioned the understanding of counterparty exposure driving the need for new data sets, as had the initiative of banks having “living wills” to allow a bank to be wound down in an orderly manner.
- Everybody has Left the Building! – Martin Taylor of LCH.Clearnet was a great speaker and said that the biggest new problem that the collapse of Lehman’s created was that ordinarily there are people around to help with extracting from systems what the exposure is to the various counterparties. In the Lehman’s case there was nobody around to help, making the process very difficult and leading to the need for changes to address this problem.
- Mandating Data Integrity – Martin added that data security, integrity and auditabiliy were vital, and in particular put emphasis on the people that are running the systems that they have their own form of integrity so that an institution knows that the people can trusted but is also capable to deal with a situation where the people are not around to help. Martin felt that this level of data management should be mandated on the industry and that there was an awful lot that finance could learn from industries such as Pharmaceuticals in terms of product approval and management/robustness of data.
- Data with No Cost or Value – Neil Fletcher of DB was another good speaker who started his talk by saying that pre-crisis people thought of data as project based, otherwise dealt with it on an adhoc basis and considered data as having no cost or value. Institutions had a spaghetti approach to data, with systems/projects being process not data based i.e. the systems get only the isolated data sets they need only when they need it.
- Quality is Now the Data Driver – Neil said that 18 months on from the crisis, then whilst ROI is still important for data projects then quality of data is the key driver.
- Sponsorship and Ownership of Data – Neil added that quality data is an asset as are the systems that produce data quality, and to ensure success data management projects needed high level business sponsorship, but also ongoing and clearly defined ownership of all data sets and their quality.
- Enterprise Data Virtualisation – Neil said that DB were embarking on a long term project to ensure that all systems get data from the same logical place on a global basis, and that they were investing heavily in data virtualisation technology as a key means of achieving this goal. DB are starting with reference data, moving to transactional/positional data and on to other data types. For each type/category of data ownership would be clearly defined across all systems and would enable real-time transformation of the data into whatever format it is needed in.
- Enterprise Data Model – Neil said that as a result of this virtualisation approach then you have to invest in putting together an enterprise data model for all data used in an institution. From my point of view this could be interpreted as a move back to “big EDM” (with all the project risk that implies) but I guess it is being approach on a more staged manner.
- Lip Service to Data has Ended – Neil summarised by saying that lip service to data management has ended with the start of the crisis and that 18 months on the enthusiasm for dealing with the data problem has not diminished.
- Publish/Validate/Subscribe – Simon Tweddle of Mizuho echoed a lot of what Neil said in approach to global data management and ownership, but added that he believed that the model of publish/subscribe needs to change to publish/validate/subscribe to ensure data quality.
Most of the panelists agreed that bringing in experience from external industries (Pharma, Oil & Gas, Internet Search etc) would be beneficial since we should not assume that the financial market has the expertise to get data management right first time. Martin of LCH.Clearnet was convinced that mandated data management would come and would be beneficial, which some of other panelists did not agree with and suggested that the industry needs to get ahead of the regulators to head this possibility off. Simon said that the focus on complex data/products was wrong given that the basics (what is our exposure to this counterparty?) were not being done (not sure I agree with this totally, both are needed given the losses from CDOs etc). Overall it was good panel with some interesting debate and speakers.