Xenomorph
  • Solutions
  • Technology
  • Customers
  • Resources
  • News
  • About Us
  • Request a Demo
  • Search
  • Menu Menu
  • Twitter
  • LinkedIn

News

Data Management Panel

Thomson Reuters held a panel event on data management at their London offices on Tuesday last week, with speakers from Barcap, LCH.Clearnet, DB, Mizuho and Citi. This event was held in follow up to their recent report “Beyond Golden Copy“. Below are some of my notes on the summary points the panelists made:

  • The Value of Data – Kris Bhattacharjee of Barcap said that there were currently two main drivers behind the perceived business value of data; i) Regulators are expecting more information, adding additional requirements and conducting more adhoc reporting requests. ii) Business users/decision makers want more granular understanding of trading and risk management data, in order to decide how best to allocate scarce capital to what trading positions.
  • Data Metrics – Kris said that the metrics were many but timeliness of data was becoming a key metric – over the past two years regulators have moved from allowing say 2 months as a reporting timeline down to 10 days recently. Additionally timeliness is again vital as regulators demand adhoc reporting in response to market events.
  • Accuracy/Completeness – Again regulators are driving this, with the “bad numbers in, bad numbers out” as the main motivation. Unsurprisingly, counterparty data is also being required at a new level of detail and accuracy down to a portfolio level in light of the crisis.
  • Granularity of Data – Deeper granularity of data being driven by scarce capital and the need to understand how efficiently it is being used. Basel II has also driven greater granularity over Basel I. Reflecting what I have heard from some our clients, Kris added that the data associated with securitised products had increased greatly as people need to understand exposure/risk and pricing in more detail (rather than assume blanket statistical behaviour for a whole basket of assets).
  • Stress Scenarios – Kris again mentioned the understanding of counterparty exposure driving the need for new data sets, as had the initiative of banks having “living wills” to allow a bank to be wound down in an orderly manner.
  • Everybody has Left the Building! – Martin Taylor of LCH.Clearnet was a great speaker and said that the biggest new problem that the collapse of Lehman’s created was that ordinarily there are people around to help with extracting from systems what the exposure is to the various counterparties. In the Lehman’s case there was nobody around to help, making the process very difficult and leading to the need for changes to address this problem.
  • Mandating Data Integrity – Martin added that data security, integrity and auditabiliy were vital, and in particular put emphasis on the people that are running the systems that they have their own form of integrity so that an institution knows that the people can trusted but is also capable to deal with a situation where the people are not around to help. Martin felt that this level of data management should be mandated on the industry and that there was an awful lot that finance could learn from industries such as Pharmaceuticals in terms of product approval and management/robustness of data.
  • Data with No Cost or Value – Neil Fletcher of DB was another good speaker who started his talk by saying that pre-crisis people thought of data as project based, otherwise dealt with it on an adhoc basis and considered data as having no cost or value. Institutions had a spaghetti approach to data, with systems/projects being process not data based i.e. the systems get only the isolated data sets they need only when they need it.
  • Quality is Now the Data Driver – Neil said that 18 months on from the crisis, then whilst ROI is still important for data projects then quality of data is the key driver.
  • Sponsorship and Ownership of Data – Neil added that quality data is an asset as are the systems that produce data quality, and to ensure success data management projects needed high level business sponsorship, but also ongoing and clearly defined ownership of all data sets and their quality.
  • Enterprise Data Virtualisation – Neil said that DB were embarking on a long term project to ensure that all systems get data from the same logical place on a global basis, and that they were investing heavily in data virtualisation technology as a key means of achieving this goal. DB are starting with reference data, moving to transactional/positional data and on to other data types. For each type/category of data ownership would be clearly defined across all systems and would enable real-time transformation of the data into whatever format it is needed in.
  • Enterprise Data Model – Neil said that as a result of this virtualisation approach then you have to invest in putting together an enterprise data model for all data used in an institution. From my point of view this could be interpreted as a move back to “big EDM” (with all the project risk that implies) but I guess it is being approach on a more staged manner.
  • Lip Service to Data has Ended – Neil summarised by saying that lip service to data management has ended with the start of the crisis and that 18 months on the enthusiasm for dealing with the data problem has not diminished.
  • Publish/Validate/Subscribe – Simon Tweddle of Mizuho echoed a lot of what Neil said in approach to global data management and ownership, but added that he believed that the model of publish/subscribe needs to change to publish/validate/subscribe to ensure data quality.

Most of the panelists agreed that bringing in experience from external industries (Pharma, Oil & Gas, Internet Search etc) would be beneficial since we should not assume that the financial market has the expertise to get data management right first time. Martin of LCH.Clearnet was convinced that mandated data management would come and would be beneficial, which some of other panelists did not agree with and suggested that the industry needs to get ahead of the regulators to head this possibility off. Simon said that the focus on complex data/products was wrong given that the basics (what is our exposure to this counterparty?) were not being done (not sure I agree with this totally, both are needed given the losses from CDOs etc). Overall it was good panel with some interesting debate and speakers.

Share this entry
  • Share on Twitter
  • Share on LinkedIn
  • Share by Mail
https://www.xenomorph.com/wp-content/uploads/2019/07/logo-xeno.png 0 0 Brian Sentance https://www.xenomorph.com/wp-content/uploads/2019/07/logo-xeno.png Brian Sentance2010-03-08 14:30:062020-02-10 14:30:04Data Management Panel

Categories

  • Blog Post
  • Events
  • Press Release

Archives

About Xenomorph

Xenomorph provides trusted data management solutions to many of the world’s leading financial institutions.

With more than two decades’ experience managing large volumes of complex data and analytics we can quickly configure a solution for your requirements.

Xenomorph London

4th Floor
10 Lloyd’s Avenue
London EC3N 3AJ
UK

+44 (0)20 7614 8600

info@xenomorph.com

map

Xenomorph New York

45 Rockefeller Plaza, FL 20
New York
NY 10111
USA

+1-212-401-7894

info@xenomorph.com

map


Xenomorph Boston

53 State Street
Suite 500
Boston, MA 02109
USA

+1-617-465-2050

info@xenomorph.com

map

© Xenomorph Software Ltd 2022 - Privacy policy | Anti-Bribery Policy
Scroll to top
This website uses cookies to improve your experience. If you continue to use the site we will assume your consent. ACCEPTRead More
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are as essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
SAVE & ACCEPT