Xenomorph Blog

Posts categorized "Regulation"

Data Management Summit NYC from the A-Team

The A-Team put on another good event at DMS New York yesterday. Lots of good stuff talked and here are a few takeaways that I remember, after a photo of Ludwig D'Angelo of JPMorgan:

WP_20141104_12_33_59_Raw

  • Data Utilities - One of presenters said that "Data Utility" was a really overused term second only to "Big Data". My comment would be that a lot of the managed services folks seem to want to talk about "Data Utilities" - seeming to prefer that term rather than what they are? Maybe because they perceive as better marketing and/or maybe because they hope to be annointed/appointed (how I don't know) as an industry "Data Utility". Anyway for me they fail to address the issue of client-specific data and its management very well, much to the detriment of their argument imho - although SmartStream did say that client data can be mixed up into the data services they offer. 
  • Andrew Gets Literaturally Physical - Andrew Delaney of the A-Team expressed a preference for "physical" books when talking about why the A-Team also prints the Regulatory Data Handbook2 as well as making it available online. I have to agree that holding a book still beats my Kindle experience but maybe I am just getting old. Andrew should check out this YouTube video on how the book was first introduced...
  • FIBO - The Financial Instrument Business Ontology (FIBO) was discussed in the context of trying to establish industry standards for data. As ever the usage of words like "Ontology" I suspect leaves a lot of business folks looking for the nearest double shot of expresso but that aside, seems like the EDM Council are making some progress on developing this standard. Main point from the event was industry adoption is key. I found some of the comments during the day a bit schizophrenic, in that some said that the regulators should not mandate standards (i.e. leave it to industry adoption and principles) but then in the next breath discussing the benefits (or otherwise) of the LEI (ok, not mandated but specific and coming from the regulators). Certainly the industry needs "help" (is that a strong enough word?) to get standards in place.
  • Data Quality - Lots on data quality with assessing the business value of data quality initiatives being a key point. On the same subject, Predrag of element-22 announced that the EDM Council will soon be announcing adoption of the Data Quality Index, which could be used to correlate data quality with operational KPIs for the business. 
  • Regulation (doh!) - It wouldn't be a data management event without lots of discussion on regulation - a key point being that even those regulations that are not directly/explicitly about data still imply that data management is key (take CVA calcs for example) - and on a related note it was suggested that BCBS239 should be considered as a more general data managment template for any business objective. 
  • Entity Hierarchies/LEI - Ludwig D'Angelo of JPMorgan gave a great talk and said that vendors were missing a massive opportunity in delivering good hierarchy datasets to clients, and that the effort expended on this at firms was enormous. Ludwig said that the lack of hierarchies in the Legal Entity Identifier (LEI) is a gap that the private sector could and should fill.  Ludwig also seemed initially to be thrown when one of the audience suggested that they were multiple "golden copies" of hierarchies needed, since definitions of ownership can differ depending on which department you are in (old battle of risk and finance departments again). Good discussion later of how regulation was driving all systems to be much more entity-centric rather than portfolio-centric, emphasising the importance of getting entity hierarchies right. 
  • DCAM - John Bottega did a great presentation on the Data Management Capability Model (DCAM). John asked Predrag of element-22 to speak about DCAM and he said that unlike previous models (DMM) then this framework would not only assess where you are in data management but will also show you where you need to go. DCAM covers data management strategy / operations / quality / business case / data architecture / tech architecture / governance / program. From what I could see it looked like a great framework - it appeared like common sense and obvious but that is in itself difficult to achieve so good effort I think. Element-22 will offer an online service around DCAM that will also allow anonymous benchmarking of data management capabilities as more institutions get involved (update: the service is called pellustro).
  • BCBS239 - Big thanks to John M. Fleming of BNY Mellon and Srikant Ganesan of Risk Focus for taking part in the panel with me. Less focus on spreadsheet use and abuse on this panel unlike the London Panel from last month. John had some very practical ideas such as the use of Wikis to publish/gather data dictionary information and with a large legacy infrastructure you are better documenting differences in definitions across systems rather than trying to change the world from day one. Echoing some of the points from DMS London, it was thought that making the use of internal data standards as part of a project sign off was very pragmatic data governance, but that also some systems should be marked/assessed as obsolete/declining and hence blocked from any additional usage in new project work. Bit of a plug for some of our recent work on data validation and exception management, but the panel said that BCBS239 needs to encompass audit/lineage on calculations/derived data/rules in addition to just the raw data

You can get more on the day by taking a look at my feed via @TheLongSentance and involving others at #DMSNYC.

 

Related articles

Posted by Brian Sentance | 5 November 2014 | 11:42 pm


Banking Reloaded from Capco and Zicklin

Great event by Capco and Zicklin Business School at Baruch College in NYC yesterday. Topics went right through from high frequency trading, systemic risk, wealth management and bitcoin. The agenda is here and you can see some on the highlights on twitter at #BankingReloaded.

Posted by Brian Sentance | 30 October 2014 | 11:40 am


TabbForum MarketTech 2014: Game of Smarts

A great afternoon event put on by TabbFORUM in New York yesterday with a number of panels and one on one interviews (see agenda). You can see some of went on at the event via the hashtag #TabbTech or via the @XenomorphNews feed.

WP_20141015_16_36_01_Raw

"Death of Legacy" Panel Discussion

Posted by Brian Sentance | 16 October 2014 | 8:52 pm


A-Team DMS London Event and BCBS239 Panel

Good day at the A-Team's DMS London event last Wednesday. The day started with Tom Dalglish doing a pretty passable impression of a stand-up comedian in the morning keynote to open the day - not exactly an easy thing to do if 1) you are asked to do it very much at the last minute and 2) this is data management, not the subject that most comedians would immediately reach out for. So due kudos to Tom, and some of the comments he made about technology architects and technology builders were funny and resonated with the audience, such as this quote coming from a technologist: "How can I give you the requirements, I haven't finished the code yet?" (I think we have all been there on that one a few times in our careers...).

You can find some of the main points from the various panels at via @XenomorphNews or more generally by #dmslondon (you could also find out a bit via my twitter account @TheLongSentance so long as you don't mind the odd photograph and a few bits of personal baggage now and again).

BCBS239 Panel - I took part in the panel on BCBS239 on risk data aggregation and reporting, something which I have written about before, and obviously a prime example of how regulation is influencing (dictating?) financial markets institutions to take data management seriously. Dennis Slattery of EDMWorks moderated the panel, and on the panel with me was Sally Hinds of DCMS, and Mikael Soboen, head of risk systems at BNP Paribas. 

IMG_3401

BCBS239 Panel at DMS London

Dennis started by outlining the four pillars of BCBS239:

  • Pillar 1 “Overarching Governance and Infrastructure.”
  • Pillar 2 “Risk aggregation” capabilities.
  • Pillar 3 “Risk reporting” capabilities.
  • Pillar 4 “Supervisory review, tools and cooperation."

Regulatory Chicken - Dennis started by asking the panel whether BCBS was another game of regulatory "chicken" where the approach of "principles" means 1) the banks do the minimum and wait for the banks to inspect and tell them what they specifically have to do 2) the regulators don't really want to be more specific beyond principles because they themselves are unsure of what is needed and want to learn from what different banks have done. General concensus from the panel debate was that firms were not doing as much as they could, but that banks needed to show at least that they had a program in place and running by the January 2016 deadline or face big issues with the regulators (so the game of regulatory chicken is "on" seems to be the conclusion). Mikael Soboen added that he was unsure whether his regulator would have the time to conduct the BCBS239 given the workload that the regulators currently faced. 

The End of Spreadsheets? - Dennis asked whether BCBS239 and the requirements for having a clear data lineage meant this sounded the bell for the end of spreadsheet usage at banks. I said not - I personally feel that a lot of folks in technology underestimate how difficult using software is for many business users and tools that make manipulating data easy like spreadsheets will have a role for the foreseeable future. I suggested that spreadsheets are a great adhoc reporting and analysis tool, and things mainly go wrong when they are used as a personal, "siloed" desktop database.

BCBS239 does not itself preclude the usage of spreadsheets and end user computing, but rather like a lot of regulation says that their usage must be taken seriously - in my view there is a tendency for some in IT to regard spreadsheets as someone else's problem, which is understandable but problematic for any CDO. Also there are approaches to spreadsheet usage that can help maintain data lineage, such as what Microsoft offers with web provision of spreadsheet dashboards using PowerView and PowerBI (used in our TimeScape MarketPlace offering), folks such as Cluster7 with their "closed circuit TV" for spreadsheet monitoring, and indeed Xenomorph with our SpreadSheet Inside approach of including centralised spreadsheet-like calculations as a supported data type within the audited data management process.

Data Dictionary - Mikael said that one responsibility he had was to represent the investment bank within the wider data dictionary initiatives due to BCBS239 at the retail bank, and said that this was challenging given the different terminology sometimes used. 

Is BCBS239 a Project or Data Governance? - The panel thought that the best approach was to use BCBS239 as a framework for compliance with current regulation and regulation to come, but that this needs to obviously be subject to having the budget to do so. There were some general comments on how the data management needs of the front office and risk were converging. Standards such as FIBO were also discussed, with feedback being that they are desirable but that it is early days where their immaturity means they are often used for specific areas such as modeling counterparty data. 

Overall a good panel (I hope!) with a good amount of audience questions and participation. Again you can find some of the main points from the various panels at via @XenomorphNews or more generally by #dmslondon (you could also find out a bit via my twitter account @TheLongSentance so long as you don't mind the odd photograph and a few bits of personal baggage now and again).

IMG_3352

A bit of fun - Brian looking up to Ron Wilbraham at DMS London

Posted by Brian Sentance | 13 October 2014 | 4:50 pm


A-Team DMS Awards 2014 - Xenomorph on the Cloud

A-Team’s DMS Data Management Awards close on the 26th of September so if you haven't already, please vote for Xenomorph!

Xenomorph on the Cloud - First of a few lookbacks at what we have been doing over the past year - firstly with a short animation about one of our major initiatives this year, cloud provision of data management and a new venture into cloud-based data publishing with the TimeScape MarketPlace

So it would be fantastic if you could support Xenomorph by voting here

Thank you!

Related articles

Posted by Brian Sentance | 11 September 2014 | 6:21 pm


Cloud, data and analytics in London - thanks for coming along!

We had over 60 folks along to our event our the Merchant Taylors' Hall last week in London. Thanks to all who attended, all who helped with the organization of the event and sorry to miss those of you that couldn't come along this time.

Some photos from the event are below starting with Brad Sevenko of Microsoft (Director, Capital Markets Technology Strategy) in the foreground with a few of the speakers doing some last minute adjustments at the front of the room before the guests arrived:

AzureUK-1

 

Rupesh Khendry of Microsoft (Head of World-Wide Capital Markets Solutions) started off the presentations at the event, introducing Microsoft's capital markets technology strategy to a packed audience:

AzureUK-3

 

After a presentation by Virginie O'Shea of Aite Group on Cloud adoption in capital markets, Antonio Zurlo (below) of Microsoft (Senior Program Manager) gave a quick introduction to the services available through the Microsoft Azure cloud and then moved on to more detail around Microsoft Power BI:

AzureUK-5

 

After Antonio, then yours truly (Brian Sentance, CEO, Xenomorph) gave a presentation on what we have been building with Microsoft over the past 18 months, the TimeScape MarketPlace. At this point in the presentation I was giving some introductory background on the challenges of regulatory compliance and the pros and cons between point solutions and having a more general data framework in place:

AzureUK-6

 

The event ended with some networking and further discussions. Big thanks to those who came forward to speak with me afterwards, great to get some early feedback.

AzureUK-8

 

Related articles

Posted by Brian Sentance | 1 July 2014 | 10:54 am


Cloud, data and analytics in London. Tomorrow Wednesday 25th June.

One day to go until our TimeScape MarketPlace breakfast briefing "Financial Markets Data and Analytics. Everywhere You Need Them" at Merchant Taylor's Hall tomorrow, Wednesday June 25th. With over ninety people registered so far it should be a great event, but if you can make it please register and come along, it would be great to see you there.

Related articles

Posted by Brian Sentance | 24 June 2014 | 10:25 am


Cloud, data and analytics in London. Next Wednesday June 25th.

Less than one week to go until our TimeScape MarketPlace breakfast briefing "Financial Markets Data and Analytics. Everywhere You Need Them" at Merchant Taylor's Hall on Wednesday June 25th. 

Come and join Xenomorph, Aite Group and Microsoft for breakfast and hear Virginie O'Shea of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be demonstrating the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. 

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

Related articles

Posted by Brian Sentance | 19 June 2014 | 9:55 am


Financial Markets Data and Analytics. Everywhere London Needs Them.

Pleased to announce that our TimeScape MarketPlace event "Financial Markets Data and Analytics. Everywhere You Need Them" is coming to London, at Merchant Taylor's Hall on Wednesday June 25th. 

Come and join Xenomorph, Aite Group and Microsoft for breakfast and hear Virginie O'Shea of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be demonstrating the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. 

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

Related articles

Posted by Brian Sentance | 11 June 2014 | 7:50 pm


Clients and Partners. Everywhere You Need Them.

Quick thank you to the clients and partners who took some time out of their working day to attend our breakfast briefing, "Financial Markets Data and Analytics. Everywhere You Need Them." at Microsoft's Times Square offices last Friday morning. Not particularly great weather on here in Manhattan so it was great to see around 60 folks turn up...

Photo 1

 
Rupesh Khendry of Microsoft (Head of World-Wide Capital Markets Solutions) started the event and set out the agenda for the morning. Rupesh described the expense of data within financial markets, and the difficulties experienced by risk managers in pulling together all the data and analytics they need...  Photo 2
 
 ...and following Rupesh was Antonio Zurlo (below) of Microsoft (Senior Program Manager) who explained the fundamentals of Microsoft Azure and what services and infrastructure it offers, including public cloud, virtual private cloud and hybrid cloud architectures. Antonio also described a key usage pattern for HPC/grid on Azure being used to "burst to the cloud" when on-premise infrasture needs to be extended for end/intra-day risk calcs...
Photo 3
 
Sang Lee (below) of Aite Group (Managing Partner) then delivered his presentation "Floating in the Capital Markets Cloud: Moving Beyond Data Storage". Sang's main findings from the survey of 20 financial institutions were that concerns about security and SLAs relating to cloud usage remain, but even those that were concerned about this also said they were planning to start a cloud project within the next 24 months. Cloud technology seems to becoming more acceptable of late, and Sang said this seems to be due to regulation, cost pressures and the desire to offer better services to clients. Sang confirmed that HPC/Grid with "burst to the cloud" is a common usage pattern and that "Data as a Service" is becoming more popular... 
Photo 4
 
Fred Veasley (below) of Microsoft (Tech Solutions Professional) to introduce Microsoft Power BI and Office 365. Fred explained how Power BI extended the capabilities of Excel with data search (finding and retrieving publicized data sources both within an organization and over the web), its integration capabilities with standard databases, NoSQL databases, data standards such as OData and new APIs/sources of data such as Facebook. Once downloaded, the data can be shaped and merged with other datasets (for instance combining data from positions databases/systems with analytics and data from the cloud), and kept up to date automatically. In addition to Power BI, Power View enables great visualizations and interactive dashboards to be created, and once finalized these can be deployed centrally via web pages down to end users...
Photo 5
 
After Fred, Brian Sentance (below), CEO of Xenomorph explained the origins of the TimeScape MarketPlace. Based on some discussions with Microsoft about 18 months back, the idea was effectively to firstly to get TimeScape running in the Microsoft Azure cloud, secondly to turn the data management capabilities of TimeScape "upside-down" by using it as a means to upload and publish data to the cloud and thirdly to provide one-to-many access to multiple sources of data via web interfaces and key delivery tools such as Microsoft Power BI. Put another way, without any local software or hardware infrastructure both business users and IT staff can access multiple data sources in the same format and using the same data model wherever the data is needed. In addition to .NET and Java interfaces to the TimeScape MarketPlace via OData, web API delivery into F#, Python, R and MATLAB are all in development...
Photo 1 - Copy
 
...and in addition to downloading data via Power BI, Brian also demonstrated how you could build on the data using "Power View" to create powerful analytical dashboard functionality that could be built and tested in Excel, then deployed centrally within a browser for access by users outside of Excel. He added that partners was one of the key aspects for the platform, and introduced the TimeScape MarketPlace Partner Program for the platform to get data, analytics, model vendors, software and service vendors involved and building on the platform. Andrew Tognela (below) of Microsoft (Worldwide Managing Director) closed the presentations...
Photo 4 - Copy

Posted by Brian Sentance | 14 May 2014 | 8:49 pm


7 days to go - Financial Markets Data and Analytics. Everywhere You Need Them.

Quick reminder that there are just 7 days left to register for Xenomorph's breakfast briefing event at Microsoft's Times Square offices on Friday May 9th, "Financial Markets Data and Analytics. Everywhere You Need Them."

With 90 registrants so far it looks to be a great event with presentations from Sang Lee of Aite Group on the adoption of cloud technology in financial markets, Microsoft showing the self-service (aka easy!) data integration capabilities of Microsoft Power BI for Excel, and introducing the TimeScape MarketPlace, Xenomorph's new cloud-based data mashup service for publishing and consuming financial markets data and analytics.

Hope to see you there and have a great weekend!

 

Related articles

Posted by Brian Sentance | 2 May 2014 | 6:34 pm


Xenomorph Releases TimeScape Data Validation Dashboard

Very pleased to announce general availability of TimeScape Data Validation Dashboard which we announced this morning. You can see find out more here. Big thank you to all the staff and the clients involved, who have helped us to put this together over the past year. 

Related articles

Posted by Brian Sentance | 30 April 2014 | 3:30 pm


Regulatory, Compliance, and Risk Data Technology Challenges - PRMIA

The New York Chapter of PRMIA hosted "Regulatory, Compliance, and Risk Data Technology Challenges" at Credit Suisse's offices in New York, last Thursday 10th April. Abraham Thomas introduce the panelists, and Don Wesnofske started off by setting the scene for the evening's event.

Don outlined how in reaction to the 2008 Crisis the regulators now require data retention for up to 10 years or more. Don cited one particular example where data must be reconstructed within 24 to 48 hours for any date up to 7 years back, and said that this kind of "forensic" investigation capability was an important consideration for many financial institutions. He took us through a good presentation slide of his view on data management/risk architecture, and outlined how operational risk is comprised of people, process, technology and events. Don ended his presentation by taking us through Wikipedia's definition of "Big Data", and in particular talked about how data has a life cycle going through:

  • Production
  • Retention
  • Archive
  • Purged

Don handed then handed over to Luigi Mercone of Credit Suisse who is a Director of Engineering Strategy & Architecture at Credit Suisse. Luigi started by saying that to the business at CS, he is technical support which involves asking "What is on fire today? And whats going to be on fire tomorrow?" Luigi described how some time back CS had regulatory enquiry around their equities business which required them to reconstruct data from 2 years back.

The project to do this took around 4-5 months of database adminstrators time to reconstruct the world as at that point in time (I guess because tape storage was being used, and this needed restoring to disk/database). This was for an equity order management system that had doubled in size every year for the past 17 years, and at that point CS was only retaining data going back 2 years. Luigi said that it was then thought that with new regulations requiring the ability to produce forensice evidence at any point in time would potentially swamp CS's resources unless it was addressed head on and strategically. 

Luigi described the original architecture that they were using being based on an in-memory database for intraday workloads, then standard Sybase (probably ASE I guess) and then Sybase IQ for longer term archiving, taking advantage of the column-store capabilities of Sybase IQ and the resulting data compression possible. He added that the data storage requirements of the system had grown from 150TB to 1.2PB in 4 years.

Luigi then offered a comparison of this original architecture with what he found by implementing RainStor, in the original architecture the Sybase IQ database compressed data down into 160TB, whereas this was improved by a further factor of 10 down to 14TB using RainStor. He said that the RainStor was self-service providing a standard SQL interface, eliminated the need for tape storage, reduced the system "footprint" by 90% at CS, was 1/5 of the cost and the performance was good. (I guess here I would like to caveat that I know nothing of the original architecture other than the summary Luigi provided, and as such it is hard to judge whether the original architecture was optimal for the data growth experienced, and hence whether this was overall an objective comparison of Sybase IQ's capabilities with RainStor.) Luigi closed by saying that whilst RainStor was a great archive database, its original origins were in in-memory databases and he would encourage RainStor to re-enter that market too, given his experience so far. 

John Bantleman CEO of RainStor took over and described how RainStor had been designed specifically for the needs of data archiving (I guess talking more about what it does now rather than its origins outlined by Luigi above). He said that RainStor offers a 20-40x storage footprint reduction over traditional database technology and operates efficiently even at the PetaByte (PB) scale, based around RainStor proprietary database technology making use of columnar storage and being capable of storing data in both relational-style tabular format and also in more "document" style using XML and JSON formats using Key-Value access. John mention that in terms of being able to store data that not only could RainStor retrieve data at a point in time, but it could retrieve the schema being used at that point in time for a more complete view of the state of the world at that point. This echos a couple of past articles that I have penned, one for IRD and one for Wilmott Magazine on bitemporal regulatory requirements.

John said that regulation was driving the need for data archiving capabilities, with 1400 regulations added since 2008 (not sure of source, but believable) and the comment from a Chief Data Officer (CDO) at one financial markets client that if a project wasn't driven by regulatory compliance then the project isn't going to get done (certainly sounds like regulatory overload). John's opening remarks were really around how regulatory cost, complexity and compliance were driving forces behind the growth of RainStor in financial services technology, and whilst regulation is the driver, firms should look at archiving of data as an opportunity too, in order to create value from corporate memory, and to be proactive in addressing future reporting and analysis needs.

John illustrated the regulatory need for data archiving through the Consolidated Audit Trail (CAT) regulation with data retention over 7 years will generate 100PB of data. He also mentioned SEC Rule 17a-4 for broker dealers as another example of "data retention" regulation, with particular reference to storage of records in on-rewriteable, non-erasable format. John termed this WORM storage, meaning Write Once, Read Many. John seemed to imply that both the software (RainStor) and the hardware it runs on (e.g. EMC or Teradata etc) need to be WORM compliant. One of the audience members asked John about BCBS 239, to which John said that he didn't know that particular regulation (fair enough that John didn't know in my opinion, RainStor's tech is general about "data" and is applicable across many industries, whereas BCBS 239 is obviously about banks specifically and is more about data aggregation and reporting than data retention/archiving to my understanding, and this seems to be confirmed with a quick doc scan for "archive" or "retention".)

To finish off the main part of the event (before the drinks and food began) there was a panel discussion. Luigi said that it was best to "prepare for all time, not just specifics" with respect to data retention and that there were dangers in rolling up data (effectively aggregating and loosing granularity to reduce storage needs). John added that his definition of "Big Data" was "All information, for ever". Luigi added that implementing RainStor had allowed CS to spend more time on interesting questions rather than on database restoration. John proposed that version 1 of Big Data involved the retention of web data, and as such loosing a data point here and their didn't matter. Version 2 of Big Data is concerned more with enterprise data where all data has value and needs to be retained i.e. lots of high value data. He added that this was an opportunity for risk and compliance to become an asset. 

WP_20140410_20_27_09_Raw

Abraham (second from left), Don (center) and John (second from right)

Overall it was a good event which I found very interesting (but I have to admit to a certain geeky interest in this kind of tech). The event would have benefitted from say another competitive or complementary technology vendor involved maybe, plus maybe an academic to give a different slant on data retention and on what the regulators hope to gain from this kind of mandated data retention. Not that the regulators have been that good at managing data themselves recently.

WP_20140410_19_52_58_Raw

 Networking afterwards courtesy of Credit Suisse and RainStor

 

 

 

 

 

 

 

Related articles

Posted by Brian Sentance | 17 April 2014 | 2:05 pm


Financial Markets Data and Analytics. Everywhere You Need Them.

Very pleased to announce that Xenomorph will be hosting an event, "Financial Markets Data and Analytics. Everywhere You Need Them.", at Microsoft's Times Square New York offices on May 9th.

This breakfast briefing includes Sang Lee of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be introducing the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. More background and updates on MarketPlace in coming weeks.

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

Posted by Brian Sentance | 15 April 2014 | 2:57 pm


Innovations in Liquidity Risk Management - PRMIA

PRMIA put on an event at MSCI on Wednesday, called "Innovations in Liquidity Risk Management".

 

WP_20140326_18_04_09_Raw

Melissa Sexton of Morgan Stanley introduced the agenda, saying that the evening would focus on three aspects of liquidity risk management:

  • methodology
  • industry practice
  • regulation

LiquidityMetrics by MSCI - Carlo Acerbi of MSCI then took over with his presentation on "LiquidityMetrics". Carlo said that he was pleased to be involved with MSCI (and RiskMetrics, aquired by MSCI) in that it had helped to establish and define standards for risk management that were used across the industry. He said that liquidity risk management was difficult because:

  • Clarity of Definition - Carlo suggest that if he asked the audience to define liquidity risk he would receive 70 differing definitions. Put another way, he suggested that liquidity risk was "a strange animal with many faces".
  • Data Availability - Carlo said that there were aspects of the market that we unobservable and hence data was scarce/non-existent and as such this was a limit on the validity of the models that could be applied to liquidity risk.

Carlo went on to clarify that liquidity risk was different depending upon the organization type/context being considered, with banks obviously focusing on funding. He said that LiquidityMetrics was focused on asset liquidity risk, and as such was more applicable to the needs of asset managers and hedge funds given recent regulation such as UCITS/AIFMD/FormPF. The methodology is aimed at bringing traditional equity market impact models out from the trading floor across into risk management and across other asset classes. 

Liquidity Surfaces - LiquidityMetrics measures the expected price impact for an order of a given size, and as such has dimensions in:

  • order size
  • liquidity time horizon
  • transaction costs

The representation shown by Carlo was of a "liquidity surface" with x dimension of order size (both bid and ask around 0), y dimension of time horizon for liquidation and z (vertical) dimension of transaction cost. The surface shown had a U-shaped cross section around zero order size, at which the transaction cost was half the bid-ask spread (this link illustrates my attempt at verbal visualization). The U-shape cross section indicates "Market Impact", its shape over time "Market Elasticity" and the limits for what it is observable "Market Depth". 

Carlo then moved to consider a portfolio of instruments, and how obligations on an investment fund (a portfolio) can be translated into the estimated transaction costs of meeting this obligations, so as to quantify the hidden costs of redemption in a fund. He mentioned that LiquidityMetrics could be used to quantify the costs of regulations such as UCITS/AIFMD/FormPF. There was some audience questioning about portfolios of foreign assets, such as holding Russian Bonds (maybe currently topical for an audience member maybe?). Carlo said that you would use both the liquidity surfaces for both the bond itself and the FX transaction (and in FX, there is much data available). He was however keen to emphasize that LiquidityMetrics was not intended to be used to predict "regime change" i.e. it is concerned with transaction costs under normal market conditions). 

Model Calibration - In terms of model calibration, then Carlo said that the established equity market impact models (see this link for some background for instance) have observable market data to work with. In equity markets, traditionally there was a "lit" central trading venue (i.e. an exchange) with a star network of participants fanning out from it. In OTC markets such as bonds, there is no star network but rather many to many linkages establised between all market participants, where each participant may have a network of connections of different size. As such there has not been enough data around to calibrate traditional market impact models for OTC markets. As a result, Carlo said that MSCI had implemented some simple models with a relatively small number of parameters. 

Two characteristics of standard market impact models are:

  1. Permanent Effects - this is where the fair price is impacted by a large order and the order book is dragged along to follow this.
  2. Temporary Effects - this is where the order book is emptied but then liquidity regenerates

Carlo said that the effects were obviously related to the behavioural aspects of market participants. He said that the bright side for bonds (and OTC markets) was given that the trades are private there was no public information, and price movements were often constrained by theoretical pricing, therefore permanent effects could be ignored and the fair price is insenstive to trading (again under "normal" market conditions). Carlo then moved on to talk about some of the research his team was doing looking at the shape of the order book and the time needed to regenerate it. He talked of "Perfectly Elastic" markets that digest orders immediately and "Perfectly Plastic" markets that never regenerate, and how "Relaxation Time" measures in days how long the market takes to regenerate the order book. 

WP_20140326_18_34_29_Raw

Liquidity Observatory - Carlo described how the data was gathered from market participants on a monthly basis using a spreadsheet to categorize the bond/asset class type, and again using simple parameters from active "expert" traders. Take a look at this link and sign up if this is you. (This sounded to me a lot like another "market consensus" data gathering exercise which are proving increasingly popular, such as one the first I had heard of many years back in Totem - we are not quite fully ready for "crowdsourcing" in financial markets maybe, but more people are seeing sense in sharing data.). 

Panel Debate - Ron Papenek of MSCI was moderator of the panel, and asked Karen Cassidy of Morgan Stanley about her experiences in liquidity risk management.

Liqudity Risk Management at Banks - Karen started by saying that in liquidity management at Morgan Stanley they look at:

  • Funding
  • Operating Capital
  • Client Behaviour

Since 2008, Karen said that liquidity management had become a lot more rigorous and formalized, being rule based and using a categorisation of assets held from highly liquid to highly illiquid. She said that Morgan Stanley undertake stress testing by market and also by idiosyncratic risk over time frames of 1 month and 1 year. As part of this they are assessing the minimum operating liquidity needed based on working capital needs. 

Karen added that Morgan Stanley are expending a lot of effect currently on data collection and modelling given that their data is specific to a retail broker-dealer unit, unlike many other firms. They are also looking at metrics around financial advisors, and how many clients follow the financial advisor when he or she decides to switch firms. 

Business or Regulation Driving Liquidity Risk Management - Ron asked Karen what were the drivers of their processes at Morgan Stanley. Karen said that in 2008 the focus was on fundability of assets, saying that the FED was monitoring this on a daily basis. She made the side comment that this monitoring was not unusual since "Regulators live with us anyway". Karen said that it was the responsibility of firms to come up with the controls and best practice needed to manage liquidity risk, and that is what Morgan Stanley do anyway.

Karen added that in her view the industry was over-funding and funding too long in response to regulation, and that funding would be at lower but still pragmatic levels in the absence of regulatory pressure. Like many in the industry, Karen thought the regulation had swung too far in response to the 2008 crisis and would eventually swing back to more normal levels. 

Carlo added that he had written an unintentionally prescient academic paper on liquidity management in 2008 just prior to the crisis hitting, and he thought the regulators certainly arrived "after" the crisis rather than anticipating it in any way. He thought that the banks have anticipated the regulators very well with measures such as LCR and SFR already in place. 

In contrast, Carlo said that the regulators were lost in dealing with liquidity risk management for asset managers and hedge funds, with regulation such as UCITS being very vague on this topic and regulators themselves seeking guidance from the industry. He recounted a meeting he had with BaFin in 2009 where he told them that certain of their regulations made no sense and he said they acknowledge this and said the asset management industry needed to tell them what to implement (sounds like the German regulator is using the same card as the UK regulators in keeping regulations vague when they are uncertain, waiting for regulated firms to implement them to see what the regulation really becomes...). 

What Have We Learnt Since 2008 - Karen said that back in 2008 liquidity was not managed to term, funding basis was not rigorous and relied heavily on unsecured debt. She said that since then Morgan Stanley had been actively involved in shaping the requirements of better liquidity risk management with more rigorous analysis of counterparties and funding capacity. Karen said that stronger governance was a foundation for the creation of better policy and process. She said that regulators were receptive to new ideas and had been working with them closely.

What will be the effect of CCPs on OTC markets? Carlo said that when executing a large order, you have the choice between executing 1) multiple small orders with multiple counterparties or 2) a single large block order with one counterparty. In this regard, the equity and bond markets are very different. In lit equity venues, the best approach is 1), but in the bond markets approach 2) is taken since the trade information is not transparent to the market.

Obviously equity markets have become more fragmented, and this has resulted in improve market quality since it is harder to get all market information and hence the market is less resonant to big events/orders. Carlo added that with the increased transparency proposed for OTC markets with CCPs etc will this improve them? His answer was that this was likely to improve the counterparty risk inherent in the market but due to increased transaparency is likely to have a negative effect on transaction costs (I guess another example of the law of unintended consequencies for the regulators).

Audience Questions - there then followed some audience questions:

LiqidityMetrics extrapolation - one audience member asked about transaction cost extrapolation in Carlo's modelling. Carlo said that MSCI do not extrapolate and the liquidity surface terminates where the market terminates its liquidity. There was some extrapolation used along the time dimension however particularly in relation to the time-relaxation parameter. 

LiquidityMetrics "Cross-Impact" - looking at applying LiquidityMetrics to a portfolio, one audience member wondering if an order for one asset distorted the liquidity surface for other potentially related assets. Carlo said this was a very interesting area with little research done so far. He said that this "cross-impact" had not been detected in equity markets but that they were looking at it in other markets such as fixed income where effective two assets might be proxies for duration related trading. Carlo put forward a simple model of where the two assets are analogous to two species of animal feeding from the same source of food.

Long and short position liquidity modelling - one audience member asked Carlo what the effects would be of being long or short and that in a crisis you would prefer to be short (maybe obviously?) given the sell off by those with long positions. Carlo clarified that being "short" was not merely taking the negative number on a liquidity surface for a particular asset but rather a "short" is a borrowing position with an obligation to deliver a security at some defined point, and as such is a different asset with its own liquidity surface.  

Changing markets, changing participants - final question of the evening was from one member of the audience who asked if the general move out of fixed income trading by the banks over recent years was visible in Carlo's data? Carlo said that MSCI only have around two years of data so far and as such this was not yet visible but his team are looking for effects like this amongst others. He added that the August 2011 weak banks - weak sovereigns in Europe was visible with signals present in the data.

WP_20140326_20_10_03_Raw

Good food and good (really good I thought) wine put on by MSCI at the event reception. Great view of Manhattan from the 48th floor of World Trade Centre 7 too.

WP_20140326_19_46_09_Raw

 

 

 

 

 

Related articles

Posted by Brian Sentance | 31 March 2014 | 10:35 am


Risk Management in Securities Financing and Money Market Funds - PRMIA

I went along to this PRMIA event on Thursday evening hosted by Credit Suisse and sponsored by Acacia Capital. Viktoria Baklanova introduced the panel with Joseph Tenaga as MC for the panel and very quickly got a plug in for her about to be released book written with Joe on money market funds. For those of us who don't know so much about money market funds, then these are a form of interest-bearing fund that invests in short term debt securities. The funds attempt to maintain a stable Net Asset Value (NAV) but to quote Wikipedia they "are widely (though not necessarily accurately) regarded as being as safe as bank deposits yet providing a higher yield." Their role in the 2008 financial crisis echos on strongly through to the present day, with controversy of their supposedly stable NAV (typically $1 in the US) and the associated phrase "Breaking the Buck".

Joe Tenaga started the panel with an (unnecessary in my view) justification of academia, asking the rhetorical question "What is the point of academia?" to which Joe answered that "knowledge is what makes the impossible possible" and he added that knowledge drives us to make things better. Joe introduced the next panelist, Matthew Fink of Oppenheimer Mutual Funds. Matt said that we would be prepared to wager that he had worked in the money market funds area the longest of anyone in the room, having started his involvement in the industry in April of 1971. Matt gave a picture of the mutual funds industry at the time, with around $60B AUM in the US with 95% invested in equities. At that time the mutual funds industry was going through a very bad time, as the economy and markets were falling and fund redemptions were rising to such an extent that they had fallen to $30B over the next few years. At the time, if redemptions had continued at this rate the industry would have vanished.

WP_20140320_19_27_38_Raw

Against this background for the mutual funds industry, interest rates in the US were very high rising from 6% in 1969 to around 12% in 1974. So many people were paying very high rates on mortgage obligations whilst being limited to receiving only 4-5% on savings due to "Regulation Q". For wealthy individuals, it was possible to get around these savings limits, but only if you had $100,000 to put in a Commercial Deposit or $10,000 into a T-Bill. Ironically it was the regulation to remove one risk (it had been thought that competition on deposit rates had contributed to the bank failures of the Great Depression) that had sparked the drive to innovate to find higher returns and create the money market funds industry as a result, with the first fund being "The Reserve Fund" in 1971. (side comment - if regulation from the 1930's via the 1970 can cause problems in 2014, then I would have to defer to which ever Deity you worsphip to advise on what the longer-term consequencies will be of the current round of complexity being implemented...). 

The banks saw the money pouring into money market funds such as those from Fidelity and Dreyfus, and understandably wanted to be part of the party too. Some of the worries about money market funds were firstly what if a fund got into trouble? Secondly, the bank regulators were angry that funds were flowing into this new industry and were concerned that it would increase bank failures. 1979 saw a certain Paul Volcker (ever heard of him?) complaining that money market funds were acting like checking accounts. Matt said that he spoke with Volcker and said that this was not the case, to which Volcker replied that it was true since his wife's company was paying staff wages out on checks written against money market funds. 

Henry Shilling of Moody's took over from Matt and showed a few slides, firstly showing the number of funds with AAA (AAAmmf, Aaamf, AAAmf) from Fitch (49), Moody's (130) and S&P Ratings (156). Henry described how regulators have wanted to reduce the risk of funds by shortening the maturity of the debt held from 90 to 60 days, and having one and seven day liquidity windows. He showed that there is a high degree of concentration risk in the industry with the top 10 firms have 74% AUM and the top 20 covering 94% of the AUM for the industry. Similarly, looking at the assets invested in the funds, 80% are from financial institutions.

Igor Axenov of Barclays Capital then showed his slides, illustrating the composition of the funds by asset type prior to the crisis:

  • ABS related - 34%
  • Bank products - 23%
  • Repos - 15%
  • Corporate - 11%
  • Unsecured - 8%
  • Other - 6%

He said that the largest exposure then was to securitized products, with implicit indirect exposure to banks. Igor said that CDO issuance was rising at a rate of $300B per year through 2005/6/7 and that much of the structuring was done to ensure that the ABS products fitted the needs and regulations of money market funds. Detailing the ABS asset composition, Igor showed:

  • Asset Backed (AB) commercial paper - 50%
  • AB medium term notes - 24%
  • Extendible AB commercial paper - 17%
  • ABS Bonds - 5%

Igor said the asset backed commercial paper market (largely funded through money market funds) had grown to $1.2Trln by 2007, and has fallen precipitously since then down to around $200M now.

Looking at the current money market fund portfolio, it looks like:

  • Bank products - 41%
  • Repos - 18%
  • CP - 15%
  • US Govt + Agency debt - 10%
  • Asset backed CP - 9%
  • Other corporate - 4%

Terence Ma added that the Money Market Fund industry sat at 4Trln in 2008 and was now around $2.7Trln in 2014. Matthew Fink said that given his involvement in regulation that he had "never met the face of the enemy before" in Igor was the start of some lively but well-intended banter between the ex-regulator and structurer.  

Terence Ma of South Street Securities described his business, which exclusively involves repurchase agreements "Repos". Terry said that in the 1990's Citi were very disciplined on balanced sheet management and in his opinion, then adhead of the market in this regard. He that the Repo business earns small spreads and as a result needs a big balance sheet. When John Read took over Citi, he decided that he did not like the Repo business since its ROE could not compete with some of the products in retail and other parts of the business. So Terry and his partners wondered whether the Repo business could be managed off balance sheet, so they formed a broker-dealer business and when Citi merged with Salomen Brothers they span off. This was December of 2003 but by 2008 they were left "sucking wind" by the crisis.

Terry was quite explicit that his firm is not part of the "shadow banking system" but are subject to the SEC. He then described a few more things about his business, starting with his definition of a Repo as "an agreement to sell and repurchase a security at a fixed date in the future", with the objectives of providing cash inventory, leverage and short cover. All borrowings are lent out, unlike Lehman Brothers in 2008. They do not finance again structured products unless guaranteed, and only accept collateral from Fannie, Freddie and the US Government. 

Joe Tenaga then open out questions to the audience. Someone asked who the first MMF was (I think they missed the first part of the talk) and Matt said that the Keystone MMF filed first but the first was Reserve MMF (which got into trouble in 2008). Matt said that it was interesting that the same people like Paul Volcker were stilled involved with the same concerns about the industry many years on. 

The next question was how did early MMFs keep their NAV at $1? Henry said that the "Break the Buck" definition is when there is a mark to market fall of 50bp or more. He said that historically that fund sponsors had addressed any issues with breaking the buck with purchases of the fund at par or direct equity investment in the fund - they did this since the effect on their funds and the industry would be too great to comtemplate. Hence an MMF is not a perfect product but (up until Lehmans in 2008 with a 50% NAV loss) has a near-perfect record. He added that the first funds to break the buck were from Salomon's and First Chicago.

Matt added some further history saying the need to maintain the $1 NAV was initially due to the needs of some of the early investors in the industry, who could not invest in products unless they had fixed NAV. He mentioned that one of the companies, Federated, had a long running battle with the SEC over Money Market Funds, filing for exemptions to avoid some of the restrictions that the SEC was trying to impose since the SEC regarded the MMF industry as damaging the mutual funds industry. He mentioned Rule 2-a7 which defines the accountancy procedures for keeping the NAV at $1, and some of the battles around amortization and penny-rounding policies to facilitate this. To later questions, Matt said that the SEC wants a floating NAV for institutional MMFs but currently wants to leave retail alone (seems somewhat arbitrary choice i.e. lets only change what has been problematic before, ignore anything else and not contemplate what could happen if only we understood things better). He said that the SEC was weak and FSOC is driving the SEC to change (and FSOC itself is a pawn of the Federal Reserve). 

Overall an interesting panel, particularly when you have characters such as Matt Fink who know the history and stories within the industry so well. 

 

 

 

 

 

Posted by Brian Sentance | 25 March 2014 | 10:35 am


#DMSLondon - The Hobgoblin of Little Minds: Risk and Regulation as Drivers

The second panel of the day was "Regulation and Risk as Data Management Drivers" - you can find the A-Team's write up here. Some of my thoughts/notes can be found below:

  • Ian Webster of Axioma responded to a question about whether consistency was the Holy Grail of data management said that there isn't consistent view possible for data used in risk and regulation - there are many regulations with many different requirements and so unnecessary data consistency is "the hobgoblin of little minds" in delaying progress and achieving goals in data management.
  • James of Lombard Risk suggest that firms should seek competitive advantage from regulatory compliance rather than just compliance alone - seeking the carrot and not just avoiding the stick.
  • Ian said he thought too many firms dealt with regulatory compliance in a tactical manner and asked if regulation and risk were truly related? He suggested that risk levels might remain unchanged even if regulation demanded a great deal more reporting.
  • Marcelle von Wendland said she thought that regulation added cost only, and that firms must focus on risk management and margin.
  • James said that "regulatory risk" was a category of risk all in itself alongside its mainstream comtempories.
  • Ian added that risk and finance think about risk differently and this didn't help in promoting consistency of ideas in discussions about risk management.
  • James said that the legacy of systems in financial markets was a hindrince in complying with new regulation and mentioned the example of the relatively young energy industry where STP was much easier to implement.
  • Laurent of Bloomberg said that young, emerging markets like energy were greenfield and as such easier to implement systems but that they did not have any experience or culture around data governance.
  • Marcelle said that the G20 initiatives around trade reporting at least promoted some consistency and allowed issues to be identified at last.
  • Ian said in response that was unconvinced about politically driven regulation, questioning its effectiveness and motivations.
  • Ian raised the issues of the assumptions behind VaR and said that the current stress tests were overdone.
  • Marcelle agreed that a single number for VaR or some other measure meant that other useful information has potentially been ignored/thrown away.
  • General consensus across the panel that fines were not enough and that restricting business activities might be a more effective stick for the regulators.
  • James reference the risk data aggregation paper from the Basel Committee and suggested that data should be capture once, cleaned once and used many times.
  • Ian disagreed with James in that he thought clean once, capture once and use many times was not practically possible and this goal was one of the main causes of failure within the data management industry over the past 10 years. 
  • The panel ended with Ian saying that we not just solve for the last crisis, but the underlying causes of crises were similar and mostly around asset price bubbles so in order to recuce risk in the system 1) lets make data more transparent and 2) do what we can to avoid bubbles with better indices and risk measures.

3 Regulation panel

 

Posted by Brian Sentance | 24 March 2014 | 6:07 pm


S&P Capital IQ Risk Event #2 - Enterprise or Risk Data Strategy?

Christian Nilsson of S&P CIQ followed up Richard Burtsal's talk with a presentation on data management for risk, containing many interesting questions for those considering data for risk management needs. Christian started his talk by taking a time machine back to 2006, and asking what were the issues then in Enterprise Data Management:

  1. There is no current crisis - we have other priorities (we now know what happened there)
  2. The business case is still too fuzzy (regulation took care of this issue)
  3. Dealing with the politics of implementation (silos are still around, but cost and regulation are weakening politics as a defence?)
  4. Understanding data dependencies (understanding this throughout the value chain, but still not clear today?)
  5. The risk of doing it wrong (there are risk you will do data management wrong given all the external parties and sources involved, but what is the risk of not doing it?)

Christian then moved on to say the current regulatory focus is on clearer roadmaps for financial institutions, citing Basel II/III, Dodd Frank/Volker Rule in the US, challenges in valuation from IASB and IFRS, fund management challenges with UCITS, AIFMD, EMIR, MiFID and MiFIR, and Solvency II in the Insurance industry. He coined the phrase that "Regulation Goes Hollywood" with multiple versions of regulation like UCITS I, II, III, IV, V, VII for example having more versions than a set of Rocky movies. 

He then touched upon some of the main motivations behind the BCBS 239 document and said that regulation had three main themes at the moment:

  1. Higher Capital and Liquidity Ratios
  2. Restrictions on Trading Activities
  3. Structural Changes ("ring fence" retail, global operations move to being capitalized local subsidiaries)

Some further observations were on what will be the implications of the effective "loss" of globablization within financial markets, and also what now can be considered as risk free assets (do such things now exist?). Christian then gave some stats on risk as a driver of data and technology spend with over $20-50B being spent over the next 2-3 years (seems a wide range, nothing like a consensus from analysts I guess!). 

The talk then moved on to what role data and data management plays within regulatory compliance, with for example:

  • LEI - Legal Entity Identifiers play out throughout most regulation, as a means to enable automated processing and as a way to understand and aggregate exposures.
  • Dodd-Frank - Data management plays within OTC processing and STP in general.
  • Solvency II - This regulation for insurers places emphasis on data quality/data lineage and within capital reserve requirements.
  • Basel III - Risk aggregation and counterparty credit risk are two areas of key focus.

Christian outlined the small budget of the regulators relative to the biggest banks (a topic discussed in previous posts, how society wants stronger, more effective regulation but then isn't prepared to pay for it directly - although I would add we all pay for it indirectly but that is another story, in part illustrated in the document this post talks about).

In addtion to the well-known term "regulatory arbitrage" dealing with different regulations in different jurisdictions, Christian also mentioned the increasingly used term "subsituted compliance" where a global company tries to optimise which jurisdictions it and its subsidiaries comply within, with the aim of avoiding compliance in more difficult regimes through compliance within others.

I think Christian outlined the "data management dichotomy" within financial markets very well :

  1. Regulation requires data that is complete, accurate and appropriate
  2. Industry standards of data management and data are poorly regulated, and there is weak industry leadership in this area.

(not sure if it was quite at this point, but certainly some of the audience questions were about whether the data vendors themselves should be regulated which was entertaining).

He also outlined the opportunity from regulation in that it could be used as a catalyst for efficiency, STP and cost base reduction.

Obviously "Big Data" (I keep telling myself to drop the quotes, but old habits die hard) is hard to avoid, and Christian mentioned that IBM say that 90% of the world's data has been created in the last 2 years. He described the opportunities of the "3 V's" of Volume, Variety, Velocity and "Dark Data" (exploiting underused data with new technology - "Dark" and "Deep" are getting more and more use of late). No mention directly in his presentation but throughout there was the implied extension of the "3 V's" to "5 V's" with Veracity (aka quality) and Value (aka we could do this, but is it worth it?). Related to the "Value" point Christian brought out the debate about what data do you capture, analyse, store but also what do you deliberately discard which is point worth more consideration that it gets (e.g. one major data vendor I know did not store its real-time tick data and now buys its tick data history from an institution who thought it would be a good idea to store the data long before the data vendor thought of it).

I will close this post taking a couple of summary lists directly from his presentation, the first being the top areas of focus for risk managers:

  • Counterparty Risk
  • Integrating risk into the Pre-trade process
  • Risk Aggregation across the firm
  • Risk Transparency
  • Cross Asset Risk Reporting
  • Cost Management/displacement

The second list outlines the main challenges:

  • Getting complete view of risk from multiple systems
  • Lack of front to back integration of systems
  • Data Mapping
  • Data availability of history
  • Lack of Instrument coverage
  • Inability to source from single vendor
  • Growing volumes of data

Christian's presentation then put forward a lot of practical ideas about how best to meet these challenges (I particularly liked the risk data warehouse parts, but I am unsurprisingly biassed). In summary if you get the chance then see or take a read of Christian's presentation, I thought it was a very thoughtful document with some interesting ideas and advice put forward.

 

 

 

 

 

 

 

Related articles

Posted by Brian Sentance | 12 March 2014 | 10:34 am


S&P Capital IQ Risk Event #1 - Managed Services

Attended a good event at S&P Capital IQ's offices on Tuesday morning last week in London, built around the BCBS 239 document on risk aggregation and reporting (see earlier PRMIA event on this topic too). A partner vendor of S&P CIQ, Tech Mahindra, started the morning with Richard Burtsal's presentation on "Delivering an Enterprise Data Strategy". Tech Mahindra recently acquired a data management platform from UBS Asset Management and are offering a managed service data management offering based on this (see A-Team article).

Richard said that he wasn't going to "sell" in his presentation (always a worrying admission from one of us data management vendors, it usually means entirely the opposite). That small criticism aside, Richard gave a solid update on the state of the industry and obviously on what Tech Mahindra are offering, and added that:

  • For every $1 spent directly on market data, the total cost of that data goes up by a factor of 6 by the time the data is actually used 
  • 33% of rejected trades are caused by incorrect reference data
  • 60% of staff manipulate, report on or support data on a daily basis (I wonder what the other 40% actually do then? Be good to get the Tower Group report this came from to find out maybe?)
  • 25% of reference data management is wasted due to duplication and inefficiences
  • In their work with UBS Asset Management they had jointly shown that the cost of data management were reduced by 25-30% using a managed service (sounds worth verifying what the "before" situation was I guess, but interesting/impressive).
  • Clients were pushing for much faster instrument setup and a reduction in time from the 1-2 weeks setup in some systems.

There were a few questions from the audience during Richard's talk, the first asked about the differences in doing data management with the buy-side and data management on the sell-side. Richard said that his experience was that the buy-side managed less instruments (<500,000) but with greater depth of data, and sell-side held more instruments (10M+) but with less depth of data (not sure that completely reflects my experience, but sounds worth a survey maybe). 

The second question was why is the utility model for data management going to succeed right now, when previous attempts over the past 10 years had failed? Richard responded that he thought Tech Mahindra would succeed due to:

  • Tech Mahindra are data-vendor agnostic (I assume aimed at Markit-Cadis and Bloomberg-PolarLake)
  • Tech Mahindra own all their own IP (hmm, not really so sure this is a good reason or even a differentiator, but a I guess aimed at managed services that are not run by the firm that develops the data management system?)

I think the answers to this second question need thinking through more clearly, to be fair Richard had stated the 25% cost reduction already as one benefit, and various folks have said that the technology is ripe for these kinds of offerings now, but all the same the response need to be more fully developed to convince many I think (I remain undecided personally, it would be good to have some more evidence to back this up). One of the S&P CIQ added that what he thinks clients want is "Utility of Delivery" and not "Utility of Content" which I thought was a sensible comment and one that I will be revisiting in the coming months. 

On a related note to why managed services just now, another audience member asked how client specific data was managed within a utility or managed service model, and Richard said that client specific data was often managed at the client but that they can upload and integrate client generated data into the managed service offering. I think this is a very key issue within the debate about managed services and utilities, I mean I get the point the data utility proponents make that certain datasets are simple "facts" as such are either write or wrong and hence commoditisable, but much of the data is subjective and all of the data needs validating together in the context of its intended use in my view. I guess I kind of loose myself in looping arguments about why data utility vendors aren't ultimately wanting to be the next Thomson Reuters or Bloomberg (not that that is not a laudible aim but it is not going to change the world or indeed financial markets data provision very much).

 

 

Related articles

Posted by Brian Sentance | 10 March 2014 | 10:41 am


See you at the A-Team Data Management Summit this week!

Xenomorph is sponsoring the networking reception at the A-Team DMS event in London this week, and if you are attending then I wanted to extend a cordial invite to you to attend the drinks and networking reception at the end of day at 5:30pm on Thursday.

In preparation for Thursday’s Agenda then the blog links below are a quick reminder of some of the main highlights from last September’s DMS:

I will also be speaking on the 2pm panel “Reporting for the C-Suite: Data Management for Enterprise & Risk Analytics”. So if you like what you have heard during the day, come along to the drinks and firm up your understanding with further discussion with like-minded individuals. Alternatively, if you find your brain is so full by then of enterprise data architecture, managed services, analytics, risk and regulation that you can hardly speak, come along and allow your cerebellum to relax and make sense of it all with your favourite beverage in hand. Either way your you will leave the event more informed then when you went in...well that’s my excuse and I am sticking with it!

Hope to see you there!

Related articles

Posted by Brian Sentance | 3 March 2014 | 6:33 pm


Model Risk Management from PRMIA

Guest blog post by Qi Fu of PRMIA and Credit Suisse NYC with some notes on a model risk management event held ealier in September of this year. Big thank you to Qi for his notes and to all involved in organising the event:

The PRMIA event on Model Risk Management (MRM) was held in the evening of September 16th at Credit Suisse.  The discussion was sponsored by Ernst & Young, and was organized by Cynthia Williams, Regulatory Coordinator for Americas at Credit Suisse. 

As financial institutions have shifted considerable focus to model governance and independent model validation, MRM is as timely a topic as any in risk management, particularly since the Fed and OCC issued the Supervisory Guidance on Model Risk Management, also known as SR 11-7.

The event brings together a diverse range of views: the investment banks Morgan Stanley, Bank of American Merrill Lynch, and Credit Suisse are each represented, also on the panel are a consultant from E&Y and a regulator from Federal Reserve Bank of NY.  The event was well attended with over 100 attendees.

Colin Love-Mason, Head of Market Risk Analytics at CS moderated the panel, and led off by discussing his 2 functions at Credit Suisse, one being traditional model validation (MV), the other being VaR development and completing gap assessment, as well as compiling model inventory.  Colin made an analogy between model risk management with real estate.   As in real estate, there are three golden rules in MRM, which are emphasized in SR 11-7: documentation, documentation, and documentation.  Looking into the future, the continuing goals in MRM are quantification and aggregation.

Gagan Agarwala of E&Y’s Risk Advisory Practice noted that there is nothing new about many of the ideas in MRM.  Most large institutions already have in place guidance on model validation and model risk management.  In the past validation consisted of mostly quantitative analysis, but the trend has shifted towards establishing more mature, holistic, and sustainable risk management practices. 

Karen Schneck of FRBNY’s Models and Methodology Department spoke about her role at the FRB where she is on the model validation unit for stress testing for Comprehensive Capital Analysis and Review (CCAR); thus part of her work was on MRM before SR 11-7 was written.  SR 11-7 is definitely a “game changer”; since its release, there is now more formalization and organization around the oversight of MRM; rather than a rigid organization chart, the reporting structure at the FRB is much more open minded.  In addition, there is an increased appreciation of the infrastructure around the models themselves and the challenges faced by practitioners, in particularly the model implementation component, which is not always immediately recognized.

Craig Wotherspoon of BAML Model Risk Management remarked on his experience in risk management, and comments that a new feature in the structure of risk governance is that model validation is turning into a component of risk management.  In addition, the people involved are changing: risk professionals with the combination of a scientific mind, business sense, and writing skills will be in as high demand as ever.

Jon Hill, Head of Morgan Stanley’s Quantitative Analytics Group discussed his past experience in MRM since 90’s, when then the primary tools applied were “sniff tests”.  Since then, the landscape has long been completely changed.  In the past, focus had been on production, while documentation of models was an afterthought, now documentation must be detailed enough for highly qualified individual to review.  In times past the focus was only around validating methodology, nowadays it is just as important to validate the implementation.  There is an emphasis on stress testing, especially for complex models, in addition to internal threshold models and independent benchmarking.  The definition of what a model is has also expanded to anything that takes numbers in and haves numbers as output.  However, these increased demands require a substantial increase in resources; the difficulty of recruiting talent in these areas will remain a major challenge.

Colin noted a contrast in the initial comments of the panelists, on one hand some are indicating that MRM is mostly common sense; but Karen in particular emphasized the “game-changing” implications of SR 11-7, with MRM becoming more process oriented, when in the past it had been more of an intellectual exercise.  With regards to recruitment, it is difficult to find candidates with all the prerequisite skill sets, one option is to split up the workload to make it easier to hire.

Craig noted the shift in the risk governance structure, the model risk control committees are defining what models are, more formally and rigorously.  Gagan added that models have lifecycles, and there are inherent risks associated within that lifecycle.  It is important to connect the dots to make sure everything is conceptually sound, and to ascertain that other control functions understand the lifecycles.

Karen admits that additional process requirements contain the risk of trumping value.  MRM should aim to maintain high standards while not get overwhelmed by the process itself, so that some ideas become too expensive to implement.  There is also the challenge of maintaining independence of the MV team.

Jon concurred with Karen on the importance of maintaining independence.  A common experience is when validators find mistakes in the models, they become drawn into the development process with the modelers.  He also notes differences with the US, UK, and European MV processes, and Jon asserts his view that the US is ahead of the curve and setting standards.

Colin noted the issue of the lack of an analogous PRA document to SR 11-7, that drills down into nuts and bolts of the challenges in MRM.  He also concurred on the difficulty of maintaining independence, particularly in areas with no established governance.  It is important to get model developers to talk to other developers about the definition and scope of the models, as well as possible expansion of scope.  There is a wide gamut of models: core, pricing, risk, vendor, sensitivity, scenarios, etc.  Who is responsible for validating which?  Who checks on the calibration, tolerance, and weights of the models?  These are important questions to address.

Craig commented further on the complexity and uncertainty of defining what a model is, and on whose job it is to determine that, amongst the different stakeholders.  It also needs to be taken into consideration that model developers maybe biased towards limiting the number of models.

Gagan followed up by noting that while the generic definition of models is broad, and will need to be redefined, but analytics do not all need to have the same standards, the definition should leave some flexibility for context.  Also, the highest standard should be assigned to risk models.

Karen adds that, defining and validating models used to have a narrow focus, and done in a tailor-controlled environment.  It would be better to broaden the scope, and to reexamine the question on an ongoing basis (it is however important to point out that annual review does not equal annual re-validation).  In addition to the primary models, some challenge models also need to be supported; developers should discuss why they’re happy with primary model, how it is different from challenger model, and how it impacts output.

Colin brought up the point of stress-testing.  Jon asserts that stress-testing is more important for stochastic models, which are more likely to break under nonsensical inputs.  Also any model that plugs into the risk system should require judicious decision-making, as well as annual reviews to look at changes since the previous review.

Colin also brought up the topic of change management: what are the system challenges when model developers release code, which may include experimental releases.  Often discussed are concepts of annual certification and checkpoints.  Jon commented that the focus should be on changes of 5% or more, with pricing model being less of a priority; and firms should move towards centralized source code depositories.

Karen also added the question of what ought to considered material change: the more conservative answer is any variation, even if a pure code change that didn’t change model usage or business application, may need to be communicated to upper management.

Colin noted that developers often have a tendency to encapsulate intentions, and have difficulty or reluctance to document changes, thus resulting in many grey areas.  Gagan added that infrastructure is crucial.  Especially when market conditions are rapidly changing, MRM need to have controls that are in place.  Also, models are in Excel make the change management process more difficult.  

The panel discussion was followed by a lively Q&A session with an engaged audience, below are some highlights.

Q:  How do you distinguish between a trader whose model actually needs change, versus a trader who is only saying so because he/she has lost money?

Colin:  Maintain independent price verification and control functions.

Craig:  Good process for model change, and identify all stakeholders.

Karen:  Focus on what model outputs are being changed, what the trader’s assumptions are, and what is driving results.

Q:  How do you make sure models are used in business in a way that makes sense?

Colin:  This can be difficult, front office builds the models, states what is it good for, there is no simple answer from the MV perspective; usage means get as many people in the governance process as possible, internal audit and setting up controls.

Gagan:  Have coordination with other functions, holistic MRM.

Karen:  Need structure, inventory a useful tool for governance function.

Q:  Comments on models used in the insurance industry?

Colin:  Very qualitative, possible to give indications, difficult to do exact quantitative analysis, estimates are based on a range of values.  Need to be careful with inputs for very complex models, which can be based on only a few trades.

Q:  What to do about big shocks in CCAR?

Jon:  MV should validate for severe shocks, and if model fails may need only simple solution.

Karen:  Validation tools, some backtesting data, need to benchmark, quant element of stress testing need to substantiated and supported by qualitative assessment.

Q:  How to deal with vendor models?

Karen:  Not acceptable just to say it’s okay as long as the vendor is reputable, want to see testing done, consider usage also compare to original intent.

Craig:  New guidance makes it difficult to buy vendors models, but if vendor recognizes this, this will give them competitive advantage.

Q:  How to define independence for medium and small firms?

Colin:  Be flexible with resources, bring in different people, get feedback from senior management, and look for consistency.

Jon:  Hire E&Y?  There is never complete independence even in a big bank.

Gagan:  Key is the review process.

Karen:  Consultants could be cost effective; vendor validation may not be enough.

Q:  At firm level, do you see practice of assessing risk models?

Jon:  Large bank should appoint Model Risk Officer.

Karen:  Just slapping on additional capital is not enough

Q:  Who actually does MV?

Colin:  First should be user, then developer, 4 eyes principle.

Q:  Additional comments on change management?

Colin:  Ban Excel for anything official; need controlled environment.

 

Posted by Brian Sentance | 23 October 2013 | 8:56 pm


Credit Risk: Default and Loss Given Default from PRMIA

Great event from PRMIA on Tuesday evening of last week, entitled Credit Risk: The link between Loss Given Default and Default. The event was kicked off by Melissa Sexton of PRMIA, who introduced Jon Frye of the Federal Reserve Bank of Chicago. Jon seems to an acknowledged expert in the field of Loss Given Default (LGD) and credit risk modelling. I am sure that the slides will be up on the PRMIA event page above soon, but much of Jon's presentation seems to be around the following working paper. So take a look at the paper (which is good in my view) but I will stick to an overview and in particular any anecdotal comments made by Jon and other panelists.

Jon is an excellent speaker, relaxed in manner, very knowledgeable about his subject, humourous but also sensibly reserved in coming up with immediate answers to audience questions. He started by saying that his talk was not going to be long on philosophy, but very pragmatic in nature. Before going into detail, he outlined that the area of credit risk can and will be improved, but that this improvement becomes easier as more data is collected, and inevitably that this data collection process may need to run for many years and decades yet before the data becomes statistically significant. 

Which Formula is Simpler? Jon showed two formulas for estimating LGD, one a relatively complex looking formula (the Vasicek distribution mentioned his working paper) and the other a simple linear model of the a + b.x. Jon said that looking at the two formulas, then many would hope that the second formula might work best given its simplicity, but he wanted to convince us that the first formula was infact simpler than the second. He said that the second formula would need to be regressed on all loans to estimate its parameters, whereas the first formula depended on two parameters that most banks should have a fairly good handle on. The two parameters were Default Rate (DR) and Expected Loss (EL). The fact that these parameters were relatively well understood seemed to be the basis for saying the first formula was simpler, despite its relative mathematical complexity. This prompted an audience question on what is the difference between Probability of Default (PD) and Default Rate (DR). Apparently it turns out PD is the expected probability of default before default happens (so ex-ante) and DR is the the realised rate of default (so ex-post). 

Default and LGD over Time. Jon showed a graph (by an academic called Altman) of DR and LGD over time. When the DR was high (lots of companies failing, in a likely economic downtown) the LGD was also perhaps understandably high (so high number of companies failing, in an economic background that is both part of the causes of the failures but also not helping the loss recovery process). When DR is low, then there is a disconnect between LGD and DR. Put another way, when the number of companies failing is low, the losses incurred by those companies that do default can be high or low, there is no discernable pattern. I guess I am not sure in part whether this disconnect is due to the smaller number of companies failing meaning the sample space is much smaller and hence the outcomes are more volatile (no averaging effect), or more likely that in healthy economic times the loss given a default is much more of random variable, dependent on the defaulting company specifics rather than on general economic background.

Conclusions Beware: Data is Sparse. Jon emphasised from the graph that the Altman data went back 28 years, of which 23 years were periods of low default, with 5 years of high default levels but only across 3 separate recessions. Therefore from a statistical point of view this is very little data, so makes drawing any firm statistical conclusions about default and levels of loss given default very difficult and error-prone. 

The Inherent Risk of LGD. Jon here seemed to be focussed not on the probability of default, but rather on the conditional risk that once a default has occurred then how does LGD behave and what is the risk inherent from the different losses faced. He described how LGD affects i) Economic Capital - if LGD is more variable, then you need stronger capital reserves, ii) Risk and Reward - if a loan has more LGD risk, then the lender wants more reward, and iii) Pricing/Valuation - even if the expected LGD of two loans is equal, then different loans can still default under different conditions having different LGD levels.

Models of LGD

Jon showed a chart will LGC plotted against DR for 6 models (two of which I think he was involved in). All six models were dependent on three parameters, PD, EL and correlation, plus all six models seemed to produce almost identical results when plotted on the chart. Jon mentioned that one of his models had been validated (successfully I think, but with a lot of noise in the data) against Moody's loan data taken over the past 14 years. He added that he was surprised that all six models produced almost the same results, implying that either all models were converging around the correct solution or in total contrast that all six models were potentially subject to "group think" and were systematically all wrong in the ways the problem should be looked at.

Jon took one of his LGD models and compared it against the simple linear model, using simulated data. He showed a graph of some data points for what he called a "lucky bank" with the two models superimposed over the top. The lucky bit came in since this bank's data points for DR against LGD showed lower DR than expected for a given LGD, and lower LGD for a given DR. On this specific case, Jon said that the simple linear model fits better than his non-linear one, but when done over many data sets his LGD model fitted better overall since it seemed to be less affected by random data.

There were then a few audience questions as Jon closed his talk, one leading Jon to remind everyone of the scarcity of data in LGD modelling. In another Jon seemed to imply that he would favor using his model (maybe understandably) in the Dodd-Frank Annual Stress Tests for banks, emphasising that models should be kept simple unless a more complex model can be justified statistically. 

Steve Bennet and the Data Scarcity Issue 

Following Jon's talk, Steve Bennet of PECDC picked on Jon's issue of scare data within LGD modelling. Steve is based in the US, working for his organisation PECDC which is a cross border initiative to collect LGD and EAD (exposure at default) data. The basic premise seems to be that in dealing with the scarce data problem, we do not have 100 years of data yet, so in the mean time lets pool data across member banks and hence build up a more statistically significant data set - put another way: let's increase the width of the dataset if we can't control the depth. 

PECDC is a consortia of around 50 organisations that pool data relating to credit events. Steve said that capture data fields per default at four "snapshot" times: orgination, 1 year prior to default, at default and at resolution. He said that every bank that had joined the organisation had managed to improve its datasets. Following an audience question, he clarified that PECDC does not predict LGD with any of its own models, but rather provides the pooled data to enable the banks to model LGD better. 

Steve said that LGD turns out to be very different for different sectors of the market, particularly between SMEs and large corporations (levels of LGD for large corporations being more stable globally and less subject to regional variations). But also there is great LGD variation across specialist sectors such as aircraft finance, shipping and project finance. 

Steve ended by saying that PECDC was orginally formed in Europe, and was now attempting to get more US banks involved, with 3 US banks already involved and 7 waiting to join. There was an audience question relating to whether regulators allowed pooled data to be used under Basel IRB - apparently Nordic regulators allow this due to needing more data in a smaller market, European banks use the pooled data to validate their own data in IRB but in the US banks much use their own data at the moment.

Til Schuermann

Following Steve, Til Schuermann added his thoughts on LGD. He said that LGD has a time variation and is not random, being worse in recession when DR is high. His stylized argument to support this was that in recession there are lots of defaults, leading to lots of distressed assets and that following the laws of supply and demand, then assets used in recovery would be subject to lower prices. Til mentioned that there was a large effect in the timing of recovery, with recovery following default between 1 and 10 quarters later. He offered words of warning that not all defaults and not all collateral are created equal, emphasising that debt structures and industry stress matter. 

Summary

The evening closed with a few audience questions and a general summation by the panelists of the main issues of their talks, primarily around models and modelling, the scarcity of data and how to be pragmatic in the application of this kind of credit analysis. 

 

 

Related articles

Posted by Brian Sentance | 21 October 2013 | 10:24 am


And the winner of the Best Risk Data Management and Analytics Platform is...

...Xenomorph!!! Thanks to all who voted for us in the recent A-Team Data Management Awards, it was great to win the award for Best Risk Data Management and Analytics Platform. Great that our strength in the Data Management for Risk field is being recognised, and big thanks again to clients, partners and staff who make it all possible!

Please also find below some posts for the various panel debates at the event:

 Some photos, slides and videos from the event are now available on the A-Team site.

 

Posted by Brian Sentance | 9 October 2013 | 11:07 am


#DMSLondon - Managed Services and the Utility Model

Andrew Delaney introduced the final panel of the day, involving Steve Cheng of Rimes, Jonathan Clark of Tech Mahindra, Tom Dalglish of UBS and Martijn Groot of Euroclear. Main points:

  • Andrew started by asking the panel for their definitions of managed data services and data utilities
  • Martijn said that a managed data service was usually the lifting out of a data process from in a company to be run by somebody else whereas a data utility had many users.
  • Tom put it another way saying that a managed service was run for you whereas a utility was run for them. Tom suggested that there were some concerns around data utilities for the industry in terms of knowing/being transparent about data vendor affinity and any data monopoly aspects.
  • When asked why past attempts at data utilities had failed, Tom said that it must be frustrating to be right but at wrong time, but in addition to the timing being right just now (costs/regulations being drivers) then the tech stack available is better and the appreciation of data usage importance is clearer.
  • Steve added a great point on the tech stack, in that it now made mass customisation much easier.
  • Jonathan made the point that past attempts at data utilities were built on product platforms used at clients, whereas the latest utilities were built on platforms specifically designed for use by a data utility.
  • Looking at the cost savings of using a data utility, Martijn said that the industry spends around $16-20B on data, and that with his Euroclear data utility they can serve 2000 clients with a staff level that is less than any one client employs directly.
  • Tom said that the savings from collapsing the data silos were primarily from more efficient/reduced usage of people and hardware to perform a specific function, and not data.
  • Steve suggested that some utilities take an incremental data services and not take all data as in the old utility model, again coming back to his earlier point of mass customisation.
  • Tom mentioned it was a bit like cable TV, where you can subscribe to a set of services of your choice but where certain services cost more than others.
  • Martijn said that there were too many vested interests to turn data costs around quickly. He said that data utilities could go a long way however. 
  • Tom concluded by saying that it was about content not feeds, licensing was important as was how to segregate data.

Good panel - additionally one final audience question/discussion was around data utilities providing LEI data, and it was argued that LEI without the hierarchy is just another set of data to map and manage. 

 

Posted by Brian Sentance | 7 October 2013 | 11:28 am


#DMSLondon - The Chief Data Officer Challenge

The first panel of the afternoon touched on a hot topic at the moment, the role of the Chief Data Officer (CDO). Andrew Delaney again moderated the panel, consisting of Rupert Brown of UBS, Patrick Dewald of Diaku, Colin Hall of Credit Suisse, Nigel Matthews of Barclays and Neill Vanlint of GoldenSource. Main points:

  • Colin said that the need for the CDO role is that someone needs to sit at the top table who is both nerdy about data but also can communicate a vision for data to the CEO.
  • Rupert said that role of CDO was still a bit nebulous covering data conformance, storage management, security and data opportunity (new functionality and profit). He suggested this role used to be called "Data Stewardship" and that the CDO tag is really a rename.
  • Colin answered that the role did use to be a junior one, but regulation and the rate of industry change demands a CDO, a point contact for everyone when anything comes up that concerns data - previously nobody knew quite who to speak to on this topic.
  • Patrick suggested that a CDO needs a long-term vision for data, since the role is not just an operational one. 
  • Nigel pointed out that the CDO needs to cover all kinds of data and mentioned recent initiatives like BCBS with their risk data aggregation paper.
  • Neil said that he had seen the use of a CDO per business line at some of his clients.
  • There was some conversation around the different types of CDO and the various carrots and sticks that can be employed. Neil made the audience laugh with his quote from a client that "If the stick doesn't work, I have a five-foot carrot to hit them with!"
  • Patrick said that CDO role is about business not just data.
  • Colin picked up on what Patrick said and illustrated this with an example of legal contract data feeding directly into capital calculations.
  • Nigel said that the CDO is a facilitator with all departments. He added that the monitoring tools from market data where needed in reference data

Overall good debate, and I guess if you were starting from scratch (if only we could!) you would have to think that the CDO is a key role given the finance industry is primarily built on the flow of data from one organisation to another.

 

 

Related articles

Posted by Brian Sentance | 7 October 2013 | 11:26 am


#DMSLondon - Big Data, Cloud, In-Memory

Andrew Delaney introduced the second panel of the day, with the long title of "The Industry Response: High Performance Technologies for Data Management - Big Data, Cloud, In-Memory, Meta Data & Big Meta Data". The panel included Rupert Brown of UBS, John Glendenning of Datastax, Stuart Grant of SAP and Pavlo Paska of Falconsoft. Andrew started the panel by asking what technology challenges the industry faced:

  • Stuart said that risk data on-demand was a key challenge, that there was the related need to collapse the legacy silos of data.
  • Pavlo backed up Stuart by suggesting that accuracy and consistency were needed for all live data.
  • Rupert suggested that there has been a big focus on low latency and fast data, but raised a smile from the audience when he said that he was a bit frustrated by the "format fetishes" in the industry. He then brought the conversation back to some fundamentals from his viewpoint, talking about wholeness of data and namespaces/data dictionaries - Rupert said that naming data had been too stuck in the functional area and not considered more in isolation from the technology.
  • John said that he thought there were too many technologies around at the moment, particularly in the area of Not Only SQL (NoSQL) databases. John seemed keen to push NoSQL, and in particular Apache Cassandra, as post relational databases. He put forward that these technologies, developed originally by the likes of Google and Yahoo, were the way forward and that in-memory databases from traditional database vendors were "papering over the cracks" of relational database weaknesses.
  • Stuart countered John by saying that properly designed in-memory databases had their place but that some in-memory databases had indeed been designed to paper over the cracks and this was the wrong approach, exascerbating the problem sometimes.
  • Responding to Andrew's questions around whether cloud usage was more accepted by the industry than it had been, Rupert said he thought it was although concerns remain over privacy and regulatory blockers to cloud usage, plus there was a real need for effective cloud data management. Rupert also asked the audience if we knew of any good release management tools for databases (controlling/managing schema versioning etc) because he and his group were yet to find one. 
  • Rupert expressed that Hadoop 2 was of more interest to him at UBS that Hadoop, and as a side note mentioned that map reduce was becoming more prevalent across NoSQL not just within the Hadoop domain. Maybe controversially, he said that UBS was using less data than it used to and as such it was not the "big data" organisation people might think it to be. 
  • As one example of the difficulties of dealing with silos, Stuart said that at one client it required the integration of data from 18 different system to a get an overall view of the risk exposure to one counterparty. Stuart advocated bring the analytics closer to the data, enabling more than one job to be done on one system.
  • Rupert thought that Goldman Sachs and Morgan Stanley seem to do what is the right thing for their firm, laying out a long-term vision for data management. He said that a rethink was needed at many organisations since fundamentally a bank is a data flow.
  • Stuart picked up on this and said that there will be those organisations that view data as an asset and those that view data as an annoyance.
  • Rupert mentioned that in his view accountants and lawyers are getting in the way of better data usage in the industry.
  • Rupert added that data in Excel needed to passed by reference and not passed by value. This "copy confluence" was wasting disk space and a source of operational problems for many organisations (a few past posts here and here on this topic).
  • Moving on to describe some of the benefits of semantic data and triple stores, Rupert proposed that the statistical world needed to be added to the semantic world to produce "Analytical Semantics" (see past post relating to the idea of "analytics management").

Great panel, lots of great insight with particularly good contributions from Rupert Brown.

Posted by Brian Sentance | 7 October 2013 | 11:23 am


#DMSLondon - What Will Drive Data Management?

The first panel of the day opened with an introductory talk by Chris Johnson of HSBC. Chris started his talk by proudly announcing that he drives a Skoda car, something that to him would have been unthinkable 25 years ago but with investment, process and standards things can and will change. He suggested that data management needs to go through a similar transformation, but that there remained a lot to be done. 

Moving on to the current hot topics of data unitilities and managed services, he said that reduced costs of managed services only became apparent in the long term and that both types of initiative have historically faced issues with:

  • Collaboration
  • Complexity
  • Logistical Challenges and Risks

Chris made the very good point that until service providers accept liability for data quality then this means that clients must always check the data they use. He also mentioned that in relation to Solvency II (a hot topic for Chris at HSBC Security Services), that EIOPA had recently mentioned that managed services may need to be regulated. Chris mentioned the lack of time available to respond to all the various regulatory deadlines faced (a recurring theme) and that the industry still lacked some basic fundamentals such as a standard instrument identifier.

Chris then joined the panel discussion with Andrew Delaney as moderator and with other panelists including Colin Gibson (see previous post), Matt Cox of Denver Perry, Sally Hinds of Data Management Consultancy Services and Robert Hofstetter of Bank J. Safra Sarasin. The key points I took from the panel are outlined below:

  • Sally said that many firms were around Level 3 in the Data Management Maturity Model, and that many were struggling particularly with data integration. Sally added that utililities were new, as was the CDO role and that implications for data management were only just playing out.
  • Matt thought that reducing cost was an obvious priority in the industry at the moment, with offshoring playing its part but progress was slow. He believed that data management remains underdeveloped with much more to be done.
  • Colin said that organisations remain daunted by their data management challenges and said that new challenges for data management with transactional data and derived data.
  • Sally emphasised the role of the US FATCA regulation and how it touches upon some many processess and departments including KYC, AML, Legal, Tax etc.
  • Matt highlighted derivatives regulation with the current activity in central clearing, Dodd-Frank, Basel III and EMIR.
  • Chris picked up on this and added Solvency II into the mix (I think you can sense regulation was a key theme...). He expressed the need and desirability of a Unique Product Identifier (UPI see report) as essential for the financial markets industry and how we need not just stand still now the LEI was coming. He said that industry associations really needed to pick up their game to get more standards in place but added that the IMA had been quite proactive in this regard. He expressed his frustration at current data licensing arrangements with data vendors, with the insistence on a single point of use being the main issue (big problem if you are in security services serving your clients I guess)
  • Robert added that his main issues were data costs and data quality
  • Andrew then brought the topic around to risk management and its impact on data management.
  • Colin suggested that more effort was needed to understand the data needs of end users within risk management. He also mentioned that products are not all standard and data complexity presents problems that need addressing in data management.
  • Chris mentioned that there 30 data fields used in Solvency II calculations and that if any are wrong this would have a direct impact on the calcualated capital charge (i.e. data is important!)
  • Colin got onto the topic of unstructured data and said how it needed to be tagged in some way to become useful. He suggested that there was an embrionic cross-over taking place between structured and unstructured data usage.
  • Sally thought that the merging of Business Intelligence into Data Management was a key development, and that if you have clean data then use it as much as you can.
  • Robert thought that increased complexity in risk management and elsewhere should drive the need for increased automation.
  • Colin thought cost pressures mean that the industry simply cannot afford the old IT infrastructure and that architecture needs to be completely rethought.
  • Chris said that we all need to get the basics right, with LEI but then on to UPI. He said to his knowledge data management will always be a cost centre and standardisation was a key element of reducing costs across the industry.
  • Sally thought that governance and ownership of data was wooly at many organisations and needed more work. She added this needed senior sponsorship and that data management was an ongoing process, not a one-off project.
  • Matt said that the "stick" was very much needed in addition to the carrot, advising that the proponents of improved data management should very much lay out the negative consequences to bring home the reality to business users who might not see the immediate benefits and costs.

Overall good panel, lots of good debate and exchanging of ideas.

 

Posted by Brian Sentance | 7 October 2013 | 11:17 am


#DMSLondon - Data Architecture: Sticks or Carrots?

Great day on Thursday at the A-Team Data Management Summit in London (personally not least because Xenomorph won the Best Risk Data Management/Analytics Platform Award but more of that later!). The event kicked off with a brief intro from Andrew Delaney of the A-Team talking through some of the drivers behind the current activity in data management, with Andrew saying that risk and regulation were to the fore. Andrew then introduced Colin Gibson, Head of Data Architecture, Markets Division at Royal Bank of Scotland.

Data Architecture - Sticks or Carrots? Colin began by looking at the definition of "data architecture" showing how the definition on Wikipedia (now obviously the definitive source of all knowledge...) was not particularly clear in his view. He suggested himself that data architecture is composed of two related frameworks:

  • Orderly Arrangement of Parts
  • Discipline 

He said that the orderly arrangement of parts is focussed on business needs and aims, covering how data is sourced, stored, referenced, accessed, moved and managed. On the discipline side, he said that this covered topics such as rules, governance, guides, best practice, modelling and tools.

Colin then put some numbers around the benefits of data management, saying that for every dollar spend on centralising data saves 20 dollars, and mentioning a resulting 80% reduction in operational costs. Related to this he said that for every dollar spent on not replicating data saved a dollar on reconcilliation tools and a further dollar saved on the use of reconcilliation tools (not sure how the two overlap but these are obviously some of the "carrots" from the title of the talk). 

Despite these incentives, Colin added that getting people to actually use centralised reference data remains a big problem in most organisations. He said he thought that people find it too difficult to understand and consume what is there, and faced with a choice they do their own thing as an easier alternative. Colin then talked about a program within RBS called "GoldRush" whereby there is a standard data management library available to all new projects in RBS which contains:

  • messaging standards
  • standard schema
  • update mechanisms

The benefit being that if the project conforms with the above standards then they have little work to do for managing reference data since all the work is done once and centrally. Colin mentioned that also there needs to be feedback from the projects back to central data management team around what is missing/needing to be improved in the library (personally I would take it one step further so that end-users and not just IT projects have easy discovery and access to centralised reference data). The lessons he took from this were that we all need to "learn to love" enterprise messaging if we are to get to the top down publish once/consume often nirvana, where consuming systems can pick up new data and functionality without significant (if any) changes (might be worth a view of this post on this topic). He also mentioned the role of metadata in automating reconcilliation where that needed to occur.

Colin then mentioned that allocation of costs of reference data to consumers is still a hot topic, one where reference data lags behind the market data permissioning/metering insisted upon by exchanges. Related to this Colin thought that the role of the Chief Data Officer to enforce policies was important, and the need for the role was being driven by regulation. He said that the true costs of a tactical, non-standard approach need to be identifiable (quantifying the size of the stick I guess) but that he had found it difficult to eliminate the tactical use of pricing data sourced for the front office. He ended by mentioning that there needs to be a coming together of market data and reference data since operations staff are not doing quantitative valuations (e.g. does the theoretical price of this new bond look ok?) and this needs to be done to ensure better data quality and increased efficiency (couldn't agree more, have a look at this article and this post for a few of my thoughts on the matter). Overall very good speaker with interesting, practical examples to back up the key points he was trying to get across. 

 

Posted by Brian Sentance | 7 October 2013 | 11:12 am


Macro Stress Testing

Great event from PRMIA on Macro Stress Testing at Moody's last night. A few quick highlights:

  • The role of the regulators is now not only to be sure that banks have enough capital to withstand a severe downtown, but that the banks have enough capital once the downturn has happened.
  • The Fed have a new whitepaper coming out in July on "Effective Capital Adequacy Process" that covers 7 different aspects from risk management foundations through to governance.
  • CCAR stress tests are thought by regulators to be easier to understand (e.g. this happens we get this loss) rather VAR/risk sensitivities that do not capture tail risk.
  • Hedges that do not behave as hedges under times of stress are a major area of concern.
  • Assumptions of the stress tests such as the second half of 2008 occuring instantaneously to the trading book is not reasonable/representative but hard to come up with credible/pragmatic alternatives.
  • One of the speakers put forward the following lists of positives about the stress tests:
    • Restoration of market/public confidence in banks
    • Determination of the appropriate levels of capital adequacy
    • Understanding of risk profile
    • Identification of tail risks
    • Curbing of risk taking
    • Incentivising behaviours
  • Whilst banks and regulators are often in conflict over capital adequacy, banks do implement their own internal stress tests and do have a commercial interest in doing this well.
  • One panelist said that "the best hedge is to sell"
  • Some banks have switched accountancy standards to game capital requirements, and there was some later debate that Risk Weighted Assets were a controversial part of the calculations when analyzed against the NYU Stern V-Lab stress testing.
  • There is a danger that CCAR and stress testing drives or becomes an industry in itself, which is not good for markets, the banking system or the economy as a whole.
  • There was some debate about liquidity risk as it relates to solvency, and that it should be much more integrated with the stress tests. The panel expressed interest at the forthcoming CLAR stress tests and how it relates to CCAR.
  • The panel thought that the Federal Reserve is effectively challenging each bank to understand its own balance sheet better than the Fed can.
  • Given the state of systems and data management at many banks, this was a big challenge.
  • The panel thought that more open access to the data regulators are collecting would be great for academics to analyze given some of the big data technologies available to analyze such large datasets.
  • One speaker put forward that only a subidized industry such as banking could an industry afford to treat data so poorly. 

Great event, knowledgeable speakers with strong opinions and good wine/food afterwards (thanks Moody's!).

Related articles

Posted by Brian Sentance | 26 June 2013 | 3:51 pm


Matthew Berry on the Libor/OIS curve debate

Guest post today from Matthew Berry of Bedrock Valuation Advisors, discussing Libor vs OIS based rate benchmarks. Curves and curve management are a big focus for Xenomorph's clients and partners, so great that Matthew can shed some further light on the current debate and its implications:

New Benchmark Proposal’s Significant Implications for Data Management

During the 2008 financial crisis, problems posed by discounting future cash flows using Libor rather than the overnight index swap (OIS) rate became apparent. In response, many market participants have modified systems and processes to discount cash flows using OIS, but Libor remains the benchmark rate for hundreds of trillions of dollars worth of financial contracts. More recently, regulators in the U.S. and U.K. have won enforcement actions against several contributors to Libor, alleging that these banks manipulated the benchmark by contributing rates that were not representative of the market, and which benefitted the banks’ derivative books of business.

In response to these allegations, the CFTC in the U.S. and the Financial Conduct Authority (FCA) in the U.K. have proposed changes to how financial contracts are benchmarked and how banks manage their submissions to benchmark fixings. These proposals have significant implications for data management.

The U.S. and U.K. responses to benchmark manipulation

In April 2013, CFTC Chairman Gary Gensler delivered a speech in London in which he suggested that Libor should be retired as a benchmark. Among the evidence he cited to justify this suggestion:

-          Liquidity in the unsecured inter-dealer market has largely dried up.

-          The risk implied by contributed Libor rates has historically not agreed with the risk implied by credit default swap rates. The Libor submissions were often stale and did not change, even if the entity’s CDS spread changed significantly. Gensler provided a graph to demonstrate this.

Gensler proposed to replace Libor with either the OIS rate or the rate paid on general collateral repos. These instruments are more liquid and their prices more readily-observable in the market. He proposed a period of transition during which Libor is phased out while OIS or the GC repo rate is phased in.

In the U.K., the Wheatley Report provided a broad and detailed review of practices within banks that submit rates to the Libor administrator. This report found a number of deficiencies in the benchmark submission and calculation process, including:

-          The lack of an oversight structure to monitor systems and controls at contributing banks and the Libor administrator.

-          Insufficient use of transacted or otherwise observable prices in the Libor submission and calculation process.

The Wheatley Report called for banks and benchmark administrators to put in place rigorous controls that scrutinize benchmark submissions both pre and post publication. The report also calls for banks to store an historical record of their benchmark submissions and for benchmarks to be calculated using a hierarchy of prices with preference given to transacted prices, then prices quoted in the market, then management’s estimates.

Implications for data management

The suggestions for improving benchmarks made by Gensler and the Wheatley Report have far-reaching implications for data management.

If Libor and its replacement are run in parallel for a time, users of these benchmark rates will need to store and properly reference two different fixings and forward curves. Without sufficiently robust technology, this transition period will create operational, financial and reputational risk given the potential for users to inadvertently reference the wrong rate. If Gensler’s call to retire Libor is successful, existing contracts may need to be repapered to reference the new benchmark. This will be a significant undertaking. Users of benchmarks who store transaction details and reference rates in electronic form and manage this data using an enterprise data management platform will mitigate risk and enjoy a lower cost to transition.

Within the submitting banks and the benchmark administrator, controls must be implemented that scrutinize benchmark submissions both pre and post publication. These controls should be exceptions-based and easily scripted so that monitoring rules and tolerances can be adapted to changing market conditions. Banks must also have in place technology that defines the submission procedure and automatically selects the optimal benchmark submission. If transacted prices are available, these should be submitted. If not, quotes from established market participants should be submitted. If these are not available, management should be alerted that it must estimate the benchmark rate, and the decision-making process around that estimate should be documented.

Conclusion

These improvements to the benchmark calculation process will, in Gensler’s words, “promote market integrity, as well as financial stability.” Firms that effectively utilize data management technology, such as Xenomorph's TimeScape, to implement these changes will manage the transition to a new benchmark regime at a lower cost and with a higher likelihood of success.

 


Related articles

Posted by Brian Sentance | 25 June 2013 | 12:32 pm


Unifying Risk with Numerix, Tabb and Microsoft

Numerix ran a great event on Thursday morning over at Microsoft's offices here in New York. "The Road to Achieving a Unified View of Risk" was introduced by Paul Rowady of the TABB Group. As at our holiday event last December, Paul is a great speaker and trying to get him to stop talking is the main (positive) problem of working with him (his typical ebullience was also heightened by his appearance in the Wall Street Journal on Thursday, apparently involving nothing illegal he assured me and even about which his mother phoned him during his presentation...).  Paul started by saying that in their end of year review with his colleagues Larry Tabb and Adam Sussman, he suggested that Tabb Group needed to put more into developing the risk management thought leadership, which had led to today's introduction and the work Tabb Group have been doing with Numerix.

Having been involved in financial markets in Chicago, Paul is very bullish about the risk management capabilities of the funds and prop trading shops of the exchange traded options markets from days of old, and said that these risk management capabilities are now needed and indeed coming to the mainstream financial markets. Put another way, post crisis the need for a holistic view on risk has never been stronger. Considering bilateral OTC derivatives and the move towards central clearing, Paul said that he had been thinking that calculations such as CVA would eventually become as extinct as a dodo. However on using some data from the DTCC trade repository, he found that there are still some $65trillion notional of uncleared bilateral trades in the market, and that these will take a further 30 years to expire. Looking at swaptions alone the notional uncleared was $6trillion, and so his point was that bilateral OTC and their associated risks will be around for some time yet.

Paul put forward some slides showing back, middle and front-offices along different siloed business lines, and explained that back in the day when margins were fat and times were good, each unit could be run independently, with no overall view of risk possible given the range of siloed systems and data. In passing Paul also mentioned that one bank he had spoken two had 6,000 separate systems to support on just the banking side, let alone capital markets. Obviously post crisis this has changed, with pressures to reduce operational costs being a key driver at many institutions, and currently only valuation/reference data (+2.4%) and risk management (+1.2%) having increased budget spend across the market in 2013. Given operational costs and regulation such as CVA, risk management is having to move from being an end of day, post-trade process to being pre- and post-trade at intraday frequency. Paul said that not only must consistent approaches to data and analytics be taken across back, middle and front office in each business unit but now an integrated view of risk across business units must be taken (echos of an earlier event with Numerix and PRMIA). Considering consistent analytics, Paul mentioned his paper "The Risk Analytics Library" but suggested that "libraries" of everything were needed, so not just analytics, but libraries of data (data management anyone?), metadata, risk models etc.

Paul asked Ricardo Martinez of Deloite for an update on the regulatory landscape at the moment, and Ricardo responded by focusing down on the derivatives aspects Dodd-Frank. He first pointed out that even after a number of years the regulation was not yet finalized around collateral and clearing. A good point he made was that whilst the focus in the market at the moment is on compliance, he feels that the consequences of the regulation will ripple on over the next 5 years in terms of margining and analytics.

Some panel members disagreed with Paul over the premise that bilateral exotic trades will eventually disappear. Their point was that the needs of pension funds and other clients are very specific and there will always be a need for structured products, despite the capital cost incentives to move everything onto exchanges/clearing. Paul countered by saying that he didn't disagree with this, but the reason for suggesting that the exotics industry may die is trying to find institutions that can warehouse the risk of the trade. 

Satyam Kancharla of Numerix spoke next. Satyam said that two main changes struck him in the market at the moment. One was the adjustment to a mandated market structure with clearing, liquidity and capital changes coming through from the regulators. The other was increased operating efficiency for investment banks. Whilst it is probable that no in investment bank would ever get to the operational efficiency of a retail business like Walmart, this was however the direction of travel with banks looking at how to optimize collateral, optimize trading venues etc.

Satyam put forward that computing power is still adhering to Moore's law, and that as a result some things are possible now that were not before, and that a centralized architecture built on this compute power is needed, but just because it is centralized does not mean that it is too inflexible to deal with each business units needs. Coming back to earlier comments made by the panel, he put forward that a lot of quants are involved in simply re-inventing the wheel, to which Paul added that quants were very experienced in using words like "orthogonal" to confuse mere mortals like him and justify the repetition of business functionality available already (from Numerix obviously, but more of that later). Satyam said that some areas of model development were more mature than others, and that quants should not engage in innovation for innovation's sake. Satyam also made a passing reference to the continuing use of Excel and VBA is the main tool of choice in the front office, suggesting that we still have some way to go in terms of IT maturity (hobby-horse topic of mine, for example see post). 

Prompt by an audience question around data and analytics, Ricardo said that the major challenge towards sharing data was not technical but cultural. Against a background were maybe 50% of investment in technology was regulation-related, he said that there were no shortage of business ideas for P&L in the emerging "mandated" markets of the future, but many of these ideas required wholesale shifts in attitudes at the banks in terms of co-operation across departments and from front to back office. 

Satyam said that he thought of data and analytics as two sides of the same coin (could not agree more, but then again I would say that) in that analytics generate derived data which needs just as much management as the raw data. He said that it should be possible to have systems and architectures that manage the duality of data and analytics well, and these architectures did not have to imply rigidity and inflexibility in meeting individual business needs. 

There was then some debate of trade repositories for derivatives, where the panel discussed the potential conflict between the US regulators wanting competition in this area, but as Paul suggested having competition between DTCC, ICE, Bloomberg, LCH Clearnet etc also led to fragmentation. As such Paul put it that the regulators would need to "boil the ocean" to understand the exposures in the market. Ricardo also mentioned some of the current controversy over who owns the data in the trade repository. One of the panelists suggested that we should also keep an eye open to China and not necessarily get totally tied up in what is happening in "our" markets. The main point was that a huge economy such as China's could not survive without a sophisticated capital market to support it, and that China was not asleep in this regard.

A good audience question came from Don Wesnofske who asked how best to cope with the situation where an institution is selling derivatives based on one set of models, and the client is using another set of models to value the same trade. So the selling institution decides to buy/build a similar model to the client too, and Don wondered how the single analytic library practically helped this situation where I could price on one model and report my P&L using another. One panelist responded that it was mostly the assumptions behind each model that determined differences in price, and that heterogenious models and hence prices where needed for a market to function correctly. Another concurred on this and suggested there needed to be an "officially blessed" model with an institution against which valuations are compared. Amusingly for the audience, Steve O'Hanlon (CEO of Numerix) piped up that the problem was easy to resolve in that everyone should use Numerix's models. 

Mike Opal of Microsoft closed the event with his presentation on data, analytics and cloud computing. Mike started by illustrating that the number of internet-enabled devices passed the human population of the world in 2008 and by 2020 the number of devices would be 50 billion. He showed that the amount of data in the world was 0.8ZB (zetabytes) in 2009, and is projected to reach 8ZB by 2015 and 35ZB by 2020, driven primarily by the growth in internet-enabled devices. Mike also said that the Prism project so in the news of late was involving the construction of a server fame near Salt Lake City of 5ZB in size, so what the industry (in this case the NSA) is trying to do is unimaginable if we were to go back only a few years. He said that Microsoft itself was utterly committed to cloud computing, with 8 datacenters globally but 20 more in construction, at a cost of $500million per center (I recently saw a datacentre in Redmond, totally unlike what I expected with racks pre-housed in lorry containers, and the containers just unloaded within a gigantic hanger and plugged in - the person showing me around asked me who the busiest person was a Microsoft data center and the answer was the truck drivers...)

Talking of "Big Data", he first gave the now-standard disclaimer (as I have I acknowledge) that he disliked the phrase. I thought he made a good point in the Big Data is really about "Small Data", in that a lot of it is about having the capacity to analyze at tiny granular level within huge datasets (maybe journalists will rename it? No, don't think so). He gave a couple of good client case studies, one for Westpac and one for Phoenix on uses of HPC and cloud computing in financial services. He also mentioned the Target retailing story about Big Data, which if you haven't caught it is worth a read. One audience question asked him again how committed Microsoft was to cloud computing given competition from Amazon, Apple and Google. Mike responded that he had only joined Microsoft a year or two back, and in part this was because he believed Microsoft had to succeed and "win" the cloud computing market given that cloud was not the only way to go for these competitors, whereas Microsoft (being a software company) had to succeed at cloud (so far Microsoft have been very helpful to us in relation to Azure, but I guess Amazon and others have other plans.)

In summary a great event from Numerix with good discussions and audience interaction - helped for me by the fact that much of what was said (centralization with flexibility, duality of data and analytics, libraries of everything etc) fits with what Xenomorph and partners like Numerix are delivering for clients. 

 

 

 

 

 

 

 

Related articles

Posted by Brian Sentance | 17 June 2013 | 7:23 pm


The Anthropology, Sociology, and Epistemology of Risk

Background - I went along to my first PRMIA event in Stamford, CT last night, with the rather grandiose title of "The Anthropology, Sociology, and Epistemology of Risk". Stamford is about 30 miles north of Manhattan and is the home to major offices of a number of financial markets companies such as Thomson Reuters, RBS and UBS (who apparently have the largest column-less trading floor in the world at their Stamford headquarters - particularly useful piece of trivia for you there...). It also happens to be about 5 minutes drive/train journey away from where I now live, so easy for me to get to (thanks for another useful piece of information I hear you say...). Enough background, more on the event which was a good one with five risk managers involved in an interesting and sometimes philosophical discussion on fundamentally what "risk management" is all about.

IntroductionMarc Groz who heads the Stamford Chapter of PRMIA introduced the evening and started by thanking Barry Schwimmer for allowing PRMIA to use the Stamford Innovation Centre (the Old Town Hall) for the meeting. Henrik Neuhaus moderated the panel, and started by outlining the main elements of the event title as a framework for the discussion:

  • Anthropology - risk management is to what purpose?
  • Sociology - how does risk management work?
  • Epistemology - what knowledge is really contained within risk management?

Henrik started by taking a passage about anthropology and replacing human "development" with "risk management" which seemed to fit ok, although the angle I was expecting was much more about human behaviour in risk management than where Henrik started. Henrik asked the panel what results they had seen from risk management and what did that imply about risk management? The panelists seemed a little confused or daunted by the question prompting one of them to ask "Is that the question?".

Business Model and Risk CultureElliot Noma dived in by responding that the purpose of risk management obviously depended very much on what are the institutional goals of the organization. He said that it was as much about what you are forced to do and what you try to do in risk management. Elliot said that the sell-side view of risk management was very regulatory and capital focused, whereas mutual funds are looking more at risk relative to benchmarks and performance attribution. He added that in the alternatives (hedge-fund) space then there were no benchmarks and the focus was more about liquidity and event risk.

Steve Greiner said that it was down to the investment philosophy and how risk is defined and measured. He praised some asset managers where the risk managers sit across from the portfolio managers and are very much involved in the decision making process.

Henrik asked the panel whether any of the panel had ever defined a “mission statement” for risk management. Marc Groz chipped in that he remember that he had once defined one, and that it was very different from what others in the institution were expecting and indeed very different from the risk management that he and his department subsequently undertook.

Mark Szycher (of GM Pension Fund) said that risk management split into two areas for him, the first being the symmetrical risks where you need to work out the range of scenarios for a particular trade or decision being taken. The second was the more asymmetrical risks (i.e. downside only) such as those found in operational risk where you are focused on how best to avoid them happening.

Micro Risk Done Well - Santa Federico said that he had experience of some of the major problems experienced at institutions such as Merrill Lynch, Salomen Brothers and MF Global, and that he thought risk management was much more of a cultural problem than a technical one. Santa said he thought that the industry was actually quite good at the micro (trade, portfolio) risk management level, but obviously less effective at the large systematic/economic level. Mark asked Santa what was the nature of the failures he had experienced. Santa said that the risks were well modeled, but maybe the assumptions around macro variables such as the housing market proved to be extremely poor.

Keep Dancing? - Henrik asked the panel what might be done better? Elliot made the point that some risks are just in the nature of the business. If a risk manager did not like placing a complex illiquid trade and the institution was based around trading in illiquid markets then what is a risk manager to do? He quote the Citi executive who said “ whilst the music is still playing we have to dance”. Again he came back to the point that the business model of the institution drives its cultural and the emphasis of risk management (I guess I see what Elliot was saying but taken one way it implied that regardless of what was going on risk management needs to fit in with it, whereas I am sure that he meant that risk managers must fit in with the business model mandated to shareholders).

Risk Attitudes in the USA - Mark said that risk managers need to recognize that the improbable is maybe not so improbable and should be more prepared for the worst rather than risk management under “normal” market and institutional behavior. Steven thought that a cultural shift was happening, where not losing money was becoming as important to an organization as gaining money. He said that in his view, Europe and Asia had a stronger risk culture than in the United States, with much more consensus, involvement and even control over the trading decisions taken. Put another way, the USA has more of a culture of risk taking than Europe. (I have my own theories on this. Firstly I think that the people are generally much more risk takers in the USA than in UK/Europe, possibly influenced in part by the relative lack of underlying social safety net – whilst this is not for everyone, I think it produces a very dynamic economy as a result. Secondly, I do not think that cultural desire in the USA for the much admired “presidential” leader necessarily is the best environment for sound, consensus based risk management. I would also like to acknowledge that neither of my two points above seem to have protected Europe much from the worst of the financial crisis, so it is obviously a complex issue!).

Slaves to Data? - Henrik asked whether the panel thought that risk managers were slaves to data? He expanded upon this by asking what kinds of firms encourage qualitative risk management and not just risk management based on Excel spreadsheets? Santa said that this kind of qualitative risk management occurred at a business level and less so at a firm wide level. In particular he thought this kind of culture was in place at many hedge funds, and less so at banks. He cited one example from his banking career in the 1980's, where his immediate boss was shouted off the trading floor by the head of desk, saying that he should never enter the trading floor again (oh those were the days...). 

Sociology and Credibility - Henrik took a passage on the historic development of women's rights and replaced the word "women" with "risk management" to illustrate the challenges risk management is facing with trying to get more say and involvement at financial institutions. He asked who should the CRO report to? A CEO? A CIO? Or a board member? Elliot responded by saying this was really a issue around credibility with the business for risk managers and risk management in general. He made the point that often Excel and numbers were used to establish credibility with the business. Elliot added that risk managers with trading experience obviously had more credibility, and to some extent where the CRO reported to was dependent upon the credibility of risk management with the business. 

Trading and Risk Management Mindsets - Elliot expanded on his previous point by saying that the risk management mindset thinks more in terms of unconditional distributions and tries to learn from history. He contrasted this with a the "conditional mindset' of a trader, where the time horizon forwards (and backwards) is rarely longer than a few days and the belief is strong that a trade will work today given it worked yesterday is high. Elliot added that in assisting the trader, the biggest contribution risk managers can make is more to be challenging/helpful on the qualitative side rather than just quantitative.

Compensation and Transactions - Most of the panel seemed to agree that compensation package structure was a huge influencer in the risk culture of an organisation. Mark touched upon a pet topic of mine, which is that it very hard for a risk manager to gain credibility (and compensation) when what risk management is about is what could happen as opposed to what did happen. A risk manager blocking a trade due to some potentially very damaging outcomes will not gain any credibility with the business if the trading outcome for the suggested trade just happened to come out positive. There seemed to be concensus here that some of the traditional compensation models that were based on short-term transactional frequency and size were ill-formed (given the limited downside for the individual), and whilst the panel reserved judgement on the effectiveness of recent regulation moves towards longer-term compensation were to be welcome from a risk perspective.

MF Global and Busines Models - Santa described some of his experiences at MF Global, where Corzine moved what was essentially a broker into taking positions in European Sovereign Bonds. Santa said that the risk management culture and capabilities were not present to be robust against senior management for such a business model move. Elliot mentioned that he had been courted for trades by MF Global and had been concerned that they did not offer electronic execution and told him that doing trades through a human was always best. Mark said that in the area of pension fund management there was much greater fidiciary responsibility (i.e. behave badly and you will go to jail) and maybe that kind of responsibility had more of a place in financial markets too. Coming back to the question of who a CRO should report to, Mark also said that questions should be asked to seek out those who are 1) less likely to suffer from the "agency" problem of conflicts of interest and on a related note those who are 2) less likely to have personal biases towards particular behaviours or decisions.

Santa said that in his opinion hedge funds in general had a better culture where risk management opinions were heard and advice taken. Mark said that risk managers who could get the business to accept moral persuasion were in a much stronger position to add value to the business rather than simply being able to "block" particular trades. Elliot cited one experience he had where the traders under his watch noticed that a particular type of trade (basis trades) did not increase their reported risk levels, and so became more focussed on gaming the risk controls to achieve high returns without (reported) risk. The panel seemed to be in general agreement that risk managers with trading experience were more credible with the business but also more aware of the trader mindset and behaviors. 

Do we know what we know? - Henrik moved to his third and final subsection of the evening, asking the panel whether risk managers really know what they think they know. Elliot said that traders and risk managers speak a different language, with traders living in the now, thinking only of the implications of possible events such as those we have seen with Cyprus or the fiscal cliff, where the risk management view was much less conditioned and more historical. Steven re-emphasised the earlier point that risk management at this micro trading level was fine but this was not what caused events such as the collapse of MF Global.

Rational argument isn't communication - Santa said that most risk managers come from a quant (physics, maths, engineering) background and like structured arguments based upon well understood rational foundations. He said that this way of thinking was alien to many traders and as such it was a communication challenge for risk managers to explain things in a way that traders would actually put some time to considering. On the modelling side of things, Santa said that sometimes traders dismissed models as being "too quant" and sometimes traders followed models all too blindly without questioning or understanding the simplifying assumptions they are based on. Santa summarised by saying that risk management needs to intuitive for traders and not just academically based. Mark added that a quantitative focus can sometimes become too narrow (modeler's manifesto anyone?) and made the very profound point that unfortunately precision often wins over relevance in the creation and use of many models. Steven added that traders often deal with absolutes, so as knowing the spread between two bonds to the nearest basis point, whereas a risk manager approaching them with a VaR number really means that this is the estimated VaR which really should be thought to be within a range of values. This is alien to the way traders think and hence harder to explain.

Unanticipated Risk - An audience member asked whether risk management should focus mainly on unanticipated risks rather than "normal' risks. Elliot said that in his trading he was always thinking and checking whether the markets were changing or continuing with their recent near-term behaviour patterns. Steven said that history was useful to risk management when markets were "normal", but in times of regime shifts this was not the case and cited the example of the change in markets when Mario Dragi announced that the ECB would stand behind the Euro and its member nations. 

Risky Achievements - Henrik closed the panel by asking each member what they thought was there own greatest achievement in risk management. Elliot cited a time when he identified that a particular hedge fund had a relatively inconspicuous position/trade that he identified as potentially extremely dangerous and was proved correct when the fund closed down due to this. Steven said he was proud of some good work he and his team did on stress testing involving Greek bonds and Eurozone. Santa said that some of the work he had done on portfolio "risk overlays" was good. Mark ended the panel by saying that he thought his biggest achievement was when the traders and portfolio managers started to come to the risk management department to ask opinions before placing key trades. Henrik and the audience thanked the panel for their input and time.

An Insured View - After the panel closed I spoke with an actuary who said that he had greatly enjoyed the panel discussions but was surprised that when talking of how best to support the risk management function in being independent and giving "bad" news to the business, the role of auditors were not mentioned. He said he felt that auditors were a key support to insurers in ensuring any issues were allowed to come to light. So food for thought there as to whether financial markets can learn from other industry sectors.

Summary - great evening of discussion, only downside being the absence of wine once the panel had closed!

 


Posted by Brian Sentance | 25 April 2013 | 8:27 pm


PRMIA on ETFs #4 and #5 - ETF regulation and risk management

Katherine Moriaty was a very interesting speaker at the ETF event, and she talked us through some of the regulatory issues in relation to ETFs, particularly in relation to non-transparent ETFs. Katherine provided some history on the regulation of the fund industry in the US, particularly in relation to the Investment Company Act of 1940 which was enacted to restore public confidence in the fund management industry following the troubled times of the late 1920's and through the 1930's.

The fundamental concern for the SEC (the regulatory body for this) is that the provider of the fund products cannot game investors, providing false or incorrect valuations to maximize profits. Based on the "'40 Act" as she termed it, the SEC has allowed exemptions to allow various index and fund products, such as for smart indices you need full disclosure of the rules involved, plus with active indices then constituents are published. However with active ETFs, retail investors are at a disadvantage to authorized participants (APs, the ETF providers) since there is no transparency around the constituents.

Obviously fund managers want to manage portfolios without disclosure (to maintain the "secrets" of their success, to keep trading costs low etc), but no solution has yet been found to allow this for ETFs that satisfies the SEC that the small guy is not at risk from this lack of transparency. Katherine said that participants were still still trying to come up with solutions to this problem and the SEC is still open to an exemption for anything that in their view, "works" (sounds like someone will make a lot of money when/if a solution is found). Solutions tried so far include using blind trusts and proxy or shadow portfolios. Someone from the audience asked about the relative merits of Active ETFs when compared to Active Mutual Funds - Katherine answered that the APs wanted an exchange traded product as a new distribution channel (and I guess us "Joe Soaps" want lower fees for active management...)

Vikas Kalra of MSCI had the uneviable position of giving the last presentation of the evening, and he said he would keep his talk short since he was aware he was standing between us and the cocktail reception to follow. Vikas described the problem that many risk managers faced, which was that doing risk management for a portfolio containing ETFs was fine when the ETF was of a "look through" type (i.e. constituents available), but when the ETF is opaque (no/little/uncertain constituent data) then the choices were usually 1) remove the ETF from the risk calculation or 2) substitute some proxy instrument.

Vikas said the Barra part of MSCI had come up with the solution to analyse ETF "styles". From what I could tell, this looked like some sophisticated form of 2) above, where Barra had done the analysis to enable an opaque ETF to be replaced by some more transparent proxy which allowed constituents to be analysed within the risk process and correlations etc recognised. Vikas said that 400 ETFs and ETNs were now covered in their product offering.

Conclusion - Overall a very interesting event that improved my knowledge of ETFs and had some great speakers.

Posted by Brian Sentance | 23 April 2013 | 10:26 pm


PRMIA on ETFs #3 - Tradable Volatility Exposure in ETP Packaging

Joanne Hill of Proshare presented next at the event. Joanne started her talk by illustrating how showing volatility levels from 1900 to the present day, and how historic volatility over the past 10 years seems to be at pre-1950's levels. Joanne had a lot of slides that she took us through (to be available on the event link above) which would be challenging to write up everyone (or at least that is my excuse and I am sticking to it...).

Joanne said that the VIX trades about 4% above realised volatility, which she described as being due to expectations that "something" might happen (so financial markets can be cautious it seems!). Joanne seemed almost disappointed that we seem now to have entered a period of relatively boring (?!) market activity following the end of the crisis given that the VIX is now trading at pre-2007 lows. In answer to audience questions she said that inverse volatility indices were growing as were products dependent on dynamic index strategies.

 

Posted by Brian Sentance | 23 April 2013 | 9:12 pm


PRMIA on ETFs #2 - How the ETF Market Works: Quant for the Traders

Next up in the event was Phil Mackintosh of Credit Suisse who gave his presentation on trading ETFs, starting with some scene-setting for the market. Phil said that the ETP market had expanded enormously since its start in 1993, currently with over $2trillion of assets ($1.3trillion in the US). He mentioned that $1 in $4 of flow in the US was ETF related, and that the US ETF market was larger than the whole of the Asian equity market, but again emphasizing relative size the US ETF market was much smaller than the US equities and futures markets. 

He said that counter to the impression some have, the market is 52% institutional and only 48% retail. He mentioned that some macro hedge fund managers he speaks to manage all their business through ETPs. ETFs are available across all asset classes from alternatives, currencies, commodities, fixed income, international and domestic equities. Looking at fees, these tend to reside in the 0.1% to 1% bracket, with larger fees charged only for products that have specific characteristics and/or that are difficult to replicate.

Phil illustrated how funds have consistently flowed into ETFs over recent years, in contrast with the mutual funds industry, with around 25% in international equity and around 30% in fixed income. He said that corporate fixed income, low volatility equity indices and real estate ETFs were all on the up in terms of funds flow. 

He said that ETF values were calculated every 15 seconds and oscillated around there NAV, with arbitrage activity keeping ETF prices in line with underlying prices. Phil said that spreads in ETFs could be tighter than in their underlyings and that ETF spreads tightened for ETFs over $200m. 

Phil warned of a few traps in trading ETFs. He illustrated the trading volumes of ETFs during an average which showed that they tended to be traded in volume in the morning but not (late) afternoon (need enlightening as to why..). He added that they were more specifically not a trade for a market open or close. He said that large ETF trades sometimes caused NAV disconnects, and mentioned deviations around NAV due to underlying liquidity levels. He also said that contango can become a problem for VIX futures related products.

There were a few audience questions. One concerned how fixed income ETFs were the price discovery mechanism for some assets during the crisis given the liquidity and timeliness of the ETF relative to its underlyings. Another question concerned why the US ETF market was larger and more homogenous then in Europe. Phil said that Europe was not dominated by 3 providers as in the US, plus each nationality in Europe tended to have preferences for ETF products produced by each country. This was also further discussions on shorting Fixed Income ETFs since they were more liquid than the primary market. (Inote to self, need to find out more about the details of the ETF redemption and creation process).

Overall a great talk by a very "sharp" presenter (like a lot of good traders Phil seemed to understand the relationships in the market without needing to think about them too heavily). 

 

Posted by Brian Sentance | 23 April 2013 | 8:52 pm


PRMIA on ETFs #1 - Index-Based Approaches for Risk Management in Wealth Management

It seems to be ETF week for events in New York this week, one of which was hosted by PRMIA, Credit Suisse and MSCI last night called "Risk Management of and with ETFs/Indices". The event was chaired by Seddik Meziani of Montclair State University, who opened with thanks for the sponsors and the speakers for coming along, and described the great variety of asset exposures now available in Exchange Traded Products (ETPs) and the growth in ETF assets since their formation in 1993. He also mentioned that this was the first PRMIA event in NYC specifically on ETFs. 

Index-Based Approaches for Risk Management in Wealth Management - Shaun Weuzbach of S&P Dow Jones Indices started with his proesentation. Shaun's initial point was to consider whether "Buy & Hold" works given the bad press it received over the crisis. Shaun said that the peak to trough US equity loss during the recent crisis was 57%, but when he hears of investors that made losses of this order he thinks that this was more down to a lack of diversification and poor risk management rather than inherent failures in buy and hold. To justify this, he sited an example simple portfolio constructed of 60% equity and 40% fixed income, which only lost 13% peak to trough during the crisis. He also illustrated that equity market losses of 5% or more were far more frequent during the period 1945-2012 than many people imagine, and that investors should be aware of this in portfolio construction.

Shaun suggested that we are in the third innings of indexing:

  1. Broad-based benchmark indices
  2. Precise sector-and thematic-based indices
  3. Factor-based indices (involving active strategies)

Where the factor-based indices might include ETF strategies based on/correlated with things such as dividend payments, equity weightings, fundamentals, revenues, GDP weights and volatility. 

He then described how a simple strategy index based around lowering volatility could work. Shaun suggested that low volatility was easier to explain than minimizing variance to retail investors. The process for his example low volatility index was take the 100 lowest volatility stocks out of the the S&P500 and weight by the inverse of volatility, with rebalancing every quarter.

He illustrated how this index exhibited lower volatility with higher returns over the past 13 years or so (this looked like a practical example illustrating some of the advantages of having a less volatile geometric mean of returns from what I could see). He also said that this index had worked across both developed and emerging markets.

Apparently this index has been available for only 2 years, so 11 years of the performance figures were generated from back-testing (the figures looked good, but a strategy theoretically backtested over historic markets when the strategy was not used and did not exist should always be examined sceptically).

Looking at the sector composition of this low volatility index, then one of the very interesting points that Shaun made was that the index got of the financials sector some two quarters before Lehman's went down (maybe the index was less influenced by groupthink or the fear of realising losses?)

Shaun then progressed to look a short look at VIX-based strategies, describing the VIX as the "investor fear guage". In particular he considered the S&P VIX Short-Term Future Index, which he said exhibits a high negative correlation with the S&P500 (around -0.8) and a high positive correlation with the VIX spot (approx +0.8). He said that explaining these products as portfolio insurance products was sometimes hard for financial advisors to do, and features such as the "roll cost" (moving from one set of futures contracts to others as some expire) was also harder to explain to non-institutional investors.

A few audience questions followed, one concerned concerned with whether one could capture principal retention in fixed income ETFs. Shaun briefly mentioned that the audience member should look at "maturity series" products in the ETP market. One audience member had concerns over the liquidity of ETF underlyings, to which Shaun said that S&P have very strict criteria for their indices ensuring that the free float of underlyings is high and that the ETF does not dominate liquidity in the underlying market. 

Overall a very good presentation from a knowledgeable speaker.

 

 

Posted by Brian Sentance | 23 April 2013 | 6:30 pm


Spreadsheet control and contagion

Just caught saw a reference on LinkedIn to this FT article "Finance groups lack spreadsheet controls". Started to write a quick response and given it is one of my major hobby-horses, I ended up doing a bit of an essay, so I decided to post it here too:

"As many people have pointed out elsewhere, much of the problem with spreadsheet usage is that they are not treated as a corporate and IT asset, and as such things like testing, peer review and general QA are not applied (mind you, maybe more of that should still be applied to many mainstream software systems in financial markets...). 

Ralph and the guys at Cluster Seven do a great job in helping institutions to manage and monitor spreadsheet usage (I like Ralph's "we are CCTV for spreadsheets" analogy), but I think a fundamental (and often overlooked) consideration is to ask yourself why did the business users involved decide that they needed spreadsheets to manage trading and risk in the first place? It is a bit like trying to address the symptoms of a illness without ever considering how we got the illness in the first place. 

Excel is a great tool, but to quote Spider-Man "with great power comes great responsibility" and I guess we can all see the consequences of not taking the usage of spreadsheets seriously and responsibly. So next time the trader or risk manager says "we've just built this really great model in Excel" ask them why they built it in Excel, and why they didn't build upon the existing corporate IT solutions and tools. In these cost- and risk- conscious times, I think the answers would be interesting..."

 

Posted by Brian Sentance | 27 March 2013 | 11:09 am


Data Management for Risk at Mediobanca

Very pleased to announce today that Mediobanca, the leading investment bank in Italy, has decided to select TimeScape as its data management system. You can see the press release here.

Posted by Brian Sentance | 25 March 2013 | 12:58 pm


Regulating the same

Thanks to one of my PRMIA colleagues for pointing out this article in the WSJ, talking about how regulatory driven stress testing in the US is promoting conformity and reducing innovation in approach to risk management. Echos some posts from last year on regulation increasing risk and diversity of regulation.

Posted by Brian Sentance | 20 March 2013 | 3:18 pm


Asset Management CRO Views on Risk

Notes I took from a recent Oliver Wyman sponsored PRMIA event in New York, who brought together a panel of senior managers and CROs from leading asset management organizations to discuss the role of risk management for asset managers, specifically the types of governance and controls necessary to safeguard client's assets in the current macro environment. You can access the notes here on the PRMIA site.

Posted by Brian Sentance | 14 March 2013 | 11:23 am


Analytics Strategy from Numerix

Good post from Jim Jockle over at Numerix - main theme is around having an "analytics" strategy in place in addition to (and probably as part of) a "Big Data" strategy. Fits strongly around Xenomorph's ideas on having both data management and analytics management in place (a few posts on this in the past, try this one from a few years back) - analytics generate the most valuable data of all, yet the data generated by analytics and the input data that supports analytics is largely ignored as being too business focussed for many data management vendors to deal with, and too low level for many of the risk management system vendors to deal with. Into this gap in functionality falls the risk manager (supported by many spreadsheets!), who has to spend too much time organizing and validating data, and too little time on risk management itself.

Within risk management, I think it comes down to having the appropriate technical layers in place of data management, analytics/pricing management and risk model management. Ok it is a greatly simplified representation of the architecture needed (apologies to any techies reading this), but the majority of financial institutions do not have these distinct layers in place, with each of these layers providing easy "business user" access to allow risk managers to get to the "detail" of the data when regulators, auditors and clients demand it. Regulators are finally waking up to the data issue (see Basel on data aggregation for instance) but more work is needed to pull analytics into the technical architecture/strategy conversation, and not just confine regulatory discussions of pricing analytics to model risk. 

Posted by Brian Sentance | 14 February 2013 | 2:50 pm


Big Data – What is its Value to Risk Management?

A little late on these notes from this PRMIA Event on Big Data in Risk Management that I helped to organize last month at the Harmonie Club in New York. Big thank you to my PRMIA colleagues for taking the notes and for helping me pull this write-up together, plus thanks to Microsoft and all who helped out on the night.

Introduction: Navin Sharma (of Western Asset Management and Co-Regional Director of PRMIA NYC) introduced the event and began by thanking Microsoft for its support in sponsoring the evening. Navin outlined how he thought the advent of “Big Data” technologies was very exciting for risk management, opening up opportunities to address risk and regulatory problems that previously might have been considered out of reach.

Navin defined Big Data as the structured or unstructured in receive at high volumes and requiring very large data storage. Its characteristics include a high velocity of record creation, extreme volumes, a wide variety of data formats, variable latencies, and complexity of data types. Additionally, he noted that relative to other industries, in the past financial services has created perhaps the largest historical sets of data and continually creates enormous amount of data on a daily or moment-by-moment basis. Examples include options data, high frequency trading, and unstructured data such as via social media.  Its usage provides potential competitive advantages in a trading and investment management. Also, by using Big Data it is possible to have faster and more accurate recognition of potential risks via seemingly disparate data - leading to timelier and more complete risk management of investments and firms’ assets. Finally, the use of Big Data technologies is in part being driven by regulatory pressures from Dodd-Frank, Basel III, Solvency II, Markets for Financial Instruments Directives (1 & 2) as well as Markets for Financial Instruments Regulation.

Navin also noted that we will seek to answer questions such as:

  • What is the impact of big data on asset management?
  • How can Big Data’s impact enhance risk management?
  • How is big data used to enhance operational risk?

Presentation 1: Big Data: What Is It and Where Did It Come From?: The first presentation was given by Michael Di Stefano (of Blinksis Technologies), and was titled “Big Data. What is it and where did it come from?”.  You can find a copy of Michael’s presentation here. In summary Michael started with saying that there are many definitions of Big Data, mainly defined as technology that deals with data problems that are either too large, too fast or too complex for conventional database technology. Michael briefly touched upon the many different technologies within Big Data such as Hadoop, MapReduce and databases such as Cassandra and MongoDB etc. He described some of the origins of Big Data technology in internet search, social networks and other fields. Michael described the “4 V’s” of Big Data: Volume, Velocity, Variety and a key point from Michael was “time to Value” in terms of what you are using Big Data for. Michael concluded his talk with some business examples around use of sentiment analysis in financial markets and the application of Big Data to real-time trading surveillance.

Presentation 2: Big Data Strategies for Risk Management: The second presentation “Big Data Strategies for Risk Management” was introduced by Colleen Healy of Microsoft (presentation here). Colleen started by saying expectations of risk management are rising, and that prior to 2008 not many institutions had a good handle on the risks they were taking. Risk analysis needs to be done across multiple asset types, more frequently and at ever greater granularity. Pressure is coming from everywhere including company boards, regulators, shareholders, customers, counterparties and society in general. Colleen used to head investor relations at Microsoft and put forward a number of points:

  • A long line of sight of one risk factor does not mean that we have a line of sight on other risks around.
  • Good risk management should be based on simple questions.
  • Reliance on 3rd parties for understanding risk should be minimized.
  • Understand not just the asset, but also at the correlated asset level.
  • The world is full of fast markets driving even more need for risk control
  • Intraday and real-time risk now becoming necessary for line of sight and dealing with the regulators
  • Now need to look at risk management at a most granular level.

Colleen explained some of the reasons why good risk management remains a work in progress, and that data is a key foundation for better risk management. However data has been hard to access, analyze, visualize and understand, and used this to link to the next part of the presentation by Denny Yu of Numerix.

Denny explained that new regulations involving measures such as Potential Future Exposure (PFE) and Credit Value Adjustment (CVA) were moving the number of calculations needed in risk management to a level well above that required by methodologies such as Value at Risk (VaR). Denny illustrated how the a typical VaR calculation on a reasonable sized portfolio might need 2,500,000 instrument valuations and how PFE might require as many as 2,000,000,000. He then explain more of the architecture he would see as optimal for such a process and illustrated some of the analysis he had done using Excel spreadsheets linked to Microsoft’s high performance computing technology.

Presentation 3: Big Data in Practice: Unintentional Portfolio Risk: Kevin Chen of Opera Solutions gave the third presentation, titled “Unintentional Risk via Large-Scale Risk Clustering”. You can find a copy of the presentation here. In summary, the presentation was quite visual and illustrating how large-scale empirical analysis of portfolio data could produce some interesting insights into portfolio risk and how risks become “clustered”. In many ways the analysis was reminiscent of an empirical form of principal component analysis i.e. where you can see and understand more about your portfolio’s risk without actually being able to relate the main factors directly to any traditional factor analysis. 

Panel Discussion: Brian Sentance of Xenomorph and the PRMIA NYC Steering Committee then moderated a panel discussion. The first question was directed at Michael “Is the relational database dead?” – Michael replied that in his view relational databases were not dead and indeed for dealing with problems well-suited to relational representation were still and would continue to be very good. Michael said that NoSQL/Big Data technologies were complimentary to relational databases, dealing with new types of data and new sizes of problem that relational databases are not well designed for. Brian asked Michael whether the advent of these new database technologies would drive the relational database vendors to extend the capabilities and performance of their offerings? Michael replied that he thought this was highly likely but only time would tell whether this approach will be successful given the innovation in the market at the moment. Colleen Healy added that the advent of Big Data did not mean the throwing out of established technology, but rather an integration of established technology with the new such as with Microsoft SQL Server working with the Hadoop framework.

Brian asked the panel whether they thought visualization would make a big impact within Big Data? Ken Akoundi said that the front end applications used to make the data/analysis more useful will evolve very quickly. Brian asked whether this would be reminiscent of the days when VaR first appeared, when a single number arguably became a false proxy for risk measurement and management? Ken replied that the size of the data problem had increased massively from when VaR was first used in 1994, and that visualization and other automated techniques were very much needed if the headache of capturing, cleansing and understanding data was to be addressed.

Brian asked whether Big Data would address the data integration issue of siloed trading systems? Colleen replied that Big Data needs to work across all the silos found in many financial organizations, or it isn’t “Big Data”. There was general consensus from the panel that legacy systems and people politics were also behind some of the issues found in addressing the data silo issue.

Brian asked if the panel thought the skills needed in risk management would change due to Big Data? Colleen replied that effective Big Data solutions require all kinds of people, with skills across a broad range of specific disciplines such as visualization. Generally the panel thought that data and data analysis would play an increasingly important part for risk management. Ken put forward his view all Big Data problems should start with a business problem, with not just a technology focus. For example are there any better ways to predict stock market movements based on the consumption of larger and more diverse sources of information. In terms of risk management skills, Denny said that risk management of 15 years ago was based on relatively simply econometrics. Fast forward to today, and risk calculations such as CVA are statistically and computationally very heavy, and trading is increasingly automated across all asset classes. As a result, Denny suggested that even the PRMIA PRM syllabus should change to focus more on data and data technology given the importance of data to risk management.

Asked how best to should Big Data be applied?, then Denny replied that echoed Ken in saying that understanding the business problem first was vital, but that obviously Big Data opened up the capability to aggregate and work with larger datasets than ever before. Brian then asked what advice would the panel give to risk managers faced with an IT department about to embark upon using Big Data technologies? Assuming that the business problem is well understood, then Michael said that the business needed some familiarity with the broad concepts of Big Data, what it can and cannot do and how it fits with more mainstream technologies. Colleen said that there are some problems that only Big Data can solve, so understanding the technical need is a first checkpoint. Obviously IT people like working with new technologies and this needs to be monitored, but so long as the business problem is defined and valid for Big Data, people should be encouraged to learn new technologies and new skills. Kevin also took a very positive view that IT departments should  be encouraged to experiment with these new technologies and understand what is possible, but that projects should have well-defined assessment/cut-off points as with any good project management to decide if the project is progressing well. Ken put forward that many IT staff were new to the scale of the problems being addressed with Big Data, and that his own company Opera Solutions had an advantage in its deep expertise of large-scale data integration to deliver quicker on project timelines.

Audience Questions: There then followed a number of audience questions. The first few related to other ideas/kinds of problems that could be analyzed using the kind of modeling that Opera had demonstrated. Ken said that there were obvious extensions that Opera had not got around to doing just yet. One audience member asked how well could all the Big Data analysis be aggregated/presented to make it understandable and usable to humans? Denny suggested that it was vital that such analysis was made accessible to the user, and there general consensus across the panel that man vs. machine was an interesting issue to develop in considering what is possible with Big Data. The next audience question was around whether all of this data analysis was affordable from a practical point of view. Brian pointed out that there was a lot of waste in current practices in the industry, with wasteful duplication of ticker plants and other data types across many financial institutions, large and small. This duplication is driven primarily by the perceived need to implement each institution’s proprietary analysis techniques, and that this kind of customization was not yet available from the major data vendors, but will become more possible as cloud technology such as Microsoft’s Azure develops further. There was a lot of audience interest in whether Big Data could lead to better understanding of causal relationships in markets rather than simply correlations. The panel responded that causal relationships were harder to understand, particularly in a dynamic market with dynamic relationships, but that insight into correlation was at the very least useful and could lead to better understanding of the drivers as more datasets are analyzed.

 

Posted by Brian Sentance | 8 February 2013 | 3:14 pm


Rutgers Quantitative Finance Summit

I got my first tour around the NYSE trading floor on Wednesday night, courtesy of an event by Rutgers University on Risk. Good event, mainly around panel discussion moderated by Nicholar Dunbar (Editor of Bloomberg Risk newsletter), and involving David Belmont (Commonfund CRO), Adam Litke (Chief Risk Strategist for Bloomberg), Hilmar Schaumann (Fortress Investment CRO) and Sanjay Sharma (CRO of Global Arbitrage and Trading at RBC).

Nick first asked the panel how do you define and measure risk? Hilmar responded that risk measurement is based around two main activities: 1) understanding how a book/portfolio is positioned (the static view) and 2) understanding sensitivities to risks that impact P&L (the dynamic view). Hilmar mentioned the use of historical data as a guide to current risks that are difficult to measure, but emphasised the need for a qualitative approach when looking at the risks being taken.

David said that he looks at both risk and uncertainty - with risk being defined as those impacts you can measure/estimate. He said that historical analysis was useful but limited given it is based only on what has happened. He thought that scenario analysis was a stronger tool. (I guess with historical analysis you at least get some idea of the impact of things that could not be predicted even it is based on one "simulation" path i.e. reality, whereas you have more flexibility with scenario management to cover all bases, but I guess limited to those bases you can imagine). David said that path-dependent risks such as those in the credit markets in the last crisis were some of the most difficult to deal with.

Adam said that you need to understand why you are measuring risk and understand what risks you are prepared to take. He said that at Wachovia they knew that a 25% house price fall in California would be a near death experience for the bank prior to the 2008 crisis, and in the event the losses were much greater than 25%. His point was really that you must decide what risks you want to survice and at what level. He said that sound common-sense judgement is needed to decide whether a scenario is really-real or not.

Sanjay said that risk managers need to maintain a lot of humility and not to over-trust risk meaurements. He described a little of the risk approach used at RBC where he said they use over 80 different models and employ them as layers/different views on risk to be brought together. He said they start with VaR as a base analysis, but build on this with scenarios, greeks and then on to other more specific reports and analysis. He emphasised that communication is a vital skill for risk managers to get their views and ideas across.

Nicholas then moved on to ask how risk managers should make or reduce risks? - getting away from risk measurement to risk management. Adam said that risks should be delegated out to those that manage them but this needs to be combined with responsibility for the risks too. Keep people and departments within the bounds of what their remit. Be prepared to talk a different business language to different stakeholders dependent upon their understanding and their motivations. David gave some examples of this in his case, where endowment funds what risk premiums over many years and risks are translated/quantified into practical things for example such as a new college building not going ahead etc. 

Hilmar said the hedge funds are supposed to take risks, and that the key was not necessarily to avoid losses (although avoid them if you can) but rather to avoid surprises. Like the other speakers, Hilmar emphasised that communication of risks to key stakeholders was vital. He also added the key point that if you don't like a risk you have identified, then try first to take it off rather than hedging it, since hedging could potentially add basis risk and simple more complication.

Nicholas then Sanjay about how risk managers should deal with bringing difficult news to the business? Sanjay suggested that any bad news should be approach in the form of "actionable transparency" i.e. that not only do you say communicate how bad the risk is to all stakeholders but you come along with actionable approaches to dealing with the risk. In all of his experience and despite the crisis, Sanjay's experience is that traders do not want to loose money and if you come with solutions they will listen. He concluded by saying that qualitative analysis should also be used, citing the hypothetical example that you should take notice of dogs (yes, the animal!) buying mortgages, whether or not the mortgages are AAA rated.

Nicholas asked the panel members in turn what risks are they concerned about currently? David said he believed that many risks were not priced into the market currently. He was concerned about policy impacts of action by the ECB and the Fed, and thought the current and forward levels of volatility are low. In Fixed Income markets he thought that Dodd-Frank may have detrimental effects, particular with the current lack of clarity about what is proprietary trading and what is market-making. He thought that should policies and interests rates change, he thought that risk managers should look carefully at what will happen as funds flow out of fixed income and into equities.

Hilmar talked about the postponement of the US debt ceiling limits and that US Government policy battles continue to be an obvious source of risk. In Europe, many countries had elections this year which would be interesting, and that the problems in the Euro-zone are less than they were, but problems in Cyprus could fan the flames of more problems and anxiety. Hilmar said the Japan's new policy of targetting 2% inflation may have effects on the willingness of domestic investors to buy JGBs. 

Sanjay said he was worried. In the "Greenspan Years" prior to 2008 a quasi government guarantee on the banks was effectively put in place and that we continue to live with cheap money. When policy eventually changes and interest rates rise, Sanjay wondered whether the world was ready for the wholesale asset revaluation that would then be required.

Adams concerns where mainly around identifying what will be the cause of the next panic in the market. Whilst he said he is in favour of central clearing for OTC derivatives, he thought that the changing market structure combined with implementing central clearing had not been fully thought through and this was a worry to him. 

Nicholas asked what do the panelists think to the regulation being implemented? David said that regulators face the same difficulty that risk managers face, in that nobody notices when you took sensible action to protect against a risk that didn't occur. He thinks that regulation of the markets is justified and necessary.

Sanjay said that in the airline and pharmacutical industries regulatory approval was on the whole very robust but that they were dealing with approving designs (aeroplanes and drugs) that are reproduced once approved. He said that such levels of regulation in financial services were not yet possible due to the constant innovation found in the markets, and he wanted regulation to be more dynamic and responsive to market developments. Sanjay also joined those in the industry that are critical of the shear size of Dodd-Frank.

Nicholas said that Adam was obviously keen on operational issues and wondered what plumbing in the industry would he change? Adam said that he is a big fan of automation but operational risk are real and large. He thought that there were too many rules and regulations being applied, and the regulators were not paying attention to the type of markets they want in the future, nor on the effects of current regulation and how people were moving from one part of the industry to another. Adam said that in relation to Knight Capital he was still a strong advocate of standing by the wall socket, ready to pull the plug on the computer. Adam suggested that regulators should look at regulating/approving software releases (I assume here he means for key tasks such as automated trading or risk reporting, not all software).

Given the large number of students present, Nicholas closed the panel by asking what career advice the panelists had for future risk managers? Adam emphasised flexibility in role, taking us through his career background as an equity derivatives and then fixed income trader before coming into risk management. Adam said it was highly unlikely over your career that you would stay with one role or area of expertise. 

Hilmar said that having risk managers independent of trading was vitally important for the industry. He thought there were many areas to work with operational risk being potentially the largest, but still with plenty more to do in market risk, compliance and risk modelling. He added that understanding the interdepencies between risks was key and an area for further development.

When asked by Nicholas, David said that risk managers should have a career path right through to CEO of an institution. He wanted to encourage risk management as a necessary level above risk measurement and control. He was excited about the potential of Big Data technologies to help in risk management. David gave some interesting background on his own career initially as an emergining markets debt trader. He said that it is important to know yourself, and that he regarded himself as a sceptic, needing all the information available before making a decision. As such his performance as a trader was consistent but not as high as some, and this became one of the reasons he moved into risk management. 

Sanjay said many of the systems used in finance are 20 years old, in complete contrast with the advancies in mobile and internet technologies. As such he thought this was a great opportunity to be involved in the replacement and upgrading of this older infrastructure. Apparently one analyst had estimated that $65B will be spent on risk management over the next 4-5 years.

Adam thought that there was a need for code of ethics for quants (see old post for some ideas). Sanjay added that the industry needed to move away from being involved primarily in attempting to optimise activity around gaming regulation. When asked by Nicholas about Basel III, Adam thought that improved regulation was necessary but Basel III was not the right way to go about it and was way too complex.

 

 

 

Posted by Brian Sentance | 1 February 2013 | 2:41 pm


Chartis Research - Data Management for Risk White Paper

New whitepaper on data management for risk from the analysts Chartis Research, including a section on how Xenomorph's TimeScape solution addresses some of the key issues identified.

Posted by Brian Sentance | 22 January 2013 | 3:14 pm


Alpha, Regulation and Party

Quick thank you to all those who came along to Xenomorph's New York Holiday Party at the Classic Car Club. Below is an extract from talk given by Paul Rowady of the Tabb at the event, followed by my effort and some photographs from the event. 

 ************************************************************

There Is No Such Thing as Alpha Generation

The change in perspective caused by a subtle change in language can galvanize your approach to data, the tools you select, and even the organizational culture. That said, ‘alpha generation’ is a myth; there is only alpha discovery and capture.

By E. Paul Rowady, Jr.

We live in an age of superlatives: unprecedented market complexity and uncertainty caused, in part, by an unprecedented regulatory onslaught and unprecedented economic extremes. As a result, there is an unprecedented focus on risk analysis – and an unprecedented (and anxious) search for new sources of performance from all market demographics.

The big data era is here and will only become the bigger data era. What we need is a new perspective. But fostering such a new perspective may be as subtle as performing a little linguistic jujitsu.

Our business – trading and investment in capital and commodity markets around the globe – has a history of being cavalier or too casual about language; particularly how certain labels, terms or vernacular are used to describe the business and the markets. Some of this language is intentional – the use of certain terminology creates mystique, fosters mythology, manufactures a sense of complexity that only a select group of savants can tame -- particularly when it comes to activities around quantitative methods. And some of it is just plain laziness, stretching the use of labels far beyond their original meaning on the idea that these terms are close enough.

I have become increasingly sensitive to this phenomenon over the years. Call it an insatiable need to simplify complexity, bring order to chaos, to enhance a level of accuracy and precision in how we describe what we do and how we do it. I find that precision of language does impact how complex technical topics are communicated, understood and absorbed. It turns out, language impacts perspective – and perspective impacts strategy and tactics.

So let’s gain a little perspective on alpha generation and alpha creation...(full extract can be found on the TabbFORUM)

PaulSpeechBehind

Paul in full speech mode at the Classic Car Club

  ***********************************************************

Big thanks to Paul for the above talk. Here's is my follow-up:

  ************************************************************

Thanks Paul for a great talk, certainly I agree that people, process, technology and data are key to the future success of financial markets. In particular, I think attitudes towards data must change if we are to meet the coming challenges over the next few years. For example, in my view data in financial markets is analogous to water:

  • Everyone needs it
  • Everyone knows where to get it
  • Nobody likes to share it
  • Nobody is 100% sure where was really sourced from
  • Nobody is quite sure where it goes to
  • Nobody knows its true cost
  • Nobody knows how much is wasted
  • Everyone assumes it is of high quality
  • And you only ever know it has gone bad after you have drunk it.
  • (I should add, that if you own water you are also very wealthy, so wealthy your neighbor might even consider robbing you)

The problem of siloed data and data integration remains, but this is as much a political as opposed to purely technical problem. People need to share data more, and I wonder (I hope) that as the “social network” generation come through that attitudes will improve, but I guess this will also add different pressures to data aggregators as people are less hung up about sharing information. The focus needs to be on the data that business folks need, and should be less about the type of the data or the technical means by which it is captured, stored and distributed – for sure these are important aspects, but we need involve more people in realizing this cult of data. 

And just as Paul has issues with the over-use of “Alpha”, I promise this will be the only time this evening I will mention “Big Data” but today I heard the best description so far of what big data is all about, which is  “Big data is like watching the planet develop a nervous system”. Data is fundamental to all of our lives and we are living through some very interesting times in terms of how much data is becoming available and how we make sense of it.

So, a change of tack. When moving to the New York area a few years back, one of my fellow Brits said that you will find the Americans a lot friendlier than the English, but don’t talk to them about politics or religion. So rules are meant to broken, and religion aside I thought I would briefly have to mention the recent election as one of the big differences between the UK and the USA.

Firstly, wow you guys know how to have long elections. I think the French get theirs done in two weeks but even the Brits do it in a month. A few things struck me from the election: I don’t know whether the Democratic Party is generally supportive of legalizing drugs, but I think we can be certain that President Obama spent some time in the states of Colorado and Washington prior to the first debate. 

 And I hear from the New Yorker that the Republicans are trying a radical new approach to broaden the demographic of the supporter base, apparently to make it inclusive of people who have strong believers in “maths and science”. 

Moving on from a light-hearted look at elections but sticking with the government theme, the regulation is obviously very high profile at the moment. To some degree this is understandable as financial markets have been doing a great job of keeping a low profile with:

  • JPMorgan $7B London Whale
  • Barclays and the Libor rigging
  • Standard Chartered and Iranian money laundering
  • Knight Capital with the biggest advertisement in history for automated trading
  • ING feeling it was missing out on things with Cuba and Iranian money
  • HSBC helping Mexican drug lords to move the money around
  • Capital One deceiving its customers
  • Peregrine Financial Group deceiving the regulators (generating alpha?)

 All these occurred in 2012, when it seems that the dust had barely settled over MF Global and UBS. So it is possible to understand the reaction of people and politicians to what has gone on and the need for more stable capital markets, but my biggest concern is that there is simply too much regulation, and complex systems with complex rules is a great breeding ground for the law of unintended consequences. To illustrate how over time we humans, and in particular governments, seem to be regressing in terms of using more words to describe ever more complex behaviours I found the following list online:

  • Pythagoras 24 words
  • Lords Prayer 66 words
  • Archidmedies Priciple 66 words
  • 10 commandments 179 words
  • Gettysburg Address 286 words
  • Declaration of independence 1300 words
  • US Govt sale of cabbage 26,991 words

Dodd-Frank is about 2,300 pages, which apparently is going to spawn some 30,000 pages of rules – that is enormous. Listening to a regulator speak last week, he said the regulators had about 10,000 pages done, 10,000 in progress and 10,000 not even started yet. Worse than this, he added that regulators were not trying to shape the financial markets of the future but rather dealing only with the current issues. Regulators should take their lead from quantum physics in my view, as soon as you observe something it is changed. Financial markets are complex, and making them even more complex through overlaying complex rules is not going to result in the stability that we all desire. 

Anyway, thanks for coming along this evening and I hope you have a great time. Quick thank you to our clients and partners without whom we would not exist. Thanks to the hard work our staff put in over the year, but in particular thanks to Naj and Xenomorph's NYC team for organizing this evenings event.

   ************************************************************

 Some photographs from the event below. Big thanks to NandoVision for some of the images:

Xenomorph 12-5-2012 8-36-055

Clients, partners and staff catch up over a drink or three

 

Xenomorph 12-5-2012 6-35-007

Ted Pendleton of Numerix and Paul Rowady of Tabb Group earlier on in the evening

 

WaiterSurprise

This waiter had a pleasant interuption in service prior to the fashion show by Hiliary Flowers

 

Xenomorph 12-5-2012 8-28-044

Jim Beck talks with PRMIA NYC members: Qi Fu, Sol Steinberg and Don Wesnofske

 

MeGirlOther

Cass Almendral, Hillary Flowers and Brian later at the bar

 

BalletRedWhiteCar

Not sure how this ballet-themed dress works in a convertible?

 

PaulRussMark

Russ Glisker and Mark O'Donnell talk cars with Paul

 

GirlBlackPorsche

A far more practical outfit for this Porsche

 

PaulGirls

Some of the fashion models rush to discuss the finer points of Alpha Harvesting with Paul...

  ***********************************************************

Thanks again to all involved in putting the party together and for everyone who came along on the night. If I don't get round to another post over the Holiday Season, then best wishes for a fantastic break and a great start to 2013.

Posted by Brian Sentance | 19 December 2012 | 12:48 am


Big Data - Integration still a key (external) issue

Good breakfast event from SAP and A-Team last Thursday morning. SAP have been getting (and I guess paying for) a lot of good air-time for their SAP Hana in-memory database technology of late. Domenic Iannaccone of SAP started the briefing with an introduction to big data in finance and how their SAP/Sybase offerings knitted together. He started his presentation with a few quotes, one being "Intellectual property is the oil of the 21st century" by Mark Getty (he of Getty images, but also of the Getty oil family) and "Data is the new oil" by both Clive Humby and Gerd Leonhard (not sure why two people quoted saying the same thing but anyway).

For those of you with some familiarity with the Sybase IQ architecture of a year or two back, then in this architecture SAP Hana seems to have replaced the in-memory ASE database that worked in tandem with Sybase IQ for historical storage (I am yet to confirm this, but hope to find out more in the new year). When challenged on how Hana differs from other in-memory database products, Domenic seemed keen to emphasise its analytical capabilities and not just the database aspects. I guess it was the big data angle of bring the "data closer to the calculations" was his main differentiator on this, but with more time I think a little bit more explanation would have been good.

Pete Harris of the A-Team walked us through some of the key findings of what I think is the best survey I have read so far on the usage of big data in financial markets (free sign-up needed I think, but you can get a copy of the report here). Some key findings from a survey of staff at ten major financial institutions included:

  • Searching for meaning in instructured data was a leading use-case thought of when thinking of big data (Twitter trading etc)
  • Risk management was seen as a key beneficiary of what the technologies can offer
  • Aggregation of data for risk was seen as a key application area concerning structured data.
  • Both news feed but also (surprisingly?) text documents were key unstructured data sources being processed using big data.
  • In trading news sentiment and time series analysis were key areas for big data.
  • Creation of a system wide trade database for surveillance and compliance was seen as a key area for enhancement by big data.
  • Data security remains a big concern with technologists over the use of big data.

There were a few audience questions - Pete clarified that there was a more varied application of big data amongst sell-side firms, and that on the buy-side it was being applied more KYC and related areas. One of the audience made that point that he thought a real challenge beyond the insight gained from big data analysis was how to translate it into value from an operational point of view. There seemed to be a fair amount of recognition that regulators and auditors are wanting a full audit trail of what has gone on across the whole firm, so audit was seen as a key area for big data. Another audience member suggested that the lack of a rigid data model in some big data technologies enabled greater flexibility in the scope of questions/analysis that could be undertaken. 

Coming back to the key findings of the survey, then one question I asked Pete was whether or not big data is a silver bullet for data integration. My motivation was that the survey and much of the press you read talks about how big data can pull all the systems, data and calculations together for better risk management, but while I can understand how massively scaleable data and calculation capabilities was extremely useful, I wondered how exactly all the data was pulled together from the current range of siloed systems and databases where it currently resides. Pete suggested that this was stil a problematic area where Enterprise Application Integration (EAI) tools were needed. Another audience member added that politics within different departments was not making data integration any easier, regardless of the technologies used. 

Overall a good event, with audience interaction unsurprisingly being the most interesting and useful part.

Posted by Brian Sentance | 3 December 2012 | 2:12 pm


PRMIA on Basel III, Volcker and the Fed

Just wanted to start this post with a quick best wishes to all affected by Hurricane Sandy in the New York area. Nature is a awesomely powerful thing and amply demonstrated it is always to be respected as a "risk".

Good event on regulatory progress organised by PRMIA and hosted by Credit Suisse last night. Dan Rodriguez introduced the speakers and Michael Gibson of the Fed began with his assessment of what he thinks regulators have learned from the crisis. Mike said that regulators had not paid enough attention to the following factors:

  • Capital
  • Liquidity
  • Resolvability (managing the failure of a financial institution without triggering systemic risk) 

Capital - Mike said that regulators had addressed the quality and quantity of capital head by banks. With respect to Basel III, Mike said that the Fed had received around 2,500 comments that they were currently reviewing. In relation to supervision, he suggested that stress testing by the banks, the requirement for capital planning from banks and the independent stress tests undertaken by the regulators had turned the capital process into much more of a forward-looking exercise than it had been pre-crisis. The ability of regulators to limit dividend payments and request capital changes had added some "teeth" to this forward looking approach. Mike said that the regulators are getting more information which is allowing them to look more horizontally across different financial institutions to compare and contrast business practices, risks and capital adequacy. He thought that disclosure to the public of stress testing results and other findings was also a healthy thing for the industry, prompting wider debate and discussion.

Liquidity - Mike said that liquidity stress testing was an improvement over what had gone before (which was not much). He added that the Basel Committee was working on a quantitative liquidity ratio and that in general regulators were receiving and understanding much more data from the banks around liquidity.

Resolvability - Mike said in addition to resolution plans (aka "living wills") being required by Dodd-Frank in the US, the Fed was working with other regulators internationally on resolvability.

There then followed a Q&A session involving the panelists and the audience:

Basel III Implementation Timeline - Dan asked Mike about the 2,500 comments the Fed had received on Basel III and when the Fed would have dealt with these comments, particularly in the context of where compliance with Basel III for US Banks had been delayed beyond Jan 1 2013. Dan additionally asked whether Mike that implementing Basel III now was a competitive advantage or disadvantage for a bank?

Mike responded that the Fed had extended its review period from 90 days to 135 days which was an unusual occurence. He said that as yet the Fed had no new target data for implementation. 

Brian of AIG on Basel III and Regulation -  Dan asked Brian Peters of AIG what his thoughts were on Basel III. Brian was an entertaining speaker and responded firstly that AIG was not a bank, it was an insurer and that regulators need to recognise this. He said regulators need to think of the whole financial markets and how they want them to look in the future. Put another way, he implied that looking at capital, liquidity and resolvability in isolation was fine at one level, but these things had much wider implications and without taking that view then there would be problems. 

Brian said he thinks of Basel III as a hammer, and that when people use a hammer everything starts to look like a "nail". He said that insurers write 50 year-long liabilities, and as a result he needs long term investments to cover these obligations. He added that the liquidity profile of insurers was different to banks, with life policies having exposures to interest rates more like bank deposits. He said that AIG was mostly dealing with publicly traded securities (I guess now AIG FP is no longer dominant?). Resolvability was a different process for insurers, with regulators forcing troubled insurers to limit dividends and build up cash reserves.

Brian's big concern for the regulators was that in his view they need to look at the whole financial system and what future they want for it, rather than dealing with one set of players and regulations in isolation. Seems Brian shares some similar concerns to Pierre Guilleman on apply banking regulation to the insurance industry, combined with the unintended consequencies of current regulation on the future of the whole of financial markets (maybe the talk on diversity of approach is a good to read on this, or maybe more recently "Regulation Increases Risk" for a more quantitative approach).

Steve of Credit Suisse on Basel III - Dan asked Steven Haratunian whether implementing Basel III was a competitive advantage or disadvantage for Credit Suisse. Steve said that regardless of competitive advantage, as a Swiss bank Credit Suisse had no choice in complying with Basel III by Jan 1 2013, that Credit Suisse had started its preparations since 2011 and had been Basel 2.5 compliant since Jan 1 2012. He said that Basel III compliance had effectively doubled their capital requirements, and had prompted a strategic review of all business activities within the investment banking arm.

This review had caused a reassessment of the company's involvement in areas such as fixed income and risk weighted assets had been reduced by over $100Billion. Steven explained how they had looked at each business activity and assessed whether they could achieve a 15% return on equity over a business cycle, plus be able to withstand CCAR stress testing during this time. He said that Credit Suisse had felt lonely in the US markets in that they were many occaisions where deals were lost due directly to consideration of Basel III capital requirements. Credit Suisse felt less lonely now given how regulation is affecting other banks, and that for certain markets (notably mortgages and credit) the effects of Basel III were very harsh.

Volcker Rule and Dodd-Frank - Dan asked Mike where did the Volcker Rule fit within Dodd-Frank, and does it make us safer? Mike didn't have a great deal to say on this, other than he thought it was all part and parcel of Congress's attempts to make the financial markets safer, that its implementation was being managed/discussed across an inter-agency group including the Fed, SEC and CFTC. Brian said that Dodd-Frank did not have a great deal of impact for insurers, the only real effects being some on swap providers to insurers. 

Steve said that many of the many aspects or "spirit" of Volcker and Dodd-Frank had been internalised by the banks and were progressing despite Dodd-Frank not being finalised. He said that in particular the lack of certainty around extraterritoriality and margining in derivatives was not helpful. Mike added that in terms of progressing through Dodd-Frank, his estimated was that the Fed had one third of it finished, one third of the rules proposed, and one third not started or in very early stages. So still some work to be done.

Living Wills - Brian at this point referred to a recent speech by William C. Dudley of the Fed with title "Solving the Too Big to Fail Problem" (haven't looked at this yet, but will). Mike said that the Fed was stilling learning in relation to "Living Wills" and eventually it will get down to a level of being very company specific. Brian asked whether this meant that "Living Wills" would be very specific to each company and not a general rule to be applied to all. Mike said it was too early to tell. 

Extraterritoriality - On extraterritoriality Steve said that Credit Suisse having to look at its subsidiaries globally more as standalone companies when dealing with regulators and capital requirements, which will great increase capital requires if the portfolio effect of being a global company is not considered by regulators. Dan mentioned a forthcoming speechto be made by Dan Tarullo of the Fed, and mentioned how the Fed was looking at treating foreign subsidiaries operating in the US as bank holding companies not global subsidiaries, hence again causing problems by ignoring portfolio effect. Mike said that the regulators were working on this issue, and that unsurprisingly he couldn't comment on the speech Dan Tarullo had yet to make. 

The Future Shape of the Markets - Brian brought up an interesting question for Mike in asking how the regulators wanted to see financial markets develop and operate in the future? Brian thought that current regulation was being implemented as almost the "last war" against financial markets without a forward looking view. He said that historically he could see Basel 1 being prompted by addressing some of the issues caused by Japanese banks, he saw Basel II addressing credit risk but what will the effects of Basel III ultimately be? 

This prompted an interesting response from Mike, in that he said that the Fed is not shaping markets and is dealing only with current rules and risks. He added that private enterprise would shape future markets. (difficult to see how that argument stacks up, regulation implemented now is surely not independent of private sector reaction/exploitation of it) Steve added that Basel III had already had effects, with Credit Suisse already reducing its activity in mortgage and fixed income markets. Steve said that non-banking organisations were now involved in these markets and that regulators have to be aware of these changes or face further problems. 

Did Regulators Fail to Enforce Existing US Regulation - one audience participant was strongly of the opinion that Basel III is not needed, that there was enough regulation in place to limit the crisis and that the main failing of the regultors was that they did not implement what was already there to be used. Mike said he thought that the regulators did have lessons to learn and that some of the regulation then in place needed reviewing.

Keep it Simple - another audience member asked about the benefits of simple regulation of simpler markets and mentioned an article by Andrew Haldane of the Bank of England on "The Dog and the Frisbee". Mike didn't have much to add on this other than saying it was a work in progress. 

Brian thought that the central failure behind the crisis was the mis-rating of credit instruments, with AAA products attracting a 4bp capital charge instead of a more realistic 3%.

Regulations Effects on Market Pricing - Steve was the first to respond on this, pointing to areas such as cmbs and credit markets as being best performing areas that also have the lower capital risk weights. Dan said he felt that equity markets had not fully adjusted yet, and ironically that financial equities had the highest risk weights. Combined with anticipated rises in tax, high risk weightings were taking capital out of the risk bearing/wealth generating parts of the economy and into low weighted instruments like US treasuries. Dan wondered whether regulation was one of the key dampening factors behind why the current record stimulus was not accelerating the economy in the US more quickly. 

Derivatives Clearer and Clearing - this audience question was asking how the regulators were dealing with the desire to encourage clearing of derivative trades whilst at the same time not incentivising the banks to set themselves up as clearers. Mike said that there was an international effort to look at this.

What Happens When the Stimulus Goes - an audience member asked what the panel thought would happen once the stimulus was removed from the markets. The panelists thought this was more an economics questions. However Dan said that the regulators were more sensitive to the markets and market participants when considering new stimulus measures, and cited problems in the fall of 2011 caused by Fed actions in the market crushing mortgage spreads. Brian said insurers need yield so the stimulus was obviously having an impact. Dan mentioned that given the low risk weighting of US Treasuries then everyone was holding them and so the impact of a jump in rates would hurt many if done without preparation.

Wine Shortage and Summary - Just had to mention that there was no wine made available at the networking session afterwards. A sign of austere times or simply that it was too early in the week? Anyway it was a great discussion and raised some good points. In summary, all I hear still supports the premise that the "Law of Unintended Consequences" is ever-present, ever-powerful and looming over the next few years. Hearing regulators say that they are dealing with current risks only and are not shaping the future of financial markets smacks of either delusion or obfuscation to me. 

 

 

 

 

 

Posted by Brian Sentance | 28 November 2012 | 6:22 pm


Apex for Interactive (Reference) Data

Launch event for Interactive Data's new reference data service Apex on Wednesday night, hosted at Nasdaq Time Square and introduced by Mark Hepsworth. Apex looks like a good offering, combining multi-asset data access, batch file and on-demand API requests from the same data store, plus hosted data management services, and a flexible licensing/distribution/re-distribution model.

Some good speakers at the event. Larry Tabb ran through his opinions on the current market, starting with regulation. He painted a mixed picture of the market, starting with the continuing exit by investors from the equity mutual funds market, offset to some degree by rapid growth in ETF assets (54% growth over past 3 years to $1,200billion). Obviously events such as the Flash Crash, Libor, the London Whale and Knight Capital have not increased investors confidence in markets either.

On regulation he first cited the sheer amount of regulation being attempted at the moment going through systemic risk/too big to fail, Dodd-Frank, Volcker, derivatives regulation, Basel III etc. Of particular note he mentioned some concerns over whether there is simply enough collateral around in the market given increased capital requirements and derivative regulation (a thought currently shared by the FT apparently in this article).

Given the focus of the event, Larry unsurprisingly mentioned the foundational role of data in meeting the new regulatory requirements, which for the next few years he believes will be focussed on audit and the ability to explain and justify past decisions to regulators. Also given the focus of the event, Larry did not mention his recent article on the Tabb Forum on federated data management strategies which I would have been interested to hear Interactive's comments on, particularly given their new hosted data management offerings. (You can find some of our past thoughts here on the option of using federated data.)

Mike Atkin of the EDM Council was next up and described a framework for what he thought was going on in the market. In summary, he split the drivers for change into business and regulatory, and categorised the changes into:

  • Transparency
  • Systemic Risk
  • Capital and Liquidity
  • Clearing and Settlement
  • Control and Enforcement

He then that the fundamental challenge with data was to go through the chain of identifying things, descibing them, classifying/aggregating them and then finally establishing linkages. He then ended this part of his presentation with the three aspects he thought necessary to sort this out from industry data standards, to methods of best practice and on to having infrastructure in place to enable these changes. 

Mike then went on to recount a conversation he had had with a hedge fund manager, who had defined the interesting concept of a "Data Risk Equation":

N x CC x S / (Q x V)

where:

N: is the number of variables

CC: is a measure of calculation complexity

S: is the number of data sources needed

Q: is a measure of quality

V: is a measure of verifiability 

I think the angle was the Hedge Fund guy was simply using a form of the above to categorise and compary the complexity of some of the data issues his firm was dealing with.

Aram Flores of Deutsche Bank then talked briefly. Of note was his point that the new regulation was forcing DB to use more external rather than internal data, since regulation now restricted the use of internal data within regulatory reporting. Sounds like good news for Interactive and some of its competitors. Eric Reichenberg of SS&C GlobeOp then gave a quick talk on the importance of accurate data to his derivative valuation services. The talks ended with a well-prepped conversation between Marty Williams and one of their new Apex clients, who jokingly refered to one of the other well-known data vendors as the Evil Empire which raised a few smiles - fortunately the speaker didn't start to choke at this point so obviously Darth Vader wasn't spying on the proceedings...

So overall a good event, new product offering looks interesting, speakers were entertaining and the drinks/food/location were great. 

Posted by Brian Sentance | 26 October 2012 | 2:22 pm


The Missing Data Gap

Getting to the heart of "Data Management for Risk", PRMIA held an event entitled "Missing Data for Risk Management Stress Testing" at Bloomberg's New York HQ last night. For those of you who are unfamiliar with the topic of "Data Management for Risk", then the following diagram may help to further explain how the topic is to do with all the data sets feeding the VaR and scenario engines.

Data-Flow-for-Risk-Engines
I have a vested interest in saying this (and please forgive the product placement in the diagram above, but hey this is what we do...), but the topic of data management for risk seems to fall into a functionality gap between: i) the risk system vendors who typically seem to assume that the world of data is perfect and that the topic is too low level to concern them and ii) the traditional data management vendors who seem to regard things like correlations, curves, spreads, implied volatilities and model parameters as too business domain focussed (see previous post on this topic) As a result, the risk manager is typically left with ad-hoc tools like spreadsheets and other analytical packages to perform data validation and filling of any missing data found. These ad-hoc tools are fine until the data universe grows larger, leading to the regulators becoming concerned about just how much data is being managed "out of system" (see past post for some previous thoughts on spreadsheets).

The Crisis and Data Issues. Anyway enough background above and on to some of the issues raised at the event. Navin Sharma of Western Asset Management started the evening by saying that pre-crisis people had a false sense of security around Value at Risk, and that crisis showed that data is not reliably smooth in nature. Post-crisis, then questions obviously arise around how much data to use, how far back and whether you include or exclude extreme periods like the crisis. Navin also suggested that the boards of many financial institutions were now much more open to reviewing scenarios put forward by the risk management function, whereas pre-crisis their attention span was much more limited.

Presentation. Don Wesnofske did a great presentation on the main issues around data and data governance in risk (which I am hoping to link to here shortly...)

Issues with Sourcing Data for Risk and Regulation. Adam Litke of Bloomberg asked the panel what new data sourcing challenges were resulting from the current raft of regulation being implemented. Barry Schachter cited a number of Basel-related examples. He said that the costs of rolling up loss data across all operations was prohibitative, and hence there were data truncation issues to be faced when assessing operational risk. Barry mentioned that liquidity calculations were new and presenting data challenges. Non centrally cleared OTC derivatives also presented data challenges, with initial margin calculations based on stressed VaR. Whilst on the subject of stressed VaR, Barry said that there were a number of missing data challenges including the challenge of obtaining past histories and of modelling current instruments that did not exist in past stress periods. He said that it was telling on this subject that the Fed had decided to exclude tier 2 banks from stressed VaR calculations on the basis that they did not think these institutions were in a position to be able to calculate these numbers given the data and systems that they had in place.

Barry also mentioned the challenges of Solvency II for insurers (and their asset managers) and said that this was a huge exercise in data collection. He said that there were obvious difficulties in modelling hedge fund and private equity investments, and that the regulation penalised the use of proxy instruments where there was limited "see-through" to the underlying investments. Moving on to UCITS IV, Barry said that the regulation required VaR calculations to be regularly reviewed on an ongoing basis, and he pointed out one issue with much of the current regulation in that it uses ambiguous terms such as models of "high accuracy" (I guess the point being that accuracy is always arguable/subjective for an illiquid security).

Sandhya Persad of Bloomberg said that there were many practical issues to consider such as exchanges that close at different times and the resultant misalignment of closing data, problems dealing with holiday data across different exchanges and countries, and sourcing of factor data for risk models from analysts. Navin expanded more on his theme of which periods of data to use. Don took a different tack, and emphasised the importance of getting the fundamental data of client-contract-product in place, and suggested that this was a big challenge still at many institutions. Adam closed the question by pointing out the data issues in everyday mortgage insurance as an example of how prevalant data problems are.

What Missing Data Techniques Are There? Sandhya explained a few of the issues her and her team face working at Bloomberg in making decisions about what data to fill. She mentioned the obvious issue of distance between missing data points and the preceding data used to fill it. Sandhya mentioned that one approach to missing data is to reduce factor weights down to zero for factors without data, but this gave rise to a data truncation issue. She said that there were a variety of statistical techniques that could be used, she mentioned adaptive learning techniques and then described some of the work that one of her colleagues had been doing on maximum-likehood estimation, whereby in addition to achieving consistency with the covariance matrix of "near" neighbours, that the estimation also had greater consistency with the historical behaviour of the factor or instrument over time.

Navin commented that fixed income markets were not as easy to deal with as equity markets in terms of data, and that at sub-investment grade there is very little data available. He said that heuristic models where often needed, and suggested that there was a need for "best practice" to be established for fixed income, particularly in light of guidelines from regulators that are at best ambiguous.

I think Barry then made some great comments about data and data quality in saying that risk managers need to understand more about the effects (or lack of) that input data has on the headline reports produced. The reason I say great is that I think there is often a disconnect or lack of knowledge around the effects that input data quality can have on the output numbers produced. Whilst regulators increasingly want data "drill-down" and justfication on any data used to calculate risk, it is still worth understanding more about whether output results are greatly sensitive to the input numbers, or whether maybe related aspects such as data consistency ought to have more emphasis than say absolute price accuracy. For example, data quality was being discussed at a recent market data conference I attended and only about 25% of the audience said that they had ever investigated the quality of the data they use. Barry also suggested that you need to understand to what purpose the numbers are being used and what effect the numbers had on the decisions you take. I think here the distinction was around usage in risk where changes/deltas might be of more important, whereas in calculating valuations or returns then price accuracy might receieve more emphasis. 

How Extensive is the Problem? General consensus from the panel was that the issues importance needed to be understood more (I guess my experience is that the regulators can make data quality important for a bank if they say that input data issues are the main reason for blocking approval of an internal model for regulatory capital calculations). Don said that any risk manager needed to be able to justify why particular data points were used and there was further criticism from the panel around regulators asking for high quality without specifying what this means or what needs to be done.

Summary - My main conclusions:

  • Risk managers should know more of how and in what ways input data quality affects output reports
  • Be aware of how your approach to data can affect the decisions you take
  • Be aware of the context of how the data is used
  • Regulators set the "high quality" agenda for data but don't specify what "high quality" actually is
  • Risk managers should not simply accept regulatory definitions of data quality and should join in the debate

Great drinks and food afterwards (thanks Bloomberg!) and a good evening was had by all, with a topic that needs further discussion and development.

 

 

Posted by Brian Sentance | 16 October 2012 | 2:21 pm


Bankenes Sikringsfond Selects Xenomorph's TimeScape for Faster Data Analysis and High-Quality Decision Support

Just a quick note to say that we have signed a new client, Bankenes Sikringsfond, the Norwegian Banks’ Guarantee Fund. They will be using TimeScape to fulfill requirements for a centralised analytics and data management platform. The press release is available here for those of you who are interested.

Posted by Sara Verri | 11 October 2012 | 9:50 am


The Financial Regulatory Tide: In or Out?

If you have ever wandered around the financial district in New York, then you may not have noticed the Museum of American Finance on the corner of Wall and William St. I tend to find there are lots of things I don't notice in New York, probably due to the fact that I am still doing a passable impression of a tourist and find myself looking ever upwards at the skyscrapers rather than at anything at ground level. Anyway MoAF is worth a look-in and having recently become a member (thanks Cognito Media!) I went along to one of their events last night on regulation.Richard Sylla was the moderator for the evening, with support from Hugh Rockoff, Eugene N. White and Charles Geisst.

Richard Sylla on Fractional Reserve Banking and Regulation

Richard started the evening by explaining some basics of bank balance sheets as a means for explaining why he feels banking needs regulation. He showed a simplified and conservative balance sheet for an example bank:

Liabilities

  • Deposits 85% (from the likes of you or I)
  • Capital 15% (shareholders including surpluses)

Assets

  • Earning Assets 80% (loans and investments)
  • Reserves 20% (cash and deposits at other banks/central banks)

Richard explained that the main point to note from the balance sheet was that the reserves did not match the depositors and hence there is not enough money to repay all the depositors if they asked for their money back all at once. Richard's example was a form of Fractional Reserve Banking and he explained that there were two main reasons why banking needs regulation. The first was the incentive for banks to reduce their reserves to increase profits (increasing risk re: depositors) and the second was to keep capital levels low in order to increase earnings per share.

He then went on to illustrate how at the time of the last crisis Fannie Mae and Freddie Mac had earning assets of 100%, reserves 0%, deposits of 96% and capital of 4%. Lehman and Bear Stearns both had zero reserves and capital of only 3%. He then went on to list a large number of well known financial institutions and showed how the equity of many was simply wiped out given falls in asset valuations, the lack of reserves and the very small levels of equity maintained.

Hugh Rockoff on Adam Smith and Banking Regulation

Hugh is apparently a big fan of free market economics and of Adam Smith in particular. Much as Smith is for the "Invisible Hand" of the free market and against regulation, Hugh was at pains to point out that even Smith thought of banking being a special case in need of regulation and referred to banking operations as "a sort of waggon-way through the air".

Apparently Smith lived through a banking crisis in 1772 involving the Ayr Bank - I think Hugh had misspelt this as "Air" which I not sure whether it was deliberate but made for some reasonable humour about the value of the notes issued by the bank. Apparently this was an international crisis involving many of the then major powers, was based on stock market and property speculation and indirectly lead to the Boston Tea Party so I guess many Americans should pay their respects to this failed bank that became a catalyst to the formation of their country. A key point to note was that the shareholders of the Ayr Bank were subject to unlimited liability and had to pay all obligations owing...not sure how that would go down today in our more enlightened (?) times but more of that later.

Hugh described how Smith thought there were many things that banks should not be allowed to do including investing in real-estate (!) and prohibitions on the "option" to repay monetary notes. Smith also suggested that the Government should set maximum interest rates. So for a free market thinker, Smith had some surprising ideas when it came to banking. Hugh also pointed out that another great free-marketeer, Milton Freedman, was also in favour of banking regulation and favoured both deposit insurance and 100% reserve banking

Eugene White on Regulatory History

At a guess I would say that Eugene is a big fan of the quote from Mark Twain that "History does not repeat itself, but it does rhyme". Eugene took us briefly through major financial regulations in American history such as the National Banking Act of 1864, Federal Reserve Act of 1913, The "New Deal" of 1932 and others. He notably had a question mark around whether Dodd-Frank was going to be a major milestone in regulatory history, as in his opinion Dodd-Frank treats the symptons and not the causes of the last financial crisis. Eugene spent some time explaining the cycle of regulation where governments go through stages of:

-> Regulation ->
-> Problems caused by Regulation->
-> De-regulation ->
-> Financial Crisis ->
-> back to Regulation ->

Charles Geisst on Dodd-Frank and the Volker Rule

Charles started by saying that he thought Dodd-Frank, and in particular the Volker Rule, might well still be being debated three years hence. As others have done, he contrasted the 2,300 pages of Dodd-Frank with the simplicity of the 72 pages of the Glass-Steagall Act. He believes that the Volker Rule is Glass-Steagall by another name, and believes that Wall St has only recently realised this is the case and has begun the big push back against it. 

He left the audience with the sobering thought that he thinks another financial crisis is needed in order to cut down Dodd-Frank from 2300 pages of instructions for regulators to put regulations in place to around 150 pages of meaningful descriptions of the kinds of things that banks can and cannot do. 

Audience Questions

Rules vs. Principals - One audience member wondered if the panel thought it better to regulate in terms of feduciary duties of the participants rather than in detailed rules that can be "worked around". Charles respond that he thought feduciary duties were better, and contrasted the strictness with which banking fraud has been treated in the USA with the relative lack of punishment and sentencing in the securities industry. Eugene added that the "New Deal" of 1932 took away limited liabiltiy for shareholders of banks, and with it the incentives for shareholders to monitor the risks being taken by the banks they own.

Basel Regulations - Another audience member wanted panel feedback on Basel. In summary the panel said that the Basel Committee got it wrong in thinking it knew for certain how risky certainy asset classes were for example thinking that a corporate bond from IBM was more risky than say an MBS or government debt.

Do Regulators deal with the Real Issues? - Charles again brought this question back to the desire for simplicity and clarity, something that is not found in Dodd-Frank in his view. Hugh mentioned that the USA has specific problems with simply the number of regulatory bodies, and contrasted this with the single regulator in Canada. He said he thought competition was good for businesses but bad for regulators.

Eugene and Charles put an interesting historical perspective on this question, in that it is more often the case that government and the finance work together in composing legislation and regulation. Eugene gave the example that in the financial crisis of the early 30s, banks that had combined both retail and investment banking operations had faired quite well. So why did Glass-Steagal come about? Apparently Senator Steagal wanted deposit-insurance to help the myriad number of small banks back home, and Senator Glass simply wanted investment banks and retail banks to be separated, so a deal was done. I found this surprising (maybe I shouldn't be) but G-S is put forward as good regulation yet it seems it was not treating the observed symptoms of the crisis being dealt with.

How are the regulators dealing with Money Market Funds? - Here the panel said this was a classic example of the industry fiighting the SEC becuase the proposed regulation would reduce the return on their operations. Eugene explained how MMFs resulted from the savings and loans industry complaining about depositors investing in T-Bills. So the government response was to increase T-Bill denomination from $1,000 to $10,000 to limit who could invest, but then this was circumvented by the idea of setting up funds to invest in these larger denomination assets. Charles added that he thought the next crisis would come from the Shadow Banking system and that a more balanced approach needed to be taken to regulate across both systems. Hugh added that Dodd-Frank thinks it can identify systematically important institutions and it would be his bet that the next crisis starts with an organisation that is below the radar and not on this list. The panel concluded with a brief discussion of pay and remuneration and said that this was a major problem that needed better solutions.

 

 



Posted by Brian Sentance | 3 October 2012 | 3:20 pm


We Can’t Upgrade, the Data Model’s Changed!

New article with some of my thoughts on data models, interfaces and software upgrades has just gone up on the Waters Inside Reference Data site.

 

Posted by Brian Sentance | 11 September 2012 | 3:50 pm


OIS Discounting and Curve Management - Presentation Materials

Just a quick note to say that the video, presentations and supporting documents have now gone up for our recent Wilmott event with Numerix on OIS Curves and Libor in New York. Somewhat topical at the moment given the current bad press for Barclays.

Posted by Brian Sentance | 29 June 2012 | 1:20 pm


Front to back office data management

Some recent thoughts in Advanced Trading on turning data management on its head, and how to extend data management initiatives from the back office into both risk management and the front office.

Posted by Brian Sentance | 22 June 2012 | 1:17 pm


Paris Financial Information Summit 2012

I attended the Financial Information Summit event on Tuesday, organized in Paris by Inside Market Data and Inside Reference Data.

Unsurprisingly, most of the topics discussed during the panels focused on reducing data costs, managing the vendor relationship strategically, LEI and building sound data management strategies.

Here is a (very) brief summary of the key points touched which generated a good debate from both panellists and audience:

Lowering data costs and cost containment panels

  • Make end-users aware of how much they pay for that data so that they will have a different perspective when deciding if the data is really needed or a "nice to have"
  • Build a strong relationship with the data vendor: you work for the same aim and share the same industry issues
  • Evaluate niche data providers who are often more flexible and willing to assist while still providing high quality data
  • Strategic vendor management is needed within financial institutions: this should be an on-going process aimed to improve contract mgmt for data licenses
  • A centralized data management strategy and consolidation of processes and data feeds allow cost containment (something that Xenomorph have long been advocating)
  • Accuracy and timeliness of data is essential: make sure your vendor understands your needs
  • Negotiate redistribution costs to downstream systems

One good point was made by David Berry, IPUG-Cossiom, on the acquisition of data management software vendors by the same data providers (referring to the Markit-Cadis and PolarLake-Bloomberg deals) and stating that it will be tricky to see how the two business units will be managed "separately" (if kept separated...I know what you are thinking!).

There were also interesting case studies and examples supporting the points above. Many panellists pointed out how difficult can be to obtain high quality data from vendors and that only regulation can actually improve the standards. Despite the concerns, I must recognize that many firms are now pro-actively approaching the issue and trying to deal with the problem in a strategic manner. For example, Hand Henrik Hovmand, Market Data Manager, Danske Bank, explained how Danske Bank are in the process of adopting a strategic vendor system made of 4 steps: assessing vendor, classifying vendor, deciding what to do with the vendor and creating a business plan. Vendors are classified as strategic, tactical, legacy or emerging. Based on this classification, then the "bad" vendors are evaluated to verify if they are enhancing data quality. This vendor landscape is used both internally and externally during negotiation and Hovmand was confident it will help Danske Bank to contain costs and get more for the same price.

I also enjoyed the panel on Building a sound management strategy where Alain Robert- Dauton, Sycomore Asset Management, was speaking. He highlighted how asset managers, in particular smaller firms, are now feeling the pressure of regulators but at the same time are less prepared to deal with compliance than larger investment banks. He recognized that asset managers need to invest in a sound risk data management strategy and supporting technology, with regulators demanding more details, reports and high quality data.

For a summary on what was said on LEI, then seems like most financial institutions are still unprepared on how it should be implemented, due to uncertainty around it but I refer you to an article from Nicholas Hamilton in Inside Reference Data for a clear picture of what was discussed during the panel.

Looking forward, the panellists agreed that the main challenge is and will be managing the increasing volume of data. Though, as Tom Dalglish affirmed, the market is still not ready for the cloud, given than not much has been done in terms of legislation. Watch out!

The full agenda of the event is available here.

Posted by Sara Verri | 14 June 2012 | 4:54 pm


Federal Reserve beats the market (at ping pong...)

Thanks to all those who came and along and supported "Ping Pong 4 Public Schools" at the AYTTO fund raiser event at SPiN on Wednesday evening. Great evening with participants in the team competition from the TabbGroup, Jeffries Investment Bank, Toro Trading, MissionBig, PolarLake, AIG, Mediacs, Xenomorph and others. In fact the others included the Federal Reserve, who got ahead of the market and won the team competition...something which has to change next year! Additional thanks to SPiN NYC for hosting the event, and to Bonhams for conducting the reverse auction.

Some photographs from the event below:

Photo

Ben Nisbet of AYTTO trying to make order out of chaos at the start of the team competion...

 

Photo

One the AYTTO students, glad none of us had to play her, we would have got wupped...

 

Photo

The TabbGroup strike a pose and look optimistic at the start of the evening...

 

Photo

Sidney, one of the AYTTO coaches, helping us all to keep track of the score...

 

Photo

This team got a lot of support from the audience, no idea why...

 

Posted by Brian Sentance | 8 June 2012 | 8:19 pm


OIS Discounting and Curve Management - Wilmott Event - Thursday 31st May

Quick plug for Xenomorph's Wilmott Forum Event on OIS curves tomorrow in downtown Manhattan. The event is done in partnership with Numerix, and will be looking at the issue of OIS vs. Libor discounting from the point of view of a practioner, financial engineer and systems developer. You can register for the event here, and so we hope to see you at 6pm for some great talks and some drinks/socialising afterwards.

Posted by Brian Sentance | 30 May 2012 | 1:07 pm


TabbFORUM Video: Data Is Revenue

Video interview with Paul Rowady of the Tabb Group, primarily about how data management can break out from being just a back office function and become a source of competitive advantage in both the front office and in risk management.

For those of you with a curious mind, the perseverence to watch the video until the end and possibly not such advanced years as me and Paul, then the lead singer of Midnight Oil that he refers to at the close of the video is Peter Garrett, who looks like this:

Peter garrett

Whereas I look like this:

G7Q40383 square

See, completely different. Obviously Peter has a great choice in hairstyle though...

Posted by Brian Sentance | 30 May 2012 | 12:22 pm


The hedge fund fraud is in the residuals

Good Quafafew event in NYC this week, with Michael Markov of MPI on "Hedge Fund Replication: Methods, Challenges and Benefits for Investors". To cut a relatively long but enjoyable presentation short, Michael presented some interesting empirical evidence about hedge fund performance.

Firstly, he showed how many (most) hedge fund styles were able to deliver performance that had better risk/return profile than many mainstream investment portfolios, obviously including the ubiquitous 60% in equity 40% in bonds strategy. Given this relative outperformance in terms of risk and return for many hedge fund styles, Michael put forward the idea that asset managers seeking to invest in hedge funds should take more interest in indices of hedge funds than is currently the case. 

For a particular hedge fund style, to obtain a performance level that was better than 50% of the managers was actually quite good, particularly when he showed that the risk level was approximately better than 75% of the hedge funds within each class. Also, when you look at the performance over longer time periods (rolling 3 years say) an index outperformed many more of the funds in a particular investment style (sounds like a bit of the advantages of geometric vs. arithmetic averaging at work somewhere in this to me).

As an aside, he said that most hedge fund replication products do not mention tracking error and often instead talk about near perfect correlation with the hedge fund index being replicated. He was at pain to point out that it was possible to construct portfolios with near perfect correlation that have massive tracking errors, and so investors in these products should be aware of this marketing tactic (or failing, depending on your viewpoint).

Michael should some good examples of how his system had replicated the performance of a particular hedge fund style index, and how this broadly uncovered what kinds of investments were broadly being made by the hedge fund industry during each time period under consideration. He is already doing some work with some regulators on this, but most interestingly he showed how he took a few hedge funds that were later found to be involved in fraudulent activity, and worked backwards to find out what his system thought were the investments being made.

He then showed how by taking away the performance of the replicated fund away from the actual hedge fund results posted, the residual performance for these fraudulent funds was very large, and he implored investors in "stellar" perfoming hedge funds to do this analysis and really quiz the hedge fund manager for where this massive residual performance actually comes from before deciding to invest. In summary a good talk by an interesting speaker, which surprisingly for a New York Quafafew event was not interupted too many times by questions from the hosts.



Posted by Brian Sentance | 10 May 2012 | 6:44 pm


Dragon Kings, Black Swans and Bubbles

"Dragon Kings" is a new term to me, and the subject on Monday evening of a presentation by Prof. Didier Sornette at an event given by PRMIA. Didier has been working on the diagnosis on financial markets bubbles, something that has been of interest to a lot of people over the past few years (see earlier post on bubble indices from RiskMinds and a follow up here).

Didier started his presentation by talking about extreme events and how many have defined different epochs in human history. He placed a worrying question mark over the European Sovereign Debt Crisis as to its place in history, and showed a pair of particularly alarming graphs of the "Perpetual Money Machine" of financial markets. One chart was a plot of savings and rate of profit for US, EU and Japan with profit rising, savings falling from about 1980 onwards, and a similar diverging one of consumption rising and wages falling in the US since 1980. Didier puts this down to finance allowing this increasing debt to occur and to perpetuate the "virtual" growth of wealth.

Corn, Obesity and Antibiotics - He put up one fascinating slide relating to positive feedback in complex systems and effectively the law of unintended consequencies. After World War II, the US Government wanted to ensure the US food supply and subsidized the production of corn. This resulted in over supply over for humans -> so the excess corn was fed to cattle -> who can't digest starch easily -> who developed e-coli infections -> which prompted the use of antibiotics in cattle -> which prompted antibiotics as growth promoters for food animals -> which resulted in cheap meat -> leading to non-sustainable meat protein consumption and under-consumption of vegetable protein. Whilst that is a lot of things to pull together, ultimately Didier suggested that the simple decision to subsidise corn had led to the current epidemic in obesity and the losing battle against bacterial infections.

Power Laws - He then touched briefly upon Power Law Distributions, which are observed in many natural phenomena (city size, earthquakes etc) and seem to explain the peaked mean and long-tails of distributions of finance far better than the traditional Lognormal distribution of traditional economic theory. (I need to catch up on some Mandelbrot I think). He explained that whilst many observations (city size for instance) fitted a power law, that the where observations that did not fit this distribution at all (in the cities example, many capital cities are much, much larger than a power law predicts). Didier then moved on to describe Black Swans, characterised as unknown unknowable events, occurring exogenously ("wrath of god" type events) and with one unique investment strategy in going long put options.

Didier said that Dragon-Kings were not Black Swans, but the major crises we have observed are "endogenous" (i.e. come from inside the system), do not conform to a power law distribution and:

  • can be diagnosed in advanced
  • can be quantified
  • have (some) predictability

Diagnosing Bubbles - In terms of diagnosing Dragon Kings, Didier listed the following criteria that we should be aware of (later confirmed as a very useful and practical list by one of the risk managers in the panel):

  • Slower recovery from perturbations
  • Increasing (or decreasing) autocorrelation
  • Increasing (or decreasing) cross-correlation with external driving
  • Increasing variance
  • Flickering and stochastic resonance
  • Increased spatial coherence
  • Degree of endogeneity/reflexivity
  • Finite-time singularities

Didier finished his talk by describing the current work that he and ETH are doing with real and ever-larger datasets to test whether bubbles can be detected before they end, and whether the prediction of the timing of their end can be improved. So in summary, Didier's work on Dragon Kings involves the behaviour of complex systems, how the major events in these systems come from inside (e.g. the flash crash), how positive feedback and system self-configuration/organisation can produce statistical behaviour well beyond that predicted by power law distributions and certainly beyond that predicted by traditional equilibrium-based economic theory. Didier mentioned how the search for returns was producing more leverage and an ever more connected economy and financial markets system, and how this interconnectedness was unhealthy from a systemic risk point of view, particularly if overlayed by homogenous regulation forcing everyone towards the same investment and risk management approaches (see Riskminds post for some early concerns on this and more recent ideas from Baruch College)

Panel-Debate - The panel debate following was interesting. As mentioned, one of the risk managers confirmed the above statistical behaviours as useful in predicting that the markets were unstable, and that to detect such behaviours across many markets and asset classes was an early warning sign of potential crisis that could be acted upon. I thought a good point was made about the market post crash, in that the market's behaviour has changed now that many big risk takers were eliminated in the recent crash (backtesters beware!). It seems Bloomberg are also looking at some regime switching models in this area, so worth looking out for what they are up to. Another panelist was talking about the need to link the investigations across asset class and markets, and emphasised the role of leverage in crisis events. One of the quants on the panel put forward a good analogy for "endogenous" vs. "exogenous" impacts on systems (comparing Dragon King events to Black Swans), and I paraphrase this somewhat to add some drama to the end of this post, but here goes: "when a man is pushed off a cliff then how far he falls is not determined by the size of the push, it is determined by the size of the cliff he is standing on". 

 

 

Posted by Brian Sentance | 25 April 2012 | 3:10 pm


CVA - a business driver for breaking down asset silos

Xenomorph's analytics partner Numerix sponsored a PRMIA event at New York's Harvard Club this week on Credit Valuation Adjustment (CVA). The event also involved Microsoft, with a surprisingly relevant contribution to the evening on CVA and "Big Data" (I still don't feel comfortable losing the quotes yet, maybe soon...). Credit Valuation Adjustment seems to be the hot topic in risk management and pricing at the moment, with Numerix's competitor Quantifi having held another PRMIA event on CVA only a few months back. 

The event started with an introduction to CVA from Aletta Ely of JP Morgan Chase. Aletta started by defining CVA as the market value of counterparty credit risk. I am new to CVA as a topic, and my own experience on any kind of adjustment in valuation for instrument was back at JP Morgan in the mid-90s (those of you under 30 are allowed to start yawning at this point...). We used to maintain separate risk-free curves (what are they now?) and counterparty spread curves, which would be combined to discount the cashflows in the model.

Whilst such an adjustment could be calibrated to come up with an adjusted valuation which would be better than having no counterparty risk modelled at all, it seems one of the key aspects of how CVA differs is that a credit valuation adjustement needs to be done in the context of the whole portfolio of exposures to the counterparty, and not in isolation instrument by instrument. The fact that a trader in equity derivatives was long exposure to a counterparty cannot be looked at in isolation from a short exposure to a portfolio of swaps with the same counterparty on the fixed income desk.

Put another way, CVA only has context if we stand to lose money if our counterparty defaults, and so an aggregated approach is needed to calculate the size of the positive exposures to the counterparty over the lifetime of the portfolio. Also, given this one sided payoff aspect of the CVA calculation, then instrument types such as vanilla interest rate swaps suddenly move from being relatively simple instrument that can be priced off a single curve to instruments that needed optionality to be modelled for the purposes of CVA.

So why has CVA become such a hot topic at the banks? Prior to the 2008/2009 crisis CVA was already around (credit risk has existed for a long time I guess, regardless of whether you regulate or report to it), but given that bank credit spreads were at that time consistently low and stable then CVA had minimal effects on valuations and P&L. Obviously with the advent of Lehmans then this changed, and CVA has been pushed into prominence since it has directly affected P&L in a significant manner for many institutions (for example see these FT articles on Citi and JPMorgan)

A key and I think positive point for the whole industry is the CVA requires a completely multi-asset view, and given regulatory focus on CVA and capital adequacy then as a result it will drive banks away from a siloed approach to data and valuation management. If capital is scarcer and more costly, then banks will invest in understanding both their aggregate CVA and the incremental contribution to CVA of a new trade in the context of all exposures to the counterparty. Looking at incremental CVA, then you can also see that this also drives investment into real or near-realtime CVA calculation, which brings me on to the next talks of the evening by Numerix on CVA calculation methods and a surprisingly good presentation on CVA and "Big Data" from David Cox of Microsoft.

Denny Yu of Numerix did a good job of explaining some of the methods of calculating CVA, and in addition to being cross asset and all the implications that requires for having the ability to price anything, CVA is both data and computationally expensive. It requires both simulation of the scenarios for the default of counterparties through time, but also the valuation of cross-asset portfolios at different points in time. Denny mentioned techniques such as American Monte-Carlo to reduce the computation needed through using the same simulation paths for both default scenarios and valuation.

So on to Microsoft. I have seen some appalling presentations on "Big Data" recently, mainly from the larger software and hardware companies try to jump on the marketing band wagon (main marketing premise: the data problems you have are "Big"...enough said I hope). Surprisingly, David Cox of Microsoft gave a very good presentation around the computation challenges of CVA, and how technologies such as Hadoop take the computational power closer to the data that needs acting on, bringing the analytics and data together. (As an aside, his presentation was notably "Metro" GUI in style, something that seems to work well for PowerPoint where the slide is very visual and it puts more emphasis on the speak to overlay the information). David was obviously keen to talk up some of the cloud technology that Microsoft is currently pushing, but he knew the CVA business topic well and did a good job of telling a good story around CVA, "Big Data" and Cloud technologies. Fundamentally, his pitch was for banks and other institutions to become "Analytic Enterprises" with a common, scaleable and flexible infrastructure for data management and analysis. 

In summary it was a great event - the Harvard Club is always worth a visit (bars and grandiose portraits as expected but also barber shop in the basement and squash courts in the loft!), the wine afterwards was tolerably good and the speakers were informative without over-selling their products or company. Quick thank you to Henry Hu of IBM for transportation on the night, and thanks also to Henry for sending through this link to a great introductory paper on CVA and credit risk from King's College London. Whilst the title of the King's paper is a bit long and scary, it takes the form of dialogue between a new employee and a CVA expert, and as such is very readable with lots of background links.

 

 

 

Posted by Brian Sentance | 13 April 2012 | 1:56 pm


NoSQL - the benefit of being specific

NoSQL is an unfortunate name in my view for the loose family of non-relational database technologies associated with "Big Data". NotRelational might be a better description (catchy eh? thought not...) , but either way I don't like the negatives in both of these titles, due to aestetics and in this case because it could be taken to imply that these technologies are critical of SQL and relational technology that we have all been using for years. For those of you who are relatively new to NoSQL (which is most of us), then this link contains a great introduction. Also, if you can put up with a slightly annoying reporter, then the CloudEra CEO is worth a listen to on YouTube.

In my view NoSQL databases are complementary to relational technology, and as many have said relational tech and tabular data are not going away any time soon. Ironically, some of the NoSQL technologies need more standardised query languages to gain wider acceptance, and there will be no guessing which existing query language will be used for ideas in putting these new languages together (at this point as an example I will now say SPARQL, not that should be taken to mean that I know a lot about this, but that has never stopped me before...)

Going back into the distant history of Xenomorph and our XDB database technology, then when we started in 1995 the fact that we then used a proprietary database technology was sometimes a mixed blessing on sales. The XDB database technology we had at the time was based around answering a specific question, which was "give me all of the history for this attribute of this instrument as quickly as possible".

The risk managers and traders loved the performance aspects of our object/time series database - I remember one client with a historical VaR calc that we got running in around 30 minutes on laptop PC that was taking 12 hours in an RDBMS on a (then quite meaty) Sun Sparc box. It was a great example how specific database technology designed for specific problems could offer performance that was not possible from more generic relational technology. The use of database for these problems was never intended as a replacement for relational databases dealing with relational-type "set-based" problems though, it was complementary technology designed for very specific problem sets.

The technologists were much more reserved, some were more accepting and knew of products such as FAME around then, but some were sceptical over the use of non-standard DBMS tech. Looking back, I think this attitude was in part due to either a desire to build their own vector/time series store, but also understandably (but incorrectly) they were concerned that our proprietary database would be require specialist database admin skills. Not that the mainstream RDBMS systems were expensive or specialist to maintain then (Oracle DBA anyone?), but many proprietary database systems with proprietary languages can require expensive and on-going specialist consultant support even today.

The feedback from our clients and sales prospects that our database performance was liked, but the proprietary database admin aspects were sometimes a sales objection caused us to take a look at hosting some of our vector database structures in Microsoft SQL Server. A long time back we had already implemented a layer within our analytics and data management system where we could replace our XDB database with other databases, most notably FAME. You can see a simple overview of the architecture in the diagram below, where other non-XDB databases (and datafeeds) can "plugged in" to our TimeScape system without affecting the APIs or indeed the object data model being used by the client:

TimeScape-DUL

Data Unification Layer

Using this layer, we then worked with the Microsoft UK SQL team to implement/host some of our vector database structures inside of Microsoft SQL Server. As a result, we ended up with a database engine that maintained the performance aspects of our proprietary database, but offered clients a standards-based DBMS for maintaining and managing the database. This is going back a few years, but we tested this database at Microsoft with a 12TB database (since this was then the largest disk they had available), but still this contained 500 billion tick data records which even today could be considered "Big" (if indeed I fully understand "Big" these days?). So you can see some of the technical effort we put into getting non-mainstream database technology to be more acceptable to an audience adopting a "SQL is everything" mantra.

Fast forward to 2012, and the explosion of interest in "Big Data" (I guess I should drop the quotes soon?) and in NoSQL databases. It finally seems that due to the usage of these technologies on internet data problems that no relational database could address, the technology community seem to have much more willingness to accept non-RDBMS technology where the problem being addressed warrants it - I guess for me and Xenomorph it has been a long (and mostly enjoyable) journey from 1995 to 2012 and it is great to see a more open-minded approach being taken towards database technology and the recognition of the benefits of specfic databases for (some) specific problems. Hopefully some good news on TimeScape and NoSQL technologies to follow in coming months - this is an exciting time to be involved in analytics and data management in financial markets and this tech couldn't come a moment too soon given the new reporting requirements being requested by regulators.

 

 

 

Posted by Brian Sentance | 4 April 2012 | 3:54 pm


The Semantics are not yet clear.

I went along to "Demystifying Financial Services Semantics" on Tuesday, a one day conference put together by the EDMCouncil and the Object Management Group. Firstly, what are semantics? Good question, to which the general answer is that semantics are the "study of meaning". Secondly, were semantics demystified during the day? - sadly for me I would say that they weren't, but ironically I would put that down mainly to poor presentations rather than a lack of substance, but more of that later.

Quoting from Euzenat (no expert me, just search for Semantics in Wikipedia), semantics "provides the rules for interpreting the syntax which do not provide the meaning directly but constrains the possible interpretations of what is declared." John Bottega (now of BofA) gave an illustration of this in his welcoming speech at the conference by introducing himself and the day in PigLatin, where all of the information he wanted to convey was contained in what he said, but only a small minority of the audience who knew the rules of Pig Latin understood what he was saying. The rest of us were "upidstay"...

Putting this in the more in the context of financial markets technology and data management, the main use of semantics and semantic data models seem to be as a conceptual data model technique that abstract away from any particular data model or database implementation. To humour the many disciples of the "Church of Semantics", such a conceptual data model would also be self-describing in nature, such that you would not need a separate meta data model to understand it. For example take a look at say the equity example from what Mike Aitkin and the EDM Council have put together so far with their "Semantics Repository".

Abstraction and self-description are not new techniques (OO/SOA design anyone?) but I guess even the semantic experts are not claiming that all is new with semantics. So what are they saying? The main themes from the day seem to be that Semantics:

  • can bridge the gaps between business understanding and technology understanding
  • can reduce the innumerable transformations of data that go on within large organisations
  • is scaleable and adaptable to change and new business requirements
  • facilitates greater and more granular analysis of data
  • reduces the cost of data management
  • enables more efficient business processes

Certainly the issue of business and technology not understanding each other (enough) has been a constant theme of most of my time working in financial services (and indeed is one of the gaps we bridge here at Xenomorph). For example, one project I heard of a few years back was were an IT department had just delivered a tick database project, only for the business users to find that that it did not cope with stock splits and for their purposes was unusable for data analysis. The business people had assumed that IT would know about the need for stock split adjustments, and as such had never felt the need to explicitly specify the requirement. The IT people obviously did not know the business domain well enough to catch this lack of specification. 

I think there is a need to involve business people in the design of systems, particularly at the data level (whilst not quite a "semantic" data model, the data model in TimeScape presents business objects and business data types to the end user, so both business people and technologist can use it without showing any detail of an underlying table or physical data structure). You can see a lot of this around with the likes of CADIS pushing its "you don't need a fixed data model" ETL/no datawarehouse type approach against the more rigid (and to some, more complete) data models/datawarehouses of the likes of Asset Control and GoldenSource. You also get the likes of Polarlake pushing its own  semantic web and big data approach to data management as a next stage on from relational data models (however I get a bit worried when "semantic web" and "big data" are used together, sounds like we are heading into marketing hype overdrive, warp factor 11...)

So if Semantics is to become prevalent and deliver some of these benefits in bringing greater understanding between business staff and technologists, the first thing that has addressed is that Semantics is a techy topic at the moment, which would cause drooping eyelids on even the most technically enthused members of the business. Ontology, OWL, RDF, CLIF are all great if you are already in the know, but guaranteed to turn a non-technical audience off if trying to understand (demystify?) Semantics in financial markets technology.

Looking at the business benefits, many of the presenters (particularly vendors) put forward slides where "BAM! Look at what semantics delivered here!" was the mantra, whereas I was left with a huge gap in seeing how what they had explained had actually translated into the benefits they were shouting about. There needed to be a much more practical focus to these presentations, rather than semantic "magic" delivering a 50% reduction in cost with no supporting detail of just how this was achieved. Some of the "magic" seemed to be that there was no unravelling of any relational data model to effect new attributes and meanings in the semantic model, but I would suggest that abstracting away from relational representation has always been a good thing if you want to avoid collapsing under the weight of database upgrades, so nothing too new there I would suggest but maybe a new approach for some.

So in summary I was a little disappointed by the day, especially given the "Demystifying" title, although there were a few highlights with Mike Bennett's talk on FIBO (Financial Instruments Business Ontology) being interesting (sorry to use the "O" word). The discussion of the XBRL success story was also good, especially how regulators mandating this standard had enforced its adoption, but from its adoption many end consumers were now doing more with the data, enhancing its adoption further. In fact the XBRL story seemed to be model for regulators could improve the world of data in financial markets, through the provision and enforcement of the data semantics to be used with each new reporting requirement as they are mandated. In summary, a mixed day and one in which I learned that the technical fog that surrounds semantics in financial markets technology is only just beginning to clear.

 

Posted by Brian Sentance | 15 March 2012 | 2:58 pm


Risk models and tools at Baruch College

Emanuel Derman gave the last presentation of the day on mathematical models and their role in financial markets. His presentation seemed to build on some of his earlier ideas with Paul Wilmott on the "Modeller's Manifesto".

Emanuel said that there was a "scandal based on models" is wrong; models did (and do) have their faults but they were not a root cause of the crisis. He started his presentation (somewhat "tongue in cheek") by putting forward a "Theory of Deliciousness" to see how one might arrived at the value of something being more or less delicious. This involved discussion of "realised deliciousness" and "expected or implied deliciousness", plus definitions around equally (relatively) delicious things and absolute deliciousness. See post on FT Alphaville for more background, but fundamentally by analogy Emanuel was putting across that there is no "fundamental theory of finance" and that finance is not physics.

He said that economists do not know the difference between theorems and laws. He seemed to be critical of some recent work from Andrew Lo (see recent post) on putting together a "Complete Theory of Human Behaviour" for once again attempting to codify something that it is uncodifiable.

Emanuel described how economists should be more aware of what is and isn't a:

  • Metaphor - using something physical/tangible to represent a less tangible concept or idea. See this link for his interesting example on sleep/life and debt interest
  • Model - extending the behaviour of one thing to another. A model aircraft is a very useful model of a full-size aircraft with know inputs and useful outputs of interest. We can try to model the weather but here the inputs are known (temperature, wind etc) but the model is hard to define. In finance it is hard to really see what both the inputs are and what the outputs are too.
  • Theory - the ultimate non-metaphor. Here he gave the example of Moses asking the burning bush who shall I say sent me to which God replies "I am what I am". Put another way, you can't ask why on a theory, it just is.
  • Intuition - a premise put forward based neither on logical progression nor on experimentation.

Emanuel said that in Finance there is no absolute value theory, and the majority of models are relative value in nature. From a common sense point of view, the world is not a model. Things change dynamically and in this way effectively all models are wrong to some degree. In summary all financial models are short volatility.

He ended his presentation by saying that nature cares more about principles than regulations (prescriptive regulators beware I guess). His parting quote was by Edward Lucas who said "If you believe that capitalism is a system in which money matters more than freedom, you are doomed when people who don’t believe in freedom attack using money."

Panel Debate

Some highlights:

  • Bruno Dupire of Bloomberg said that it was important that a financial product was aligned with the needs of the customer, and cited certain complex products (with triggers) as being more in the interests of the vendor not the customer.
  • Bruno also said that the hedgeability of a product was also key to a more stable financial system (presumably pointing at products like CDO^3 etc). He said that residual risk (that left after hedging with simpler products) should be measured and costed for. Bruno also mention the problems with assessing long term volatility where traders will try to set this input to what best suits their own P&L
  • Leo Tilman said that risk management needs to be a decision-support discipline and not a policing function. He later suggested that risk managers should have to work as consultants for a while to understand that they get paid for serving the needs of the customer, not just stopping all activity/risks (in fairness to risk managers, I guess they might ask who is my customer? the trader? the CEO? the firm?).
  • Dilip Madan added to the models debate by saying "what is not in the assumptions will not show up in the conclusions".
  • Emanuel likes the old GS partner model for banking, and mentioned the example of Brazilian banks where banks/banking staff(?) did not enjoy limited liability. Dilip said he understood the advantage of this but no limited liability would stifle entrepreneurship.
  • Leon Tatevossian said that post-crisis the relationship between risk managers and traders is better than before, and that there was also greater co-operation between empiricists and modelers. Leo add that risk managers and traders need to speak the same language and understand what each other means by "risk".
  • Bruno said that models were much less of a problem than leverage.
  • All seemed to agree that the tools were not invalidated by the crisis, but the framework in which they are used was the important thing.

 

 

 

Posted by Brian Sentance | 11 February 2012 | 8:09 pm


Regulatory Dis-Harmony at Baruch College

Roberta Romano gave her presentation in the second session of the morning, putting forward her ideas that what was needed was greater regulatory dis-harmony rather than world-wide harmonisation. Fundamentally she argued that this diversity of approaches in different regulatory regimes would minimise the impact of regulatory error (since it would confine the error to less of the system) and it would provide a test bed for ideas so that it could be seen what regulations work and what do not.

Certainly there is some basis for this idea from others in the industry (see post on Pierre Guilleman concern's on the impact of Solvency II) and I first heard the idea of diversity in financial services put forward by Avinish Persaud at Riskminds a few years back (see post).

Roberta spent a good amount of the presentation putting forward how the process of putting this diverse regulation in place would work, with individual regimes applying to the Basel Committee putting forward why they wanted to deviate from Basel III and justify how such a desired deviation would not increase systemic risk. The Basel Committee would then have a short time frame for approval (say 3 months) and the burden of proof would be placed on the Committee to show that the deviation was a detrimental one. She also described how some of the home-host regulatory conflicts would be dealt with under her proposed process.

I thought that the overall aims of her proposal were sound (diversity leading to a more robust financial system) but the implementation process would be difficult to implement I would suggest and very open to regulatory arbitrage (both by banks and by countries seeking to boost their own economies). Roberta did touch on this, but my biggest criticism was that if one of the benefits was that for a while such a diverse system would demonstrate which regulations work and which do not, then logically everyone would eventually converge on the regulations that work, re-harmonising regulations and reducing diversity.This convergence would then introduce its own (potentially new?) risks and you would be back to where you started.

Panel Debate

A few points from the panel debate following the presentation:

  • There was more criticism of how Basel regulations were gamed by the banks, particularly in relation to optimising Risk Weighted Assets
  • One member of the panel pointed out that non-Basel US banks faired better in the crisis than those subject to Basel
  • Rodgin Cohen suggested that RWA should receive more focus rather than the level of the capital charge (echoing the previous panel session).
  • Rodgin was highly critical in the cutbacks in funding for regulators in the US
  • Rodgin also said that London had its standing as the leading world financial centre due to the US Congress (refering to the Eurobond market and the Sarbanes-Oxley)
  • Regulators should never forget that the "Law of Unintended Consequences" rules

 

 

Posted by Brian Sentance | 11 February 2012 | 6:00 pm


Systemic Risk at Baruch College

Baruch College hosted the Capco-sponsored "Institute Paper Series in Applied Finace" on Thursday. I assume this is a further follow-up event to the one they did at NYU Poly last year (see some notes here). I have put some notes together below, my apologies in advance to the speakers for any innaccuracies or ommissions in putting my thoughts together:

Systemic Risk Presentation

First part of the day started with a presentation by Viral V. Acharya of Stern on systemic risk. I have always found systemic risk an interesting topic, given the puzzle of how do you dis-incentivise an organisation from increasing risks in the wider financial system when the organisation itself will not directly (or wholey) face the consequences of this "external" risk increase.

Viral started his presentation with some great jokey graphics, one of a the HQ of a bank going up in flames with fireman hosing the flames with banknotes not water. He mentioned the definition of systemic risk given by Daniel Tarullo, Governor of the Federal Reserve (I couldn't find the definition, but primer paper here). He asked how Lehman was allowed to fail when the likes of Fannie Mae, Freddie Mac, AIG, Merrills, CitiGroup, Morgan Stanley, Goldman Sachs, Washington Mutual and Wachovia were not and offered assistance in one way or another. He said there was not enough capital in the system to stop Lehmans failure but that he saw Lehmans as the catalyst for the recapitalisation of the American banking system, not the cause. He later implied that Europe had so far lacked such a catalyst for action in the European banking system.

Viral said that he wanted to put forward an ex-ante regulation that would force a bank to retain additional capital to account for the systemic risk it produced. He said that the banking system was obviously much safer than it had been a few years back, but suggested that whilst the system could now withstand say the failure of a large organisation such as Citigroup, in his opinion it would struggle to survive the failure of Citigroup and a Euro default happening at the same time. Viral said that the current Dodd-Frank regulation on systemic risk was not a healthy one in that if a large institution fails, banks of capitalisation of over $50B are jointly taxed to assist in the consequences of the failure. Viral viewed this as a big dis-incentive against a healthy bank (say a JPM) from stepping in to purchase the failing institution before the failure, as JPM would know that it would be taxed anyway on the bailout. 

In Viral's model, he defined a crisis as a 40% market correction, and assumed that non-equity liabilities repayed at face value in such a crisis. Given there is not much real data around for a 40% correction, he used data obtained from 2% correction events observed, then extrapolated from the 2% to the 40% level. He said that the question that needed to be asked was whether in such a crisis scenario that a bank like JPM would retain 8% capital. He emphasis that the level of capital chosen was somewhat arbitary but rather more importantly were the assumptions in the model of crisis, since the capital models used in regulation today are based on average losses not crisis-level losses. Using this and related models, Viral showed that the banks exhibiting the most systemic risk were Bank of America, JPM and the Citigroup (for more background and a complete list see Stern's V-Lab ).

Viral said the restructuring of Dexia (exposed heavily to peripheral sovereign debt) was the "Bear Stearns of Europe" (exposed heavily to peripheral MBS), but that is restructuring was not large enough to cause a more widespread re-capitalisation of the European banking system. Dexia was ranked as one of the safest banks in the Europe-wide stress tests of 2011, given that the Basel risk weightings did not apply any haircut to European sovereign debt. This was another critiscism that Viral levelled at Basel in that the risk weightings are static and do not reflect changes in market conditions.

Panel Debate

Viral then joined a panel debate on systemic risk chaird by Linda Allen of Baruch, joined by Jan Cave of FDIC, Sean Culbert of Capco, Gary Gluck of Credit Suisse and Craig Lewis of the SEC.I have tried to bring out some of the main themes/points of the discussions below:

- The Balance Between Risk to the System and Risk to the Economy

There was a lot of debate on the secondary effects of regulating systemic risk and increasing capital charges on banks, and its wider effect on the general economy. Craig put forward the argument that too high capital requirements would stifle lending and in turn stifle the wider economy (arguably the "bigger" systemic risk maybe?). He argued for a balance to be found and that the aim should not be to eliminate risk in the system completely. I guess Craig was taking the banker's view, but the rest of the panel seemed to agree that the point was a valid one.

- Basel III

All agreed that Basel III was an improvement but there was still much more to be done. Gary was critical of Basel III calculation remaining too static, but Jason described how Basel III had removed many debt-like assets from the capital calculation which was good however. Jason also described how Basel I had been a simple framework (and good for that) but was tinkered with with VAR encouraging assets to be moved to trading book to reduce capital charges. Basel II then introduced the Internal Model approach and over ten years capital requirements were continued to be lowered, with CDO's attracting a 56bp capital charge during this time down from 8%. Enforcement of Basel III on both liquidity risk and capital was considered as key for coming years.

- Liquidity Risk

There was general consensus that pre-2007 liquidity risk was not talked about enough and there were no standard ways of calculating its level. Jason said that pre-2007 the regulators had not modelled what happens when the counterparties start running. Gary said that he questioned whether some of the current calibrations of liquidity risk were correct.

- Volcker

Sean raised the point that Volcker was likely to impact market-makers and hence impact liquidity (see earlier post on this).

- Rehypothecation

Sean also mentioned that Rehypothecation of Assets has not been debated enough and had only received scant attention in Dodd-Frank (maybe see recent article on Thomson-Reuters on MF Global)

- Europe (and more Basel)

General consensus that Basel III capital requirements will constrain GDP growth in Europe. Viral seemed to have the strongest views here, saying the Europe needed a bank recapitalisation program just as the US had gone through, and that such a program would be a big boost to economic confidence. Viral remains deeply sceptical on the success of Basel III - for example all of the 2007 failures were supposedly from well capitalised insitutions under Basel I and II. Viral says that the problem is not the level of capital (8% or 12% etc) but the method of modelling the shock. A good point from Gary I thought was his premise that politics in relation to sovereign debt was playing its part in undermining the calculations and approach of Basel III.

- Too Big to Fail?

One audience question was "is too big to fail simply too big?" and should the largest organisations be broken up into more manageable parts. Viral answered that he was not in favour of a size constraint and cited that some large institutions, notably JPM, Rabobank and HSBC had been relatively robust successful during the recent crisis. He did however qualify this response by saying that he was in favour of a size constraint if the large size reached was due to implicit banking guarantees from the government, and that he would like failing large banks to be broken up into smaller pieces.

 

Posted by Brian Sentance | 11 February 2012 | 5:18 pm


Contact Details and Regional Offices. All rights reserved. Trademarks, copyright and legal. Whole site © Xenomorph Software Ltd. Sitemap.