Financial Markets Industry
Posts categorized "Regulation"
Numerix ran a great event on Thursday morning over at Microsoft's offices here in New York. "The Road to Achieving a Unified View of Risk" was introduced by Paul Rowady of the TABB Group. As at our holiday event last December, Paul is a great speaker and trying to get him to stop talking is the main (positive) problem of working with him (his typical ebullience was also heightened by his appearance in the Wall Street Journal on Thursday, apparently involving nothing illegal he assured me and even about which his mother phoned him during his presentation...). Paul started by saying that in their end of year review with his colleagues Larry Tabb and Adam Sussman, he suggested that Tabb Group needed to put more into developing the risk management thought leadership, which had led to today's introduction and the work Tabb Group have been doing with Numerix.
Having been involved in financial markets in Chicago, Paul is very bullish about the risk management capabilities of the funds and prop trading shops of the exchange traded options markets from days of old, and said that these risk management capabilities are now needed and indeed coming to the mainstream financial markets. Put another way, post crisis the need for a holistic view on risk has never been stronger. Considering bilateral OTC derivatives and the move towards central clearing, Paul said that he had been thinking that calculations such as CVA would eventually become as extinct as a dodo. However on using some data from the DTCC trade repository, he found that there are still some $65trillion notional of uncleared bilateral trades in the market, and that these will take a further 30 years to expire. Looking at swaptions alone the notional uncleared was $6trillion, and so his point was that bilateral OTC and their associated risks will be around for some time yet.
Paul put forward some slides showing back, middle and front-offices along different siloed business lines, and explained that back in the day when margins were fat and times were good, each unit could be run independently, with no overall view of risk possible given the range of siloed systems and data. In passing Paul also mentioned that one bank he had spoken two had 6,000 separate systems to support on just the banking side, let alone capital markets. Obviously post crisis this has changed, with pressures to reduce operational costs being a key driver at many institutions, and currently only valuation/reference data (+2.4%) and risk management (+1.2%) having increased budget spend across the market in 2013. Given operational costs and regulation such as CVA, risk management is having to move from being an end of day, post-trade process to being pre- and post-trade at intraday frequency. Paul said that not only must consistent approaches to data and analytics be taken across back, middle and front office in each business unit but now an integrated view of risk across business units must be taken (echos of an earlier event with Numerix and PRMIA). Considering consistent analytics, Paul mentioned his paper "The Risk Analytics Library" but suggested that "libraries" of everything were needed, so not just analytics, but libraries of data (data management anyone?), metadata, risk models etc.
Paul asked Ricardo Martinez of Deloite for an update on the regulatory landscape at the moment, and Ricardo responded by focusing down on the derivatives aspects Dodd-Frank. He first pointed out that even after a number of years the regulation was not yet finalized around collateral and clearing. A good point he made was that whilst the focus in the market at the moment is on compliance, he feels that the consequences of the regulation will ripple on over the next 5 years in terms of margining and analytics.
Some panel members disagreed with Paul over the premise that bilateral exotic trades will eventually disappear. Their point was that the needs of pension funds and other clients are very specific and there will always be a need for structured products, despite the capital cost incentives to move everything onto exchanges/clearing. Paul countered by saying that he didn't disagree with this, but the reason for suggesting that the exotics industry may die is trying to find institutions that can warehouse the risk of the trade.
Satyam Kancharla of Numerix spoke next. Satyam said that two main changes struck him in the market at the moment. One was the adjustment to a mandated market structure with clearing, liquidity and capital changes coming through from the regulators. The other was increased operating efficiency for investment banks. Whilst it is probable that no in investment bank would ever get to the operational efficiency of a retail business like Walmart, this was however the direction of travel with banks looking at how to optimize collateral, optimize trading venues etc.
Satyam put forward that computing power is still adhering to Moore's law, and that as a result some things are possible now that were not before, and that a centralized architecture built on this compute power is needed, but just because it is centralized does not mean that it is too inflexible to deal with each business units needs. Coming back to earlier comments made by the panel, he put forward that a lot of quants are involved in simply re-inventing the wheel, to which Paul added that quants were very experienced in using words like "orthogonal" to confuse mere mortals like him and justify the repetition of business functionality available already (from Numerix obviously, but more of that later). Satyam said that some areas of model development were more mature than others, and that quants should not engage in innovation for innovation's sake. Satyam also made a passing reference to the continuing use of Excel and VBA is the main tool of choice in the front office, suggesting that we still have some way to go in terms of IT maturity (hobby-horse topic of mine, for example see post).
Prompt by an audience question around data and analytics, Ricardo said that the major challenge towards sharing data was not technical but cultural. Against a background were maybe 50% of investment in technology was regulation-related, he said that there were no shortage of business ideas for P&L in the emerging "mandated" markets of the future, but many of these ideas required wholesale shifts in attitudes at the banks in terms of co-operation across departments and from front to back office.
Satyam said that he thought of data and analytics as two sides of the same coin (could not agree more, but then again I would say that) in that analytics generate derived data which needs just as much management as the raw data. He said that it should be possible to have systems and architectures that manage the duality of data and analytics well, and these architectures did not have to imply rigidity and inflexibility in meeting individual business needs.
There was then some debate of trade repositories for derivatives, where the panel discussed the potential conflict between the US regulators wanting competition in this area, but as Paul suggested having competition between DTCC, ICE, Bloomberg, LCH Clearnet etc also led to fragmentation. As such Paul put it that the regulators would need to "boil the ocean" to understand the exposures in the market. Ricardo also mentioned some of the current controversy over who owns the data in the trade repository. One of the panelists suggested that we should also keep an eye open to China and not necessarily get totally tied up in what is happening in "our" markets. The main point was that a huge economy such as China's could not survive without a sophisticated capital market to support it, and that China was not asleep in this regard.
A good audience question came from Don Wesnofske who asked how best to cope with the situation where an institution is selling derivatives based on one set of models, and the client is using another set of models to value the same trade. So the selling institution decides to buy/build a similar model to the client too, and Don wondered how the single analytic library practically helped this situation where I could price on one model and report my P&L using another. One panelist responded that it was mostly the assumptions behind each model that determined differences in price, and that heterogenious models and hence prices where needed for a market to function correctly. Another concurred on this and suggested there needed to be an "officially blessed" model with an institution against which valuations are compared. Amusingly for the audience, Steve O'Hanlon (CEO of Numerix) piped up that the problem was easy to resolve in that everyone should use Numerix's models.
Mike Opal of Microsoft closed the event with his presentation on data, analytics and cloud computing. Mike started by illustrating that the number of internet-enabled devices passed the human population of the world in 2008 and by 2020 the number of devices would be 50 billion. He showed that the amount of data in the world was 0.8ZB (zetabytes) in 2009, and is projected to reach 8ZB by 2015 and 35ZB by 2020, driven primarily by the growth in internet-enabled devices. Mike also said that the Prism project so in the news of late was involving the construction of a server fame near Salt Lake City of 5ZB in size, so what the industry (in this case the NSA) is trying to do is unimaginable if we were to go back only a few years. He said that Microsoft itself was utterly committed to cloud computing, with 8 datacenters globally but 20 more in construction, at a cost of $500million per center (I recently saw a datacentre in Redmond, totally unlike what I expected with racks pre-housed in lorry containers, and the containers just unloaded within a gigantic hanger and plugged in - the person showing me around asked me who the busiest person was a Microsoft data center and the answer was the truck drivers...)
Talking of "Big Data", he first gave the now-standard disclaimer (as I have I acknowledge) that he disliked the phrase. I thought he made a good point in the Big Data is really about "Small Data", in that a lot of it is about having the capacity to analyze at tiny granular level within huge datasets (maybe journalists will rename it? No, don't think so). He gave a couple of good client case studies, one for Westpac and one for Phoenix on uses of HPC and cloud computing in financial services. He also mentioned the Target retailing story about Big Data, which if you haven't caught it is worth a read. One audience question asked him again how committed Microsoft was to cloud computing given competition from Amazon, Apple and Google. Mike responded that he had only joined Microsoft a year or two back, and in part this was because he believed Microsoft had to succeed and "win" the cloud computing market given that cloud was not the only way to go for these competitors, whereas Microsoft (being a software company) had to succeed at cloud (so far Microsoft have been very helpful to us in relation to Azure, but I guess Amazon and others have other plans.)
In summary a great event from Numerix with good discussions and audience interaction - helped for me by the fact that much of what was said (centralization with flexibility, duality of data and analytics, libraries of everything etc) fits with what Xenomorph and partners like Numerix are delivering for clients.
Posted by Brian Sentance | 17 June 2013 | 8:23 pm
Background - I went along to my first PRMIA event in Stamford, CT last night, with the rather grandiose title of "The Anthropology, Sociology, and Epistemology of Risk". Stamford is about 30 miles north of Manhattan and is the home to major offices of a number of financial markets companies such as Thomson Reuters, RBS and UBS (who apparently have the largest column-less trading floor in the world at their Stamford headquarters - particularly useful piece of trivia for you there...). It also happens to be about 5 minutes drive/train journey away from where I now live, so easy for me to get to (thanks for another useful piece of information I hear you say...). Enough background, more on the event which was a good one with five risk managers involved in an interesting and sometimes philosophical discussion on fundamentally what "risk management" is all about.
Introduction - Marc Groz who heads the Stamford Chapter of PRMIA introduced the evening and started by thanking Barry Schwimmer for allowing PRMIA to use the Stamford Innovation Centre (the Old Town Hall) for the meeting. Henrik Neuhaus moderated the panel, and started by outlining the main elements of the event title as a framework for the discussion:
- Anthropology - risk management is to what purpose?
- Sociology - how does risk management work?
- Epistemology - what knowledge is really contained within risk management?
Henrik started by taking a passage about anthropology and replacing human "development" with "risk management" which seemed to fit ok, although the angle I was expecting was much more about human behaviour in risk management than where Henrik started. Henrik asked the panel what results they had seen from risk management and what did that imply about risk management? The panelists seemed a little confused or daunted by the question prompting one of them to ask "Is that the question?".
Business Model and Risk Culture - Elliot Noma dived in by responding that the purpose of risk management obviously depended very much on what are the institutional goals of the organization. He said that it was as much about what you are forced to do and what you try to do in risk management. Elliot said that the sell-side view of risk management was very regulatory and capital focused, whereas mutual funds are looking more at risk relative to benchmarks and performance attribution. He added that in the alternatives (hedge-fund) space then there were no benchmarks and the focus was more about liquidity and event risk.
Steve Greiner said that it was down to the investment philosophy and how risk is defined and measured. He praised some asset managers where the risk managers sit across from the portfolio managers and are very much involved in the decision making process.
Henrik asked the panel whether any of the panel had ever defined a “mission statement” for risk management. Marc Groz chipped in that he remember that he had once defined one, and that it was very different from what others in the institution were expecting and indeed very different from the risk management that he and his department subsequently undertook.
Mark Szycher (of GM Pension Fund) said that risk management split into two areas for him, the first being the symmetrical risks where you need to work out the range of scenarios for a particular trade or decision being taken. The second was the more asymmetrical risks (i.e. downside only) such as those found in operational risk where you are focused on how best to avoid them happening.
Micro Risk Done Well - Santa Federico said that he had experience of some of the major problems experienced at institutions such as Merrill Lynch, Salomen Brothers and MF Global, and that he thought risk management was much more of a cultural problem than a technical one. Santa said he thought that the industry was actually quite good at the micro (trade, portfolio) risk management level, but obviously less effective at the large systematic/economic level. Mark asked Santa what was the nature of the failures he had experienced. Santa said that the risks were well modeled, but maybe the assumptions around macro variables such as the housing market proved to be extremely poor.
Keep Dancing? - Henrik asked the panel what might be done better? Elliot made the point that some risks are just in the nature of the business. If a risk manager did not like placing a complex illiquid trade and the institution was based around trading in illiquid markets then what is a risk manager to do? He quote the Citi executive who said “ whilst the music is still playing we have to dance”. Again he came back to the point that the business model of the institution drives its cultural and the emphasis of risk management (I guess I see what Elliot was saying but taken one way it implied that regardless of what was going on risk management needs to fit in with it, whereas I am sure that he meant that risk managers must fit in with the business model mandated to shareholders).
Risk Attitudes in the USA - Mark said that risk managers need to recognize that the improbable is maybe not so improbable and should be more prepared for the worst rather than risk management under “normal” market and institutional behavior. Steven thought that a cultural shift was happening, where not losing money was becoming as important to an organization as gaining money. He said that in his view, Europe and Asia had a stronger risk culture than in the United States, with much more consensus, involvement and even control over the trading decisions taken. Put another way, the USA has more of a culture of risk taking than Europe. (I have my own theories on this. Firstly I think that the people are generally much more risk takers in the USA than in UK/Europe, possibly influenced in part by the relative lack of underlying social safety net – whilst this is not for everyone, I think it produces a very dynamic economy as a result. Secondly, I do not think that cultural desire in the USA for the much admired “presidential” leader necessarily is the best environment for sound, consensus based risk management. I would also like to acknowledge that neither of my two points above seem to have protected Europe much from the worst of the financial crisis, so it is obviously a complex issue!).
Slaves to Data? - Henrik asked whether the panel thought that risk managers were slaves to data? He expanded upon this by asking what kinds of firms encourage qualitative risk management and not just risk management based on Excel spreadsheets? Santa said that this kind of qualitative risk management occurred at a business level and less so at a firm wide level. In particular he thought this kind of culture was in place at many hedge funds, and less so at banks. He cited one example from his banking career in the 1980's, where his immediate boss was shouted off the trading floor by the head of desk, saying that he should never enter the trading floor again (oh those were the days...).
Sociology and Credibility - Henrik took a passage on the historic development of women's rights and replaced the word "women" with "risk management" to illustrate the challenges risk management is facing with trying to get more say and involvement at financial institutions. He asked who should the CRO report to? A CEO? A CIO? Or a board member? Elliot responded by saying this was really a issue around credibility with the business for risk managers and risk management in general. He made the point that often Excel and numbers were used to establish credibility with the business. Elliot added that risk managers with trading experience obviously had more credibility, and to some extent where the CRO reported to was dependent upon the credibility of risk management with the business.
Trading and Risk Management Mindsets - Elliot expanded on his previous point by saying that the risk management mindset thinks more in terms of unconditional distributions and tries to learn from history. He contrasted this with a the "conditional mindset' of a trader, where the time horizon forwards (and backwards) is rarely longer than a few days and the belief is strong that a trade will work today given it worked yesterday is high. Elliot added that in assisting the trader, the biggest contribution risk managers can make is more to be challenging/helpful on the qualitative side rather than just quantitative.
Compensation and Transactions - Most of the panel seemed to agree that compensation package structure was a huge influencer in the risk culture of an organisation. Mark touched upon a pet topic of mine, which is that it very hard for a risk manager to gain credibility (and compensation) when what risk management is about is what could happen as opposed to what did happen. A risk manager blocking a trade due to some potentially very damaging outcomes will not gain any credibility with the business if the trading outcome for the suggested trade just happened to come out positive. There seemed to be concensus here that some of the traditional compensation models that were based on short-term transactional frequency and size were ill-formed (given the limited downside for the individual), and whilst the panel reserved judgement on the effectiveness of recent regulation moves towards longer-term compensation were to be welcome from a risk perspective.
MF Global and Busines Models - Santa described some of his experiences at MF Global, where Corzine moved what was essentially a broker into taking positions in European Sovereign Bonds. Santa said that the risk management culture and capabilities were not present to be robust against senior management for such a business model move. Elliot mentioned that he had been courted for trades by MF Global and had been concerned that they did not offer electronic execution and told him that doing trades through a human was always best. Mark said that in the area of pension fund management there was much greater fidiciary responsibility (i.e. behave badly and you will go to jail) and maybe that kind of responsibility had more of a place in financial markets too. Coming back to the question of who a CRO should report to, Mark also said that questions should be asked to seek out those who are 1) less likely to suffer from the "agency" problem of conflicts of interest and on a related note those who are 2) less likely to have personal biases towards particular behaviours or decisions.
Santa said that in his opinion hedge funds in general had a better culture where risk management opinions were heard and advice taken. Mark said that risk managers who could get the business to accept moral persuasion were in a much stronger position to add value to the business rather than simply being able to "block" particular trades. Elliot cited one experience he had where the traders under his watch noticed that a particular type of trade (basis trades) did not increase their reported risk levels, and so became more focussed on gaming the risk controls to achieve high returns without (reported) risk. The panel seemed to be in general agreement that risk managers with trading experience were more credible with the business but also more aware of the trader mindset and behaviors.
Do we know what we know? - Henrik moved to his third and final subsection of the evening, asking the panel whether risk managers really know what they think they know. Elliot said that traders and risk managers speak a different language, with traders living in the now, thinking only of the implications of possible events such as those we have seen with Cyprus or the fiscal cliff, where the risk management view was much less conditioned and more historical. Steven re-emphasised the earlier point that risk management at this micro trading level was fine but this was not what caused events such as the collapse of MF Global.
Rational argument isn't communication - Santa said that most risk managers come from a quant (physics, maths, engineering) background and like structured arguments based upon well understood rational foundations. He said that this way of thinking was alien to many traders and as such it was a communication challenge for risk managers to explain things in a way that traders would actually put some time to considering. On the modelling side of things, Santa said that sometimes traders dismissed models as being "too quant" and sometimes traders followed models all too blindly without questioning or understanding the simplifying assumptions they are based on. Santa summarised by saying that risk management needs to intuitive for traders and not just academically based. Mark added that a quantitative focus can sometimes become too narrow (modeler's manifesto anyone?) and made the very profound point that unfortunately precision often wins over relevance in the creation and use of many models. Steven added that traders often deal with absolutes, so as knowing the spread between two bonds to the nearest basis point, whereas a risk manager approaching them with a VaR number really means that this is the estimated VaR which really should be thought to be within a range of values. This is alien to the way traders think and hence harder to explain.
Unanticipated Risk - An audience member asked whether risk management should focus mainly on unanticipated risks rather than "normal' risks. Elliot said that in his trading he was always thinking and checking whether the markets were changing or continuing with their recent near-term behaviour patterns. Steven said that history was useful to risk management when markets were "normal", but in times of regime shifts this was not the case and cited the example of the change in markets when Mario Dragi announced that the ECB would stand behind the Euro and its member nations.
Risky Achievements - Henrik closed the panel by asking each member what they thought was there own greatest achievement in risk management. Elliot cited a time when he identified that a particular hedge fund had a relatively inconspicuous position/trade that he identified as potentially extremely dangerous and was proved correct when the fund closed down due to this. Steven said he was proud of some good work he and his team did on stress testing involving Greek bonds and Eurozone. Santa said that some of the work he had done on portfolio "risk overlays" was good. Mark ended the panel by saying that he thought his biggest achievement was when the traders and portfolio managers started to come to the risk management department to ask opinions before placing key trades. Henrik and the audience thanked the panel for their input and time.
An Insured View - After the panel closed I spoke with an actuary who said that he had greatly enjoyed the panel discussions but was surprised that when talking of how best to support the risk management function in being independent and giving "bad" news to the business, the role of auditors were not mentioned. He said he felt that auditors were a key support to insurers in ensuring any issues were allowed to come to light. So food for thought there as to whether financial markets can learn from other industry sectors.
Summary - great evening of discussion, only downside being the absence of wine once the panel had closed!
Posted by Brian Sentance | 25 April 2013 | 9:27 pm
Katherine Moriaty was a very interesting speaker at the ETF event, and she talked us through some of the regulatory issues in relation to ETFs, particularly in relation to non-transparent ETFs. Katherine provided some history on the regulation of the fund industry in the US, particularly in relation to the Investment Company Act of 1940 which was enacted to restore public confidence in the fund management industry following the troubled times of the late 1920's and through the 1930's.
The fundamental concern for the SEC (the regulatory body for this) is that the provider of the fund products cannot game investors, providing false or incorrect valuations to maximize profits. Based on the "'40 Act" as she termed it, the SEC has allowed exemptions to allow various index and fund products, such as for smart indices you need full disclosure of the rules involved, plus with active indices then constituents are published. However with active ETFs, retail investors are at a disadvantage to authorized participants (APs, the ETF providers) since there is no transparency around the constituents.
Obviously fund managers want to manage portfolios without disclosure (to maintain the "secrets" of their success, to keep trading costs low etc), but no solution has yet been found to allow this for ETFs that satisfies the SEC that the small guy is not at risk from this lack of transparency. Katherine said that participants were still still trying to come up with solutions to this problem and the SEC is still open to an exemption for anything that in their view, "works" (sounds like someone will make a lot of money when/if a solution is found). Solutions tried so far include using blind trusts and proxy or shadow portfolios. Someone from the audience asked about the relative merits of Active ETFs when compared to Active Mutual Funds - Katherine answered that the APs wanted an exchange traded product as a new distribution channel (and I guess us "Joe Soaps" want lower fees for active management...)
Vikas Kalra of MSCI had the uneviable position of giving the last presentation of the evening, and he said he would keep his talk short since he was aware he was standing between us and the cocktail reception to follow. Vikas described the problem that many risk managers faced, which was that doing risk management for a portfolio containing ETFs was fine when the ETF was of a "look through" type (i.e. constituents available), but when the ETF is opaque (no/little/uncertain constituent data) then the choices were usually 1) remove the ETF from the risk calculation or 2) substitute some proxy instrument.
Vikas said the Barra part of MSCI had come up with the solution to analyse ETF "styles". From what I could tell, this looked like some sophisticated form of 2) above, where Barra had done the analysis to enable an opaque ETF to be replaced by some more transparent proxy which allowed constituents to be analysed within the risk process and correlations etc recognised. Vikas said that 400 ETFs and ETNs were now covered in their product offering.
Conclusion - Overall a very interesting event that improved my knowledge of ETFs and had some great speakers.
Posted by Brian Sentance | 23 April 2013 | 11:26 pm
Joanne Hill of Proshare presented next at the event. Joanne started her talk by illustrating how showing volatility levels from 1900 to the present day, and how historic volatility over the past 10 years seems to be at pre-1950's levels. Joanne had a lot of slides that she took us through (to be available on the event link above) which would be challenging to write up everyone (or at least that is my excuse and I am sticking to it...).
Joanne said that the VIX trades about 4% above realised volatility, which she described as being due to expectations that "something" might happen (so financial markets can be cautious it seems!). Joanne seemed almost disappointed that we seem now to have entered a period of relatively boring (?!) market activity following the end of the crisis given that the VIX is now trading at pre-2007 lows. In answer to audience questions she said that inverse volatility indices were growing as were products dependent on dynamic index strategies.
Posted by Brian Sentance | 23 April 2013 | 10:12 pm
Next up in the event was Phil Mackintosh of Credit Suisse who gave his presentation on trading ETFs, starting with some scene-setting for the market. Phil said that the ETP market had expanded enormously since its start in 1993, currently with over $2trillion of assets ($1.3trillion in the US). He mentioned that $1 in $4 of flow in the US was ETF related, and that the US ETF market was larger than the whole of the Asian equity market, but again emphasizing relative size the US ETF market was much smaller than the US equities and futures markets.
He said that counter to the impression some have, the market is 52% institutional and only 48% retail. He mentioned that some macro hedge fund managers he speaks to manage all their business through ETPs. ETFs are available across all asset classes from alternatives, currencies, commodities, fixed income, international and domestic equities. Looking at fees, these tend to reside in the 0.1% to 1% bracket, with larger fees charged only for products that have specific characteristics and/or that are difficult to replicate.
Phil illustrated how funds have consistently flowed into ETFs over recent years, in contrast with the mutual funds industry, with around 25% in international equity and around 30% in fixed income. He said that corporate fixed income, low volatility equity indices and real estate ETFs were all on the up in terms of funds flow.
He said that ETF values were calculated every 15 seconds and oscillated around there NAV, with arbitrage activity keeping ETF prices in line with underlying prices. Phil said that spreads in ETFs could be tighter than in their underlyings and that ETF spreads tightened for ETFs over $200m.
Phil warned of a few traps in trading ETFs. He illustrated the trading volumes of ETFs during an average which showed that they tended to be traded in volume in the morning but not (late) afternoon (need enlightening as to why..). He added that they were more specifically not a trade for a market open or close. He said that large ETF trades sometimes caused NAV disconnects, and mentioned deviations around NAV due to underlying liquidity levels. He also said that contango can become a problem for VIX futures related products.
There were a few audience questions. One concerned how fixed income ETFs were the price discovery mechanism for some assets during the crisis given the liquidity and timeliness of the ETF relative to its underlyings. Another question concerned why the US ETF market was larger and more homogenous then in Europe. Phil said that Europe was not dominated by 3 providers as in the US, plus each nationality in Europe tended to have preferences for ETF products produced by each country. This was also further discussions on shorting Fixed Income ETFs since they were more liquid than the primary market. (Inote to self, need to find out more about the details of the ETF redemption and creation process).
Overall a great talk by a very "sharp" presenter (like a lot of good traders Phil seemed to understand the relationships in the market without needing to think about them too heavily).
Posted by Brian Sentance | 23 April 2013 | 9:52 pm
It seems to be ETF week for events in New York this week, one of which was hosted by PRMIA, Credit Suisse and MSCI last night called "Risk Management of and with ETFs/Indices". The event was chaired by Seddik Meziani of Montclair State University, who opened with thanks for the sponsors and the speakers for coming along, and described the great variety of asset exposures now available in Exchange Traded Products (ETPs) and the growth in ETF assets since their formation in 1993. He also mentioned that this was the first PRMIA event in NYC specifically on ETFs.
Index-Based Approaches for Risk Management in Wealth Management - Shaun Weuzbach of S&P Dow Jones Indices started with his proesentation. Shaun's initial point was to consider whether "Buy & Hold" works given the bad press it received over the crisis. Shaun said that the peak to trough US equity loss during the recent crisis was 57%, but when he hears of investors that made losses of this order he thinks that this was more down to a lack of diversification and poor risk management rather than inherent failures in buy and hold. To justify this, he sited an example simple portfolio constructed of 60% equity and 40% fixed income, which only lost 13% peak to trough during the crisis. He also illustrated that equity market losses of 5% or more were far more frequent during the period 1945-2012 than many people imagine, and that investors should be aware of this in portfolio construction.
Shaun suggested that we are in the third innings of indexing:
- Broad-based benchmark indices
- Precise sector-and thematic-based indices
- Factor-based indices (involving active strategies)
Where the factor-based indices might include ETF strategies based on/correlated with things such as dividend payments, equity weightings, fundamentals, revenues, GDP weights and volatility.
He then described how a simple strategy index based around lowering volatility could work. Shaun suggested that low volatility was easier to explain than minimizing variance to retail investors. The process for his example low volatility index was take the 100 lowest volatility stocks out of the the S&P500 and weight by the inverse of volatility, with rebalancing every quarter.
He illustrated how this index exhibited lower volatility with higher returns over the past 13 years or so (this looked like a practical example illustrating some of the advantages of having a less volatile geometric mean of returns from what I could see). He also said that this index had worked across both developed and emerging markets.
Apparently this index has been available for only 2 years, so 11 years of the performance figures were generated from back-testing (the figures looked good, but a strategy theoretically backtested over historic markets when the strategy was not used and did not exist should always be examined sceptically).
Looking at the sector composition of this low volatility index, then one of the very interesting points that Shaun made was that the index got of the financials sector some two quarters before Lehman's went down (maybe the index was less influenced by groupthink or the fear of realising losses?)
Shaun then progressed to look a short look at VIX-based strategies, describing the VIX as the "investor fear guage". In particular he considered the S&P VIX Short-Term Future Index, which he said exhibits a high negative correlation with the S&P500 (around -0.8) and a high positive correlation with the VIX spot (approx +0.8). He said that explaining these products as portfolio insurance products was sometimes hard for financial advisors to do, and features such as the "roll cost" (moving from one set of futures contracts to others as some expire) was also harder to explain to non-institutional investors.
A few audience questions followed, one concerned concerned with whether one could capture principal retention in fixed income ETFs. Shaun briefly mentioned that the audience member should look at "maturity series" products in the ETP market. One audience member had concerns over the liquidity of ETF underlyings, to which Shaun said that S&P have very strict criteria for their indices ensuring that the free float of underlyings is high and that the ETF does not dominate liquidity in the underlying market.
Overall a very good presentation from a knowledgeable speaker.
Posted by Brian Sentance | 23 April 2013 | 7:30 pm
Just caught saw a reference on LinkedIn to this FT article "Finance groups lack spreadsheet controls". Started to write a quick response and given it is one of my major hobby-horses, I ended up doing a bit of an essay, so I decided to post it here too:
"As many people have pointed out elsewhere, much of the problem with spreadsheet usage is that they are not treated as a corporate and IT asset, and as such things like testing, peer review and general QA are not applied (mind you, maybe more of that should still be applied to many mainstream software systems in financial markets...).
Ralph and the guys at Cluster Seven do a great job in helping institutions to manage and monitor spreadsheet usage (I like Ralph's "we are CCTV for spreadsheets" analogy), but I think a fundamental (and often overlooked) consideration is to ask yourself why did the business users involved decide that they needed spreadsheets to manage trading and risk in the first place? It is a bit like trying to address the symptoms of a illness without ever considering how we got the illness in the first place.
Excel is a great tool, but to quote Spider-Man "with great power comes great responsibility" and I guess we can all see the consequences of not taking the usage of spreadsheets seriously and responsibly. So next time the trader or risk manager says "we've just built this really great model in Excel" ask them why they built it in Excel, and why they didn't build upon the existing corporate IT solutions and tools. In these cost- and risk- conscious times, I think the answers would be interesting..."
Posted by Brian Sentance | 27 March 2013 | 11:09 am
Very pleased to announce today that Mediobanca, the leading investment bank in Italy, has decided to select TimeScape as its data management system. You can see the press release here.
Posted by Brian Sentance | 25 March 2013 | 12:58 pm
Thanks to one of my PRMIA colleagues for pointing out this article in the WSJ, talking about how regulatory driven stress testing in the US is promoting conformity and reducing innovation in approach to risk management. Echos some posts from last year on regulation increasing risk and diversity of regulation.
Posted by Brian Sentance | 20 March 2013 | 3:18 pm
Notes I took from a recent Oliver Wyman sponsored PRMIA event in New York, who brought together a panel of senior managers and CROs from leading asset management organizations to discuss the role of risk management for asset managers, specifically the types of governance and controls necessary to safeguard client's assets in the current macro environment. You can access the notes here on the PRMIA site.
Posted by Brian Sentance | 14 March 2013 | 11:23 am
Good post from Jim Jockle over at Numerix - main theme is around having an "analytics" strategy in place in addition to (and probably as part of) a "Big Data" strategy. Fits strongly around Xenomorph's ideas on having both data management and analytics management in place (a few posts on this in the past, try this one from a few years back) - analytics generate the most valuable data of all, yet the data generated by analytics and the input data that supports analytics is largely ignored as being too business focussed for many data management vendors to deal with, and too low level for many of the risk management system vendors to deal with. Into this gap in functionality falls the risk manager (supported by many spreadsheets!), who has to spend too much time organizing and validating data, and too little time on risk management itself.
Within risk management, I think it comes down to having the appropriate technical layers in place of data management, analytics/pricing management and risk model management. Ok it is a greatly simplified representation of the architecture needed (apologies to any techies reading this), but the majority of financial institutions do not have these distinct layers in place, with each of these layers providing easy "business user" access to allow risk managers to get to the "detail" of the data when regulators, auditors and clients demand it. Regulators are finally waking up to the data issue (see Basel on data aggregation for instance) but more work is needed to pull analytics into the technical architecture/strategy conversation, and not just confine regulatory discussions of pricing analytics to model risk.
Posted by Brian Sentance | 14 February 2013 | 2:50 pm
A little late on these notes from this PRMIA Event on Big Data in Risk Management that I helped to organize last month at the Harmonie Club in New York. Big thank you to my PRMIA colleagues for taking the notes and for helping me pull this write-up together, plus thanks to Microsoft and all who helped out on the night.
Introduction: Navin Sharma (of Western Asset Management and Co-Regional Director of PRMIA NYC) introduced the event and began by thanking Microsoft for its support in sponsoring the evening. Navin outlined how he thought the advent of “Big Data” technologies was very exciting for risk management, opening up opportunities to address risk and regulatory problems that previously might have been considered out of reach.
Navin defined Big Data as the structured or unstructured in receive at high volumes and requiring very large data storage. Its characteristics include a high velocity of record creation, extreme volumes, a wide variety of data formats, variable latencies, and complexity of data types. Additionally, he noted that relative to other industries, in the past financial services has created perhaps the largest historical sets of data and continually creates enormous amount of data on a daily or moment-by-moment basis. Examples include options data, high frequency trading, and unstructured data such as via social media. Its usage provides potential competitive advantages in a trading and investment management. Also, by using Big Data it is possible to have faster and more accurate recognition of potential risks via seemingly disparate data - leading to timelier and more complete risk management of investments and firms’ assets. Finally, the use of Big Data technologies is in part being driven by regulatory pressures from Dodd-Frank, Basel III, Solvency II, Markets for Financial Instruments Directives (1 & 2) as well as Markets for Financial Instruments Regulation.
Navin also noted that we will seek to answer questions such as:
- What is the impact of big data on asset management?
- How can Big Data’s impact enhance risk management?
- How is big data used to enhance operational risk?
Presentation 1: Big Data: What Is It and Where Did It Come From?: The first presentation was given by Michael Di Stefano (of Blinksis Technologies), and was titled “Big Data. What is it and where did it come from?”. You can find a copy of Michael’s presentation here. In summary Michael started with saying that there are many definitions of Big Data, mainly defined as technology that deals with data problems that are either too large, too fast or too complex for conventional database technology. Michael briefly touched upon the many different technologies within Big Data such as Hadoop, MapReduce and databases such as Cassandra and MongoDB etc. He described some of the origins of Big Data technology in internet search, social networks and other fields. Michael described the “4 V’s” of Big Data: Volume, Velocity, Variety and a key point from Michael was “time to Value” in terms of what you are using Big Data for. Michael concluded his talk with some business examples around use of sentiment analysis in financial markets and the application of Big Data to real-time trading surveillance.
Presentation 2: Big Data Strategies for Risk Management: The second presentation “Big Data Strategies for Risk Management” was introduced by Colleen Healy of Microsoft (presentation here). Colleen started by saying expectations of risk management are rising, and that prior to 2008 not many institutions had a good handle on the risks they were taking. Risk analysis needs to be done across multiple asset types, more frequently and at ever greater granularity. Pressure is coming from everywhere including company boards, regulators, shareholders, customers, counterparties and society in general. Colleen used to head investor relations at Microsoft and put forward a number of points:
- A long line of sight of one risk factor does not mean that we have a line of sight on other risks around.
- Good risk management should be based on simple questions.
- Reliance on 3rd parties for understanding risk should be minimized.
- Understand not just the asset, but also at the correlated asset level.
- The world is full of fast markets driving even more need for risk control
- Intraday and real-time risk now becoming necessary for line of sight and dealing with the regulators
- Now need to look at risk management at a most granular level.
Colleen explained some of the reasons why good risk management remains a work in progress, and that data is a key foundation for better risk management. However data has been hard to access, analyze, visualize and understand, and used this to link to the next part of the presentation by Denny Yu of Numerix.
Denny explained that new regulations involving measures such as Potential Future Exposure (PFE) and Credit Value Adjustment (CVA) were moving the number of calculations needed in risk management to a level well above that required by methodologies such as Value at Risk (VaR). Denny illustrated how the a typical VaR calculation on a reasonable sized portfolio might need 2,500,000 instrument valuations and how PFE might require as many as 2,000,000,000. He then explain more of the architecture he would see as optimal for such a process and illustrated some of the analysis he had done using Excel spreadsheets linked to Microsoft’s high performance computing technology.
Presentation 3: Big Data in Practice: Unintentional Portfolio Risk: Kevin Chen of Opera Solutions gave the third presentation, titled “Unintentional Risk via Large-Scale Risk Clustering”. You can find a copy of the presentation here. In summary, the presentation was quite visual and illustrating how large-scale empirical analysis of portfolio data could produce some interesting insights into portfolio risk and how risks become “clustered”. In many ways the analysis was reminiscent of an empirical form of principal component analysis i.e. where you can see and understand more about your portfolio’s risk without actually being able to relate the main factors directly to any traditional factor analysis.
Panel Discussion: Brian Sentance of Xenomorph and the PRMIA NYC Steering Committee then moderated a panel discussion. The first question was directed at Michael “Is the relational database dead?” – Michael replied that in his view relational databases were not dead and indeed for dealing with problems well-suited to relational representation were still and would continue to be very good. Michael said that NoSQL/Big Data technologies were complimentary to relational databases, dealing with new types of data and new sizes of problem that relational databases are not well designed for. Brian asked Michael whether the advent of these new database technologies would drive the relational database vendors to extend the capabilities and performance of their offerings? Michael replied that he thought this was highly likely but only time would tell whether this approach will be successful given the innovation in the market at the moment. Colleen Healy added that the advent of Big Data did not mean the throwing out of established technology, but rather an integration of established technology with the new such as with Microsoft SQL Server working with the Hadoop framework.
Brian asked the panel whether they thought visualization would make a big impact within Big Data? Ken Akoundi said that the front end applications used to make the data/analysis more useful will evolve very quickly. Brian asked whether this would be reminiscent of the days when VaR first appeared, when a single number arguably became a false proxy for risk measurement and management? Ken replied that the size of the data problem had increased massively from when VaR was first used in 1994, and that visualization and other automated techniques were very much needed if the headache of capturing, cleansing and understanding data was to be addressed.
Brian asked whether Big Data would address the data integration issue of siloed trading systems? Colleen replied that Big Data needs to work across all the silos found in many financial organizations, or it isn’t “Big Data”. There was general consensus from the panel that legacy systems and people politics were also behind some of the issues found in addressing the data silo issue.
Brian asked if the panel thought the skills needed in risk management would change due to Big Data? Colleen replied that effective Big Data solutions require all kinds of people, with skills across a broad range of specific disciplines such as visualization. Generally the panel thought that data and data analysis would play an increasingly important part for risk management. Ken put forward his view all Big Data problems should start with a business problem, with not just a technology focus. For example are there any better ways to predict stock market movements based on the consumption of larger and more diverse sources of information. In terms of risk management skills, Denny said that risk management of 15 years ago was based on relatively simply econometrics. Fast forward to today, and risk calculations such as CVA are statistically and computationally very heavy, and trading is increasingly automated across all asset classes. As a result, Denny suggested that even the PRMIA PRM syllabus should change to focus more on data and data technology given the importance of data to risk management.
Asked how best to should Big Data be applied?, then Denny replied that echoed Ken in saying that understanding the business problem first was vital, but that obviously Big Data opened up the capability to aggregate and work with larger datasets than ever before. Brian then asked what advice would the panel give to risk managers faced with an IT department about to embark upon using Big Data technologies? Assuming that the business problem is well understood, then Michael said that the business needed some familiarity with the broad concepts of Big Data, what it can and cannot do and how it fits with more mainstream technologies. Colleen said that there are some problems that only Big Data can solve, so understanding the technical need is a first checkpoint. Obviously IT people like working with new technologies and this needs to be monitored, but so long as the business problem is defined and valid for Big Data, people should be encouraged to learn new technologies and new skills. Kevin also took a very positive view that IT departments should be encouraged to experiment with these new technologies and understand what is possible, but that projects should have well-defined assessment/cut-off points as with any good project management to decide if the project is progressing well. Ken put forward that many IT staff were new to the scale of the problems being addressed with Big Data, and that his own company Opera Solutions had an advantage in its deep expertise of large-scale data integration to deliver quicker on project timelines.
Audience Questions: There then followed a number of audience questions. The first few related to other ideas/kinds of problems that could be analyzed using the kind of modeling that Opera had demonstrated. Ken said that there were obvious extensions that Opera had not got around to doing just yet. One audience member asked how well could all the Big Data analysis be aggregated/presented to make it understandable and usable to humans? Denny suggested that it was vital that such analysis was made accessible to the user, and there general consensus across the panel that man vs. machine was an interesting issue to develop in considering what is possible with Big Data. The next audience question was around whether all of this data analysis was affordable from a practical point of view. Brian pointed out that there was a lot of waste in current practices in the industry, with wasteful duplication of ticker plants and other data types across many financial institutions, large and small. This duplication is driven primarily by the perceived need to implement each institution’s proprietary analysis techniques, and that this kind of customization was not yet available from the major data vendors, but will become more possible as cloud technology such as Microsoft’s Azure develops further. There was a lot of audience interest in whether Big Data could lead to better understanding of causal relationships in markets rather than simply correlations. The panel responded that causal relationships were harder to understand, particularly in a dynamic market with dynamic relationships, but that insight into correlation was at the very least useful and could lead to better understanding of the drivers as more datasets are analyzed.
Posted by Brian Sentance | 8 February 2013 | 3:14 pm
I got my first tour around the NYSE trading floor on Wednesday night, courtesy of an event by Rutgers University on Risk. Good event, mainly around panel discussion moderated by Nicholar Dunbar (Editor of Bloomberg Risk newsletter), and involving David Belmont (Commonfund CRO), Adam Litke (Chief Risk Strategist for Bloomberg), Hilmar Schaumann (Fortress Investment CRO) and Sanjay Sharma (CRO of Global Arbitrage and Trading at RBC).
Nick first asked the panel how do you define and measure risk? Hilmar responded that risk measurement is based around two main activities: 1) understanding how a book/portfolio is positioned (the static view) and 2) understanding sensitivities to risks that impact P&L (the dynamic view). Hilmar mentioned the use of historical data as a guide to current risks that are difficult to measure, but emphasised the need for a qualitative approach when looking at the risks being taken.
David said that he looks at both risk and uncertainty - with risk being defined as those impacts you can measure/estimate. He said that historical analysis was useful but limited given it is based only on what has happened. He thought that scenario analysis was a stronger tool. (I guess with historical analysis you at least get some idea of the impact of things that could not be predicted even it is based on one "simulation" path i.e. reality, whereas you have more flexibility with scenario management to cover all bases, but I guess limited to those bases you can imagine). David said that path-dependent risks such as those in the credit markets in the last crisis were some of the most difficult to deal with.
Adam said that you need to understand why you are measuring risk and understand what risks you are prepared to take. He said that at Wachovia they knew that a 25% house price fall in California would be a near death experience for the bank prior to the 2008 crisis, and in the event the losses were much greater than 25%. His point was really that you must decide what risks you want to survice and at what level. He said that sound common-sense judgement is needed to decide whether a scenario is really-real or not.
Sanjay said that risk managers need to maintain a lot of humility and not to over-trust risk meaurements. He described a little of the risk approach used at RBC where he said they use over 80 different models and employ them as layers/different views on risk to be brought together. He said they start with VaR as a base analysis, but build on this with scenarios, greeks and then on to other more specific reports and analysis. He emphasised that communication is a vital skill for risk managers to get their views and ideas across.
Nicholas then moved on to ask how risk managers should make or reduce risks? - getting away from risk measurement to risk management. Adam said that risks should be delegated out to those that manage them but this needs to be combined with responsibility for the risks too. Keep people and departments within the bounds of what their remit. Be prepared to talk a different business language to different stakeholders dependent upon their understanding and their motivations. David gave some examples of this in his case, where endowment funds what risk premiums over many years and risks are translated/quantified into practical things for example such as a new college building not going ahead etc.
Hilmar said the hedge funds are supposed to take risks, and that the key was not necessarily to avoid losses (although avoid them if you can) but rather to avoid surprises. Like the other speakers, Hilmar emphasised that communication of risks to key stakeholders was vital. He also added the key point that if you don't like a risk you have identified, then try first to take it off rather than hedging it, since hedging could potentially add basis risk and simple more complication.
Nicholas then Sanjay about how risk managers should deal with bringing difficult news to the business? Sanjay suggested that any bad news should be approach in the form of "actionable transparency" i.e. that not only do you say communicate how bad the risk is to all stakeholders but you come along with actionable approaches to dealing with the risk. In all of his experience and despite the crisis, Sanjay's experience is that traders do not want to loose money and if you come with solutions they will listen. He concluded by saying that qualitative analysis should also be used, citing the hypothetical example that you should take notice of dogs (yes, the animal!) buying mortgages, whether or not the mortgages are AAA rated.
Nicholas asked the panel members in turn what risks are they concerned about currently? David said he believed that many risks were not priced into the market currently. He was concerned about policy impacts of action by the ECB and the Fed, and thought the current and forward levels of volatility are low. In Fixed Income markets he thought that Dodd-Frank may have detrimental effects, particular with the current lack of clarity about what is proprietary trading and what is market-making. He thought that should policies and interests rates change, he thought that risk managers should look carefully at what will happen as funds flow out of fixed income and into equities.
Hilmar talked about the postponement of the US debt ceiling limits and that US Government policy battles continue to be an obvious source of risk. In Europe, many countries had elections this year which would be interesting, and that the problems in the Euro-zone are less than they were, but problems in Cyprus could fan the flames of more problems and anxiety. Hilmar said the Japan's new policy of targetting 2% inflation may have effects on the willingness of domestic investors to buy JGBs.
Sanjay said he was worried. In the "Greenspan Years" prior to 2008 a quasi government guarantee on the banks was effectively put in place and that we continue to live with cheap money. When policy eventually changes and interest rates rise, Sanjay wondered whether the world was ready for the wholesale asset revaluation that would then be required.
Adams concerns where mainly around identifying what will be the cause of the next panic in the market. Whilst he said he is in favour of central clearing for OTC derivatives, he thought that the changing market structure combined with implementing central clearing had not been fully thought through and this was a worry to him.
Nicholas asked what do the panelists think to the regulation being implemented? David said that regulators face the same difficulty that risk managers face, in that nobody notices when you took sensible action to protect against a risk that didn't occur. He thinks that regulation of the markets is justified and necessary.
Sanjay said that in the airline and pharmacutical industries regulatory approval was on the whole very robust but that they were dealing with approving designs (aeroplanes and drugs) that are reproduced once approved. He said that such levels of regulation in financial services were not yet possible due to the constant innovation found in the markets, and he wanted regulation to be more dynamic and responsive to market developments. Sanjay also joined those in the industry that are critical of the shear size of Dodd-Frank.
Nicholas said that Adam was obviously keen on operational issues and wondered what plumbing in the industry would he change? Adam said that he is a big fan of automation but operational risk are real and large. He thought that there were too many rules and regulations being applied, and the regulators were not paying attention to the type of markets they want in the future, nor on the effects of current regulation and how people were moving from one part of the industry to another. Adam said that in relation to Knight Capital he was still a strong advocate of standing by the wall socket, ready to pull the plug on the computer. Adam suggested that regulators should look at regulating/approving software releases (I assume here he means for key tasks such as automated trading or risk reporting, not all software).
Given the large number of students present, Nicholas closed the panel by asking what career advice the panelists had for future risk managers? Adam emphasised flexibility in role, taking us through his career background as an equity derivatives and then fixed income trader before coming into risk management. Adam said it was highly unlikely over your career that you would stay with one role or area of expertise.
Hilmar said that having risk managers independent of trading was vitally important for the industry. He thought there were many areas to work with operational risk being potentially the largest, but still with plenty more to do in market risk, compliance and risk modelling. He added that understanding the interdepencies between risks was key and an area for further development.
When asked by Nicholas, David said that risk managers should have a career path right through to CEO of an institution. He wanted to encourage risk management as a necessary level above risk measurement and control. He was excited about the potential of Big Data technologies to help in risk management. David gave some interesting background on his own career initially as an emergining markets debt trader. He said that it is important to know yourself, and that he regarded himself as a sceptic, needing all the information available before making a decision. As such his performance as a trader was consistent but not as high as some, and this became one of the reasons he moved into risk management.
Sanjay said many of the systems used in finance are 20 years old, in complete contrast with the advancies in mobile and internet technologies. As such he thought this was a great opportunity to be involved in the replacement and upgrading of this older infrastructure. Apparently one analyst had estimated that $65B will be spent on risk management over the next 4-5 years.
Adam thought that there was a need for code of ethics for quants (see old post for some ideas). Sanjay added that the industry needed to move away from being involved primarily in attempting to optimise activity around gaming regulation. When asked by Nicholas about Basel III, Adam thought that improved regulation was necessary but Basel III was not the right way to go about it and was way too complex.
Posted by Brian Sentance | 1 February 2013 | 2:41 pm
Posted by Brian Sentance | 22 January 2013 | 3:14 pm
Quick thank you to all those who came along to Xenomorph's New York Holiday Party at the Classic Car Club. Below is an extract from talk given by Paul Rowady of the Tabb at the event, followed by my effort and some photographs from the event.
There Is No Such Thing as Alpha Generation
The change in perspective caused by a subtle change in language can galvanize your approach to data, the tools you select, and even the organizational culture. That said, ‘alpha generation’ is a myth; there is only alpha discovery and capture.
By E. Paul Rowady, Jr.
We live in an age of superlatives: unprecedented market complexity and uncertainty caused, in part, by an unprecedented regulatory onslaught and unprecedented economic extremes. As a result, there is an unprecedented focus on risk analysis – and an unprecedented (and anxious) search for new sources of performance from all market demographics.
The big data era is here and will only become the bigger data era. What we need is a new perspective. But fostering such a new perspective may be as subtle as performing a little linguistic jujitsu.
Our business – trading and investment in capital and commodity markets around the globe – has a history of being cavalier or too casual about language; particularly how certain labels, terms or vernacular are used to describe the business and the markets. Some of this language is intentional – the use of certain terminology creates mystique, fosters mythology, manufactures a sense of complexity that only a select group of savants can tame -- particularly when it comes to activities around quantitative methods. And some of it is just plain laziness, stretching the use of labels far beyond their original meaning on the idea that these terms are close enough.
I have become increasingly sensitive to this phenomenon over the years. Call it an insatiable need to simplify complexity, bring order to chaos, to enhance a level of accuracy and precision in how we describe what we do and how we do it. I find that precision of language does impact how complex technical topics are communicated, understood and absorbed. It turns out, language impacts perspective – and perspective impacts strategy and tactics.
So let’s gain a little perspective on alpha generation and alpha creation...(full extract can be found on the TabbFORUM)
Paul in full speech mode at the Classic Car Club
Big thanks to Paul for the above talk. Here's is my follow-up:
Thanks Paul for a great talk, certainly I agree that people, process, technology and data are key to the future success of financial markets. In particular, I think attitudes towards data must change if we are to meet the coming challenges over the next few years. For example, in my view data in financial markets is analogous to water:
- Everyone needs it
- Everyone knows where to get it
- Nobody likes to share it
- Nobody is 100% sure where was really sourced from
- Nobody is quite sure where it goes to
- Nobody knows its true cost
- Nobody knows how much is wasted
- Everyone assumes it is of high quality
- And you only ever know it has gone bad after you have drunk it.
- (I should add, that if you own water you are also very wealthy, so wealthy your neighbor might even consider robbing you)
The problem of siloed data and data integration remains, but this is as much a political as opposed to purely technical problem. People need to share data more, and I wonder (I hope) that as the “social network” generation come through that attitudes will improve, but I guess this will also add different pressures to data aggregators as people are less hung up about sharing information. The focus needs to be on the data that business folks need, and should be less about the type of the data or the technical means by which it is captured, stored and distributed – for sure these are important aspects, but we need involve more people in realizing this cult of data.
And just as Paul has issues with the over-use of “Alpha”, I promise this will be the only time this evening I will mention “Big Data” but today I heard the best description so far of what big data is all about, which is “Big data is like watching the planet develop a nervous system”. Data is fundamental to all of our lives and we are living through some very interesting times in terms of how much data is becoming available and how we make sense of it.
So, a change of tack. When moving to the New York area a few years back, one of my fellow Brits said that you will find the Americans a lot friendlier than the English, but don’t talk to them about politics or religion. So rules are meant to broken, and religion aside I thought I would briefly have to mention the recent election as one of the big differences between the UK and the USA.
Firstly, wow you guys know how to have long elections. I think the French get theirs done in two weeks but even the Brits do it in a month. A few things struck me from the election: I don’t know whether the Democratic Party is generally supportive of legalizing drugs, but I think we can be certain that President Obama spent some time in the states of Colorado and Washington prior to the first debate.
And I hear from the New Yorker that the Republicans are trying a radical new approach to broaden the demographic of the supporter base, apparently to make it inclusive of people who have strong believers in “maths and science”.
Moving on from a light-hearted look at elections but sticking with the government theme, the regulation is obviously very high profile at the moment. To some degree this is understandable as financial markets have been doing a great job of keeping a low profile with:
- JPMorgan $7B London Whale
- Barclays and the Libor rigging
- Standard Chartered and Iranian money laundering
- Knight Capital with the biggest advertisement in history for automated trading
- ING feeling it was missing out on things with Cuba and Iranian money
- HSBC helping Mexican drug lords to move the money around
- Capital One deceiving its customers
- Peregrine Financial Group deceiving the regulators (generating alpha?)
All these occurred in 2012, when it seems that the dust had barely settled over MF Global and UBS. So it is possible to understand the reaction of people and politicians to what has gone on and the need for more stable capital markets, but my biggest concern is that there is simply too much regulation, and complex systems with complex rules is a great breeding ground for the law of unintended consequences. To illustrate how over time we humans, and in particular governments, seem to be regressing in terms of using more words to describe ever more complex behaviours I found the following list online:
- Pythagoras 24 words
- Lords Prayer 66 words
- Archidmedies Priciple 66 words
- 10 commandments 179 words
- Gettysburg Address 286 words
- Declaration of independence 1300 words
- US Govt sale of cabbage 26,991 words
Dodd-Frank is about 2,300 pages, which apparently is going to spawn some 30,000 pages of rules – that is enormous. Listening to a regulator speak last week, he said the regulators had about 10,000 pages done, 10,000 in progress and 10,000 not even started yet. Worse than this, he added that regulators were not trying to shape the financial markets of the future but rather dealing only with the current issues. Regulators should take their lead from quantum physics in my view, as soon as you observe something it is changed. Financial markets are complex, and making them even more complex through overlaying complex rules is not going to result in the stability that we all desire.
Anyway, thanks for coming along this evening and I hope you have a great time. Quick thank you to our clients and partners without whom we would not exist. Thanks to the hard work our staff put in over the year, but in particular thanks to Naj and Xenomorph's NYC team for organizing this evenings event.
Some photographs from the event below. Big thanks to NandoVision for some of the images:
Clients, partners and staff catch up over a drink or three
This waiter had a pleasant interuption in service prior to the fashion show by Hiliary Flowers
Jim Beck talks with PRMIA NYC members: Qi Fu, Sol Steinberg and Don Wesnofske
Cass Almendral, Hillary Flowers and Brian later at the bar
Not sure how this ballet-themed dress works in a convertible?
Russ Glisker and Mark O'Donnell talk cars with Paul
A far more practical outfit for this Porsche
Some of the fashion models rush to discuss the finer points of Alpha Harvesting with Paul...
Thanks again to all involved in putting the party together and for everyone who came along on the night. If I don't get round to another post over the Holiday Season, then best wishes for a fantastic break and a great start to 2013.
Posted by Brian Sentance | 19 December 2012 | 12:48 am
Good breakfast event from SAP and A-Team last Thursday morning. SAP have been getting (and I guess paying for) a lot of good air-time for their SAP Hana in-memory database technology of late. Domenic Iannaccone of SAP started the briefing with an introduction to big data in finance and how their SAP/Sybase offerings knitted together. He started his presentation with a few quotes, one being "Intellectual property is the oil of the 21st century" by Mark Getty (he of Getty images, but also of the Getty oil family) and "Data is the new oil" by both Clive Humby and Gerd Leonhard (not sure why two people quoted saying the same thing but anyway).
For those of you with some familiarity with the Sybase IQ architecture of a year or two back, then in this architecture SAP Hana seems to have replaced the in-memory ASE database that worked in tandem with Sybase IQ for historical storage (I am yet to confirm this, but hope to find out more in the new year). When challenged on how Hana differs from other in-memory database products, Domenic seemed keen to emphasise its analytical capabilities and not just the database aspects. I guess it was the big data angle of bring the "data closer to the calculations" was his main differentiator on this, but with more time I think a little bit more explanation would have been good.
Pete Harris of the A-Team walked us through some of the key findings of what I think is the best survey I have read so far on the usage of big data in financial markets (free sign-up needed I think, but you can get a copy of the report here). Some key findings from a survey of staff at ten major financial institutions included:
- Searching for meaning in instructured data was a leading use-case thought of when thinking of big data (Twitter trading etc)
- Risk management was seen as a key beneficiary of what the technologies can offer
- Aggregation of data for risk was seen as a key application area concerning structured data.
- Both news feed but also (surprisingly?) text documents were key unstructured data sources being processed using big data.
- In trading news sentiment and time series analysis were key areas for big data.
- Creation of a system wide trade database for surveillance and compliance was seen as a key area for enhancement by big data.
- Data security remains a big concern with technologists over the use of big data.
There were a few audience questions - Pete clarified that there was a more varied application of big data amongst sell-side firms, and that on the buy-side it was being applied more KYC and related areas. One of the audience made that point that he thought a real challenge beyond the insight gained from big data analysis was how to translate it into value from an operational point of view. There seemed to be a fair amount of recognition that regulators and auditors are wanting a full audit trail of what has gone on across the whole firm, so audit was seen as a key area for big data. Another audience member suggested that the lack of a rigid data model in some big data technologies enabled greater flexibility in the scope of questions/analysis that could be undertaken.
Coming back to the key findings of the survey, then one question I asked Pete was whether or not big data is a silver bullet for data integration. My motivation was that the survey and much of the press you read talks about how big data can pull all the systems, data and calculations together for better risk management, but while I can understand how massively scaleable data and calculation capabilities was extremely useful, I wondered how exactly all the data was pulled together from the current range of siloed systems and databases where it currently resides. Pete suggested that this was stil a problematic area where Enterprise Application Integration (EAI) tools were needed. Another audience member added that politics within different departments was not making data integration any easier, regardless of the technologies used.
Overall a good event, with audience interaction unsurprisingly being the most interesting and useful part.
Posted by Brian Sentance | 3 December 2012 | 2:12 pm
Just wanted to start this post with a quick best wishes to all affected by Hurricane Sandy in the New York area. Nature is a awesomely powerful thing and amply demonstrated it is always to be respected as a "risk".
Good event on regulatory progress organised by PRMIA and hosted by Credit Suisse last night. Dan Rodriguez introduced the speakers and Michael Gibson of the Fed began with his assessment of what he thinks regulators have learned from the crisis. Mike said that regulators had not paid enough attention to the following factors:
- Resolvability (managing the failure of a financial institution without triggering systemic risk)
Capital - Mike said that regulators had addressed the quality and quantity of capital head by banks. With respect to Basel III, Mike said that the Fed had received around 2,500 comments that they were currently reviewing. In relation to supervision, he suggested that stress testing by the banks, the requirement for capital planning from banks and the independent stress tests undertaken by the regulators had turned the capital process into much more of a forward-looking exercise than it had been pre-crisis. The ability of regulators to limit dividend payments and request capital changes had added some "teeth" to this forward looking approach. Mike said that the regulators are getting more information which is allowing them to look more horizontally across different financial institutions to compare and contrast business practices, risks and capital adequacy. He thought that disclosure to the public of stress testing results and other findings was also a healthy thing for the industry, prompting wider debate and discussion.
Liquidity - Mike said that liquidity stress testing was an improvement over what had gone before (which was not much). He added that the Basel Committee was working on a quantitative liquidity ratio and that in general regulators were receiving and understanding much more data from the banks around liquidity.
Resolvability - Mike said in addition to resolution plans (aka "living wills") being required by Dodd-Frank in the US, the Fed was working with other regulators internationally on resolvability.
There then followed a Q&A session involving the panelists and the audience:
Basel III Implementation Timeline - Dan asked Mike about the 2,500 comments the Fed had received on Basel III and when the Fed would have dealt with these comments, particularly in the context of where compliance with Basel III for US Banks had been delayed beyond Jan 1 2013. Dan additionally asked whether Mike that implementing Basel III now was a competitive advantage or disadvantage for a bank?
Mike responded that the Fed had extended its review period from 90 days to 135 days which was an unusual occurence. He said that as yet the Fed had no new target data for implementation.
Brian of AIG on Basel III and Regulation - Dan asked Brian Peters of AIG what his thoughts were on Basel III. Brian was an entertaining speaker and responded firstly that AIG was not a bank, it was an insurer and that regulators need to recognise this. He said regulators need to think of the whole financial markets and how they want them to look in the future. Put another way, he implied that looking at capital, liquidity and resolvability in isolation was fine at one level, but these things had much wider implications and without taking that view then there would be problems.
Brian said he thinks of Basel III as a hammer, and that when people use a hammer everything starts to look like a "nail". He said that insurers write 50 year-long liabilities, and as a result he needs long term investments to cover these obligations. He added that the liquidity profile of insurers was different to banks, with life policies having exposures to interest rates more like bank deposits. He said that AIG was mostly dealing with publicly traded securities (I guess now AIG FP is no longer dominant?). Resolvability was a different process for insurers, with regulators forcing troubled insurers to limit dividends and build up cash reserves.
Brian's big concern for the regulators was that in his view they need to look at the whole financial system and what future they want for it, rather than dealing with one set of players and regulations in isolation. Seems Brian shares some similar concerns to Pierre Guilleman on apply banking regulation to the insurance industry, combined with the unintended consequencies of current regulation on the future of the whole of financial markets (maybe the talk on diversity of approach is a good to read on this, or maybe more recently "Regulation Increases Risk" for a more quantitative approach).
Steve of Credit Suisse on Basel III - Dan asked Steven Haratunian whether implementing Basel III was a competitive advantage or disadvantage for Credit Suisse. Steve said that regardless of competitive advantage, as a Swiss bank Credit Suisse had no choice in complying with Basel III by Jan 1 2013, that Credit Suisse had started its preparations since 2011 and had been Basel 2.5 compliant since Jan 1 2012. He said that Basel III compliance had effectively doubled their capital requirements, and had prompted a strategic review of all business activities within the investment banking arm.
This review had caused a reassessment of the company's involvement in areas such as fixed income and risk weighted assets had been reduced by over $100Billion. Steven explained how they had looked at each business activity and assessed whether they could achieve a 15% return on equity over a business cycle, plus be able to withstand CCAR stress testing during this time. He said that Credit Suisse had felt lonely in the US markets in that they were many occaisions where deals were lost due directly to consideration of Basel III capital requirements. Credit Suisse felt less lonely now given how regulation is affecting other banks, and that for certain markets (notably mortgages and credit) the effects of Basel III were very harsh.
Volcker Rule and Dodd-Frank - Dan asked Mike where did the Volcker Rule fit within Dodd-Frank, and does it make us safer? Mike didn't have a great deal to say on this, other than he thought it was all part and parcel of Congress's attempts to make the financial markets safer, that its implementation was being managed/discussed across an inter-agency group including the Fed, SEC and CFTC. Brian said that Dodd-Frank did not have a great deal of impact for insurers, the only real effects being some on swap providers to insurers.
Steve said that many of the many aspects or "spirit" of Volcker and Dodd-Frank had been internalised by the banks and were progressing despite Dodd-Frank not being finalised. He said that in particular the lack of certainty around extraterritoriality and margining in derivatives was not helpful. Mike added that in terms of progressing through Dodd-Frank, his estimated was that the Fed had one third of it finished, one third of the rules proposed, and one third not started or in very early stages. So still some work to be done.
Living Wills - Brian at this point referred to a recent speech by William C. Dudley of the Fed with title "Solving the Too Big to Fail Problem" (haven't looked at this yet, but will). Mike said that the Fed was stilling learning in relation to "Living Wills" and eventually it will get down to a level of being very company specific. Brian asked whether this meant that "Living Wills" would be very specific to each company and not a general rule to be applied to all. Mike said it was too early to tell.
Extraterritoriality - On extraterritoriality Steve said that Credit Suisse having to look at its subsidiaries globally more as standalone companies when dealing with regulators and capital requirements, which will great increase capital requires if the portfolio effect of being a global company is not considered by regulators. Dan mentioned a forthcoming speechto be made by Dan Tarullo of the Fed, and mentioned how the Fed was looking at treating foreign subsidiaries operating in the US as bank holding companies not global subsidiaries, hence again causing problems by ignoring portfolio effect. Mike said that the regulators were working on this issue, and that unsurprisingly he couldn't comment on the speech Dan Tarullo had yet to make.
The Future Shape of the Markets - Brian brought up an interesting question for Mike in asking how the regulators wanted to see financial markets develop and operate in the future? Brian thought that current regulation was being implemented as almost the "last war" against financial markets without a forward looking view. He said that historically he could see Basel 1 being prompted by addressing some of the issues caused by Japanese banks, he saw Basel II addressing credit risk but what will the effects of Basel III ultimately be?
This prompted an interesting response from Mike, in that he said that the Fed is not shaping markets and is dealing only with current rules and risks. He added that private enterprise would shape future markets. (difficult to see how that argument stacks up, regulation implemented now is surely not independent of private sector reaction/exploitation of it) Steve added that Basel III had already had effects, with Credit Suisse already reducing its activity in mortgage and fixed income markets. Steve said that non-banking organisations were now involved in these markets and that regulators have to be aware of these changes or face further problems.
Did Regulators Fail to Enforce Existing US Regulation - one audience participant was strongly of the opinion that Basel III is not needed, that there was enough regulation in place to limit the crisis and that the main failing of the regultors was that they did not implement what was already there to be used. Mike said he thought that the regulators did have lessons to learn and that some of the regulation then in place needed reviewing.
Keep it Simple - another audience member asked about the benefits of simple regulation of simpler markets and mentioned an article by Andrew Haldane of the Bank of England on "The Dog and the Frisbee". Mike didn't have much to add on this other than saying it was a work in progress.
Brian thought that the central failure behind the crisis was the mis-rating of credit instruments, with AAA products attracting a 4bp capital charge instead of a more realistic 3%.
Regulations Effects on Market Pricing - Steve was the first to respond on this, pointing to areas such as cmbs and credit markets as being best performing areas that also have the lower capital risk weights. Dan said he felt that equity markets had not fully adjusted yet, and ironically that financial equities had the highest risk weights. Combined with anticipated rises in tax, high risk weightings were taking capital out of the risk bearing/wealth generating parts of the economy and into low weighted instruments like US treasuries. Dan wondered whether regulation was one of the key dampening factors behind why the current record stimulus was not accelerating the economy in the US more quickly.
Derivatives Clearer and Clearing - this audience question was asking how the regulators were dealing with the desire to encourage clearing of derivative trades whilst at the same time not incentivising the banks to set themselves up as clearers. Mike said that there was an international effort to look at this.
What Happens When the Stimulus Goes - an audience member asked what the panel thought would happen once the stimulus was removed from the markets. The panelists thought this was more an economics questions. However Dan said that the regulators were more sensitive to the markets and market participants when considering new stimulus measures, and cited problems in the fall of 2011 caused by Fed actions in the market crushing mortgage spreads. Brian said insurers need yield so the stimulus was obviously having an impact. Dan mentioned that given the low risk weighting of US Treasuries then everyone was holding them and so the impact of a jump in rates would hurt many if done without preparation.
Wine Shortage and Summary - Just had to mention that there was no wine made available at the networking session afterwards. A sign of austere times or simply that it was too early in the week? Anyway it was a great discussion and raised some good points. In summary, all I hear still supports the premise that the "Law of Unintended Consequences" is ever-present, ever-powerful and looming over the next few years. Hearing regulators say that they are dealing with current risks only and are not shaping the future of financial markets smacks of either delusion or obfuscation to me.
Posted by Brian Sentance | 28 November 2012 | 6:22 pm
Launch event for Interactive Data's new reference data service Apex on Wednesday night, hosted at Nasdaq Time Square and introduced by Mark Hepsworth. Apex looks like a good offering, combining multi-asset data access, batch file and on-demand API requests from the same data store, plus hosted data management services, and a flexible licensing/distribution/re-distribution model.
Some good speakers at the event. Larry Tabb ran through his opinions on the current market, starting with regulation. He painted a mixed picture of the market, starting with the continuing exit by investors from the equity mutual funds market, offset to some degree by rapid growth in ETF assets (54% growth over past 3 years to $1,200billion). Obviously events such as the Flash Crash, Libor, the London Whale and Knight Capital have not increased investors confidence in markets either.
On regulation he first cited the sheer amount of regulation being attempted at the moment going through systemic risk/too big to fail, Dodd-Frank, Volcker, derivatives regulation, Basel III etc. Of particular note he mentioned some concerns over whether there is simply enough collateral around in the market given increased capital requirements and derivative regulation (a thought currently shared by the FT apparently in this article).
Given the focus of the event, Larry unsurprisingly mentioned the foundational role of data in meeting the new regulatory requirements, which for the next few years he believes will be focussed on audit and the ability to explain and justify past decisions to regulators. Also given the focus of the event, Larry did not mention his recent article on the Tabb Forum on federated data management strategies which I would have been interested to hear Interactive's comments on, particularly given their new hosted data management offerings. (You can find some of our past thoughts here on the option of using federated data.)
Mike Atkin of the EDM Council was next up and described a framework for what he thought was going on in the market. In summary, he split the drivers for change into business and regulatory, and categorised the changes into:
- Systemic Risk
- Capital and Liquidity
- Clearing and Settlement
- Control and Enforcement
He then that the fundamental challenge with data was to go through the chain of identifying things, descibing them, classifying/aggregating them and then finally establishing linkages. He then ended this part of his presentation with the three aspects he thought necessary to sort this out from industry data standards, to methods of best practice and on to having infrastructure in place to enable these changes.
Mike then went on to recount a conversation he had had with a hedge fund manager, who had defined the interesting concept of a "Data Risk Equation":
N x CC x S / (Q x V)
N: is the number of variables
CC: is a measure of calculation complexity
S: is the number of data sources needed
Q: is a measure of quality
V: is a measure of verifiability
I think the angle was the Hedge Fund guy was simply using a form of the above to categorise and compary the complexity of some of the data issues his firm was dealing with.
Aram Flores of Deutsche Bank then talked briefly. Of note was his point that the new regulation was forcing DB to use more external rather than internal data, since regulation now restricted the use of internal data within regulatory reporting. Sounds like good news for Interactive and some of its competitors. Eric Reichenberg of SS&C GlobeOp then gave a quick talk on the importance of accurate data to his derivative valuation services. The talks ended with a well-prepped conversation between Marty Williams and one of their new Apex clients, who jokingly refered to one of the other well-known data vendors as the Evil Empire which raised a few smiles - fortunately the speaker didn't start to choke at this point so obviously Darth Vader wasn't spying on the proceedings...
So overall a good event, new product offering looks interesting, speakers were entertaining and the drinks/food/location were great.
Posted by Brian Sentance | 26 October 2012 | 3:22 pm
Getting to the heart of "Data Management for Risk", PRMIA held an event entitled "Missing Data for Risk Management Stress Testing" at Bloomberg's New York HQ last night. For those of you who are unfamiliar with the topic of "Data Management for Risk", then the following diagram may help to further explain how the topic is to do with all the data sets feeding the VaR and scenario engines.
I have a vested interest in saying this (and please forgive the product placement in the diagram above, but hey this is what we do...), but the topic of data management for risk seems to fall into a functionality gap between: i) the risk system vendors who typically seem to assume that the world of data is perfect and that the topic is too low level to concern them and ii) the traditional data management vendors who seem to regard things like correlations, curves, spreads, implied volatilities and model parameters as too business domain focussed (see previous post on this topic) As a result, the risk manager is typically left with ad-hoc tools like spreadsheets and other analytical packages to perform data validation and filling of any missing data found. These ad-hoc tools are fine until the data universe grows larger, leading to the regulators becoming concerned about just how much data is being managed "out of system" (see past post for some previous thoughts on spreadsheets).
The Crisis and Data Issues. Anyway enough background above and on to some of the issues raised at the event. Navin Sharma of Western Asset Management started the evening by saying that pre-crisis people had a false sense of security around Value at Risk, and that crisis showed that data is not reliably smooth in nature. Post-crisis, then questions obviously arise around how much data to use, how far back and whether you include or exclude extreme periods like the crisis. Navin also suggested that the boards of many financial institutions were now much more open to reviewing scenarios put forward by the risk management function, whereas pre-crisis their attention span was much more limited.
Presentation. Don Wesnofske did a great presentation on the main issues around data and data governance in risk (which I am hoping to link to here shortly...)
Issues with Sourcing Data for Risk and Regulation. Adam Litke of Bloomberg asked the panel what new data sourcing challenges were resulting from the current raft of regulation being implemented. Barry Schachter cited a number of Basel-related examples. He said that the costs of rolling up loss data across all operations was prohibitative, and hence there were data truncation issues to be faced when assessing operational risk. Barry mentioned that liquidity calculations were new and presenting data challenges. Non centrally cleared OTC derivatives also presented data challenges, with initial margin calculations based on stressed VaR. Whilst on the subject of stressed VaR, Barry said that there were a number of missing data challenges including the challenge of obtaining past histories and of modelling current instruments that did not exist in past stress periods. He said that it was telling on this subject that the Fed had decided to exclude tier 2 banks from stressed VaR calculations on the basis that they did not think these institutions were in a position to be able to calculate these numbers given the data and systems that they had in place.
Barry also mentioned the challenges of Solvency II for insurers (and their asset managers) and said that this was a huge exercise in data collection. He said that there were obvious difficulties in modelling hedge fund and private equity investments, and that the regulation penalised the use of proxy instruments where there was limited "see-through" to the underlying investments. Moving on to UCITS IV, Barry said that the regulation required VaR calculations to be regularly reviewed on an ongoing basis, and he pointed out one issue with much of the current regulation in that it uses ambiguous terms such as models of "high accuracy" (I guess the point being that accuracy is always arguable/subjective for an illiquid security).
Sandhya Persad of Bloomberg said that there were many practical issues to consider such as exchanges that close at different times and the resultant misalignment of closing data, problems dealing with holiday data across different exchanges and countries, and sourcing of factor data for risk models from analysts. Navin expanded more on his theme of which periods of data to use. Don took a different tack, and emphasised the importance of getting the fundamental data of client-contract-product in place, and suggested that this was a big challenge still at many institutions. Adam closed the question by pointing out the data issues in everyday mortgage insurance as an example of how prevalant data problems are.
What Missing Data Techniques Are There? Sandhya explained a few of the issues her and her team face working at Bloomberg in making decisions about what data to fill. She mentioned the obvious issue of distance between missing data points and the preceding data used to fill it. Sandhya mentioned that one approach to missing data is to reduce factor weights down to zero for factors without data, but this gave rise to a data truncation issue. She said that there were a variety of statistical techniques that could be used, she mentioned adaptive learning techniques and then described some of the work that one of her colleagues had been doing on maximum-likehood estimation, whereby in addition to achieving consistency with the covariance matrix of "near" neighbours, that the estimation also had greater consistency with the historical behaviour of the factor or instrument over time.
Navin commented that fixed income markets were not as easy to deal with as equity markets in terms of data, and that at sub-investment grade there is very little data available. He said that heuristic models where often needed, and suggested that there was a need for "best practice" to be established for fixed income, particularly in light of guidelines from regulators that are at best ambiguous.
I think Barry then made some great comments about data and data quality in saying that risk managers need to understand more about the effects (or lack of) that input data has on the headline reports produced. The reason I say great is that I think there is often a disconnect or lack of knowledge around the effects that input data quality can have on the output numbers produced. Whilst regulators increasingly want data "drill-down" and justfication on any data used to calculate risk, it is still worth understanding more about whether output results are greatly sensitive to the input numbers, or whether maybe related aspects such as data consistency ought to have more emphasis than say absolute price accuracy. For example, data quality was being discussed at a recent market data conference I attended and only about 25% of the audience said that they had ever investigated the quality of the data they use. Barry also suggested that you need to understand to what purpose the numbers are being used and what effect the numbers had on the decisions you take. I think here the distinction was around usage in risk where changes/deltas might be of more important, whereas in calculating valuations or returns then price accuracy might receieve more emphasis.
How Extensive is the Problem? General consensus from the panel was that the issues importance needed to be understood more (I guess my experience is that the regulators can make data quality important for a bank if they say that input data issues are the main reason for blocking approval of an internal model for regulatory capital calculations). Don said that any risk manager needed to be able to justify why particular data points were used and there was further criticism from the panel around regulators asking for high quality without specifying what this means or what needs to be done.
Summary - My main conclusions:
- Risk managers should know more of how and in what ways input data quality affects output reports
- Be aware of how your approach to data can affect the decisions you take
- Be aware of the context of how the data is used
- Regulators set the "high quality" agenda for data but don't specify what "high quality" actually is
- Risk managers should not simply accept regulatory definitions of data quality and should join in the debate
Great drinks and food afterwards (thanks Bloomberg!) and a good evening was had by all, with a topic that needs further discussion and development.
Posted by Brian Sentance | 16 October 2012 | 3:21 pm
Bankenes Sikringsfond Selects Xenomorph's TimeScape for Faster Data Analysis and High-Quality Decision Support
Just a quick note to say that we have signed a new client, Bankenes Sikringsfond, the Norwegian Banks’ Guarantee Fund. They will be using TimeScape to fulfill requirements for a centralised analytics and data management platform. The press release is available here for those of you who are interested.
Posted by Sara Verri | 11 October 2012 | 10:50 am
If you have ever wandered around the financial district in New York, then you may not have noticed the Museum of American Finance on the corner of Wall and William St. I tend to find there are lots of things I don't notice in New York, probably due to the fact that I am still doing a passable impression of a tourist and find myself looking ever upwards at the skyscrapers rather than at anything at ground level. Anyway MoAF is worth a look-in and having recently become a member (thanks Cognito Media!) I went along to one of their events last night on regulation.Richard Sylla was the moderator for the evening, with support from Hugh Rockoff, Eugene N. White and Charles Geisst.
Richard Sylla on Fractional Reserve Banking and Regulation
Richard started the evening by explaining some basics of bank balance sheets as a means for explaining why he feels banking needs regulation. He showed a simplified and conservative balance sheet for an example bank:
- Deposits 85% (from the likes of you or I)
- Capital 15% (shareholders including surpluses)
- Earning Assets 80% (loans and investments)
- Reserves 20% (cash and deposits at other banks/central banks)
Richard explained that the main point to note from the balance sheet was that the reserves did not match the depositors and hence there is not enough money to repay all the depositors if they asked for their money back all at once. Richard's example was a form of Fractional Reserve Banking and he explained that there were two main reasons why banking needs regulation. The first was the incentive for banks to reduce their reserves to increase profits (increasing risk re: depositors) and the second was to keep capital levels low in order to increase earnings per share.
He then went on to illustrate how at the time of the last crisis Fannie Mae and Freddie Mac had earning assets of 100%, reserves 0%, deposits of 96% and capital of 4%. Lehman and Bear Stearns both had zero reserves and capital of only 3%. He then went on to list a large number of well known financial institutions and showed how the equity of many was simply wiped out given falls in asset valuations, the lack of reserves and the very small levels of equity maintained.
Hugh Rockoff on Adam Smith and Banking Regulation
Hugh is apparently a big fan of free market economics and of Adam Smith in particular. Much as Smith is for the "Invisible Hand" of the free market and against regulation, Hugh was at pains to point out that even Smith thought of banking being a special case in need of regulation and referred to banking operations as "a sort of waggon-way through the air".
Apparently Smith lived through a banking crisis in 1772 involving the Ayr Bank - I think Hugh had misspelt this as "Air" which I not sure whether it was deliberate but made for some reasonable humour about the value of the notes issued by the bank. Apparently this was an international crisis involving many of the then major powers, was based on stock market and property speculation and indirectly lead to the Boston Tea Party so I guess many Americans should pay their respects to this failed bank that became a catalyst to the formation of their country. A key point to note was that the shareholders of the Ayr Bank were subject to unlimited liability and had to pay all obligations owing...not sure how that would go down today in our more enlightened (?) times but more of that later.
Hugh described how Smith thought there were many things that banks should not be allowed to do including investing in real-estate (!) and prohibitions on the "option" to repay monetary notes. Smith also suggested that the Government should set maximum interest rates. So for a free market thinker, Smith had some surprising ideas when it came to banking. Hugh also pointed out that another great free-marketeer, Milton Freedman, was also in favour of banking regulation and favoured both deposit insurance and 100% reserve banking.
Eugene White on Regulatory History
At a guess I would say that Eugene is a big fan of the quote from Mark Twain that "History does not repeat itself, but it does rhyme". Eugene took us briefly through major financial regulations in American history such as the National Banking Act of 1864, Federal Reserve Act of 1913, The "New Deal" of 1932 and others. He notably had a question mark around whether Dodd-Frank was going to be a major milestone in regulatory history, as in his opinion Dodd-Frank treats the symptons and not the causes of the last financial crisis. Eugene spent some time explaining the cycle of regulation where governments go through stages of:
-> Regulation ->
-> Problems caused by Regulation->
-> De-regulation ->
-> Financial Crisis ->
-> back to Regulation ->
Charles Geisst on Dodd-Frank and the Volker Rule
Charles started by saying that he thought Dodd-Frank, and in particular the Volker Rule, might well still be being debated three years hence. As others have done, he contrasted the 2,300 pages of Dodd-Frank with the simplicity of the 72 pages of the Glass-Steagall Act. He believes that the Volker Rule is Glass-Steagall by another name, and believes that Wall St has only recently realised this is the case and has begun the big push back against it.
He left the audience with the sobering thought that he thinks another financial crisis is needed in order to cut down Dodd-Frank from 2300 pages of instructions for regulators to put regulations in place to around 150 pages of meaningful descriptions of the kinds of things that banks can and cannot do.
Rules vs. Principals - One audience member wondered if the panel thought it better to regulate in terms of feduciary duties of the participants rather than in detailed rules that can be "worked around". Charles respond that he thought feduciary duties were better, and contrasted the strictness with which banking fraud has been treated in the USA with the relative lack of punishment and sentencing in the securities industry. Eugene added that the "New Deal" of 1932 took away limited liabiltiy for shareholders of banks, and with it the incentives for shareholders to monitor the risks being taken by the banks they own.
Basel Regulations - Another audience member wanted panel feedback on Basel. In summary the panel said that the Basel Committee got it wrong in thinking it knew for certain how risky certainy asset classes were for example thinking that a corporate bond from IBM was more risky than say an MBS or government debt.
Do Regulators deal with the Real Issues? - Charles again brought this question back to the desire for simplicity and clarity, something that is not found in Dodd-Frank in his view. Hugh mentioned that the USA has specific problems with simply the number of regulatory bodies, and contrasted this with the single regulator in Canada. He said he thought competition was good for businesses but bad for regulators.
Eugene and Charles put an interesting historical perspective on this question, in that it is more often the case that government and the finance work together in composing legislation and regulation. Eugene gave the example that in the financial crisis of the early 30s, banks that had combined both retail and investment banking operations had faired quite well. So why did Glass-Steagal come about? Apparently Senator Steagal wanted deposit-insurance to help the myriad number of small banks back home, and Senator Glass simply wanted investment banks and retail banks to be separated, so a deal was done. I found this surprising (maybe I shouldn't be) but G-S is put forward as good regulation yet it seems it was not treating the observed symptoms of the crisis being dealt with.
How are the regulators dealing with Money Market Funds? - Here the panel said this was a classic example of the industry fiighting the SEC becuase the proposed regulation would reduce the return on their operations. Eugene explained how MMFs resulted from the savings and loans industry complaining about depositors investing in T-Bills. So the government response was to increase T-Bill denomination from $1,000 to $10,000 to limit who could invest, but then this was circumvented by the idea of setting up funds to invest in these larger denomination assets. Charles added that he thought the next crisis would come from the Shadow Banking system and that a more balanced approach needed to be taken to regulate across both systems. Hugh added that Dodd-Frank thinks it can identify systematically important institutions and it would be his bet that the next crisis starts with an organisation that is below the radar and not on this list. The panel concluded with a brief discussion of pay and remuneration and said that this was a major problem that needed better solutions.
Posted by Brian Sentance | 3 October 2012 | 4:20 pm
New article with some of my thoughts on data models, interfaces and software upgrades has just gone up on the Waters Inside Reference Data site.
Posted by Brian Sentance | 11 September 2012 | 4:50 pm
Just a quick note to say that the video, presentations and supporting documents have now gone up for our recent Wilmott event with Numerix on OIS Curves and Libor in New York. Somewhat topical at the moment given the current bad press for Barclays.
Posted by Brian Sentance | 29 June 2012 | 2:20 pm
Some recent thoughts in Advanced Trading on turning data management on its head, and how to extend data management initiatives from the back office into both risk management and the front office.
Posted by Brian Sentance | 22 June 2012 | 2:17 pm
I attended the Financial Information Summit event on Tuesday, organized in Paris by Inside Market Data and Inside Reference Data.
Unsurprisingly, most of the topics discussed during the panels focused on reducing data costs, managing the vendor relationship strategically, LEI and building sound data management strategies.
Here is a (very) brief summary of the key points touched which generated a good debate from both panellists and audience:
Lowering data costs and cost containment panels
- Make end-users aware of how much they pay for that data so that they will have a different perspective when deciding if the data is really needed or a "nice to have"
- Build a strong relationship with the data vendor: you work for the same aim and share the same industry issues
- Evaluate niche data providers who are often more flexible and willing to assist while still providing high quality data
- Strategic vendor management is needed within financial institutions: this should be an on-going process aimed to improve contract mgmt for data licenses
- A centralized data management strategy and consolidation of processes and data feeds allow cost containment (something that Xenomorph have long been advocating)
- Accuracy and timeliness of data is essential: make sure your vendor understands your needs
- Negotiate redistribution costs to downstream systems
One good point was made by David Berry, IPUG-Cossiom, on the acquisition of data management software vendors by the same data providers (referring to the Markit-Cadis and PolarLake-Bloomberg deals) and stating that it will be tricky to see how the two business units will be managed "separately" (if kept separated...I know what you are thinking!).
There were also interesting case studies and examples supporting the points above. Many panellists pointed out how difficult can be to obtain high quality data from vendors and that only regulation can actually improve the standards. Despite the concerns, I must recognize that many firms are now pro-actively approaching the issue and trying to deal with the problem in a strategic manner. For example, Hand Henrik Hovmand, Market Data Manager, Danske Bank, explained how Danske Bank are in the process of adopting a strategic vendor system made of 4 steps: assessing vendor, classifying vendor, deciding what to do with the vendor and creating a business plan. Vendors are classified as strategic, tactical, legacy or emerging. Based on this classification, then the "bad" vendors are evaluated to verify if they are enhancing data quality. This vendor landscape is used both internally and externally during negotiation and Hovmand was confident it will help Danske Bank to contain costs and get more for the same price.
I also enjoyed the panel on Building a sound management strategy where Alain Robert- Dauton, Sycomore Asset Management, was speaking. He highlighted how asset managers, in particular smaller firms, are now feeling the pressure of regulators but at the same time are less prepared to deal with compliance than larger investment banks. He recognized that asset managers need to invest in a sound risk data management strategy and supporting technology, with regulators demanding more details, reports and high quality data.
For a summary on what was said on LEI, then seems like most financial institutions are still unprepared on how it should be implemented, due to uncertainty around it but I refer you to an article from Nicholas Hamilton in Inside Reference Data for a clear picture of what was discussed during the panel.
Looking forward, the panellists agreed that the main challenge is and will be managing the increasing volume of data. Though, as Tom Dalglish affirmed, the market is still not ready for the cloud, given than not much has been done in terms of legislation. Watch out!
The full agenda of the event is available here.
Posted by Sara Verri | 14 June 2012 | 5:54 pm
Thanks to all those who came and along and supported "Ping Pong 4 Public Schools" at the AYTTO fund raiser event at SPiN on Wednesday evening. Great evening with participants in the team competition from the TabbGroup, Jeffries Investment Bank, Toro Trading, MissionBig, PolarLake, AIG, Mediacs, Xenomorph and others. In fact the others included the Federal Reserve, who got ahead of the market and won the team competition...something which has to change next year! Additional thanks to SPiN NYC for hosting the event, and to Bonhams for conducting the reverse auction.
Some photographs from the event below:
Ben Nisbet of AYTTO trying to make order out of chaos at the start of the team competion...
One the AYTTO students, glad none of us had to play her, we would have got wupped...
The TabbGroup strike a pose and look optimistic at the start of the evening...
Sidney, one of the AYTTO coaches, helping us all to keep track of the score...
This team got a lot of support from the audience, no idea why...
Posted by Brian Sentance | 8 June 2012 | 9:19 pm
Quick plug for Xenomorph's Wilmott Forum Event on OIS curves tomorrow in downtown Manhattan. The event is done in partnership with Numerix, and will be looking at the issue of OIS vs. Libor discounting from the point of view of a practioner, financial engineer and systems developer. You can register for the event here, and so we hope to see you at 6pm for some great talks and some drinks/socialising afterwards.
Posted by Brian Sentance | 30 May 2012 | 2:07 pm
Video interview with Paul Rowady of the Tabb Group, primarily about how data management can break out from being just a back office function and become a source of competitive advantage in both the front office and in risk management.
For those of you with a curious mind, the perseverence to watch the video until the end and possibly not such advanced years as me and Paul, then the lead singer of Midnight Oil that he refers to at the close of the video is Peter Garrett, who looks like this:
Whereas I look like this:
See, completely different. Obviously Peter has a great choice in hairstyle though...
Posted by Brian Sentance | 30 May 2012 | 1:22 pm
Good Quafafew event in NYC this week, with Michael Markov of MPI on "Hedge Fund Replication: Methods, Challenges and Benefits for Investors". To cut a relatively long but enjoyable presentation short, Michael presented some interesting empirical evidence about hedge fund performance.
Firstly, he showed how many (most) hedge fund styles were able to deliver performance that had better risk/return profile than many mainstream investment portfolios, obviously including the ubiquitous 60% in equity 40% in bonds strategy. Given this relative outperformance in terms of risk and return for many hedge fund styles, Michael put forward the idea that asset managers seeking to invest in hedge funds should take more interest in indices of hedge funds than is currently the case.
For a particular hedge fund style, to obtain a performance level that was better than 50% of the managers was actually quite good, particularly when he showed that the risk level was approximately better than 75% of the hedge funds within each class. Also, when you look at the performance over longer time periods (rolling 3 years say) an index outperformed many more of the funds in a particular investment style (sounds like a bit of the advantages of geometric vs. arithmetic averaging at work somewhere in this to me).
As an aside, he said that most hedge fund replication products do not mention tracking error and often instead talk about near perfect correlation with the hedge fund index being replicated. He was at pain to point out that it was possible to construct portfolios with near perfect correlation that have massive tracking errors, and so investors in these products should be aware of this marketing tactic (or failing, depending on your viewpoint).
Michael should some good examples of how his system had replicated the performance of a particular hedge fund style index, and how this broadly uncovered what kinds of investments were broadly being made by the hedge fund industry during each time period under consideration. He is already doing some work with some regulators on this, but most interestingly he showed how he took a few hedge funds that were later found to be involved in fraudulent activity, and worked backwards to find out what his system thought were the investments being made.
He then showed how by taking away the performance of the replicated fund away from the actual hedge fund results posted, the residual performance for these fraudulent funds was very large, and he implored investors in "stellar" perfoming hedge funds to do this analysis and really quiz the hedge fund manager for where this massive residual performance actually comes from before deciding to invest. In summary a good talk by an interesting speaker, which surprisingly for a New York Quafafew event was not interupted too many times by questions from the hosts.
Posted by Brian Sentance | 10 May 2012 | 7:44 pm
"Dragon Kings" is a new term to me, and the subject on Monday evening of a presentation by Prof. Didier Sornette at an event given by PRMIA. Didier has been working on the diagnosis on financial markets bubbles, something that has been of interest to a lot of people over the past few years (see earlier post on bubble indices from RiskMinds and a follow up here).
Didier started his presentation by talking about extreme events and how many have defined different epochs in human history. He placed a worrying question mark over the European Sovereign Debt Crisis as to its place in history, and showed a pair of particularly alarming graphs of the "Perpetual Money Machine" of financial markets. One chart was a plot of savings and rate of profit for US, EU and Japan with profit rising, savings falling from about 1980 onwards, and a similar diverging one of consumption rising and wages falling in the US since 1980. Didier puts this down to finance allowing this increasing debt to occur and to perpetuate the "virtual" growth of wealth.
Corn, Obesity and Antibiotics - He put up one fascinating slide relating to positive feedback in complex systems and effectively the law of unintended consequencies. After World War II, the US Government wanted to ensure the US food supply and subsidized the production of corn. This resulted in over supply over for humans -> so the excess corn was fed to cattle -> who can't digest starch easily -> who developed e-coli infections -> which prompted the use of antibiotics in cattle -> which prompted antibiotics as growth promoters for food animals -> which resulted in cheap meat -> leading to non-sustainable meat protein consumption and under-consumption of vegetable protein. Whilst that is a lot of things to pull together, ultimately Didier suggested that the simple decision to subsidise corn had led to the current epidemic in obesity and the losing battle against bacterial infections.
Power Laws - He then touched briefly upon Power Law Distributions, which are observed in many natural phenomena (city size, earthquakes etc) and seem to explain the peaked mean and long-tails of distributions of finance far better than the traditional Lognormal distribution of traditional economic theory. (I need to catch up on some Mandelbrot I think). He explained that whilst many observations (city size for instance) fitted a power law, that the where observations that did not fit this distribution at all (in the cities example, many capital cities are much, much larger than a power law predicts). Didier then moved on to describe Black Swans, characterised as unknown unknowable events, occurring exogenously ("wrath of god" type events) and with one unique investment strategy in going long put options.
Didier said that Dragon-Kings were not Black Swans, but the major crises we have observed are "endogenous" (i.e. come from inside the system), do not conform to a power law distribution and:
- can be diagnosed in advanced
- can be quantified
- have (some) predictability
Diagnosing Bubbles - In terms of diagnosing Dragon Kings, Didier listed the following criteria that we should be aware of (later confirmed as a very useful and practical list by one of the risk managers in the panel):
- Slower recovery from perturbations
- Increasing (or decreasing) autocorrelation
- Increasing (or decreasing) cross-correlation with external driving
- Increasing variance
- Flickering and stochastic resonance
- Increased spatial coherence
- Degree of endogeneity/reflexivity
- Finite-time singularities
Didier finished his talk by describing the current work that he and ETH are doing with real and ever-larger datasets to test whether bubbles can be detected before they end, and whether the prediction of the timing of their end can be improved. So in summary, Didier's work on Dragon Kings involves the behaviour of complex systems, how the major events in these systems come from inside (e.g. the flash crash), how positive feedback and system self-configuration/organisation can produce statistical behaviour well beyond that predicted by power law distributions and certainly beyond that predicted by traditional equilibrium-based economic theory. Didier mentioned how the search for returns was producing more leverage and an ever more connected economy and financial markets system, and how this interconnectedness was unhealthy from a systemic risk point of view, particularly if overlayed by homogenous regulation forcing everyone towards the same investment and risk management approaches (see Riskminds post for some early concerns on this and more recent ideas from Baruch College)
Panel-Debate - The panel debate following was interesting. As mentioned, one of the risk managers confirmed the above statistical behaviours as useful in predicting that the markets were unstable, and that to detect such behaviours across many markets and asset classes was an early warning sign of potential crisis that could be acted upon. I thought a good point was made about the market post crash, in that the market's behaviour has changed now that many big risk takers were eliminated in the recent crash (backtesters beware!). It seems Bloomberg are also looking at some regime switching models in this area, so worth looking out for what they are up to. Another panelist was talking about the need to link the investigations across asset class and markets, and emphasised the role of leverage in crisis events. One of the quants on the panel put forward a good analogy for "endogenous" vs. "exogenous" impacts on systems (comparing Dragon King events to Black Swans), and I paraphrase this somewhat to add some drama to the end of this post, but here goes: "when a man is pushed off a cliff then how far he falls is not determined by the size of the push, it is determined by the size of the cliff he is standing on".
Posted by Brian Sentance | 25 April 2012 | 4:10 pm
Xenomorph's analytics partner Numerix sponsored a PRMIA event at New York's Harvard Club this week on Credit Valuation Adjustment (CVA). The event also involved Microsoft, with a surprisingly relevant contribution to the evening on CVA and "Big Data" (I still don't feel comfortable losing the quotes yet, maybe soon...). Credit Valuation Adjustment seems to be the hot topic in risk management and pricing at the moment, with Numerix's competitor Quantifi having held another PRMIA event on CVA only a few months back.
The event started with an introduction to CVA from Aletta Ely of JP Morgan Chase. Aletta started by defining CVA as the market value of counterparty credit risk. I am new to CVA as a topic, and my own experience on any kind of adjustment in valuation for instrument was back at JP Morgan in the mid-90s (those of you under 30 are allowed to start yawning at this point...). We used to maintain separate risk-free curves (what are they now?) and counterparty spread curves, which would be combined to discount the cashflows in the model.
Whilst such an adjustment could be calibrated to come up with an adjusted valuation which would be better than having no counterparty risk modelled at all, it seems one of the key aspects of how CVA differs is that a credit valuation adjustement needs to be done in the context of the whole portfolio of exposures to the counterparty, and not in isolation instrument by instrument. The fact that a trader in equity derivatives was long exposure to a counterparty cannot be looked at in isolation from a short exposure to a portfolio of swaps with the same counterparty on the fixed income desk.
Put another way, CVA only has context if we stand to lose money if our counterparty defaults, and so an aggregated approach is needed to calculate the size of the positive exposures to the counterparty over the lifetime of the portfolio. Also, given this one sided payoff aspect of the CVA calculation, then instrument types such as vanilla interest rate swaps suddenly move from being relatively simple instrument that can be priced off a single curve to instruments that needed optionality to be modelled for the purposes of CVA.
So why has CVA become such a hot topic at the banks? Prior to the 2008/2009 crisis CVA was already around (credit risk has existed for a long time I guess, regardless of whether you regulate or report to it), but given that bank credit spreads were at that time consistently low and stable then CVA had minimal effects on valuations and P&L. Obviously with the advent of Lehmans then this changed, and CVA has been pushed into prominence since it has directly affected P&L in a significant manner for many institutions (for example see these FT articles on Citi and JPMorgan)
A key and I think positive point for the whole industry is the CVA requires a completely multi-asset view, and given regulatory focus on CVA and capital adequacy then as a result it will drive banks away from a siloed approach to data and valuation management. If capital is scarcer and more costly, then banks will invest in understanding both their aggregate CVA and the incremental contribution to CVA of a new trade in the context of all exposures to the counterparty. Looking at incremental CVA, then you can also see that this also drives investment into real or near-realtime CVA calculation, which brings me on to the next talks of the evening by Numerix on CVA calculation methods and a surprisingly good presentation on CVA and "Big Data" from David Cox of Microsoft.
Denny Yu of Numerix did a good job of explaining some of the methods of calculating CVA, and in addition to being cross asset and all the implications that requires for having the ability to price anything, CVA is both data and computationally expensive. It requires both simulation of the scenarios for the default of counterparties through time, but also the valuation of cross-asset portfolios at different points in time. Denny mentioned techniques such as American Monte-Carlo to reduce the computation needed through using the same simulation paths for both default scenarios and valuation.
So on to Microsoft. I have seen some appalling presentations on "Big Data" recently, mainly from the larger software and hardware companies try to jump on the marketing band wagon (main marketing premise: the data problems you have are "Big"...enough said I hope). Surprisingly, David Cox of Microsoft gave a very good presentation around the computation challenges of CVA, and how technologies such as Hadoop take the computational power closer to the data that needs acting on, bringing the analytics and data together. (As an aside, his presentation was notably "Metro" GUI in style, something that seems to work well for PowerPoint where the slide is very visual and it puts more emphasis on the speak to overlay the information). David was obviously keen to talk up some of the cloud technology that Microsoft is currently pushing, but he knew the CVA business topic well and did a good job of telling a good story around CVA, "Big Data" and Cloud technologies. Fundamentally, his pitch was for banks and other institutions to become "Analytic Enterprises" with a common, scaleable and flexible infrastructure for data management and analysis.
In summary it was a great event - the Harvard Club is always worth a visit (bars and grandiose portraits as expected but also barber shop in the basement and squash courts in the loft!), the wine afterwards was tolerably good and the speakers were informative without over-selling their products or company. Quick thank you to Henry Hu of IBM for transportation on the night, and thanks also to Henry for sending through this link to a great introductory paper on CVA and credit risk from King's College London. Whilst the title of the King's paper is a bit long and scary, it takes the form of dialogue between a new employee and a CVA expert, and as such is very readable with lots of background links.
Posted by Brian Sentance | 13 April 2012 | 2:56 pm
NoSQL is an unfortunate name in my view for the loose family of non-relational database technologies associated with "Big Data". NotRelational might be a better description (catchy eh? thought not...) , but either way I don't like the negatives in both of these titles, due to aestetics and in this case because it could be taken to imply that these technologies are critical of SQL and relational technology that we have all been using for years. For those of you who are relatively new to NoSQL (which is most of us), then this link contains a great introduction. Also, if you can put up with a slightly annoying reporter, then the CloudEra CEO is worth a listen to on YouTube.
In my view NoSQL databases are complementary to relational technology, and as many have said relational tech and tabular data are not going away any time soon. Ironically, some of the NoSQL technologies need more standardised query languages to gain wider acceptance, and there will be no guessing which existing query language will be used for ideas in putting these new languages together (at this point as an example I will now say SPARQL, not that should be taken to mean that I know a lot about this, but that has never stopped me before...)
Going back into the distant history of Xenomorph and our XDB database technology, then when we started in 1995 the fact that we then used a proprietary database technology was sometimes a mixed blessing on sales. The XDB database technology we had at the time was based around answering a specific question, which was "give me all of the history for this attribute of this instrument as quickly as possible".
The risk managers and traders loved the performance aspects of our object/time series database - I remember one client with a historical VaR calc that we got running in around 30 minutes on laptop PC that was taking 12 hours in an RDBMS on a (then quite meaty) Sun Sparc box. It was a great example how specific database technology designed for specific problems could offer performance that was not possible from more generic relational technology. The use of database for these problems was never intended as a replacement for relational databases dealing with relational-type "set-based" problems though, it was complementary technology designed for very specific problem sets.
The technologists were much more reserved, some were more accepting and knew of products such as FAME around then, but some were sceptical over the use of non-standard DBMS tech. Looking back, I think this attitude was in part due to either a desire to build their own vector/time series store, but also understandably (but incorrectly) they were concerned that our proprietary database would be require specialist database admin skills. Not that the mainstream RDBMS systems were expensive or specialist to maintain then (Oracle DBA anyone?), but many proprietary database systems with proprietary languages can require expensive and on-going specialist consultant support even today.
The feedback from our clients and sales prospects that our database performance was liked, but the proprietary database admin aspects were sometimes a sales objection caused us to take a look at hosting some of our vector database structures in Microsoft SQL Server. A long time back we had already implemented a layer within our analytics and data management system where we could replace our XDB database with other databases, most notably FAME. You can see a simple overview of the architecture in the diagram below, where other non-XDB databases (and datafeeds) can "plugged in" to our TimeScape system without affecting the APIs or indeed the object data model being used by the client:
Data Unification Layer
Using this layer, we then worked with the Microsoft UK SQL team to implement/host some of our vector database structures inside of Microsoft SQL Server. As a result, we ended up with a database engine that maintained the performance aspects of our proprietary database, but offered clients a standards-based DBMS for maintaining and managing the database. This is going back a few years, but we tested this database at Microsoft with a 12TB database (since this was then the largest disk they had available), but still this contained 500 billion tick data records which even today could be considered "Big" (if indeed I fully understand "Big" these days?). So you can see some of the technical effort we put into getting non-mainstream database technology to be more acceptable to an audience adopting a "SQL is everything" mantra.
Fast forward to 2012, and the explosion of interest in "Big Data" (I guess I should drop the quotes soon?) and in NoSQL databases. It finally seems that due to the usage of these technologies on internet data problems that no relational database could address, the technology community seem to have much more willingness to accept non-RDBMS technology where the problem being addressed warrants it - I guess for me and Xenomorph it has been a long (and mostly enjoyable) journey from 1995 to 2012 and it is great to see a more open-minded approach being taken towards database technology and the recognition of the benefits of specfic databases for (some) specific problems. Hopefully some good news on TimeScape and NoSQL technologies to follow in coming months - this is an exciting time to be involved in analytics and data management in financial markets and this tech couldn't come a moment too soon given the new reporting requirements being requested by regulators.
Posted by Brian Sentance | 4 April 2012 | 4:54 pm
I went along to "Demystifying Financial Services Semantics" on Tuesday, a one day conference put together by the EDMCouncil and the Object Management Group. Firstly, what are semantics? Good question, to which the general answer is that semantics are the "study of meaning". Secondly, were semantics demystified during the day? - sadly for me I would say that they weren't, but ironically I would put that down mainly to poor presentations rather than a lack of substance, but more of that later.
Quoting from Euzenat (no expert me, just search for Semantics in Wikipedia), semantics "provides the rules for interpreting the syntax which do not provide the meaning directly but constrains the possible interpretations of what is declared." John Bottega (now of BofA) gave an illustration of this in his welcoming speech at the conference by introducing himself and the day in PigLatin, where all of the information he wanted to convey was contained in what he said, but only a small minority of the audience who knew the rules of Pig Latin understood what he was saying. The rest of us were "upidstay"...
Putting this in the more in the context of financial markets technology and data management, the main use of semantics and semantic data models seem to be as a conceptual data model technique that abstract away from any particular data model or database implementation. To humour the many disciples of the "Church of Semantics", such a conceptual data model would also be self-describing in nature, such that you would not need a separate meta data model to understand it. For example take a look at say the equity example from what Mike Aitkin and the EDM Council have put together so far with their "Semantics Repository".
Abstraction and self-description are not new techniques (OO/SOA design anyone?) but I guess even the semantic experts are not claiming that all is new with semantics. So what are they saying? The main themes from the day seem to be that Semantics:
- can bridge the gaps between business understanding and technology understanding
- can reduce the innumerable transformations of data that go on within large organisations
- is scaleable and adaptable to change and new business requirements
- facilitates greater and more granular analysis of data
- reduces the cost of data management
- enables more efficient business processes
Certainly the issue of business and technology not understanding each other (enough) has been a constant theme of most of my time working in financial services (and indeed is one of the gaps we bridge here at Xenomorph). For example, one project I heard of a few years back was were an IT department had just delivered a tick database project, only for the business users to find that that it did not cope with stock splits and for their purposes was unusable for data analysis. The business people had assumed that IT would know about the need for stock split adjustments, and as such had never felt the need to explicitly specify the requirement. The IT people obviously did not know the business domain well enough to catch this lack of specification.
I think there is a need to involve business people in the design of systems, particularly at the data level (whilst not quite a "semantic" data model, the data model in TimeScape presents business objects and business data types to the end user, so both business people and technologist can use it without showing any detail of an underlying table or physical data structure). You can see a lot of this around with the likes of CADIS pushing its "you don't need a fixed data model" ETL/no datawarehouse type approach against the more rigid (and to some, more complete) data models/datawarehouses of the likes of Asset Control and GoldenSource. You also get the likes of Polarlake pushing its own semantic web and big data approach to data management as a next stage on from relational data models (however I get a bit worried when "semantic web" and "big data" are used together, sounds like we are heading into marketing hype overdrive, warp factor 11...)
So if Semantics is to become prevalent and deliver some of these benefits in bringing greater understanding between business staff and technologists, the first thing that has addressed is that Semantics is a techy topic at the moment, which would cause drooping eyelids on even the most technically enthused members of the business. Ontology, OWL, RDF, CLIF are all great if you are already in the know, but guaranteed to turn a non-technical audience off if trying to understand (demystify?) Semantics in financial markets technology.
Looking at the business benefits, many of the presenters (particularly vendors) put forward slides where "BAM! Look at what semantics delivered here!" was the mantra, whereas I was left with a huge gap in seeing how what they had explained had actually translated into the benefits they were shouting about. There needed to be a much more practical focus to these presentations, rather than semantic "magic" delivering a 50% reduction in cost with no supporting detail of just how this was achieved. Some of the "magic" seemed to be that there was no unravelling of any relational data model to effect new attributes and meanings in the semantic model, but I would suggest that abstracting away from relational representation has always been a good thing if you want to avoid collapsing under the weight of database upgrades, so nothing too new there I would suggest but maybe a new approach for some.
So in summary I was a little disappointed by the day, especially given the "Demystifying" title, although there were a few highlights with Mike Bennett's talk on FIBO (Financial Instruments Business Ontology) being interesting (sorry to use the "O" word). The discussion of the XBRL success story was also good, especially how regulators mandating this standard had enforced its adoption, but from its adoption many end consumers were now doing more with the data, enhancing its adoption further. In fact the XBRL story seemed to be model for regulators could improve the world of data in financial markets, through the provision and enforcement of the data semantics to be used with each new reporting requirement as they are mandated. In summary, a mixed day and one in which I learned that the technical fog that surrounds semantics in financial markets technology is only just beginning to clear.
Posted by Brian Sentance | 15 March 2012 | 2:58 pm
Emanuel Derman gave the last presentation of the day on mathematical models and their role in financial markets. His presentation seemed to build on some of his earlier ideas with Paul Wilmott on the "Modeller's Manifesto".
Emanuel said that there was a "scandal based on models" is wrong; models did (and do) have their faults but they were not a root cause of the crisis. He started his presentation (somewhat "tongue in cheek") by putting forward a "Theory of Deliciousness" to see how one might arrived at the value of something being more or less delicious. This involved discussion of "realised deliciousness" and "expected or implied deliciousness", plus definitions around equally (relatively) delicious things and absolute deliciousness. See post on FT Alphaville for more background, but fundamentally by analogy Emanuel was putting across that there is no "fundamental theory of finance" and that finance is not physics.
He said that economists do not know the difference between theorems and laws. He seemed to be critical of some recent work from Andrew Lo (see recent post) on putting together a "Complete Theory of Human Behaviour" for once again attempting to codify something that it is uncodifiable.
Emanuel described how economists should be more aware of what is and isn't a:
- Metaphor - using something physical/tangible to represent a less tangible concept or idea. See this link for his interesting example on sleep/life and debt interest
- Model - extending the behaviour of one thing to another. A model aircraft is a very useful model of a full-size aircraft with know inputs and useful outputs of interest. We can try to model the weather but here the inputs are known (temperature, wind etc) but the model is hard to define. In finance it is hard to really see what both the inputs are and what the outputs are too.
- Theory - the ultimate non-metaphor. Here he gave the example of Moses asking the burning bush who shall I say sent me to which God replies "I am what I am". Put another way, you can't ask why on a theory, it just is.
- Intuition - a premise put forward based neither on logical progression nor on experimentation.
Emanuel said that in Finance there is no absolute value theory, and the majority of models are relative value in nature. From a common sense point of view, the world is not a model. Things change dynamically and in this way effectively all models are wrong to some degree. In summary all financial models are short volatility.
He ended his presentation by saying that nature cares more about principles than regulations (prescriptive regulators beware I guess). His parting quote was by Edward Lucas who said "If you believe that capitalism is a system in which money matters more than freedom, you are doomed when people who don’t believe in freedom attack using money."
- Bruno Dupire of Bloomberg said that it was important that a financial product was aligned with the needs of the customer, and cited certain complex products (with triggers) as being more in the interests of the vendor not the customer.
- Bruno also said that the hedgeability of a product was also key to a more stable financial system (presumably pointing at products like CDO^3 etc). He said that residual risk (that left after hedging with simpler products) should be measured and costed for. Bruno also mention the problems with assessing long term volatility where traders will try to set this input to what best suits their own P&L
- Leo Tilman said that risk management needs to be a decision-support discipline and not a policing function. He later suggested that risk managers should have to work as consultants for a while to understand that they get paid for serving the needs of the customer, not just stopping all activity/risks (in fairness to risk managers, I guess they might ask who is my customer? the trader? the CEO? the firm?).
- Dilip Madan added to the models debate by saying "what is not in the assumptions will not show up in the conclusions".
- Emanuel likes the old GS partner model for banking, and mentioned the example of Brazilian banks where banks/banking staff(?) did not enjoy limited liability. Dilip said he understood the advantage of this but no limited liability would stifle entrepreneurship.
- Leon Tatevossian said that post-crisis the relationship between risk managers and traders is better than before, and that there was also greater co-operation between empiricists and modelers. Leo add that risk managers and traders need to speak the same language and understand what each other means by "risk".
- Bruno said that models were much less of a problem than leverage.
- All seemed to agree that the tools were not invalidated by the crisis, but the framework in which they are used was the important thing.
Posted by Brian Sentance | 11 February 2012 | 8:09 pm
Roberta Romano gave her presentation in the second session of the morning, putting forward her ideas that what was needed was greater regulatory dis-harmony rather than world-wide harmonisation. Fundamentally she argued that this diversity of approaches in different regulatory regimes would minimise the impact of regulatory error (since it would confine the error to less of the system) and it would provide a test bed for ideas so that it could be seen what regulations work and what do not.
Certainly there is some basis for this idea from others in the industry (see post on Pierre Guilleman concern's on the impact of Solvency II) and I first heard the idea of diversity in financial services put forward by Avinish Persaud at Riskminds a few years back (see post).
Roberta spent a good amount of the presentation putting forward how the process of putting this diverse regulation in place would work, with individual regimes applying to the Basel Committee putting forward why they wanted to deviate from Basel III and justify how such a desired deviation would not increase systemic risk. The Basel Committee would then have a short time frame for approval (say 3 months) and the burden of proof would be placed on the Committee to show that the deviation was a detrimental one. She also described how some of the home-host regulatory conflicts would be dealt with under her proposed process.
I thought that the overall aims of her proposal were sound (diversity leading to a more robust financial system) but the implementation process would be difficult to implement I would suggest and very open to regulatory arbitrage (both by banks and by countries seeking to boost their own economies). Roberta did touch on this, but my biggest criticism was that if one of the benefits was that for a while such a diverse system would demonstrate which regulations work and which do not, then logically everyone would eventually converge on the regulations that work, re-harmonising regulations and reducing diversity.This convergence would then introduce its own (potentially new?) risks and you would be back to where you started.
A few points from the panel debate following the presentation:
- There was more criticism of how Basel regulations were gamed by the banks, particularly in relation to optimising Risk Weighted Assets
- One member of the panel pointed out that non-Basel US banks faired better in the crisis than those subject to Basel
- Rodgin Cohen suggested that RWA should receive more focus rather than the level of the capital charge (echoing the previous panel session).
- Rodgin was highly critical in the cutbacks in funding for regulators in the US
- Rodgin also said that London had its standing as the leading world financial centre due to the US Congress (refering to the Eurobond market and the Sarbanes-Oxley)
- Regulators should never forget that the "Law of Unintended Consequences" rules
Posted by Brian Sentance | 11 February 2012 | 6:00 pm
Baruch College hosted the Capco-sponsored "Institute Paper Series in Applied Finace" on Thursday. I assume this is a further follow-up event to the one they did at NYU Poly last year (see some notes here). I have put some notes together below, my apologies in advance to the speakers for any innaccuracies or ommissions in putting my thoughts together:
Systemic Risk Presentation
First part of the day started with a presentation by Viral V. Acharya of Stern on systemic risk. I have always found systemic risk an interesting topic, given the puzzle of how do you dis-incentivise an organisation from increasing risks in the wider financial system when the organisation itself will not directly (or wholey) face the consequences of this "external" risk increase.
Viral started his presentation with some great jokey graphics, one of a the HQ of a bank going up in flames with fireman hosing the flames with banknotes not water. He mentioned the definition of systemic risk given by Daniel Tarullo, Governor of the Federal Reserve (I couldn't find the definition, but primer paper here). He asked how Lehman was allowed to fail when the likes of Fannie Mae, Freddie Mac, AIG, Merrills, CitiGroup, Morgan Stanley, Goldman Sachs, Washington Mutual and Wachovia were not and offered assistance in one way or another. He said there was not enough capital in the system to stop Lehmans failure but that he saw Lehmans as the catalyst for the recapitalisation of the American banking system, not the cause. He later implied that Europe had so far lacked such a catalyst for action in the European banking system.
Viral said that he wanted to put forward an ex-ante regulation that would force a bank to retain additional capital to account for the systemic risk it produced. He said that the banking system was obviously much safer than it had been a few years back, but suggested that whilst the system could now withstand say the failure of a large organisation such as Citigroup, in his opinion it would struggle to survive the failure of Citigroup and a Euro default happening at the same time. Viral said that the current Dodd-Frank regulation on systemic risk was not a healthy one in that if a large institution fails, banks of capitalisation of over $50B are jointly taxed to assist in the consequences of the failure. Viral viewed this as a big dis-incentive against a healthy bank (say a JPM) from stepping in to purchase the failing institution before the failure, as JPM would know that it would be taxed anyway on the bailout.
In Viral's model, he defined a crisis as a 40% market correction, and assumed that non-equity liabilities repayed at face value in such a crisis. Given there is not much real data around for a 40% correction, he used data obtained from 2% correction events observed, then extrapolated from the 2% to the 40% level. He said that the question that needed to be asked was whether in such a crisis scenario that a bank like JPM would retain 8% capital. He emphasis that the level of capital chosen was somewhat arbitary but rather more importantly were the assumptions in the model of crisis, since the capital models used in regulation today are based on average losses not crisis-level losses. Using this and related models, Viral showed that the banks exhibiting the most systemic risk were Bank of America, JPM and the Citigroup (for more background and a complete list see Stern's V-Lab ).
Viral said the restructuring of Dexia (exposed heavily to peripheral sovereign debt) was the "Bear Stearns of Europe" (exposed heavily to peripheral MBS), but that is restructuring was not large enough to cause a more widespread re-capitalisation of the European banking system. Dexia was ranked as one of the safest banks in the Europe-wide stress tests of 2011, given that the Basel risk weightings did not apply any haircut to European sovereign debt. This was another critiscism that Viral levelled at Basel in that the risk weightings are static and do not reflect changes in market conditions.
Viral then joined a panel debate on systemic risk chaird by Linda Allen of Baruch, joined by Jan Cave of FDIC, Sean Culbert of Capco, Gary Gluck of Credit Suisse and Craig Lewis of the SEC.I have tried to bring out some of the main themes/points of the discussions below:
- The Balance Between Risk to the System and Risk to the Economy
There was a lot of debate on the secondary effects of regulating systemic risk and increasing capital charges on banks, and its wider effect on the general economy. Craig put forward the argument that too high capital requirements would stifle lending and in turn stifle the wider economy (arguably the "bigger" systemic risk maybe?). He argued for a balance to be found and that the aim should not be to eliminate risk in the system completely. I guess Craig was taking the banker's view, but the rest of the panel seemed to agree that the point was a valid one.
- Basel III
All agreed that Basel III was an improvement but there was still much more to be done. Gary was critical of Basel III calculation remaining too static, but Jason described how Basel III had removed many debt-like assets from the capital calculation which was good however. Jason also described how Basel I had been a simple framework (and good for that) but was tinkered with with VAR encouraging assets to be moved to trading book to reduce capital charges. Basel II then introduced the Internal Model approach and over ten years capital requirements were continued to be lowered, with CDO's attracting a 56bp capital charge during this time down from 8%. Enforcement of Basel III on both liquidity risk and capital was considered as key for coming years.
- Liquidity Risk
There was general consensus that pre-2007 liquidity risk was not talked about enough and there were no standard ways of calculating its level. Jason said that pre-2007 the regulators had not modelled what happens when the counterparties start running. Gary said that he questioned whether some of the current calibrations of liquidity risk were correct.
Sean raised the point that Volcker was likely to impact market-makers and hence impact liquidity (see earlier post on this).
Sean also mentioned that Rehypothecation of Assets has not been debated enough and had only received scant attention in Dodd-Frank (maybe see recent article on Thomson-Reuters on MF Global)
- Europe (and more Basel)
General consensus that Basel III capital requirements will constrain GDP growth in Europe. Viral seemed to have the strongest views here, saying the Europe needed a bank recapitalisation program just as the US had gone through, and that such a program would be a big boost to economic confidence. Viral remains deeply sceptical on the success of Basel III - for example all of the 2007 failures were supposedly from well capitalised insitutions under Basel I and II. Viral says that the problem is not the level of capital (8% or 12% etc) but the method of modelling the shock. A good point from Gary I thought was his premise that politics in relation to sovereign debt was playing its part in undermining the calculations and approach of Basel III.
- Too Big to Fail?
One audience question was "is too big to fail simply too big?" and should the largest organisations be broken up into more manageable parts. Viral answered that he was not in favour of a size constraint and cited that some large institutions, notably JPM, Rabobank and HSBC had been relatively robust successful during the recent crisis. He did however qualify this response by saying that he was in favour of a size constraint if the large size reached was due to implicit banking guarantees from the government, and that he would like failing large banks to be broken up into smaller pieces.
Posted by Brian Sentance | 11 February 2012 | 5:18 pm
I attended Challenges and Innovations in Operational Risk Management event last night which was surprisingly interesting. I say surprising since I must admit to some prejudice against learning about operational risk, which has for me the unfortunate historical reputation of being on the dull side.
Definition of Operational Risk
Michael Duffy (IBM GRC Strategy Leader, Ex-CEO of OpenPages) was asked by the moderator to define Operational Risk. Michael answered that he assumed that most folks attending already knew the definition (fair comment, the auditorium was full of risk managers...), but he sees it in practice as the definition of policy, the controls to enforce the policies and ongoing monitoring of the performance of the controls. Michael suggestion that many where looking to move the scope and remit of Operational Risk into business performance improvement, but clients are not there yet on this more advanced aspect.
Vick Panwar (Financial Services Industry Lead, SAS) added that Operational Risk was there to mitigate the risks for those unexpected future events (getting into the territory of Dick Cheney's Unknown Unknowns which I never tire of, particularly after a glass of wine).
Rajeev Lakra (Director Operational Risk Management, GE Treasury) took his definition from Basel II of Operational Risk as risk of loss resulting from inadequate or failed internal processes, people and systems, or from external events. Coming from GE, he said that he thought of best practice Operational Risk as similar to another GE initiative in the use of Six Sigma for improving process management. Raj said that his operational risks were mainly concerned with trade execution so covering data quality/errors, human error and settlement errors.
Beyond Box Ticking for Operational Risk
Raj said that Operational Risk is treated seriously at GE with the Head of Operational Risk reporting into the CRO and leaders of Operational Risk in each business division.
Michael suggested that the "regulators force us to do it" motive for Operational Risk had reduced given some of the operational failures during the financial crisis and recent "rogue trader" events, with the majority of institutions post-2008 having created risk committees at the "C" level and being so much more aware of tail events and the reputational damage that can damage shareholder value.
Vik said that Operational Risk is concerned primarily with "tail events" which by definition are not limited in size and therefore should be treated seriously. Pragmatically, he suggested that "the regulators need it" should be used as an excuse if there was no other way to get people to pay attention, but getting them to understand the importance of it was far more powerful.
The "What's in it for you" Approach to Operational Risk
Raj emphasised that it was possible to emphasise the benefits of operational risk to people in their everyday jobs, explaining to operators/managers that if they get frustated with failures/problems in the working day, then wouldn't it be great if these problems/losses were recorded so that they could justify a process change to senior management. He emphasised that this was a big cultural challange at GE.
Michael suggested that his clients in financial markets had gone through risk assessment, controls and recording of losses, but had not yet progressed to the use of Operational Risk to improve business performance.
Duplication of Effort
A key thing that all the panelists discussed was the overlap at many organisations between Operational Risk, Audit and Compliance. The said that the testing of the controls used for each had much in overlap, but was not based on a common nomenclature nor on common systems. For instance Vik pointed out that many of the tests on controls in Sarbanes-Oxley compliance were re-usable in an Operational Risk context, but that this was not yet happening. Vik said that this pointed to the need for comprehensive GRC platform rather than many siloed platforms.
Michael said that regulators want an integrated view, but no institution has an integrated nomenclature as yet. He recounted that one client sent 12 different control tests to branches that needed to be filled in for head office, which was a waste of resources and confusing/demotivating for staff. Raj said that the integration of Audit and Operational Risk at GE had proved to be a very difficult process. All agreed that senior management need to get involved and that a 5 year vision of how things should be incrementally integrated needs to be put in place.
Is business process risk different to business product risk? Michael said that Operational Risk certainly does and should cover both internal process and also the risks produced by the introduction of a new financial product for instance (is it well understood for instance, do clients understand what they are being sold?). He added that Operational Risk encompassed both the quantitative (statistical number of failures for instance) and the qualitative for which statistics were either not available (or not relevant to the risk).
Are there any surrogate measures for Operational Risk? Here a member of the audience was relaying senior management comments and frustration over the stereotyped red/amber/green traffic lights approach to reporting on operational risk. Michael mentioned the Operational Riskdata eXchange Association (ORX) where a number of financial institutions anonymously share operational risk loss data with a view to using this data to build better models and measures of operational risk. Apparently this has been going on since 2003 and the participants already have a shared taxonomy for Operational Risk. (my only comment on having a single measure for "operational riskiness" is that do you really want a "single number" approach to make things simple for C-level managers to understand, or should the C-levels be willing to understand more of the detail behind the number?)
Is "Rogue Trading" Operational Risk? Michael said that it definitely was, and that obviously each institution must control and monitor its trading policies to ensure they were being followed. The panel proposed that Operational Risk applied to trading activity could be a good application of "Big Data" (much hyped by industry journalists lately) to understand typical trading patterns and understand unusual trading patterns and behaviours. (Outside of bulk tick-data analysis this is one of the first sensible applications of Big Data so far that I have heard suggested so far given how much journalists seem to be in love with the "bigness" of it all without any business context to why you actually would invest in it...sorry, mini-rant there for a moment...)
Good event with an interesting panel, the GE speaker had lots of practical insight and the vendor speakers were knowledgeable without towing the marketing line too much. Operational Risk seems to be growing up in its linkage into and across market, credit and liquidity risk. The panel agreed however that it was very early days for the discipline and a lot more needs to be done.
Given the role of human behaviour in all aspects of the recent financial crisis, then in my view Operational Risk has a lot to offer but also a lot to learn, not least in that I think it should market itself more agressively along the lines of being the field of risk management that encompasses the study and understanding of human behaviour. Maybe there is a new career path looming for anthropologists in financial risk management...
Posted by Brian Sentance | 27 January 2012 | 11:30 pm
One of the PRMIA folks in New York kindly recommended this paper on the Volcker Rule, in which Darrell Duffie criticises the proposed this new US regulation design to drastically reduce proprietary ("own account") trading at banks.
As with all complex systems like financial markets, the more prescriptive the regulations become the harder it is "lock down" the principles that were originally intended. In this case the rules (due July 2012) make an exception to the proprietary trading ban where the bank is involved in "market-making", but Darrell suggests that the basis for what types of trades are "market-making" and what types of trades are more pure "proprietary trading" are problematic in this case, as there will always be trades that are part of "market-making" process (i.e. providing immediacy of execution to customers) that are not directly and immediately associated with actual customer trading requests.
He suggests that the consequences of the Volcker Rule as it is currently drafted will be higher bid-offer spreads, higher financing costs and reduced liquidity in the short-term, and a movement of liquidity to unregulated entities in the medium term possibly further increasing systemic risk rather than reducing it. Seems like another example of "one man's trade is another man's hedge" combined with "the law of unintended consequences". The latter law doesn't give me a lot of confidence about the Dodd-Frank regulations (of which the Volcker Rule forms part), 2319 pages of regulation probably have a lot more unintended consequences to come.
Posted by Brian Sentance | 20 January 2012 | 3:47 pm
I spotted this in the FT recently - for those of you diligent enough to want to read more about the possible causes and possible solutions to the (ongoing) financial crisis, then Andrew Lo may have saved us all a lot of time in his 21-book review of the financial crisis. Andrew reviews 10 books by academics, 10 by journalists and one by former Treasury Secretary Henry Paulson.
Andrew finds a wide range of opinions on the causes and solutions to the crisis, which I guess in part reflects that regardless of the economic/technical causes, human nature is both at the heart of the crisis and evidently also at the heart of its analysis. He regards the differences in opinion quite healthy in that they will be a catalyst for more research and investigation. I also like the way Andrew starts his review with a description of how people's view of the same events they have lived through can be entirely different, something that I have always found interesting (and difficult!).
A quote from Napolean (that I am in danger of over-using) seems appropriate to Andrew's review: "History is the version of past events that people have decided to agree upon" but maybe Churchill wins in this context with: "History will be kind to me for I intend to write it.". Maybe we should all get writing now before it is too late...
Posted by Brian Sentance | 18 January 2012 | 11:17 pm
For someone who has been criticised a lot over recent years, Vikram Pandit CEO of Citigroup, seems to have come up with an interesting risk management idea in his latest article in the FT. Vikram proposes that regulators put together an standard, multi-asset "benchmark" portfolio that all financial institutions would have to provide risk numbers on, enabling regulators to understand more of the risk management capabilities of each institution and avoiding any detailed disclosure of the portfolio actually held by each firm.
I guess a key thing would be that such numbers would have to be disclosed to the regulator away from public view, since we all know that otherwise the numbers would converge and all the banks would be doing the same thing (or at least copying each other's numbers?). Reminds me of a great talk at the RiskMinds event a few years back, praising diversity of approach and criticising regulators for effectively forcing everyone to do the same thing.
Posted by Brian Sentance | 12 January 2012 | 2:34 pm
I attended the PRMIA event last night "Risk Year in Review" at Moody's New York offices. It was a good event, but by far the most interesting topic of the evening for me was from Samuel Won, who gave a talk about some of the best and most innovative risk management techniques being used in the market today. Sam said that he was inspired to do this after reading the book "The Information" by James Gleik about the history of information and its current exponential growth. Below are some of the notes I took on Sam's talk, please accept my apologies in advance for any errors but hopefully the main themes are accurate.
Early '80s ALM - Sam gave some context to risk management as a profession through his own personal experiences. He started work in the early 80's at a supra-regional bank, managing interest rate risk on a long portfolio of mortgages. These were the days before the role of "risk manager" was formally defined, and really revolved around Asset and Liability Management (ALM).
Savings and Loans Crisis - Sam then changed roles and had some first hand experience in sorting out the Savings and Loans crisis of the mid '80s. In this role he become more experienced with products such as mortgage backed securities, and more familiar with some of the more data intensive processes needed to manage such products in order to account for such factors such as prepayment risk, convexity and cashflow mapping.
The Front Office of the '90s - In the '90s he worked in the front office at a couple of tier one investment banks, where the role was more of optimal allocation of available balance sheet rather than "risk management" in the traditional sense. In order to do this better, Sam approached the head of trading for budget to improve and systemise this balance sheet allocation but was questioned as to why he needed budget when the central Risk Control department had a large staff and large budget already.
Eventually, he successfully argued the case that Risk Control were involved in risk measurement and control, whereas what he wanted to implement was active decision support to improve P&L and reduce risk. He was given a total budget of just $5M (small for a big bank) and told to get on with it. These two themes of implementing active decision support (not just risk measurement) and have a profit motive driving better risk management ran through the rest of his talk.
A Datawarehouse for End-Users Too - With a small team and a small budget, Sam made use of postgraduate students to leverage what his team could develop. They had seen that (at the time) getting systems talking to each other was costly and unproductive, and decided as a result to implement a datawarehouse for the front office, implementing data normalisation and data scrubbing, with data dashboard over the top that was easy enough for business users to do data mining. Sam made the point that useability was key in allowing the business people to extract full value from the solution.
Sam said that the techniques used by his team and the developers were not necessarily that new, things like regression and correlation analysis were used at first. These were used to establish key variables/factors, with a view to establish key risk and investment triggers in as near to real-time as possible. The expense of all of this development work was justified through its effects on P&L which given its success resulting in more funding from the business.
Poor Sell-Side Risk Innovation - Sam has seen the most innovative risk techniques being used on the buy-side and was disappointed by the lack of innovation in risk management at the banks. He listed the following sell-side problems for risk innovation:
- politically driven requirements, not economically driven
- arbitrary increases in capital levels required is not a rigorous approach
- no need for decision analysis with risk processes
- just passing a test mentality
- just do the marginal work needed to meet the new rules
- no P&L justification driving risk management
Features of Innovative Approaches - Sam said that he had noted a few key features of some of the initiatives he admired at some of the asset managers:
- Based on a sophisticated data warehouse (not usually Oracle or Sybase, but Microsoft and other databases used - maybe driven by ease of use or cost maybe?)
- Traders/Portfolio Managers are the people using the system and implementing it, not the technical staff.
- Dedicated teams within the trading division to support this, so not relying on central data team.
A Forward-Looking Risk Model Example - The typical output from such decision analysis systems he found was in the form of scenarios for users to consider. A specific example was a portfolio manager involved in event-driven long-short equity strategies around mergers and acquisitions. The manager is interested in the risk that a particular deal breaks, and in this case techniques such as Value at Risk (VaR) do not work, since the arbitrage usually requires going long the company being acquired and short the acquiror (VaR would indicate little risk in this long-short case). The manager implemented a forward looking model that was based on information relevant to the deal in question plus information from similar historic deals. The probabilities used in the model where gathered from a range of sources, and techniques such as triangulation where used to verify the probabilities. Sam views that forward-looking models to assist in decision support are real risk management, as opposed to the backward-looking risk measurement models implemented at banks to support regulatory reporting.
Summary - Sam was a great speaker, and for a change it was refreshing to not have presentation slides backing up what the speaker was saying. His thoughts on forward looking models being true risk management and moving away from risk measurement seem to echo those of Ricardo Rebanato of a few years back at RiskMinds (see post). I think his thoughts on P&L motivation being the only way that risk management advances are correct, although I think there is a lot of risk innovation at the banks but at a trading desk level and not at the firm-wide level which is caught up in regulation - the trading desks know that capital is scarce and are wanting to use it better. I think this siloed risk management flies in the face of much of the firm-wide risk management and indeed firm-wide data management talked about in the industry, and potentially still shows that we have a long way to go in getting innovation and forward looking risk management at a firm level, particularly when it is dominated by regulatory requirements. However, having a truly integrated risk data platform is something of a hobby-horse for me, I think it is the foundation for answering all of the regulatory and risk requirementst to come, whatever their form. Finally, I could not agree more easy analysis for end-users is a vital part of data management for risk, allowing business users to do risk management better. Too many times IT is focussed on systems that require more IT involvement, when the IT investment and focus should be on systems that enable business users (trading, risk, compliance) to do more for themselves. Data management for risk is key area for improvement in the industry, where many risk management sytem vendors assume that the world of data they require is perfect. Ask any risk manager - the world of data is not perfect and manual data validation continues to be a task that takes time away from actually doing risk management.
Posted by Brian Sentance | 14 December 2011 | 11:29 pm
My colleagues Joanna Tydeman and Matthew Skinner attended the A-Team Group's Data Management for Risk, Analytics and Valuations event today in London. Here are some of Joanna's notes from the day:
Andrew Delaney, Amir Halton (Oracle)
Drivers of the data management problem – regulation and performance.
Key challenges that are faced – the complexity of the instruments is growing, managing data across different geographies, increase in M&As because of volatile market, broader distribution of data and analytics required etc. It’s a work in progress but there is appetite for change. A lot of emphasis is now on OTC derivatives (this was echoed at a CityIQ event earlier this month as well).
Having an LEI is becoming standard, but has its problems (e.g. China has already said it wants its own LEI which defeats the object). This was picked up as one of the main topics by a number of people in discussions after the event, seeming to justify some of the journalistic over-exposure to LEI as the "silver bullet" to solve everyone's counterparty risk problems.
Expressed the need for real time data warehousing and integrated analytics (a familiar topic for Xenomorph!) – analytics now need to reflect reality and to be updated as the data is running - coined as ‘analytics at the speed of thought’ by Amir. Hadoop was mentioned quite a lot during the conference, also NoSQL which is unsurprising from Oracle given their recent move into this tech (see post - a very interesting move given Oracle's relational foundations and history)
Impact of regulations on Enterprise Data Management requirements
Virginie O’Shea, Selwyn Blair-Ford (FRS Global), Matthew Cox (BNY Melon), Irving Henry (BBA), Chris Johnson (HSBC SS)
Discussed the new regulations, how there is now a need to change practice as regulators want to see your positions immediately. Pricing accuracy was mentioned as very important so that valuations are accurate.
Again, said how important it is to establish which areas need to be worked on and make the changes. Firms are still working on a micro level, need a macro level. It was discussed that good reasons are required to persuade management to allocate a budget for infrastructure change. This takes preparation and involving the right people.
Items that panellists considered should be on the priority list for next year were:
· Reporting – needs to be reliable and meaningful
· Long term forecasts – organisations should look ahead and anticipate where future problems could crop up.
· Engage more closely with Europe (I guess we all want the sovereign crisis behind us!)
· Commitment of firm to put enough resource into data access and reporting including on an ad hoc basis (the need for ad hoc was mentioned in another session as well).
Technology challenges of building an enterprise management infrastructure
Virginie O’Shea, Colin Gibson (RBS), Sally Hinds (Reuters), Chris Thompson (Mizuho), Victoria Stahley (RBC)
Coverage and reporting were mentioned as the biggest challenges.
Front office used to be more real time, back office used to handle the reference data, now the two must meet. There is a real requirement for consistency, front office and risk need the same data so that they arrive to the same conclusions.
Money needs to be spent in the right way and fims need to build for the future. There is real pressure for cost efficiency and for doing more for less. Discussed that timelines should perhaps be longer so that a good job can be done, but there should be shorter milestones to keep business happy.
Panellists described the next pain points/challenges that firms are likely to face as:
· Consistency of data including transaction data.
· Data coverage.
· Bringing together data silos, knowing where data is from and how to fix it.
· Getting someone to manage the project and uncover problems (which may be a bit scary, but problems are required in order to get funding).
· Don’t underestimate the challenges of using new systems.
Better business agility through data-driven analytics
Stuart Grant, Sybase
Discussed Event Stream Processing, that now analytics need to be carried out whilst data is running, not when it is standing still. This was also mentioned during other sessions, so seems to be a hot topic.
Mentioned that the buy side’s challenge is that their core competency is not IT. Now with cloud computing they are more easily able to outsource. He mentioned that buy side shouldn’t necessarily build in order to come up with a different, original solution.
Data collection, normalisation and orchestration for risk management
Andrew Delaney, Valerie Bannert-Thurner (FTEN), Michael Coleman (Hyper Rig), David Priestley (CubeLogic), Simon Tweddle (Mizuho)
Complexity of the problem is the main hindrance. When problems are small, it is hard for them to get budget so they have to wait for problems to get big – which is obviously not the best place to start from.
There is now a change in behaviour of senior front office management – now they want reports, they want a global view. Front office do in fact care about risk because they don’t want to lose money. Now we need an open dialogue between front office and risk as to what is required.
Integrating data for high compute enterprise analytics
Andrew Delaney, Stuart Grant (Sybase), Paul Johnstone (independent), Colin Rickard (DataFlux)
The need for granularity and transparency are only just being recognised by regulators. The amount of data is an overwhelming problem for regulators, not just financial institutions.
Discussed how OTCs should be treated more like exchange-traded instruments – need to look at them as structured data.
Posted by Brian Sentance | 18 October 2011 | 12:44 am
Achieving regulatory approval can be challenging if we consider that regulators are concerned about both the risk calculation methodology in place but also the quality, consistency and auditability of the data feeding the risk systems used for regulatory reporting.
The data management project at LBBW (Landesbank Baden-Württemberg), for example, was initiated to support LBBW’s internal model for market risk calculations, combined with the additional aim of enabling risk, back office and accountancy departments to have transparent access to high quality and consistent data.
This required a consolidated approach to the management of data in order to support future business plans and successful growth and we worked with LBBW to provide a centralised analytics and data management platform which could enhance risk management, deliver validated market data based upon consistent validation processes and ensure regulatory compliance.
More information on the joint project at LBBW can be found in the case study, available on our website. Any questions, drop us a line!
Posted by Sara Verri | 22 September 2011 | 7:21 pm
Sitting by the sea, you have just finished your MATLAB reading and now are wondering what to read next?
We have just published our "TimeScape Data Unification" white paper. Not a pocket edition I am afraid, but some of you may find it interesting.
It describes how - post-crisis - a key business and technical challenge for many large financial institutions is to knit together their many disparate data sources, databases and systems into one consistent framework than can meet the ongoing demands of the business, its clients and regulators. It then analyses the approaches that financial institutions have adopted to respond to this issue, such as implementing a ETL-type infrastructure or a traditional golden copy data management solution.
Taking on from their effectiveness and constraints, it then shows how companies looking to satisfy the need for business-user access to data across multyple systems should consider a "distributed golden copy" approach. This federated approach deals with disparate and distributed sources of data and should also provide easy and end-user interactivity whilst maintaining data quality and auditability.
The white paper is available here if you want to take a look and if you have any feedback or questions, drop us a line!
Posted by Sara Verri | 27 July 2011 | 4:19 pm
Final presentation at the PRMIA event yesterday was by Clifford Rossi and was entitled "The Brave New World of Data & Analytics Following the Crisis: A Risk Manager's Perspective".
Clifford got his presentation going with a humorous and self-depricating start by suggesting that his past employment history could in fact be the missing "leading indicator" for predicting orgnisations in crisis, having worked at CitiGroup, WaMu, Countrywide, Freddie Mac and Fannie Mae. One of the other professors present said that he didn't do the same to academia (University of Maryland beware maybe!).
Clifford said that the crisis had laid bare the inadequacy and underinvestment in data and risk technology in the financial services sector. He suggested that the OFR had the potential to be a game changer in correcting this issue and in helping the role of CRO to gain in stature.
He gave an example of a project at one of the GSEs he had worked at called "Project Enterprise" which was to replace 40 year old mainframe based systems (systems that for instance only had 3 digits to identify a transaction). He said that he noted that this project had recently been killed, having cost around $500M. With history like this, it is not surprising that enterpring risk data warehousing capabilities were viewed as black holes without much payoff prior to the crisis. In fact it was only due to Basel that data management projects in risk received any attention from senior management in his view.
During the recent stress test process (SCAP) the regulators found just how woeful these systems were as the banks struggled to produce the scenario results in a timely manner. Clifford said that many banks struggled to produce a consistent view of risk even for one asset type, and that in many cases, corporate acquisitions had exascerbated this lack of consistency in obtaining accurate, timely exposure data. He said that the mortgage processing fiasco showed the inadequacy of these types of systems (echoing something I heard at another event about mortgage tagging information being completely "free-fromat", without even designated fields for "City" and "State" for instance)
Data integrity was another key issue that Clifford discussed, here talking about the lack of historical performance data leading to myopia in dealing with new products and poor defintions of product leading to risk assessments based on the originator rather than on the characteristics of the product. (side note: I remember prior to the crisis the credit derivatives department at one UK bank requisitioning all new server hardware to price new CDO squared deals given it was supposedly so profitable, it was at that point that maybe I should have known something was brewing...) Clifford also outlined some further data challenges, such as the changing statistical relationship between Debt to Income ratio and mortgage defaults once incomes were self-declared on mortgages.
Moving on to consider analytics and models, Clifford outlined a lot of the concerns covered by the Modeller's Manifesto, such as the lack of qualitative judgement and over-reliance on the quantitative, efficiency and automation superceding risk management, limited capability to stress test on a regular basis, regime change, poor model validation, and cognitive biases reinforced by backward-looking statistical analysis. He made the additional point that in relation to the OFR, they should concentrate on getting good data in place before spending resource on building models.
In terms of focus going forward, Clifford said the liquidity, counterparty and credit risk management were not well understood. Possibly echoing Ricardo Rebonato's ideas, he suggested that leading indicators need to be integrated into risk modelling to provide the early warning systems we need. He advocated that the was more to do on integrating risk views across lines of business, counterparties and between the banking and trading book.
Whilst being a proponent of the OFRs potential to mandate better Analytics and data management, he warned (sensibly in my view) that we should not think that the solution to future crises is simply to set up a massive data collection and Modelling entity (see earlier post on the proposed ECB data utility)
Clifford thinks that Dodd-Frank has the potential to do for the CRO role what Sarbanes-Oxley did in elevating the CFO role. He wants risk managers to take the opportunity presented in this post-crisis period to lead the way in promoting good judgement based on sound management of data and Analytics. He warned that senior management buy-in to risk management was essential and could be forced through by regulatory edict.
This last and closing point is where I think where the role of risk management (as opposed to risk reporting) faces it's biggest challenge, in that how can a risk manager be supported in preventing a senior business manager from seeking a overly risky new business opportunity based on what "might" happen in the future - we human beings don't think about uncertainty very clearly and the lack of a resulting negative outcome will be seen by many to invalidate the concerns put forward before a decision was made. Risk management will become known as the "business prevention" department and not regarded as the key role it should be.
Posted by Brian Sentance | 24 June 2011 | 4:26 pm
Lewis Alexander (ex-US Treasury) carried on the theme of systemic risk at the PRMIA seminar "Risk, Regulation and Financial Technology & Identifying the Next Crisis". He started by saying that whilst systemic risk was a risk to the economy and industry as a whole, systemic risk was also relevant to the risks (such as market or credit) that a risk manager at an individual institution needs to assess.
Lewis said that there had really only been three systemic crises over the past century or so (1907, 1933 and 2008) with obviously many more disruptions in markets that should not be described at systemic. As such this is one problem of assessing systemic risk which is that crises are rare events so there is little data to analyse. He also warned that the way the system responds to small shocks should not be taken as a proxy for how it responds to large ones, that the relationship between asset prices and systemic risk is a complex one, and that reporting (mainly accounting but also in risk) had not kept up with financial markets innovation.
Lewis said that "stress test" methods can help to identify vunerable institutions but that this method of looking at systemic risk does not deal with the propogation of risk from one institution to another. He said that network analysis can help to assess propogation but the weakness with these methods was the lack of counterparty data. Liquidity methods also suffer from a lack of data. He said that "Leading Indicators" (see past post on Bubble Indices) tell us little of what creates systemic risk.
He mentioned the use of CoVaR (based on VaR) for systemic risk, using CDS pricing to theoretically "insure" the industry against crisis and a "Merton Model" approach to estimate potential losses due to default for a group of banks. He said that all of these models were good comparators, but not good as indicators.
Given the previous talk on systemic risk, Lewis switched his focus to what can done with the main focus for him being data where we need:
- Robust data on both asset and counterparty exposures
- Data on leverage through the system
- Data on the depth of liquidity to assess the vunerability of assets to fire sales
A final few points from his talk:
- Dodd-Frank will help given new reporting mandates e.g. swap data repositories being invaluable sources of data for regulators
- Could we use the payments/settlement system to provide yet more insight into what is going on by sensibly tagging transactional flows (DTCC take note apparently!)
- SEC registration of a new financial product could help to enforce what is reported, how and to act as a limit on what products can be sold
- Lewis said that up to 5,000 attributes are needed to describe any financial transaction so it can be done
- As he became involved in the FSOC and the formation of the OFR he thought initially that collecting all the data needed was impossible, but his view has changed on this with modern technology and processing power.
- The above said, he thought that until standards were in place (such as LEI) then it did not make sense for the OFR to start collecting data
- A member of the audience suggested that if data could be published in a standard form, it would "Google to the rescue" in terms of doing aggregation across the industry without centralising the data in one store. (maybe Google plans to usurp Microsoft Excel as the defacto trading and risk management system for the industry?)
Lewis gave a very good and interesting talk. I think some of his ideas on the OFR were good, but given the state of the data infrastructure that I have observed at many large institutions I would be worried that he is being optimistic on how quickly the industry is able to pull all the data together, however standardised. I think the industry will get there (particularly if mandated), but given the legacy of past systems and infrastructure it will take some good time to achieve yet.
Posted by Brian Sentance | 22 June 2011 | 9:59 pm
I attend a PRMIA seminar this morning at the offices of Ernst & Young with the rather long title of "Risk, Regulation and Financial Technology & Identifying the Next Crisis".
First up was Matthew Richardson of NYU Stern with a presentation entitled "Identifying the Next Crisis". The focus of his presentation was on systemic risk, which he defined as the risk that financial institutions lose the ability to intermediate (i.e. continue to provide services) due to an aggregate capital shortfall. He presented a precise definition of the systemic risk of a firm as:
Expected real social costs in a crisis per dollar of capital shortage
x Expected capital shortfall of the firm in a crisis
Matthew explained that there are three approaches to estimating systemic risk contribution:
- Statistical approach based on public data
- Stress tests
- Market approach based on insurance against capital losses in a crisis
He explained that the methods his team have used have had some statistical success against data from the past crisis in showing those organisations in crisis early. I found his presentation reasonably dry (more regression analysis etc) but I thought the following where worth a mention:
- Crisis Insurance - Approach 3 on getting firms to insure themselves against capital shortfalls in a crisis sounded interesting but ended up with the insurer being the regulator (not enough capital to insure privately) and the beneficiary being the regulator. So effectively this was a tax on the systemically significant institutions, where the involvement of the private insurers was mainly to do with price discovery (i.e. setting the right level of premium (i.e. tax) for each institution)
- Short-term Indicators - Many of the approaches we have currently (VaR etc) are short term indicators and so in good times do not inhibit market behaviour as would be desired by the regulators. A good illustration was given of how short term volatility was very much lower than long term prior to the crisis and how these merged to similar levels once the crisis hit.
- Regulatory Loopholes - He put forward that this crisis was as a result not of monetary policy but of large complex financial institutions exploiting loopholes in regulation. The AIG Quarterly Filings of Feb 2008 showed that $379Billion of the $527Billion of CDS were with clients that were explicitly seeking regulatory capital relief (i.e. get the CDS in place and your capital requirement dropped to zero). He also explained how Fannie Mae and Freddie Mac were used by banks to simply "rubber stamp" mortgage pools and magically reduce the capital required down from 4% to 1.6%.
- Where to look - He said that "like water flows downhill, capital flows to its most levered point". He said to look for which parts of the financial sector are treated different under Dodd-Frank, Basel III etc and that the key candidates were 1) shadow banking and 2) government guarantees. Also you should look for those asset classes that get preferred risk weights for a given level of risk.
As often seems to be the case, I found the side comments more interesting than the main body of the presentation, but Matthew's presentation showed that a lot of work is being done on systemic risk identification and measurement in academia.
Posted by Brian Sentance | 22 June 2011 | 9:53 pm
I enjoyed myself at the drinks reception after the NYU-Poly event. Nothing new in that I guess for those of you that know me well and like me find it difficult to resist a glass or two of red wine. Whilst attempting to circulate (I am almost 2 metres tall, so rather than "circulate" I think a more appropriate word might unfortunately be "intimidate"), I struck up a conversation with an interesting gentleman by the name of Per Kurowski.
Per is a former director of the World Bank and has some contrary and interesting ideas on the financial crisis and our current methods of regulation. His first that financial crises rarely start with assets that are perceived as "risky", which I think is a pretty self-evident point but not one that I had not previously registered. His second line of argument is that our current regulation biasses our banks away from "riskier" assets and hence away from just the kinds of organisations that are a) needed for employment creation and b) do not cause crises.
Per argues that many of the big institutions are near triple-A rated and hence benefit from being able to leverage up cheaply (at low-interest rates, since they are triple-A) and are then biased by lower capital requirements to use this leveraged funding to invest in yet more triple-A assets (SPVs/other institutions such as themselves). Hence you get the double-whammy of cheap funding and biased capital requirements which naturally leads to potential distortions in anything perceived as triple-A, and a bias away from riskier assets and the risk-takers that the world economy needs.
Posted by Brian Sentance | 20 June 2011 | 11:13 pm
A few of choice quotes from the rest of the day at NYU-Poly:
- "The difference between economists and meteorologists is that meteorologists can at least agree on what happended yesterday"
- "A bubble can only be identified from a trend when the bubble bursts"
- "Capital flows from strange places to strange destinations in today's financial markets"
- "In a Basel III world, the stock price of Morgan Stanley would rise if its investment banking division were sold off"
- "Basel III is a good attempt at managing systemic risk"
- "Hedge Funds are the risk takers of the future"
- "Hedge Funds have the partnership mentality that the commercial banks have lost and should regain"
- "CCPs should not compete on risk management"
- "Economists are trained to predict everything except the future"
- "Dodd Frank was a missed opportunity to consolidate the many regulators in the United States"
- "Washing D.C. is all about turf and theatre"
- "Insolvency and liquidity risk are not clearly separable"
- "Beware the Golden Rule. He who makes the Gold makes the Rule"
- "Systemic risk is not the sum of individual institutional risk"
- "As Chuck Prince said "As long as the music is playing, you’ve got to get up and dance""
- "Systemic risk management only works when we all stop dancing"
- "Regulation should remove the punchbowl just when the party is getting started"
Posted by Brian Sentance | 20 June 2011 | 9:43 pm
The first panel session at NYU-Poly after Nassim Taleb concerned itself with the increasing competition between banks and insurers, which I didn't think reached any great conclusions as to where things are heading but did give background for why banks and insurers are increasingly offering the same services (disintermediation, regulation and industry structural changes being the main reasons). One of the presenters also said that acturial methods may provide a useful framework for unhedgeable risks taken by banks. I must acknowledge that my attention span was also challenged during this session by a very early start (up pre-6am) and a distinct lack of caffeine (later rectified many times over).
Second panel session up was entitled "The Future of Financial Regulation" and proved a lot more interesting to me given that I think I learned a few new things. Main presenter was Allen Ferell from Harvard Law School. Main point I took away from this presentation was that regulation should focus more on the resolution of financial distress after (ex-post) it has occurred at an institution rather than rules and regulations to prevent it before it happens.
I found this argument quite appealing since to a large degree it avoids provisioning for the "unknown unknowns" through more and more rules and increases in capital. The reduction in pre (ex-ante) rules would also reduce the gaming of the rules that enevitability would occur, and shareholders knowing that they would be penalised and penalised quickly following financial distress would encourage them to become more interested in the levels of risk being taken on their behalf. I guess one of the main issues for the above is how such a level of financial distress would be defined and enforced in order to act as a trigger for say automatic conversion of debt to equity. Anyway, on with what Allen Ferrell had to say:
Allen said that if a financial institution had had the foresight to see the financial crisis coming, then looking across the industry there would have been a great variation in the amount of capital needed to survive the crisis. I guess here the implication here was that higher levels of capital across the industry will help, but they are unlikely to be enough for some organisations in the crisis to come.
After the crisis had hit, he said that financing from the repo market dried up as repo haircuts exploded, and he said that this was like the modern day equivalent of a bank run (where a solvent bank faced difficulty due to having to sell good assets cheaply to satisfy demands for returning of cash deposits).
Allen said that leverage and "debt overhang" made it much less likely that a financial institution would get in more equity capital following the crisis since it implied a transfer of wealth from the stockholders to bondholders. More of this important point later.
He put forward that it was not yet clear whether the 2007-8 crisis was mainly due to insolvency or due to a bank run. He argued that it was some combination of both, and referred back to the recent re-assessment of the Great Depression being caused not by a run on (solvent) banks but rather by flight of retail investors away from insolvent banks.
He concluded that much of the action for any future crisis will have to take place after any new crisis hits (ex-post), partly due to his assessment of the disconnect between equity capital needed (the current focus of things like Basel III) prior to a crisis and an institution's financial health following a crisis.
Allen suggested that contingent capital, i.e. debt capital that automatically converted in equity based on some market trigger might be very helpful in dealing with a financial crisis. Such a conversion would happen early than if an institution agreed to it earlier and would automatically dilute existing stockholders. Overall this was a thought provoking talk and the panel discussion afterwards was interesting too. One of the panelists commented that he looked for a high leverage and high ratios of CEO to CRO compensation as his measure of where to look for the next set of risky institutions. The panel also seemed to agree that with the benefit of hindsight, allowing Lehmans to fail and the resultant drying up of the money markets was a mistake, and more consistency was needed in bankruptcy and distress resolution.
Posted by Brian Sentance | 18 June 2011 | 5:23 pm
I went along to spend a day in Brooklyn yesterday at NYU-Poly, now the engineering school of NYU containing the Department of Finance and Risk Engineering. The event was called the "The Post Crisis World of Finance" was sponsored by Capco.
First up was Nassim Taleb (he of Black Swan fame). His presentation was entitled "A Simple Heuristic to Assess Tail Exposure and Model Error". First time I had seen Nassim talk and like many of us he was an interesting mix of seeming nervousness and confidence whilst presenting. He started by saying that given the success and apparent accessibility to the public of his Black Swan book, he had a deficit to make up in unreadability in this presentation and his future books.
Nassim recommenced his on-going battle with proponents of Value at Risk (see earlier posts on VaR) and economists in general. He said that economics continues to be marred by the lack of any stochastic component within the models that most economists use and develop. He restated his view that economists change the world to fit their choice of model, rather than the other way round. He mentioned "The Bed of Procrustes" from Greek mythology in which a man who made his visitors fit his bed to perfection by either stretching them or cutting their limbs (good analogy but also good plug for his latest book too I guess)
He categorized the most common errors in economic models as follows:
- Linear risks/errors - these were rare but show themselves early in testing
- Missing variables - rare and usually gave rise to small effects (as an aside he mentioned that good models should not have too many variables)
- Missing 2nd order effects - very common, harder to detect and potentially very harmful
He gave a few real-life examples of 3 above such as a 10% increase in traffic on the roads could result in doubling journey times whilst a 10% reduction would deliver very little benefit. He targeted Heathrow airport in London, saying that landing there was an exercise in understanding a convex function in which you never arrive 2 hours early, but arriving 2 hours later than scheduled was relatively common.
He described the effects of convexity firstly in "English" (his words):
"Don't try to cross a river that is on average 4ft deep"
and secondly in "French" (again his words - maybe a dig at Anglo-Saxon mathematical comprehension or in praise of French mathematics/mathematicians? Probably both?):
"A convex function of an average is not the average of a convex function"
Nassim then progressed to show the fragility of VaR models and their sensitivity to estimates of volatility. He showed that a 10% estimate error in volatility could produce a massive error in VaR level calculated. His arguments here on model fragility reflected a lot of what he had proposed a while back on the conversion of debt to equity in order to reduce the fragility of the world's economy (see post).
His heuristic measure mentioned in the title was then described which is to peturb some input variable such as volatility by say 15%, 20% and 25%. If the 20% result is much worse than the average of the 15 and 25 ones then you have a fragile system and should be very wary of the results and conclusions you draw from your model. He acknowledged that this was only a heuristic but said that with complex systems/models a simple heuristic like this was both pragmatic and insightful. Overall he gave a very entertaining talk with something of practical value at the end.
Posted by Brian Sentance | 17 June 2011 | 7:14 pm
Risk management and data control remain at the top of the agenda at many financial institutions. Many have said that the recent crisis highlighted the need for more consistent, transparent, high quality data management, which I totally agree with (but working for Xenomorph, I would I guess!). Although the crisis started in 2007, it would seem that many organizations still do not have the data management infrastructure in place to achieve better risk management.
I moved apartment last week and had to face the terrifying prospect of visiting IKEA to buy some new furniture. On walking through the endless corridors of furniture ideas I wondered whether the people at major financial institutions feel as I did: I knew I needed two wardrobes, I knew the dimensions of the rooms, I knew how many drawers I wanted. Then I got to the wardrobes showroom, sat in front of the “Create your own wardrobe” IKEA software and the nightmare started. How many solutions are there to solve your problems? And how many solutions, once you get to know of their existence, make you aware of a problem you didn’t know you had? That’s how I spent 2 days at IKEA choosing my furniture and still I wonder whether in the end I got the right solution for my needs.
Coming back to risk management, I imagine the same dilemma may be faced by financial institutions looking to implement a data management solution. How many software providers are out there? What data model do they use? Are they flexible enough to satisfy evolving requirements? How can we achieve an integrated data management approach? Will they support all kind of asset classes, even the most complex?
In these times of new regulations where time goes fast and budget is tight, selection processes have become more scrupulous.
As often happens in life, when we need a plumber for example, or a new dentist, we look for positive recommendations, people willing to endorse the efficiency and reliability of the service. So, with this in mind, please take a look at the case study we put together with Rabobank International, who have been using our TimeScape analytics and data management system at their risk department since 2002 for consolidated data management. More client stories are also available on our website here: www.xenomorph.com/casestudies.
I hope that many of you will benefit from reading the case study and for any questions (on IKEA wardrobes too!), please get in touch...
Posted by Sara Verri | 8 June 2011 | 10:07 am
Xenomorph has today released its white paper “Instrument Valuation Management: management of derivative and fixed income valuations in a multi-asset, multi-model, multi-datasource and multi-timeframe environment”.
The white paper expands on the “Rates, Curves and Surfaces – Golden Copy Management of Complex Datasets” white paper Xenomorph published recently (see earlier post) and describes how, despite the increasing importance of instrument valuation to investment, trading and risk management decisions, valuation management is not yet formally and fully addressed within data management strategies and remains a big concern for financial institutions.
Too often, says Xenomorph, valuations (and the analytics used to process input and calculate output data) fall between traditional data management providers and pricing model vendors. This leads to the over–use of tactical desktop spreadsheets where data “escapes” the control of the data management system, leading to an increased operational risk.
Whilst instrument valuation is certainly not the primary cause of the recent financial crisis, the lack of high quality, transparent valuations of many complex securities resulted in market uncertainty and in the failure of many risk models fed by untrustworthy valuations.
“A deeper understanding of financial products reduces operational risk and promotes quality, consistency and auditability, ensuring regulatory compliance”, says Brian Sentance, CEO Xenomorph. “Clients’ requirements have evolved and portfolio managers, traders and risk managers recognize that it is no longer sufficient to treat valuation as an external, black-box process offered by pricing service providers”, he adds.
Nowadays, regulators, auditors, clients and investors demand even more drill-down to the underlying details of an instrument’s valuation. It is therefore important to implement an integrated, consistent analytics and data management strategy which cuts across different departments and glues together reference and market data, pricing and analytics models, for transparent, high quality, independent valuation management.
“Our TimeScape solution provides a valuation environment which offers rapid and timely support for even the most complex instruments, allowing our clients to check easily the external valuation numbers, based on their choice of model and data providers”, says Sentance. “Otherwise, what is the point of good data management if the valuations and the analytics used are not based on the same data management infrastructure principles?”
For those who are interested, the white paper is available here.
Posted by Sara Verri | 4 May 2011 | 1:41 pm
Xenomorph has released its white paper 'Rates, Curves and Surfaces – Golden Copy Management of Complex Datasets'. The white paper describes how, despite the increasing interest in risk management and tighter regulations following the crisis, the management of complex datasets – such as prices, rates, curves and surfaces - remains an underrated issue in the industry. One that can undermine the effectiveness of an enterprise-wide data management strategy.
In the wake of the crisis, siloed data management, poor data quality, lack of audit trail and transparency have become some of the most talked about topics in financial markets. People have started looking at new approaches to tackle the data quality issue that found many companies unprepared after Lehman Brothers' collapse. Regulators – both nationally and internationally – strive hard to dictate parameters and guidelines.
In light of this, there seems to be a general consensus on the need for financial institutions to implement data management projects that are able to integrate both market and reference data. However, whilst having a good data management strategy in place is vital, the industry also needs to recognize the importance of model and derived data management.
Rates, curves and derived data management is too often a neglected function within financial institutions. What is the point of having an excellent data management infrastructure for reference and market data if ultimately instrument valuations and risk reports are run off spreadsheets using ad-hoc sources of data?
In this evolving environment, financial institutions are becoming aware of the implications of a poor risk management strategy but are still finding it difficult to overcome the political resistance across departments to implementing centralised standard datasets for valuations and risk.
The principles of data quality, consistency and auditability found in traditional data management functions need to be applied to the management of model and derived data too. If financial institutions do not address this issue, how will they be able to deal with the ever-increasing requests from regulators, auditors and clients to explain how a value or risk report was arrived at?
For those who are interested, the white paper is available here.
Posted by Sara Verri | 24 February 2011 | 5:45 pm
I went along to a a Prmia event last night "2010 - Risk Year in Review". The event started with a somewhat overwhelming brain dump of economic and credit statistics from John Lonski, Chief Capital Markets Economist at Moody's Analytics. In summary he seems very bullish about corporate credit spreads tightening given the way in which corporate profit growth is surging ahead of debt growth. His main concern for the economy was maybe unsurprisingly the US housing market and whether this will bottom out and start to rise in 2011. Given fiscal imbalances and competition from emerging markets he did not think that inflation was a big risk despite activity such as QE2.
Robert Iommazzo of search firm Seba International did a fairly dry presentation on industry compensation for risk managers. Seba seem to getting around having had a big presence at Riskminds in Geneva last week. This section only livened up when the questions started after the presentation, and is probably worth noting that the UK FSA is being perceived as a "Big Brother" with its involvement in setting compensation policies in financial markets. Obviously the FSA is not heading back to the heady days of the 1970's where central government set industry pay rises (journalists please note this meant you back then!), but it is also obvious that such control over an individual's remuneration is something that goes totally contrary to an American way of thinking. UK Government needs to be mindful of this perception particularly if it leaves itself open to arbitrage on compensation policy from other financial centres.
Panel debate followed, involving Ashish Das of Moody's, Yury Dubrovsky of Lazard Asset Management, Jan H. Voigts of the NY Fed and Christopher Whalen of Institutional Risk Analytics. Main points:
- Chris said that he was one who was predicting a further fall in the housing market next year, and he asked the audience that when they looked at economic statistics, credit spreads,the Vix, bond spreads, did anyone getting the feeling the things are "normal" yet? Using these numbers and plugging them into a model does any believe the results are stable and can be relied upon? The audience fundamentally seemed to agree with these "warning" questions.
- Jan asked the audience to consider how believable is your data and to try to understand what data is critical for your business and that is imperative to create tools to manage this data appropriately. Jan said that the biggest challenge for financial institutions going forward is how to calibrate what rate/volume/type of business you can transact safely and that this needed a lot more consideration.
- Yury said that he finds that the risks present in 2008 are still around in 2010, but now with the addition of European sovereign credit problems and the raft of regulation heading towards the industry. To add to this pessimistic note, he also said that some of the interest in "hot" emerging markets such as the BRICs was resulting in investments in lower quality IPOs relative to previous years.
- Ashish thought that systemic risk was going to become more important for the industry. With the setting up of the Office of Financial Research (OFR) next year, he suggested that the industry needed to take much more of a lead in sorting out its own house in advance of letting the regulators do so. On the subject of models, he said that models should supplement human judgement but not replace it, and mentioned the quote by George E. P. Box that "all models are wrong, but some are useful".
- Chris suggested that the role of risk managers will become more like that of a credit collector, with more involvement in actually seeing what can be recovered once a default has occurred. He also suggested that the industry should create its own consensus-based ratings (supplemented by the existing CRAs) to get a more reliable view of credit.
- Ashish echoed some of the speakers last week at Riskminds in saying that regulatory compliance is not risk management, and that practitioners should do more to guide the regulators.
- On the subject of risk culture, Yury asked how many risk managers knew data, quant, markets and how to deal with the egos of traders and senior management. This last point seemed to be conceded by the audience as a major weakness of the risk management profession and goes back to whether a risk manager is willing to put his career on the line to go against accepted business strategy.
- Chris added that having worked at several investment banks he had not yet experienced a risk manager attending a senior committee, let alone a risk manager speaking up against a senior trader. He talked of two business models "Paranoid and Nimble" and "Well Documented and Pedantic" with the second one being the only one possible in his view once a business gets to a certain size.
- On the subject of Government Sponsored Enterprises (GSEs like Fannie Mae and Freddie Mac) Chris said that the role of these will be up for review by the end of 2011. He thinks that the banks will head back towards actually holding mortgages and loans and the GSEs will become more conduits rather than direct sources of finance. This was news to me, given that so far the GSEs have been notably left out of recent reviews of what went wrong with the recent crisis.
Panel was very good, all speakers very knowledgeable. "Regulation is not risk", "models are not perfect", "risk governance" and "take control of your data" were all themes that echoed last week's RiskMinds event, allbeit with more of an American rather than international viewpoint on the economy, regulation and markets.
Posted by Brian Sentance | 15 December 2010 | 5:16 pm
Panel moderated by Ricardo Rebanato of RBS on "Determining The New Blueprint For Financial Engineering". It seems like Ricardo has been busy following up on his talk from last year (see post) with the release of his book on scenarios (no I am not on commission for this but thought it may be interesting to take a look at!).
Summary of main points from the panel debate:
- Regulators would like simpler models but simpler does not mean better, complex models do not mean worse.
- It is the thoughtful application of a model that is important, not the level of complexity in itself.
- Given the more complex world we live in, more complexity in modelling is both needed and desirable if things are to improve in risk.
- Some members of the panel thought that regulation had stifled innovation in risk models (as opposed to valuation models) through insisting on conformity of reporting. The innovation is limited since the regulators simply set the rules and then the game begins of the bank optimising against these rules.
- Evan Picoult of Citi disagreed with this, saying that his own group now look at historical events going back over 100 years for possible scenarios as opposed to the last few years (comment: interesting to see someone using more history as a complement to more forward-looking risk modelling)
- Riccardo asked whether there is a conflict between what a regulator wants (lack of risk) and what a rational CEO wants - should a CEO for example accept a level of risk of disaster for the bank of 1 in 50. Evan argued that the banks should be more transparent to allow investors in bank stock and bonds to decide and price-in the policies implemented by bank management.
- Only around 15% of the audience thought that greater pricing/valuation model validation would have changed the 2007-2009 crisis. John Hull said that he had received many emails trying to apportion blame in this way which he rejected. Concensus seemed to be the route cause was the lack of common sense over the mortgage market.
Good debate with Riccardo doing more than just moderating, but not a great deal new relative to recent years. In summary my feeling for RiskMinds 2010 was high quality speakers but a little subdued from the embarassment of 2008 and the anger against the regulators in 2009. Maybe we should all want more subdued risk management conferences but it will be interesting to see what 2011 brings and whether energy levels are up.
Posted by Brian Sentance | 9 December 2010 | 3:19 pm
Hugo Banziger of Deutsche Bank gave a presentation entitled "Reshaping The New Agenda for Risk Management".
Hugo started by saying by outlining the ways in which regulation is changing the markets. Whilst positive overall on the benefits of regulation, he expressed surprise at the regulation on OTC derivatives which he say as helpful in managing risk, and not the cause of the crisis.
He emphasised that whilst regulation is important, that regulation should not be a substitute for risk management and said that regulation in particular does not address:
- The quality of assets held
- The quality of management
- The quality of infrastructure
He additionally mentioned that whilst welcoming Basel III, the economic effect will be to make the supply of credit more expensive to the detriment of economic growth in the real economy.
Given the quality issues he identified above, he then moved on to show what he had done about them in terms of:
- People - what quality of people do you have?
- Processes - where are the holes that things will fall through?
- Systems - can you get a complete picture of risk?
- Portfolio Level Risk - across all asset classes and business units
On people, he advocated a "home grown" risk management team, with people rotated across different roles within risk. He takes the fact that other institutions hire his staff as a frustrating complement to Deutsche Bank risk management. He has implemented a "passport" for his staff which shows what they are trained/competent in and how they are annually tested against this, across both technical and softer management skills. He was funny and quite dismissive that if "anyone does not know what an option is and how it works they are out!" even for lawyers as well as risk managers.
On processes, he has set up a new "risk operations" centre of competence to centralise form filling for risk managers, enabling risk managers to spend more time on risk and less on admin. He stated flatly that just because you find that risk managers spend 50% of their time on admin does not mean you have to accept this and you can do something about it. He also said that he is moving the jobs to where the people are, rather than asking people to move (e.g. risk centre in Berlin to catch Berlin maths graduates).
On systems he has spent EUR30 million in 20 months on sorting out the consistency of data and models within market risk. During the crisis it took DB 48 hours to pull together their mortgage portfolio exposure which was too long. He says that initiatives like this are part of a 10 year investment in systems, data and analytics. His ultimate aim is to have an interactive real-time control centre for all risks in the bank and to move away from paper-based daily reporting. He also mentioned that he had grown his market risk team from 70 to 200 post-crisis.
On Portfolio Risk he says that more time needs to spent on knowing risk apetite and knowing how this fits against risk capacity for the bank. He emphasised that risk managers are there to defend P&L and not capital. He said that portfolio/business model risks were his biggest source of risk.
Inspiring speaker, very confident, open about past losses and mistakes made. Biggest difference to many speakers here was that he put forward tangible actions to address things such as risk culture rather than just talking around them.
Posted by Brian Sentance | 8 December 2010 | 10:00 pm
Paul first pointed out that if ever there was a bubble, it was a "bubble" of books on the crisis and its causes. He listed a number of reasons for the crisis, most interesting/new of which was in addition to "too big to fail", was "too big to save" as a new risk given the size of some financial institutions relative to the economies they operate in. Paul has an extensive academic background in both financial risk management and actuarial studies, and I guess it was with his actuary hat on he said that the three main problems for financial engineers going forward were "social insurance, social insurance and social insurance".
By social insurance Paul was referring to medical, life and health insurance and saw this as his big concern for the future. He illustrated this through showing the age distribution of the Japanese population from 1950, 2007 and 2050. Basically the 1950 distribution was like a pyramid with a very young population underneath the middle and old-aged. This shape changed to being fatter in middle age for 2007, and is predicted to be inverted with more older people that younger in 2050. Given that this will result in a percentage reduction of more than 10% in working population supporting an increasingly older population it was not difficult to see what he was meaning.
Paul spent some time going through old papers from him and others (particularly Joseph Stiglitz on securitisation in 1992) warning of the 2008 crisis - I do not know his work well enough to know how much this was "wise after the fact" but what he mentioned on securitisation and correlation made sense given what has since happened. Worth taking a look on his website for some of his papers on Basel and risk management I guess.
One key point he made was simply about volume. Working admittedly on a notional basis, he said that having an OTC derivatives market with notional value of $583 trillion is interesting in the context of a world economy with GDP of only $58 trillion. Even netting notional down you get around $30 trillion of OTC derivatives which still deserves our attention and our efforts to make things better in risk management. I guess his simple message here was "pay attention to volume".
He said the use and abuse of the Repo 105 rule was worth looking at (so I will, anyone with knowledge please let me know), and also questioned the societal benefit of high frequency trading (HFT). Looking back at the Flash Crash Paul said that this was a new kind of risk for risk managers and he had no idea how to hedge it.
Paul defended mathematics as the solution to some of the problems of the crisis and not as the cause - fundamentally he thinks the press have given maths bad PR. He said that we should all watch out for the word "new" being used, as this indicates the start of a bubble with phrases such as the "new economy". Overall a great speaker very comfortable with his subject matter.
Posted by Brian Sentance | 8 December 2010 | 9:28 pm
I am over in Geneva at the moment (taking a break from the harsh English winter?..) for the RiskMinds 2010 event. Despite its slightly pretentious title (I leave it to you to assess how appropriate the name seemed in 2008...) it is one of the best attended risk management events where risk managers discuss what is going on and what is new to risk management. You can find some posts from the 2009 event here, and the 2008 event here - both make interesting reading given that we are out of the crisis now (aren't we?).
I arrived late for the first day, just to catch a panelist Pippa Malmgren of the Canonbury Group saying that during the crisis everyone knew they were long highly leveraged, very risky assets that were potentially in a pricing "bubble" but when asked about whether this bubble, most used one of the three responses:
- Asset managers said it didn't matter so long as all of our peers go down too...
- Hedge fund managers said that there business was to surf market waves and they could restart the fund afterwards anyway...
- I know it's a bubble but I will be able to get out before it bursts...
Pippa added that the last was the most worrying response, although I guess all are still relevant negative insights into the attitudes of some financial market participants.
The next panel was on Risk Culture & Ethics with Richard Evans of Citi first up presenting on issues resulting from the crisis. Richard suggested that the following key issues were missed during the crisis:
- Silo Mentality - Risk reports were not comprehensive enough, covering all assets, regions and business units in one; risk management focussed too much on validating individual deal flow (transactions) rather than the portfolio; there were no incents for business managers to share resources and information.
- Short Term Revenue Focus - Focus was on short-term bonuses were not related to profitability after costs and cost of risk capital were taken into account.
- Backward-Looking Models - Models looked backwards (historic VAR for instance) rather than being forward-looking, scenario-based. Richard said that Citi now combine both backward-looking VAR and multiple (severe) scenarios on an approximately 50-50 basis when assessing overall risk now.
- Poor Teamwork - Trading and risk management staff did not work together effectively during the crisis. Richard now suggests this must be addressed through greater involvement of the business in risk management, the introduction of the risk committee and fighting against risk management "ivory towers".
- Board Weakness - Richard said that boards and senior management committees were not set up to react to "alarm bells" such as triggers resulting from limit breaches; Also many boards were simply very weak in their basic understanding of the risks being taken by the business.
I don't think the above will come as any surprise to anyone who has followed the crisis but Richard is a good speaker and so his presentation was entertaining. He later went on to criticise regulators for asking him to replace staff members with 20 years experience with others with 20 years experience - he said that he had not yet found a way to cram 10 years experience in 2 years although maybe recent times have come close to this aim! He also said that firms where the "mood" of the CRO affected what approval decisions were made obviously did not have strong enough governance in place. Richard wants risk and trading staff to work closer together, although he admits that two years on it is difficult to get business level compensation to get traders to work in risk - in this regard he also mentioned his days at JPMorgan when trading and risk staff spent time seconded to the regulators for a time.
There were a variety of other speakers during the day, all dealing with risk governance and culture. Whilst vital to the changes that must be made in the culture of the majority of institutions, I think it is a difficult topic to talk about, since it is hard to express just what needs to be "done" in some pragmatic way. Put another way, the conversations on this topic tend to focus on the need for a risk management culture and become very wooly when discussing how one is implemented. A presentation by Alden Toevs of the Commonwealth Bank Australia attracted discussion by some of the attendees over coffee. The presentation was about how to formalise/make a process of the discussion and agreement of risk appetite (a Risk Appetite Statement) between board, risk management and the business. Alden suggested that the use of anonymous "voting" technology at board level encouraged more openness and discussion, and getting the business involved in this process was a great way to encourage the involvement of trading in risk management. A good presentation in both content and effects (an iMac user I think!) and an amusing speaker who pointed out that visiting the Australian parliament is interesting for a risk manager given that you are surrounding by genuine Black Swans in the lake outside...
Posted by Brian Sentance | 8 December 2010 | 8:36 pm
Last Thursday, I went along to an event organized by the Club Finance Innovation on the topic of “Independent valuations for the buy-side: expectations, challenges and solutions”.
The event was held at the Palais Brongniart in Paris, which, for those who don’t know (like me till Thursday), was built in the years 1807-1826 by the architect Brongniart by order of Napoleone Bonaparte, who wanted the building to permanently host the Paris stock exchange.
Speakers at the roundtable were:
- Eric Benhamou, CEO Pricing Partners
- Francis Cornut, Président DeriveXperts
- Jean-Marc Eber, Président LexiFi
- Patrick Hénaff, Associated Professor at the University of Bretagne (see model validation paper for additional background)
- Claude Martini, CEO Zeliade Systems
The event focussed on the role of the buy-side in financial markets, looking in particular at the concept of independent valuations and how this has taken an important role after the financial downturn. However, all the speakers agreed that remains a large gap between the sell-side and buy-side in terms of competences and expertise in the field of independent valuations. The buy-side lacks the systems for a better understanding of financial products and should align itself to the best practices of the sell-side and bigger hedge funds.
The roundtable was started by Francis Cornut of DeriveXperts, who gave the audience a definition of independent valuation. Whilst valuation could be defined as the “set of data and models used to explain the result of a valuation”, Cornut highlighted how the difficulty is in saying what independent means; there is in fact a general confusion on what this concept represents: internal confusion, for example between the front office and risk control department of an institution, but also external confusion, when valuations are done by third-parties.
Cornut provided three criteria that an independent valuation should respect:
- Autonomy, which should be both technical and financial;
- Credibility and transparency;
- Ethics, i.e.: being able to resist to market/commercial pressure and deliver a valuation which is free from external influences/opinions.
Independent valuations are the way forward for a better understanding of complex, structured financial products. Cornut advocated the need for financial parties (clients, regulators, users and providers) to invest more and understand the importance of independent valuations, which will ultimately improve risk management.
Jean-Marc Eber, President LexiFi, agreed that the ultimate objective of independent valuations is to allow financial institutions to better understand the market. To accomplish this, Eber pointed to the fact that when we speak about services to clients, we should first think of what are their real needs. The bigger umbrella of “buy-side” implies in fact different needs and there is often a contradiction on what regulators want: on one side, having independent valuations provided by independent third parties; on the other side, independent valuations really mean that internal users/staff do understand what there is underline the products that a company have.In the same way, we don’t just need to value products but also measure their risk and periodically re-value them.It is important, in fact, to have the whole picture of the product being evaluated in order to make the buy-side more competitive.
Another point on which the speakers agreed is traceability: as Eber said, financial products don’t exist just as they are, but they go under transformation and change several times. Therefore, the market needs to follow the products across its life cycle till its maturity stage and this pose a technology challenge, in providing scenario analysis for compliance and keeping track of the audit trail.
At the question, ‘what has the crisis changed’ panellists answered:
Eber: the crisis showed the need to be more competent and technical to avoid risk. He highlighted the need to understand the product and its underlying. Many speak of having a central repository for OTCs, obligations, etc but this needs more thinking from the regulators and the financial markets. Moreover, the markets should focus more on quality data and transparency.
Eric Benhamou, CEO pricing Partners, sees an evolution of the market as the crisis showed underestimated risks which are now being taken in consideration.
Claude Martini, CEO Zeliade, advocated the need for financial markets to implement best practices for product valuations: buy-side should apply the same practices already adopted by the sell-side and verify the hypotheses, price and risk related to a financial product.
Cornut admitted things have changed since 2005, when they launched DerivExperts and nobody seemed to be interested in independent valuations. People would ask what value they would get from an investment in independent valuations: yes, regulators are happy but what’s the benefit for me?
This is changing now that financial institutions know that a deeper understanding of financial products increases their ability to push the products to their clients. The speech I enjoyed the most was from Patrick Hénaff, associated professor at the University of Bretagne and formerly Global Head of Quantitative Analysis - Commodites at Merrill Lynch / Bank of America.
He took a more academic approach and contested the fact that having two prices to confront is thought to reduce the incertitude on the product but highlighting as this is not always the case. I found interesting his idea of giving a product price with a confidence interval or a ‘toxic index’ which would represent the incertitude about the product and reproduce the model risk which may originate from it.
We speak too often about the risk associated to complex products but Hénaff, explained how the risk exists even on simpler products, for example the calculation of VAR on a given stock positioning. A stock is extremely volatile and we can’t know its trend; providing a confidence interval is therefore crucial. What is new instead, it is the interest that many are showing in assigning a price to a determinate risk, whilst before model risk was considered a mere operational risk coming out from the calculation process. Today, a good valuation of the risk associated to a product can result in less regulatory capital used to cover the risk and as such it is gaining much more interest from the market.
Henaff describes two approaches currently taken from academic research on valuations:
1) Adoption of statistic simulation in order to identify the risk deriving from an incorrect calibration of the model. This consists in taking historical data and test the model, through simulations and scenarios, in order to measure the risk associated in choosing a model instead of another;)
2) Have more quality data. Lack of quality data implies that models chosen are inaccurate as it is difficult to identify exactly what model we should be using to price a product.
Model risk, which as said above was before considered an operational risk, now becomes of extremely importance as it can free up capital. Hénaff suggested that is key to find for model risk the equivalent of the VAR for market risk, a normalized measure. He also spoke about the concept of a “Model validation protocol”, giving the example of what happens in the pharmaceutical and biologic sectors: before launching a new pill into the market, this is tested several times.
Whilst in finance products are just given with their final valuation, the pharmaceutical sector provides a “protocol” which describes the calculations, analysis and processes used in order to get to the final value and their systems are organized to provide a report which would show all the deeper detail. To reduce risk, valuations should be a pre-trade process and not a post-trade.
This week, the A-Team group published a valuations benchmarking study which shows how buy-side institutions are turning more and more often to third-parties valuations, driven mainly by risk management, regulations and client needs. Many of the institutions interviewed also admitted that they will increase their spending in technology to automate and improve the pricing process, as well as the data source integration and the workflow.
This is in line on what has been said at the event I attended and confirmed by the technology representatives speaking at the roundtable.
I would like to end with what Hénaff said: there can’t be a truly independent valuation without transparency of the protocols used to get to that value.
Well, Rome wasn’t built in a day (and as it is my city we’re speaking about, I can say there is still much to build, but let’s not get into this!) but there is a great debate going on, meaning that financial institutions are aware of the necessity to take a step forward. Much is being said about the need for more transparency and a better understanding of complex, structured financial products and still there is a lot to debate. Easier said than done I guess but, as Napoleon would say, victory belongs to the most persevering!
Posted by Sara Verri | 28 October 2010 | 5:50 pm
I went along to a Six Telekurs event "Securities Valuations: Is the Price Right?" last week - good event with some interesting speakers, most notably Paul Atkins of Patomak Partners to talk about the Dodd-Frank Wall Street Reform and Consumer Protection Act 2010. Paul is based out of Washington and was not very complimentary about what has been going on.
He started by saying that the Act was very large in size, with over 2319 pages (compared to SarbOx with only 60) and given this size he suggested that you could guess how many in Congress had actually read it. Background to the Act were:
- "Political Tailwinds" such as:
- New Democrat Government with tenuous majority
- Ambitious legislative plans
- Bleak economic back-drop
- An angry populace:
- TARP bailouts/Wall St bonuses
- Recession and high unemployment
- Perception that Govt. contributed to crisis
- Aggressive case for new regulation based on:
- Lack of confidence in current systems and regulation
- "Too big to fail" demonstrating that regulators lack the toolsets necessary to deal with such events
- High leverage across the financial system and the economy
- Poor risk management by existing participants
- Opaque shadow banking system and opaque derivatives markets
He summarised that Housing and the Credit Rating Agencies were the key fundamentals behind the financial crisis.
Paul said that with the new regulation had the following features:
- The Act is a sweeping revision of financial regulation in the US
- few dodged the regulatory changes (notably insurance managed to do this)
- The Federal Reserve has emerged pre-eminent amongst all regulatory bodies in the US.
- Significant discretion has been yielded to regulators to work out specifics
- Sheer size and ambiguous wording of the Act exacerbates the uncertainty in the market and economy and will require further fixes over coming years
- The Act does not reform Government Sponsored Enterprises (Fannie Mae, Freddie Mac)
- Far from reducing/simplifying the number of agencies involved in regulation the Act eliminated 1 agency and created 13 more
- Paul asked the question whether spreads and volatility will rise in the market due to new regulation (such as the Volcker rule) and whether ultimately this will trickle down to hinder or benefit SMEs.
- The Act will likely result in regulatory arbitrage opportunities and Paul said this was not a good thing for the United States
Paul said that in his view Congress learned the wrong lessons from the crisis:
- No reform of Fannie Mae and Freddie Mac
- Government Housing Policy left unaddressed
- Transparency still lacking despite efforts from FASB on fair value
- International Policy Co-ordination is still an open question as to its extent
- No reform of existing regulator structures
- The crisis has resulted in payoffs to favoured groups (Unions, Trial Lawyers etc)
Paul talked about how hedge funds and private equity funds were going to experienced increased regulation with them having to register if they have over $100M assets under management and future implications for systemic risk provisions. He mentioned that Venture Capital investments had escaped being required to register if the lock-up period was over 2 years.
He briefly discussed the coming changes in OTC derivatives on centralised clearing, post trade reporting and new liability provisions. Paul was also concerned about certain SEC related issues such as "Whistleblower" provisions which contain a bounty programme of about 10-30% of any fine subsequently awarded against a financial institution. He re-iterated that it was not yet clear what all of the bodies involved in regulation would be doing, and at the same time as this was the case the very same bodies were also being given very strong powers such as that of legal subpoena.
Paul was a very knowledgeable speaker and had some good points to make. Listening to him speak it would seem from my perspective that the Act is a prime example of "being seen to be doing something" to address the crisis rather than something better structured, with all of "law of unintended consequencies" risks that such an initiative entails.
Posted by Brian Sentance | 14 October 2010 | 8:32 pm