Time for System Regulation?
October 8, 2008
In these difficult times, it’s clear there are a lot of questions currently being asked about corporate systems infrastructure and why it has failed to deliver on the needs of the business in times of crisis, just when it was needed most? Despite vast sums of money being poured into those systems, they ultimately failed to do the job that was most needed of them – protecting the business from risk.
However, despite the inevitable finger pointing, it’s possible to see how this may have come about. Too many companies have been obsessed with striving for the next best thing to increase short-term returns. System Architects have been tasked with delivering the "Holy Grail of Trading Technology" (that they never reach). This pressure to deliver the ultimate solution has inevitably led to gaps in the existing systems (maybe because they were too technically boring or, more, likely misunderstood?) that have then been exploited and allowed "toxic" risks to pile up.
What the industry is crying out for in this area are standards around risk modelling, performance, quality, accuracy and interoperability. Standards that all external (and internal?) vendors adhere to and can be publicly measured by. Standards that allow the consumer to make choices based on clear, comparable facts that compare like-with-like, as opposed to forcing people to wade through the marketing hype to get to the fundamental issues and detail that matters. Vendors need to come together and cooperate around those standards to deliver solutions that the business community needs, without them having to reinvent the wheel another 20 times over.
STAC (with whom Xenomorph has been working in the area of market data analysis benchmarking) are making clear strides in this direction, as indicated by their latest press release around trading technology.
However, just because STAC have started to define standards doesn’t mean that vendors will necessarily join in the party. Just look at Microsoft and the "standard" database benchmarking statistics posted for SQL Server. Microsoft continues to post performance benchmarks, and have been doing so for years, but none of the other database vendors appear to be joining in with them (see SQL Server Magazine article). Why is this? Is it because these benchmarks are deemed meaningless to the consumer or because the other vendors do not want to be seen to perform badly against those benchmarks and hence are avoiding them? My guess is that it is probably some combination of both.
Action is clearly needed.
There are of course already many independent standards initiatives underway within financial services – the market is by no means devoid of them! Most of these have focused on the essential areas of interoperability and messaging (FIX, FpML, MDDL, SWIFT to name but a few). Many of these standards bodies have evolved because the market (as opposed to the regulators) have recognised the need to increase efficiency within the industry as a whole. As a result, they have taken a notoriously long time to evolve as they struggle to sanitise the myriad of exotic financial products that now pervade financial services across countless organisations. For example, just look how long FpML has taken – it started life in 1997 as a research project within JP Morgan and has taken over 10 years to reach its current state. It is still struggling to keep pace with the market. It’s a race that it has failed to win in the past but maybe, just maybe, because of recent events and the inevitable caution that will follow around the more exotics products, it will now have a chance of catching up!
Despite the considerable effort these standards bodies have put in we are still, as an industry, at the beginning of that process and dealing with the "wiring" as opposed to the overall system response and measurement that the consumer (and regulator) will now demand to be monitored.
The recent market turmoil is forcing the regulators to start to talk about the need for OTC clearing and better margining so that exposures are crystallised and more transparent to all participants. However, the financial industry and its consumers can ill-afford to wait for standards to evolve slowly and, as the mist begins to clear from the nuclear fallout of recent weeks, the regulators will now be forced (politically) to come down strongly on the industry as a whole and instil a quality of practise that has been sadly missing – a quality of practise that has been present in many other industries (such as manufacturing, pharmaceuticals and the academic sciences) for years.
Despite the complexity of the financial markets, there is no doubt that the vendors and practitioners within them are perfectly capable of moving mountains when they are appropriately incentivised. A quick look at the actions of the Swedish Govt who forced their banks to get their act together and reduce clearing times in 2002 proves this to be the case (see BBC article). In this case, the banks were given an ultimatum and forced to clean up their act quickly or face the music – unsurprisingly, they did.
The same types of ultimatum will now be applied on a global basis. They have to be. Meaningful standards will need to be enforced for financial services systems and practises. Those standards will be forced to ripple down the food chain from the regulator, though the banks and onto the vendor supply chain. These standards will need to be independently defined and verified. With budgets coming under increasing scrutiny, organisations will be forced to look outside their walls and in-doing so the vendors will need to be ready with products and services that meet the needs of the regulators.
I think STAC is heading the right way in starting to set standards that look at overall systems response and beyond.
What’s needed now is sufficient pressure by the consumer organisations and regulators to level the playing field, impose standard reference architectures and practises and force vendors to commit to them and be measured by them…