Apex for Interactive (Reference) Data
Launch event for Interactive Data’s new reference data service Apex on Wednesday night, hosted at Nasdaq Time Square and introduced by Mark Hepsworth. Apex looks like a good offering, combining multi-asset data access, batch file and on-demand API requests from the same data store, plus hosted data management services, and a flexible licensing/distribution/re-distribution model.
Some good speakers at the event. Larry Tabb ran through his opinions on the current market, starting with regulation. He painted a mixed picture of the market, starting with the continuing exit by investors from the equity mutual funds market, offset to some degree by rapid growth in ETF assets (54% growth over past 3 years to $1,200billion). Obviously events such as the Flash Crash, Libor, the London Whale and Knight Capital have not increased investors confidence in markets either.
On regulation he first cited the sheer amount of regulation being attempted at the moment going through systemic risk/too big to fail, Dodd-Frank, Volcker, derivatives regulation, Basel III etc. Of particular note he mentioned some concerns over whether there is simply enough collateral around in the market given increased capital requirements and derivative regulation (a thought currently shared by the FT apparently in this article).
Given the focus of the event, Larry unsurprisingly mentioned the foundational role of data in meeting the new regulatory requirements, which for the next few years he believes will be focussed on audit and the ability to explain and justify past decisions to regulators. Also given the focus of the event, Larry did not mention his recent article on the Tabb Forum on federated data management strategies which I would have been interested to hear Interactive’s comments on, particularly given their new hosted data management offerings.
Mike Atkin of the EDM Council was next up and described a framework for what he thought was going on in the market. In summary, he split the drivers for change into business and regulatory, and categorised the changes into:
- Transparency
- Systemic Risk
- Capital and Liquidity
- Clearing and Settlement
- Control and Enforcement
He then that the fundamental challenge with data was to go through the chain of identifying things, descibing them, classifying/aggregating them and then finally establishing linkages. He then ended this part of his presentation with the three aspects he thought necessary to sort this out from industry data standards, to methods of best practice and on to having infrastructure in place to enable these changes.
Mike then went on to recount a conversation he had had with a hedge fund manager, who had defined the interesting concept of a “Data Risk Equation“:
N x CC x S / (Q x V)
where:
N: is the number of variables
CC: is a measure of calculation complexity
S: is the number of data sources needed
Q: is a measure of quality
V: is a measure of verifiability
I think the angle was the Hedge Fund guy was simply using a form of the above to categorise and compary the complexity of some of the data issues his firm was dealing with.
Aram Flores of Deutsche Bank then talked briefly. Of note was his point that the new regulation was forcing DB to use more external rather than internal data, since regulation now restricted the use of internal data within regulatory reporting. Sounds like good news for Interactive and some of its competitors. Eric Reichenberg of SS&C GlobeOp then gave a quick talk on the importance of accurate data to his derivative valuation services. The talks ended with a well-prepped conversation between Marty Williams and one of their new Apex clients, who jokingly refered to one of the other well-known data vendors as the Evil Empire which raised a few smiles – fortunately the speaker didn’t start to choke at this point so obviously Darth Vader wasn’t spying on the proceedings…
So overall a good event, new product offering looks interesting, speakers were entertaining and the drinks/food/location were great.