Market Data Management and Reference Data Management
In the first in this series of blogs we explained the difference between market data and reference data. In this follow-up article on Market Data Management, we provide an overview of the various disciplines involved in managing market data and reference data for a large organisation, which can also be referred to under the catch-all title of enterprise data management.
Managing market data and reference data is a highly complex task, and therefore necessitates a multi-disciplinary approach. Financial institutions typically employ teams with varied skill sets to manage data. For this article, we will look at some of the discrete challenges faced by those teams – covering technical, commercial, administrative and legal aspects of market data management.
Technical challenges of Market Data Management
Managing data poses significant technical challenges for organisations, and those challenges differ depending on the dataset in question, given that market data and reference data are often provided using distinct technologies.
Market data technology
Most real-time market data originates from market operators – such as exchanges and alternative trading venues. In over-the-counter (OTC) markets, price feeds will also be available from interdealer brokers or be distributed directly from dealing banks to their clients.
The technical mechanisms for distributing market data feeds tend to vary. Some use multicast (one-to-many) protocols such as UDP/IP while others will use unicast protocols (one-to-one) such as TCP/IP. Although some standards, such as Simple Binary Encoding from FIX Trading Community and Nasdaq’s ITCH, have been adopted by multiple trading venues, there tends to be a prevalence of proprietary protocols for market data distribution (unlike order routing where FIX is well established). Connectivity to market data feeds is typically supported via direct circuits or VPN access. However, as market operators prepare to move parts of their infrastructure to the cloud, this could change, and internet-style technologies (such as REST APIs) could become more prevalent for data access.
Such diversity makes life more difficult for financial institutions. To simplify matters, some data technology vendors serve as aggregators, consolidating direct exchange feeds from multiple markets and normalising them into a common feed format and protocol. However, many organisations will still require direct feeds for more latency-sensitive functions (such as market making and statistical arbitrage), and will also typically source multiple consolidated feeds, contributing to ongoing complexity.
The variety of feeds is further complicated by the fact that each feed is regularly changing. Venues and data aggregators typically publish a regular stream of change notifications detailing updates that impact data consumers, such as the addition or removal of securities (e.g. following IPOs, new issues or de-listings) or alterations to the data model.
Reference data technology
Some might assume technical challenges of managing reference data are less significant given that data is less latency sensitive and characterised as ‘static’. Unfortunately, this is a simplistic interpretation. A diverse ecosystem also exists for reference data vendors, each with their distribution technologies (including end-of-day file updates in formats such as XML, JSON, and CSV; APIs in a variety of languages; or even direct database access), as well as their own distinct data models and instrument symbologies. This ensures the landscape is difficult to navigate. Most organisations will seek to create a securities master, or instrument master, which requires mapping and merging reference data from competing vendors into a master copy. Equally, organisations will typically require an instrument pricing master, as well as risk factor master (curves and surfaces), to feed middle- and back-office processes (such as risk modelling, margin calculations and portfolio valuations). This entails running validation rules to detect instances of stale prices, calculating updated prices derived from proxy instruments, investigating pricing anomalies and generating a gold copy time series for use in enterprise applications.
Commercial challenges of Market Data Management
In addition to the technical complexity, the commercial aspects of managing market and reference data also represent a multifaceted challenge. Data is often cited as the third largest cost for financial institutions, only behind people and facilities. Given this significance as a P&L line item, it is unsurprising that most large organisations have teams dedicated to sourcing, integrating and optimising use of data. From a commercial perspective, key challenges include:
Market data vendor management: Each vendor (of market data or reference data) offers its own unique strengths in terms of content coverage, data quality, technical capabilities, and commercial models. Services also evolve and commercial models are not set in stone. Being able to maximise value from vendor relationships is a fine art and requires detailed knowledge of competing products and services, and ideally a collaborative approach to negotiate favourable outcomes for the data consumer.
Optimising market data usage: The size and complexity of most financial organisations means that a wide variety of data sets, products and services are consumed by multiple teams spread across regions, departments, and business lines. Simply ascertaining what is being consumed can be a monumental challenge. Only once usage is monitored accurately, does it become possible to manage and optimise consumption – potentially identifying redundant sourcing, ensuring services are not procured un-necessarily (and cancelling services when no longer needed), and finding lower cost substitutes where available.
Administrative and legal challenges of Market Data Management
Financial information tends to be licensed under strict terms and conditions, which means financial institutions have complex administrative and legal challenges to manage in relation to data consumption. Simply knowing the content of every vendor’s licensing policies and their commercial implications is a challenge in itself.
Beyond knowledge and interpretation of policies, organisations need to demonstrate compliance with those policies. To do so, they will typically need technical controls in place with regards to access control and monitoring consumption. Equally, expense management systems can help to reconcile what is actually consumed against what is billed, spotting any anomalies or opportunities to cancel unused services.
On occasion, disagreements will also occur between data vendor and consumer. Data vendors occasionally audit their customers to ensure they are complying with usage policies, and any instance of non-compliance is likely to trigger a fine or legal dispute.
Finding the right solutions
To address the technical, commercial, administrative and legal challenges highlighted in this article, a variety of software solutions are available to support financial institutions in managing market data and reference data. Xenomorph provides trusted data management solutions to many of the world’s leading financial institutions, helping not only solve technical challenges but also commercial and administrative ones. Our market data management solutions are designed to address many of the specific challenges highlighted in this article – from market data vendor management and optimising market data usage, to providing a data catalogue to map data relationships and create a centralised master copy. For more information on our market data management solutions, get in contact and we would be happy to arrange a demo.