Xenomorph
  • Solutions
  • Technology
  • Customers
  • Resources
  • News
  • About Us
  • Request a Demo
  • Search
  • Menu Menu
  • Twitter
  • LinkedIn

News

Market Data Quality in Financial Services

This article forms part of a series of educational resources designed to help anyone starting out in the financial information industry. Earlier posts looked to define market and reference data, and explain principles of market data management. This article describes the key attributes of market data quality, explaining that data quality requirements vary according to use case, and outlining key processes used to enhance and assure market data quality.

Within the financial industry, market data quality can sometimes be taken for granted. Given the high cost associated with market data products and services, consumers may simply assume that data quality issues have been resolved by the time they reach financial professionals. Yet that is an over-simplification.

Market data needs to be fit-for-purpose. That means the purpose for which data is being used determines its quality requirements. This blog defines key characteristics of market data quality, explains why different market data consumers may define market data quality uniquely, and explains some of the processes used to assure market data quality within an enterprise.

Defining market data quality

Many factors go into defining market data quality. Given that market data is used to support investment decisions (we defined market data, reference data and the difference between them in an earlier blog – see here), it is important to note that not all investment decisions are the same. Different data consumers will therefore have unique data quality requirements depending on their purpose for the data. Below, we outline five key attributes relevant to market data quality and explore how they differ depending on use cases:

Timeliness

This can be a crucial quality attribute for market data. It is important to note that different data consumers have different requirements when it comes to timeliness (otherwise known as latency tolerance). Some strategies (such as high frequency market making or statistical arbitrage) will require the lowest latency data, resulting in data consuming applications being hosted as close as possible to exchanges’ matching engines and all aspects of data processing infrastructure optimised for speed – including the highest bandwidth / lowest latency network connections, and fastest processing capabilities (often using hardware acceleration techniques). Data consumed by humans via market data terminals typically has a higher tolerance for latency, with data often being conflated (updates consolidated so that prices on screen only change a few times per second corresponding to human reaction times). Market data consumers outside the front office (e.g. for risk or valuation services) may only require a single closing price for each day or a snapshot taken at a specific time of day.

Accuracy

Accuracy of market data is often taken for granted, particularly for highly liquid markets. For such markets, timeliness can be a contributing factor to accuracy. For example, if there are hundreds of orders being processed per second, any significant delay to source and process data can mean prices are not valid by the time they reach consumers.

For illiquid OTC markets (instruments that do not trade frequently), accuracy is more of a challenge simply because up-to-date market prices are unavailable. In such cases, data consumers may either be forced to rely on internal resources (e.g. front office marks and/or model-derived prices) or evaluated pricing services from vendors.

Completeness

Completeness is another attribute of market data quality that can differ depending on the consumer. For example, an algorithmic trading application may require every tick across the full order book, a professional investor terminal may display the best bid on a conflated basis (with a limited amount of market depth), while a retail portal may be satisfied simply with the last traded price. Completeness can also be a key factor when considering market data for illiquid instruments. When an instrument has not traded it is important that stale prices are not used (as this can distort volatility calculations and perceptions of risk). Rather, an alternative model-derived price ought to be calculated (typically using a more liquid proxy) or an evaluated price sourced from a vendor.

Consistency

The consistency of market data is particularly important in an enterprise context, where data from the same markets may be sourced by different vendors for different purposes. In such cases, it is important to ensure that instruments can be uniquely identified, creation of duplicate records is avoided, and data fields are defined and mapped consistently.

Availability

The ‘availability’ of market data is not related to frequency of price updates; it is more of a technical definition – how easily data can be ingested and integrated by consuming systems. In that sense, it is important for any enterprise data management system to be able to cater to a range of requirements – making data available in a variety of files and formats to suit each application.

Improving market data quality through enterprise data management, price mastering, and risk factor mastering

Enterprise data management systems are primarily concerned with improving data quality – in particular accuracy, completeness, consistency and availability. Timeliness is not necessarily an attribute that can be improved, although enterprise data management systems do need to cater for different requirements from consuming applications (including real-time or delayed snapshots, end-of-day and historical data).

For the purposes of this article, we will talk about three key processes relevant to improving market data quality – notably data validation, price mastering and risk factor mastering.

Data validation

To enhance and assure market data quality, it is important that data is properly checked and validated. This involves running data through a set of validation rules to detect anomalies that require investigation (and possible correction). To do so, teams need the right tools to narrow down the focus of their exception handling efforts. This can include using adaptive data validation rules (for example, looking for price spikes relative to an index rather than on an absolute basis), or applying other rules to flag high priority items.

The output of data validation is to create a master, or gold copy, of critical data elements. This can include securities and derivatives prices, or other elements used to calculate those prices, such as curves (for rates, credit, FX instruments etc.) and volatility surfaces.

Securities and derivatives price master

Maintaining a price master for securities and derivatives is crucial for a range of functions. Accurate pricing data helps to support investment decisions, portfolio valuations, margin and risk calculations, among many others. By properly vetting price data, organisations can avoid the use of stale or erroneous prices and therefore ensure the integrity and accuracy of key investment processes.

Risk factor mastering

When measuring risk, organisations are not simply exposed to instrument prices, but other factors can be equally important. It is therefore vital to maintain centralised and validated master records for key risk factors, including curves, surfaces, and cubes. Enterprise data management systems therefore need to be adept at managing these objects.

Conclusion

Market data quality is in the eye of the beholder. Data needs to be fit-for-purpose. Although all applications benefit from accurate and consistent data, different consumers will have different requirements when it comes to other aspects of market data quality (timeliness, completeness, and availability). Equally, when making sure data is fit-for-purpose there are also regulatory considerations to take into account, with a range of regulations and standards having implications that vary according to the type of market participant.

 

 

Share this entry
  • Share on Twitter
  • Share on LinkedIn
  • Share by Mail
https://www.xenomorph.com/wp-content/uploads/market-data-quality.jpg 800 1200 Matt Pick https://www.xenomorph.com/wp-content/uploads/2019/07/logo-xeno.png Matt Pick2022-10-26 10:38:002022-10-26 10:43:26Market Data Quality in Financial Services

Categories

  • Blog Post
  • Events
  • Press Release

Archives

About Xenomorph

Xenomorph provides trusted data management solutions to many of the world’s leading financial institutions.

With more than two decades’ experience managing large volumes of complex data and analytics we can quickly configure a solution for your requirements.

Xenomorph London

4th Floor
10 Lloyd’s Avenue
London EC3N 3AJ
UK

+44 (0)20 7614 8600

info@xenomorph.com

map

Xenomorph New York

45 Rockefeller Plaza, FL 20
New York
NY 10111
USA

+1-212-401-7894

info@xenomorph.com

map


Xenomorph Boston

53 State Street
Suite 500
Boston, MA 02109
USA

+1-617-465-2050

info@xenomorph.com

map

© Xenomorph Software Ltd 2022 - Privacy policy | Anti-Bribery Policy
Scroll to top
This website uses cookies to improve your experience. If you continue to use the site we will assume your consent. ACCEPTRead More
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are as essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
SAVE & ACCEPT