Matthew Berry on the Libor/OIS curve debate
June 25, 2013
Guest post today from Matthew Berry of Bedrock Valuation Advisors, discussing Libor vs OIS based rate benchmarks. Curves and curve management are a big focus for Xenomorph's clients and partners, so great that Matthew can shed some further light on the current debate and its implications:
Proposal’s Significant Implications for Data Management
During the 2008 financial crisis, problems posed by
discounting future cash flows using Libor rather than the overnight index swap
(OIS) rate became apparent. In response, many market participants have modified
systems and processes to discount cash flows using OIS, but Libor remains the
benchmark rate for hundreds of trillions of dollars worth of financial
contracts. More recently, regulators in the U.S. and U.K. have won enforcement
actions against several contributors to Libor, alleging that these banks
manipulated the benchmark by contributing rates that were not representative of
the market, and which benefitted the banks’ derivative books of business.
In response to these allegations, the CFTC in the U.S. and
the Financial Conduct Authority (FCA) in the U.K. have proposed changes to how
financial contracts are benchmarked and how banks manage their submissions to
benchmark fixings. These proposals have significant implications for data
The U.S. and U.K.
responses to benchmark manipulation
In April 2013, CFTC Chairman Gary Gensler delivered a speech
in London in which he suggested that Libor should be retired as a benchmark.
Among the evidence he cited to justify this suggestion:
Liquidity in the unsecured inter-dealer market
has largely dried up.
The risk implied by contributed Libor rates has
historically not agreed with the risk implied by credit default swap rates. The
Libor submissions were often stale and did not change, even if the entity’s CDS
spread changed significantly. Gensler provided a graph
to demonstrate this.
Gensler proposed to replace Libor with either the OIS rate
or the rate paid on general collateral repos. These instruments are more liquid
and their prices more readily-observable in the market. He proposed a period of
transition during which Libor is phased out while OIS or the GC repo rate is
In the U.K., the Wheatley
Report provided a broad and detailed review of practices within banks that
submit rates to the Libor administrator. This report found a number of
deficiencies in the benchmark submission and calculation process, including:
The lack of an oversight structure to monitor
systems and controls at contributing banks and the Libor administrator.
Insufficient use of transacted or otherwise
observable prices in the Libor submission and calculation process.
The Wheatley Report called for banks and benchmark
administrators to put in place rigorous controls that scrutinize benchmark
submissions both pre and post publication. The report also calls for banks to
store an historical record of their benchmark submissions and for benchmarks to
be calculated using a hierarchy of prices with preference given to transacted
prices, then prices quoted in the market, then management’s estimates.
Implications for data
The suggestions for improving benchmarks made by Gensler and
the Wheatley Report have far-reaching implications for data management.
If Libor and its replacement are run in parallel for a time,
users of these benchmark rates will need to store and properly reference two
different fixings and forward curves. Without sufficiently robust technology,
this transition period will create operational, financial and reputational risk
given the potential for users to inadvertently reference the wrong rate. If
Gensler’s call to retire Libor is successful, existing contracts may need to be
repapered to reference the new benchmark. This will be a significant
undertaking. Users of benchmarks who store transaction details and reference
rates in electronic form and manage this data using an enterprise data
management platform will mitigate risk and enjoy a lower cost to transition.
Within the submitting banks and the benchmark administrator,
controls must be implemented that scrutinize benchmark submissions both pre and
post publication. These controls should be exceptions-based and easily scripted
so that monitoring rules and tolerances can be adapted to changing market
conditions. Banks must also have in place technology that defines the
submission procedure and automatically selects the optimal benchmark submission.
If transacted prices are available, these should be submitted. If not, quotes
from established market participants should be submitted. If these are not
available, management should be alerted that it must estimate the benchmark
rate, and the decision-making process around that estimate should be
These improvements to the benchmark calculation process
will, in Gensler’s words, “promote market integrity, as well as financial
stability.” Firms that effectively utilize data management technology, such as Xenomorph's TimeScape, to implement these changes will manage the transition to a new
benchmark regime at a lower cost and with a higher likelihood of success.