Trading : Data standardisation : Heather McKenzie

SOLVING THE DATA CONUNDRUM.

Heather McKenzie looks at the challenges and initiatives hoping to overcome the data standardisation hurdles.

Although data standardisation is high on the financial services agenda, there are still several unresolved legacy issues with risk management implications. Market participants have been summoning the industry to join forces and develop effective solutions but progress has been slow.

In fact, in May 2018, JP Morgan Chase issued a call for action, urging the financial services industry, global regulators and other stakeholders, to collaboratively build on their progress toward achieving a data standardisation framework that addresses current deficiencies and allows innovative new technologies. “Establishing and implementing a common global language for financial instruments and transactions will create efficiency, reduce costs and result in the improved usability of financial data to create valuable information and manage systemic risk,” according to the US-based bank.

It added that establishing and implementing a common global language for financial instruments and transactions, that is universal across all institutions, will result in “unambiguous meaning, consistent formats, and improved usability of the data to create valuable information”. Moreover, consistent use of such standards in regulatory reporting requirements across the globe would significantly improve the ability of the public sector to understand and identify the build-up of risk across multiple jurisdictions and across complex global financial processes.

The bank also notes that “global data standards also lead to efficiency, saving time and reducing costs that firms and regulators would otherwise expend manually collecting, reconciling, and consolidating data, and will lay the groundwork for the future use of evolving technologies and innovative approaches to data management.”

The 2008 data crisis

JPM’s edict came a decade after the financial crash, an event which kick-started a number of standardisation initiatives including the development of the legal entity identifier (LEI). The fallout of 2008 made regulators realise that they could not easily identify the parties to financial transactions across markets, products and regions. Lehman Brothers’ insolvency, for example, resulted in more than 75 separate bankruptcy proceedings. When it collapsed, the group was party to more than 900,000 derivatives contracts.

The Financial Stability Board (FSB) and G20 finance ministers and central bank governors advocated the development of a universal LEI which would be applicable to any legal entity that engaged in financial transactions.

The LEI is a 20-character, alpha-numeric code that contains information about an entity’s ownership structure and thus answers the questions of ‘who is who’ and ‘who owns whom’. The publicly available LEI data pool can be regarded as a global directory, which greatly enhances transparency in the global marketplace. In June 2014, the Global Legal Entity Identifier Foundation (GLEIF), a not-for-profit organisation, was created to support the implementation and use of the LEI.

Klaas Knot, president of De Nederlandsche Bank

 

Speaking at a financial industry event hosted by the GLEIF in Amsterdam in February 2020, Klaas Knot, president of De Nederlandsche Bank, said since the introduction of LEIs, more than 1.5 million entities in over 200 countries have registered for an identifier. “The LEI has seen widespread adoption in several financial markets most notably in the over-the-counter derivatives markets. It is also used increasingly in the issuance of debt and equity securities in jurisdictions. In other areas, the uptake of the LEI is less widespread, and the Financial Stability Board (FSB) continues to monitor this progress.”

Knot said the LEI had led to improvements in the quality of data analysis, opening up possibilities for research and data aggregation. The identifier had also improved the accuracy of data reporting and was now being used in international stress tests. These benefits are available not only to regulators, but also to the wider financial industry, he added.

“[The LEI] is invaluable in helping to understand interconnectedness. It has improved our understanding of the build-up of risk across multiple jurisdictions,” he said.

Knot referred to a peer review conducted by the FSB in 2019, that issued four recommendations to support broader use of the LEI. conducted a peer review last year to assess the current adoption and use of the LEI. It came up with four sets of recommendations to support the broader use of the LEI. These are:

  • Mandatory use of LEIs for the identification of legal entities in data reported to trade repositories (this is already being pursued in Europe for derivatives data and from April will be required for securities financing transactions).
  • The FSB to explore the potential role of LEIs in other financial activities.
  • Standard-setting bodies and international organisations to review ways to further embed or improve references to the LEI in their work.
  • The LEI Regulatory Oversight Committee and the Global LEI Foundation to improve the LEI business model to lower the cost and administrative burden for entities.

Knot concluded: “I am convinced the use of the LEI will expand in the coming years. Beyond securities and derivatives trade reporting, and into other sectors. Because in a globally operating financial world it is clear we need global standards.”

Expanding the LEI reach

The requirement for LEIs as a means of identifying counterparties that are legal entities in MiFID II transaction reporting ensured uptake of the standard. The upcoming Securities Financing Transactions Regulation (SFTR) also requires the use of LEIs.

In its February 2020 Global LEI Data Quality Report, the GLEIF stated that the highest number of LEI issuers had achieved expected or excellent data quality since November 2017. The LEI data quality score assesses 11 criteria: accessibility, accuracy, completeness, comprehensiveness, consistency, currency, integrity, provenance, representation, uniqueness and validity.

While the number of LEI records with parent relationships increased, the number of failing checks related to relationship data remained stable. The ‘business card information’ available with the LEI reference data – the name of a legal entity and its registered address – is referred to as level 1 data. It answers the question ‘who is who?’. In addition, the LEI data pool includes level 2 data – ‘who owns whom’? Legal entities that have or acquire an LEI report their ‘direct accounting consolidating parent’ as well as their ‘ultimate accounting consolidating parent’.

“LEIs have worked well, despite some parts of the industry being sceptical at the start,” says Harry Chopra, chief client officer of US-based risk analytics and data company AxiomSL. “The only remaining challenge is that when an entity goes out of business, its LEI is not switched off. There’s a bit of work to do there.”

 

While the creation of standards is one very complex and time-consuming thing, ensuring they are adopted and complied in a coherent manner is another. Once data is consistent, says Val Wotton, managing director of product development and strategy, derivatives at DTCC, firms can leverage it beyond trade reporting.

“We have seen this in credit derivatives with the development of the Trade Information Warehouse. The industry drove standardisation of the underlying product, which is important for timely and accurate confirmation, settlement and reporting.”

 

Some standards initiatives have faltered because the industry has “bitten off more than it could chew,” according to Linda Coffman, executive vice-president Reference Data Utility (RDU) at SmartStream. The adoption and use of standards, particularly for firms with legacy systems, is costly. She believes that while regulation is a major driver of standards initiatives, regulators around the world should collaborate on standards to ensure consistency for global firms.

 

Alexander Dorfmann, senior product manager, Financial Information, SIX, says the data universe is “highly fragmented”, which means data in the market is not readily available for end users or systems. He points up that even if data is made available at no cost, the preparation and use of data is costly and is operationally intensive. “Sometimes the conversation is about making data available free of charge. But that doesn’t do the trick – we need to have standards, both functional and technical, for the data to lower the costs associated with it.”

Organisations such as SIX, DTCC and SmartStream via the RDU focus on shielding clients from the lack of data standards. SIX, for example, has built a platform that takes in the data in all of its different formats, normalises it and makes it available to users in the technical format of their choice.

“Our aim is to offer our clients platform that enables them to decide what type of data they want for their best execution reports,” says Dorfmann.

 

©BestExecution 2020

[divider_to_top]

 

Related Articles

Latest Articles