RAISING THE DATA BAR.
Heather McKenzie explains why data management is more than just a technology issue.
For decades, trading firms have heard the data mantra: “rubbish in, rubbish out”. If poor quality data is fed into trading, risk and reporting systems, poor quality decisions result.
For years, data was not considered to be an asset, let alone a ‘strategic’ one. Data used to be necessary to clear trades and feed risk management systems, but there was little or no impact of data on regulatory compliance. This has changed with the increasing amount of reference, client and market data that has flooded into financial institutions – often in duplicate – to comply with the new legislative environment.
It has also spawned a huge industry of data cleansing and scrubbing firms which is growing up out of the inability of financial institutions to better manage data and ensure accuracy. The old solution of flinging technology and bodies at the data management problem is becoming increasingly expensive. In addition, it is also coming under increased regulatory scrutiny. Financial regulators are implementing new requirements, particularly regarding trade reporting, across a broad range of market activities, asset classes and countries. These are proving to be a catalyst for firms to review their data management architectures.
One of the challenges is that firms have disparate data feeds and databases acquired either through mergers or restructurings. “Regulation is a catalyst for thinking about data more strategically,” says Bradley Wood, a partner at London-based GreySpark Consulting. “Many dinosaur City [of London] managers thought data was an IT problem. Now they are realising that data properly managed and exploited can yield returns.”
Grey Spark has issued a three-part data strategy report that addresses the issue of data as a strategic asset. “A strategic approach should seek to exploit data so that collecting and managing it effectively benefits the business rather than hinders it,” says the report. “Doing so means considering the data the business generates as an asset in its own right rather than as a regulatory liability or IT overhead.”
Viewing data as a regulatory liability is not uncommon, given the raft of rules that are being imposed on trading firms. The impact is particularly significant in the derivatives space with the European Market Infrastructure Regulation (EMIR) and the US Dodd Frank Act placing emphasis on trade reporting.
“Participants in the derivatives industry are required to focus on client and reference data for trade reporting,” says Virginie O’Shea, a senior analyst at research firm Aite Group. “The first round of reporting didn’t go well. Regulators could not understand data that was in different formats and were concerned there was double reporting in some cases. Regulators are now revising how they want firms to report to trade repositories.”
Another regulatory issue that is testing the data management strategies of firms is the Basel Committee on Banking Supervision’s principles for effective risk data aggregation and reporting, BCBS 239. Like many post-2008 regulatory initiatives, the principles were developed because of the inability of many banks to determine their exposure to Lehman Brothers. The recommendations “poke holes” in the way firms manage risk data, says O’Shea. The data, which could be client or securities identifiers, are often very poor. As a result, decisions on how to manage risk are based on unreliable information.
To meet BCBS 239 requirements, firms need to adopt consistent data reporting, says Chris Probert, a managing principal at global financial services industry consulting firm Capco. “Firms will be required to get trades into their trading and risk systems in a consistent and accurate way,” he says. “This comes down to data lineage – understanding where data comes into the trade book and how it is used, enriched, augmented and stored.”
Very often financial institutions address regulation on a first come, first served basis. The most pressing requirements will be addressed first. This is changing, according to John Randles, chief executive at enterprise data management firm Bloomberg PolarLake. “The requirement to report to trade repositories has had a knock-on effect to other data sets such as reference, pricing, holding and positional data. New securities master file projects are being driven by the need for better securities master data, linked to EMIR reporting.”
While in the past the front office was not particularly concerned about the securities master data, EMIR and BCBS 239 mean that firms are more focused on ensuring the inputs into their risk management systems are robust and the governance is strong. “We see more firms looking to put together a data fabric and governance that will meet many regulatory requirements,” says Randles. “Firms no longer have the resources to treat each regulation as a separate project.”
Regulators want to know at any point in time how a firm priced a stock, who touched the trade and how traceable the transaction is. “Regulators are looking for confidence in the trading process. In order to meet this requirement, firms are focused on developing data governance strategies that will stand the test of time,” he adds.
Regulations are raising the bar on data governance practices, says Paula Arthus, president and chief executive of Omgeo, a subsidiary of the US Depository Trust and Clearing Corp. “Many firms have a fragmented view of their client data, which has evolved over time to support their business and products. There is now a lot of activity around data discipline and rigour to develop a strategy that will provide a holistic view of data.” Not only will this help firms to comply with regulations, it will also improve analytics and deliver more accurate information into risk systems.
While data management is supported by technology, Arthus stresses that the business side of a firm also must be involved in any data management strategy. Firms are beginning to appoint chief data officers in recognition of the importance of data within a firm.
“The data management now required is beyond technology – firms cannot just throw technology at it,” says O’Shea. “A data management strategy needs to be bottom up and top down. The people who input data are not necessarily data management staff – they are most likely to be from the business side. It has to be explained to them that inaccurate data entry is a problem for everyone throughout the organisation.”
Firms that regard data management as a technology problem are “making a big mistake”, says Wood. “There needs to be a change in behaviour in financial firms. Much of the data is poor and data cleansing and scrubbing is a significant drain on resources; it should not need to be done.” Bad practices and erroneous decisions have led to “armies” of offshore resources being deployed to massage data. He adds there has been an “accumulation of bad decisions made in data management over the decades”.
With the ‘data deluge’ set to continue, according to GreySpark’s report, it is “critically important” that all firms have a strategic response. “For a bank, having a strategy to manage exponential increases in the size of the data created means having the ability to leverage reliable, timely data in order to increase revenues, facilitate strategic objectives and to cut costs.”
Implementing an effective data strategy will require both investment and cultural change, some of which can be painful, but as GreySpark points out, strategic data management is an imperative that, if ignored, will threaten a bank’s survival. Strategic data management leads to a better understanding of the client base. The banks that can use their data to predict and service client needs will emerge as the dominant market players.