Data management : Outsourcing : Heather McKenzie

MANY HANDS MAKE LIGHT WORK.

While data is becoming increasingly crucial for financial firms, it is generally a rather messy business: significant investment is made by buy and sellside firms in cleaning, enriching and aggregating data. Despite this expenditure, data management continues to be a thorn in the side of many firms. Heather McKenzie investigates what can be done about it.

Financial regulators and clients are data hungry – regulators want to avoid a repeat of 2007/8, while clients want more focused analysis and data to help them make better investment decisions or to report to their underlying clients. Reporting is an important element in today’s financial world and data drives this function. However, disparate data feeds, a lack of standardisation and cost pressures combine to make data management a significant challenge for many firms.

As regulators require more data and a growing number of investment firms are expanding into multiple asset or multiple geography strategies, the traditional ways of dealing with data management are no longer sufficient, according to James (JR) Lowry, head of State Street Global Exchange, EMEA. Historically, most investment management firms did data management themselves, either building their own data warehouses or buying a packaged solution that was installed in a data centre.

This has changed with an increasing number of firms hosting data in the cloud or outsourcing to specialist firms, fund administrators or custodians such as State Street. This enables data aggregation and management to be carried out on a firm’s behalf while it focuses on its core fund management business.

Variations on an outsourcing theme

The solutions that are emerging to deal with data all feature an element of outsourcing, although it is often called something else. Data utilities, for example, are a form of outsourcing whereby a company or group of companies or financial institutions combine their efforts to run certain aspects of data.

An example is SmartStream’s Reference Data Utility (RDU), which was set up in late 2015. Goldman Sachs, JP Morgan and Morgan Stanley have joined with the technology firm to develop services for instrument reference data normalisation and validation across all asset classes. The founding banks will be clients of the utility.

be35_web-image07-46-05

Peter Moss, chief executive of the RDU, says if reference data is not correct from the outset, significant breaks can occur in automation. “The initial response to this was that firms created data management teams that were focused on pulling in market data from various data vendors and pooling it. But they found there were gaps and inconsistencies and it was difficult to cross-reference different data sources,” he says.

The utility model has gained momentum because since the financial crisis, revenues have not increased and firms are looking for greater cost efficiencies. “The banks realised they were all doing the same thing with data and it didn’t make sense to duplicate these efforts,” says Moss. The benefit of a utility is that it delivers more consistent data across the industry and trade processing generally works more effectively. A utility model won’t be appropriate for all types of data, however. Moss says there are some classes of data that are proprietary and a financial institution will want to continue to be the owner of such data. The key for reference data, however, is consistency and this can best be achieved if it is done “once, and by the industry for the industry”.

be35_web-image07-46-18

For many buyside firms, there is little benefit in doing data management themselves, says Chris Pickles, an independent consultant to the securities industry. “Sellside firms have a different requirement – they will need to be involved in data management because they are creating trading instruments. But unless you are a large buyside firm that can get economies of scale, outsourcing data management is the better option.”

The amount of data, number of instruments and what is required of the data in terms of reporting, mean that any inefficiencies within a firm regarding data management will be amplified. “Any area from which a firm doesn’t derive a unique benefit could be outsourced,” says Pickles.

Keeping an open mind

For the remainder, increasing effort is made to rationalise data via open standards. To date, most firms have created their own identifiers for instruments but this is becoming untenable. The Open Symbology project is an attempt to rationalise reference data by creating a unique identifier that is free of charge and can be managed centrally. An open data standard, defined by the financial industry, it provides a solution to consistently identify financial instruments, regardless of asset class, or function being performed.

As part of the project, FIGI (Financial Instrument Global Identifier®) is the first and only open data standard for identification of financial instruments. Under the auspices of the Object Management Group, FIGI is provided free of charge.

Lowry cautions that standardisation efforts in data management will help, but the journey will be long. “Data is messy because different firms and different parts of the same firm use different data sources. There are differences between the trading and settlement books of record. There are also differences in accounting rules in different countries. This requires some harmonisation if firms are to be able to look at data in a holistic way.”

be35_web-image07-46-11

Frederic Ponzo, a managing partner at London-based consultancy GreySpark Partners, says there is a great deal of appetite among large buyside firms to outsource instrument data management, “but without standards you cannot do it”. All a firm can do is insource using a third party to do the manual work. Smaller firms, however, can use hosted solutions that will do everything and produce good, clean golden copies of data.

The ability of firms to outsource key data management functions – where it is feasible – is not difficult, but Ponzo says, regarding the ability or willingness to outsource certain elements of data management, it is a very mixed picture.

There are ways to tackle the data challenge, says Ponzo. Firms can outsource everything and “make your problem someone else’s problem”, he says. Firms such as State Street and Blackrock are providing such services and can achieve economies of scale that other companies cannot.

Lowry says as a custodian, State Street holds a great deal of data related to its customers, and clients can trust the firm to be steward of their data assets. This trust can be extended out to other data assets as well, such as market data and third party data. “We can aggregate and transform data in a way our clients ask us to and get that back to them in a cleansed fashion. It can be a very holistic model, based on the intent of taking the hassle of data management off the hands of our clients so they can focus on other things.”

Alternatively, firms can look to use utilities, says Ponzo, which he describes as “the purest form of outsourcing, in theory”. This will also deliver economies of scale, but can be difficult to put into practice because of the nature of consortia (particularly if there are too many partners). He is also sceptical that in some cases participants in a utility will end up providing data into it and then subsequently paying for their own data.

If a utility is run for profit, or there is conflict or misalignment of the members of the consortium, a firm would be better off dealing with a sole vendor, which generally can be faster and nimbler at responding to requirements.

©BestExecution 2017

[divider_to_top]


Related Articles

Latest Articles