KNOWLEDGE IS POWER.
Fads run rampant in technology, and data management has attracted more than its fair share. During the past decade financial services firms have been urged to adopt or pay attention to a seemingly endless stream of data-related issues including data warehousing, enterprise-wide data management, data governance, big data and more recently, ‘smart’ data. Data management may be the buzzword but the industry needs to get smarter in how they use the information. Heather McKenzie reports.
At a SWIFT business forum in New York during March, State Street’s chief scientist, David Saul, championed the idea of smart data. Rather than talking about the volume of data (as in big data), firms should be pursuing data governance structures that attach underlying meaning to the information (smart data). He was reported as saying: “You can have the technology and have the standards, but within your organisation, if you don’t know who owns the data, who’s responsible for the data, then you don’t have good control.”
Smart data means pinpointing information that can help a firm make informed business decisions. Kerry White, managing director of global product management at BNY Mellon Asset Servicing, agrees with State Street’s Saul that smart data is more useful than big data. “When we talk about smart data, we mean data that is actionable. Buyside firms have to sift through mountains of data that come in from different locations. Increasing volumes of data and regulations have made it increasingly important to ensure the data is actionable,” she says.
White explains the challenges some of BNY Mellon’s clients face: “A huge university foundation client of ours has very diverse assets and actively chases alpha. It has typical holdings such as listed equities and fixed income, but more than 60% of its assets are in non-traditional areas such as infrastructure projects, private equity and venture capital firms. Such assets tend to be difficult to keep track of, to model, map and analyse because they are not priced in the same way as blue chip equities or government bonds.”
John Avery, a partner at SunGard Consulting Services, says big data continues to dominate the “technology hype cycle” in capital markets. Although budget allocations for big data projects are poised for further growth in 2014, returns on these investments are still elusive. “Initially big data was used to solve problems in retail financial services,” he says. “This is an area that generates a great deal of data that needs to be analysed. Until recently, big data had less traction in the capital markets and wealth management areas. But retail and wholesale financial services have similar data sets and the need for greater insight into data.”
The smart data “movement”, as Avery describes it, acknowledges that capital markets data integration is expensive, effort intensive and error prone. This is because of limitations in data modelling, data governance, legacy technology and data standards. The promise of smart data is that big data technology can help solve data modelling and data integration challenges that were unachievable with legacy data management technology.
Having a plan
Again it seems that firms have to rethink their data management strategies and technologies. Technology advisory firm Sapient Global Markets has released the Sentiment-Based Data Management Maturity Assessment, which is aimed at analysing how data programs are working at firms. Says Sapient: “Firms that take the survey will gain insights to move data management strategy forward, better shape budgets and predict effects of initiatives on data maturity.”
Gavin Kaimowitz, a director with Sapient Global Markets, says a desire for better business intelligence is driving developments in smart data. “Correlation of data is a key challenge for financial services firms. They want to gather more information to make more informed business decisions.” Like Avery, Kaimowitz thinks firms have yet to capitalise on big data, despite lengthy marketing campaigns from data companies.
Smart data, he says, is about identifying customers, hierarchies and changes across myriad customers and data vendors. Firms want to understand who is who in the data supply chain. This challenge is being met via utility services from companies including the Depository Trust and Clearing Corp (DTCC), and Euroclear in alliance with technology vendor SmartStream.
Euroclear’s utility is based on the concept of shared, standardised delivery of commoditised functions of back office processing and a centralised repository of standardised data. It says such an approach can address many of the issues connected to bad reference data. Without an industry utility, organisations are left to solve data issues for themselves when many of the problems and associated costs occur once data crosses organisational or geographical boundaries.
“We are look for ways to simplify the supply chain of data, getting higher quality data from as close to the source as possible and tuning it to users’ requirements,” says Martijn Groot, director, Central Data Utility product management at Euroclear. “Firms are spending billions of euros on data each year and the cost of integrating that data into their systems is even higher.”
Previous approaches to this problem, he says, tried to solve “everything for everyone” with a single master copy. A utility approach seeks out common ground among clients – such as those that trade the same instruments or with the same counterparties. Euroclear recognises, however, that each of its clients has its own application landscape. The utility centralises the common quality management processes but allows for differentiation between clients.
“Our platform gives us sufficient common ground to centralise operations. We act as an agent for our clients; they can channel different data feeds into the utility and we can prove, map and scrub that data. Ambiguity in data is a big problem and if it is not addressed at the beginning of the transaction lifecycle, costs and risks will be higher downstream,” he says.
In September, the DTCC signed a memorandum of understanding with a group of large global banks including Barclays, Credit Suisse, Goldman Sachs and JP Morgan to jointly develop a client entity reference data service. Over time, the service will address client reference data requirements of banks and broker dealers, as well as asset managers and hedge funds, including, among other items, legal entity hierarchies, standing settlement instructions, regulatory compliance data, client on-boarding/KYC, tax and other requirements.
DTCC’s stated aim is to build a comprehensive, centralised platform to effectively manage virtually all client reference data. The platform will be based on Avox, DTCC’s data validation operation that maintains reference data, and Omgeo Alert, a standing settlement instruction database.
Mark Davies, general manager at Avox, says smart data means “smart decisions”. If firms get data management right it means accurate decisions are being made. “It doesn’t matter how good transactions are – if the reference data says a company is in the shipping industry and it is actually a property developer, the transaction will go into the wrong pot when the trade is rolled up. A lot of time is wasted at financial services firms fixing up failed transactions and incorrect reports.”
The challenge for Avox and for its clients is maintaining and using data. “Data management is becoming unwieldy for sellside and buyside firms. The largest firms might have half a million separate entities in which they are interested – customers, issuers of securities, agents, broker dealers and investment managers. Within that half a million you should assume that about 20% will undergo a significant change each year. That is 100,000 records that have to be monitored and updated. Firms also have to ensure that the source information is deployed through all systems internally.”
Collaboration between vendors and their clients seems to be a growing trend. BNY Mellon works with its clients, says White, to ensure information is accurate. “We are positioned between our customer and all of the parties with whom they do business. We gather the information from those parties, validate it, back-test it and bring it into our native environment so we can deliver it to our clients in the way they like.”