Connectivity 2016 : Philippe Chambadal & Joseph Turso

A QUIET REVOLUTION.

E013495

By Philippe Chambadal, President, SmartStream & Joseph Turso, Product Manager, The SmartStream Reference Data Utility

Debates about connectivity often focus on the front office and the IT systems used by financial firms to trade across multiple asset classes. In the world of the back office, a quiet revolution is underway as sophisticated information technology software enables banks to break down traditional silos and carry out post-trade processing across asset classes, assisting firms to reduce the huge back office costs which currently burden the industry.

Advanced technology is also being used to power utilities, capable of providing shared services to multiple financial institutions. With banks increasingly putting competitive rivalries aside and turning to these mutualised services in a bid to cut overheads, technology appears to be driving not merely the erosion of walls between asset classes but removing some of the barriers between financial institutions themselves.

Trading instruments which span asset classifications, such as hybrid products, are becoming increasingly popular amongst financial institutions. Accessing the information required to trade these products creates many challenges for banks, as the data is very often stored in a number of discrete silos. The existence of these silos does not simply create headaches for firms looking to trade products which cross traditional asset class boundaries but raises some even more fundamental questions for the financial industry, especially in relation to the huge post-trade costs that this silo-based infrastructure engenders.

While financial markets were buoyant, maintaining a siloed infrastructure manned by large numbers of administrative staff was perfectly feasible. If a processing issue cropped up, banks simply employed more people to solve it. In the wake of the financial crisis of 2008, however, banks’ revenues have been hit hard, dwindling considerably – according to the Boston Consulting Group’s 2016 Global Capital Markets report, global investment banking revenues declined to $228 billion in 2015, down 5% from $239 billion in 2014 and 16% from $271 billion in 2010, with total revenue lower in 2015 than at any point since 2009.

Costs, on the other hand, remain persistently high with the financial industry spending – according to one DTCC report – as much as $100bn per year on post-trade processing. Investment banks, more than other financial institutions are feeling the pressure of high post-trade overheads. While many buy-side organisations have already hived off back office activities to custodians and other service providers, investment banks are still burdened by bloated, unwieldy back offices which are extremely costly to maintain. The toxic combination of shrinking revenues and burgeoning post-trade costs is now leading many banks to review the way in which they manage their post-trade activities.

New regulation, from European Market Infrastructure Regulation (EMIR) through to Basel III and the Dodd-Frank Act, is exerting further pressure on the industry to review post-trade practices. Financial authorities have introduced stringent reporting requirements and are demanding far greater levels of transparency than ever before. Increasingly, there is a desire on the part of regulators for an intra-day, real-time view into banks’ operations and the flow of payments and trades. For some banks, providing this level of up-to-the minute clarity into their operations could currently prove extremely challenging.

Automating processes and bridging asset classes in the post-trade world

One of the root causes of high post-trade costs is the fragmented and inadequate nature of banks’ processing systems. Many banks still have in place a large number of legacy IT applications, each of which performs a small part of a bank’s overall post-trade processing activities. Worryingly for financial institutions, these systems are far from joined up, slowing down processing times. The lack of interconnection between IT systems also makes it far more likely for errors to occur – all too often it is necessary for data to be re-entered, creating the danger that information is keyed in incorrectly. The fractured state of back office processing systems, and the lack of STP that this engenders, results in many broken trades, transactions and corporate actions. Putting right these breaks is a hugely costly business for financial institutions and – according to SmartStream’s estimates – banks may spend as much as $50 to $65 billion a year tackling the issues which arise as a result of weaknesses in post-trade systems.

Creating post-trade processing efficiencies is challenging as not only is there a lack of communication between many of the back office IT systems financial institutions use but because the banks themselves still inhabit a very silo-based world. The lack of connectivity between asset silos – and the negative impact it has for banks looking to improve the efficiency of their post-trade processing activities – can clearly be seen in relation to reconciliations management.

Traditionally, banks have carried out reconciliations by asset class – for example, for cash, exchange traded derivatives, OTC derivatives – in separate accounts. This has entailed using a different IT system, and sometimes Excel spreadsheets, for each asset class or deploying a single application but one in which each asset class is processed in separate accounts. There are a number of drawbacks to this approach, but two stand out in particular. Firstly, it is very difficult for a bank to get a clear picture of where it stands financially as it lacks a consolidated view into a total equity reconciliation. Secondly, determining that a reconciliation has been completed involves drawing information out from many separate systems, which represents a complex, time-consuming and expensive business.

Reconciliations processing is not the only area in which the silo-based nature of banks’ infrastructure obstructs the achievement of greater efficiency. Cash management is another sphere in which financial institutions encounter difficulties. To date, many large financial institutions have managed cash with end-of-day, manual processes and huge numbers of back office staff who gather data, in varied formats and from different systems, from individual business lines or regional accounts. This approach is, first of all, highly labour-intensive and costly. Secondly, there is no single, global view of activity across all currencies and accounts. Banks do not therefore have an accurate, up-to-date picture of balances and current risk balance, meaning that there is no clear picture of the risks they run. In addition, the absence of a real-time view of balances means that companies may be unaware of the availability of funds which could be used for other activities, for example, trading. As a result, a firm may run the danger of underutilising precious financial resources.

Clearly, rethinking inefficient, duplicative post-trade processes is a priority for the financial industry. So how can technology help them?

Banks have spent many millions investing in the large number of IT systems that currently power their post-trade processes. Yet these systems are often fragmented, resulting in inefficiency and high costs. Financial institutions require a means of bridging traditional asset silos but must be able to do so in a way which enables them to continue to make use of legacy IT systems and derive value from their existing investment in technology. Companies also need technology which allows them to break down the barriers between asset classes in order to create a real-time, centralised view of processing activities and outstanding positions. Achieving a high degree of clarity is vital if financial institutions are to be able to manage risk effectively and make full use of available funds.

To this end, SmartStream has developed its Transaction Lifecycle Management (TLM®) technology, which automates a wide range of post-trade processes. Developed as a single architecture, TLM’s approach is unique. It breaks down silos – tapping into legacy IT systems and exploiting existing technology – and creates one transaction lifecycle, from inception to settlement, taking a truly cross-asset approach to post-trade processing. Results of processing activities are presented in a single, consolidated view, providing companies with a clear picture of their financial standing, as well as the real-time view of the risks they run. The technology improves STP levels significantly and takes a highly pre-emptive approach to dealing with exceptions, enabling financial institutions to reduce trade failures and their associated costs.

Financial utilities: the future of post-trade processing?

At a time when financial institutions, and especially investment banks, face burdensome post-trade processing costs the industry – SmartStream believes – would benefit from not simply making use of advanced technology to break down the boundaries between asset classes but from setting aside some of the barriers between individual institutions in order to participate in financial utilities.

The use of utilities in the financial industry goes back to 2005 when organisations first began to build internal utilities to lower costs. Banks started to gravitate from the traditional model – where each line of business performed its own reconciliations and so on – creating, instead, a service able to perform processing activities on behalf of all a company’s business divisions.

During this early phase, SmartStream worked with a number of banks on the creation of internal utilities but the company also went on carry out several projects to construct externalised utilities. These latter undertakings highlighted external utilities’ ability to create a sizeable return on investment, often far better than that attainable through an internal utility. Gains of 30% to 50% proved possible – significantly higher than the 10% to 20% return on investment typically achievable through the use of an internal utility.

The unique structure of the externalised utility, SmartStream discovered, is particularly well suited to helping users drive down costs. Instead of making use of a number of different systems – as a traditional BPO service provider would typically do – a utility employs a single platform architecture. This approach has several benefits, for example, it solves the difficulty of inter-system messaging and it is also far easier to link new clients up to the service. In addition, the use of a common architecture facilitates the introduction of new business processes and makes carrying out adjustments necessitated by regulatory change far more straightforward. Any alterations are made by the utility on behalf of users, removing the need for clients to rebuild their own processes.

Financial institutions are turning to externalised utilities on an increasingly frequent basis. Reference data management is one area in which banks have begun to adopt a mutualised service approach.

The industry-wide burden of managing reference data

The financial industry currently spends a vast amount of money on managing reference data. A Tier 1 bank may spend as much as $50 – $100 million a year on processing reference data, also employing armies of back office staff to collect, validate, cleanse and reformat reference data before distributing it to front line systems. A large bank is likely to take in reference data from 100 to 120 sources, on a daily basis. Each of these information feeds requires a data loader, and operating and maintaining the necessary technology can be a complex, costly business.

In spite of the effort and investment expended, the industry is plagued by the ill effects of bad data and financial institutions spend a colossal amount on repairing broken trades caused by poor quality reference data. SmartStream research indicates that, post-trade, as much as $50 to $65 billion may be wasted annually through the impact of broken trades. Of that, some 30% – 40% – or $15bn – stems from bad reference data.

In the past, financial institutions have taken steps to get to grips with the massive amount of waste caused by poor reference data but these efforts have met with limited success. The lack of success is attributable to banks’ approach to achieving improvements in this area: traditionally, firms have viewed such projects as a unilateral activity, ignoring the fact that while an individual institution may spend a great deal on upgrading its own reference data, if a counterparty has poor data, a trade will break anyway. Forward-thinking institutions have started to realise the importance of moving away from this one-sided approach and are beginning to focus, instead, on ensuring that not just they, but their counterparties, too, use the same high quality information.

Managing reference data in-house, for a company’s sole use, is highly inefficient and costly. The utility approach to managing reference data – such as that pioneered by SmartStream through its Reference Data Utility (RDU) – can confer considerable advantages. Such a utility is able to act, in effect, as a processing agent for clients reference data, receiving information directly from clients’ data suppliers which it then cleans, cross references and enriches. Any improvements – for example, a correction to an error in the information received from a data vendor – benefits all clients of the utility. Once cleaned, data can be delivered to clients’ infrastructure. In the case of the RDU, it is delivered in the form of a single schema per asset class, which may also be customised, allowing users to consume whatever aspect of it they wish.

The RDU, which currently serves three Tier 1 banks, is able to save 30% – 40% on the direct operational costs of data management systems. As it improves the quality of reference data upstream, far fewer broken transactions result downstream, creating further cost savings. Technological complexity can be reduced, too. Instead of tens of data feeds users simply need one feed per asset class. The utility’s mechanism for delivering cleaned data also supports institutions engaged in trading hybrid products. Its distribution model is highly integrated across asset classes, with multi-asset send-up of data, enabling data to be sent in a way which shows links across asset classes. This, in turn, simplifies activities such as assessing credit risk in relation to a particular counterparty, removing the need to search for information across numerous, separate sources.

Importantly for the industry, the data cross-referencing carried out by the RDU allows the internal systems of participating firms to use a common data model and so “speak the same language”. As more banks and their clients link up with the utility this should allow the establishment of a de facto common reference data standard, helping to make the trade breaks that currently cause such high industry costs and provoke so much aggravation between banks and their buyside clients a thing of the past.

Looking to the future

The utility model offers banks a vital lifeline at a time when the sector faces a global rise in trade volumes, increased regulatory oversight, as well as stubbornly high post-trade costs. And it is not just reference data management which can benefit from adopting the utility approach. The bulk of post-trade services – most of which contribute nothing to a financial institution’s bottom line but which are carried out individually, and at great cost, by every company across the industry – are ripe for the same treatment and could benefit from financial utilities serving the sector as a whole.

In the future, increasing numbers of banks are likely to embrace this new approach. Lower revenues, coupled with the unsustainable expense of maintaining huge numbers of back office staff, will push financial institutions into accepting such a change. Technology may well, therefore, begin to break down barriers between financial institutions as organisations realise the value of the utility model and its ability to alleviate not only heavy post-trade costs but to create a leaner, agile and more efficient banking industry – as well as one which is far better suited to survive the rigours of the future.

©BestExecution 2016[divider_to_top]

[divider_line]

Related Articles

Latest Articles