Execution analysis : Multi-asset TCA : Kevin O’Connor & Michael Sparkes

MULTI-ASSET TCA: FASTER, BROADER, DEEPER.

By Kevin O’Connor, Head of Workflow Technology and Analytics & Michael Sparkes, European Analytics Business Development at Virtu

The scope and application of transaction cost analysis (TCA) and its various close relations has evolved dramatically over recent years, driven and enabled by changes in technology, regulation, market structure and client demand. Launched initially as an equity-focused compliance-driven process, the breadth and depth of analysis has expanded to cover almost every major asset class, incorporating not just implicit and explicit costs but also factors such as liquidity profiling, algo analytics and venue analysis.

Kevin O’Connor, Head of Workflow Technology and Analytics, Virtu

The expanded use of TCA across asset classes has introduced a variety of challenges due to different market structures, data availability, cost, and the relevance of differing metrics – particularly for less liquid instruments. In addition to asset class expansion, the intended audience has also evolved. Its use is not just limited to traders or compliance departments but typically constitutes a series of elements and actionable outputs that can be relevant throughout the investment process targeting portfolio managers, risk managers and CIOs.

A major driver for many of these developments has been a raft of new regulatory requirements, not the least of which were introduced by MiFID II in Europe. The regulation requiring a demonstrable process for monitoring best execution across asset classes, using data from actual results as an input to future decisions, has galvanised the buy-side community to reassess their approach. While it is not mandatory under MiFID II to use an external TCA service, it is required that a systematic method of capturing and reviewing trade data be in place. Approximately 95% of firms reported using TCA across asset classes at the start of 2019, up from 75% in 2017, according to Greenwich Associates*.

Michael Sparkes, European Analytics Business Development, Virtu

In addition to these regulatory best execution requirements, the global trend towards commission unbundling has led to a greater focus on the quality of order execution to ensure trading and routing decisions are driven by objective and quantifiable results, separated from any need to pay for research or other bundled services which used to be funded by commissions.

Many firms had only just begun to address the new European regulations when we reviewed the state of preparedness of the buy-side in our 2015 best execution article, MiFID II and Best Execution Across Asset Classes. Our recent update of this survey captures how the industry has responded and indicates the likely path forward. While the median or typical firm has raised their game considerably, in terms of TCA data utilisation, leading firms have, if anything, pulled further ahead of the pack.

Top firms’ in-house data science teams are working directly with external TCA vendors as they consume normalised benchmarked data and load it into their proprietary databases for further analysis. The TCA vendor provides guidance not only as these teams seek to adjust or redirect trading allocations according to which brokers and counter-parties are consistently outperforming but also as they undertake more forensic testing and experimentation with different strategies to minimise costs and preserve alpha in line with their investment objectives.

Equity analysis – more granular and holistic

In 2015 it was typical for firms to monitor allocation-level costs in equities based on data from an order management system (OMS), while only a minority of firms also analysed fill-level data. Fast forward to today and we see more firms monitoring algos and venues in detail alongside the increased level of control and choices available. It is now typical for analysis at both levels of granularity, with firms adding fill-level analysis – which often incorporates data drawn from an additional source such as an execution management system (EMS).

Additionally, changes in European market structure have accelerated the evolution of TCA. Double volume caps on dark trading, the growth of systematic internalisers, the increasing role of electronic liquidity providers and the introduction of periodic auctions are all examples of changes which buy-side firms need to monitor and consider when deciding on optimal trading strategies.

The proliferation of EMSs has also led to the increased use of real-time analytics to monitor trades in-flight and make real-time adjustments in response to pre-trade TCA alerts and changing market conditions. EMS functionality continues to expand, providing greater execution strategy options, increased order control and more opportunity to analyse the post-event outcomes. Many EMS providers are now applying the same optionality used in equities to other asset classes.

The trend for more frequent post-trade analysis remains strong. While many firms are beginning to leverage pre-trade and real-time TCA to support intra-day optimisations, most review their post-trade results daily for significant outliers in equity trading. This process usually starts with the trading desk and is then checked by the compliance function. There is still an important place for monthly and quarterly processes that look at larger volumes of trades to determine any recurring trends or biases.

The definition of outliers has evolved, firms frequently use multiple filters – for instance a basis point threshold coupled with a minimum value – to highlight true deviations in performance. Similarly, multi-level analysis may incorporate information relating to portfolio manager instructions to assess trading desk performance and trader instructions to accurately evaluate broker performance.

Another hot topic for leading firms is alpha profiling. Greenwich Associates reports that more than a third of those surveyed now conduct analysis of this type, linking the execution strategy and its outcomes back to the investment decision and portfolio construction process.

Previously used by leading-edge firms exclusively, algo wheel technology is becoming main stream among buy-side clients. Virtu’s Algo Wheel, originally developed as a best execution order routing solution for equities, has now expanded into FX and futures asset classes. The use of both automated routing and algo wheels are increasingly being leveraged to remove much of the process noise and (often unintended) biases introduced by humans. The data, once normalised, can provide an objective and fair comparison for counterparties and strategies.

FX TCA – transparency

Foreign exchange market structure has made tremendous advances in the last five years, supporting the evolution in the analysis of the FX market. Many buy-side firms are bringing FX trading back in-house, and in some cases this has led to increased algo usage and request for quote (RFQ) platform adoption. The use of WM Fix related trades has also come under intense scrutiny after allegations of manipulation by market participants of these time-specific price benchmarks. Lastly, improvements in technology now make capturing accurate timestamps achievable, providing the trader with more accurate data to perform better analysis. The combination of better data and new tools has significantly enhanced the quality and scope of analysis available in the fragmented global FX market.

The implementation of FX TCA by buy-side desks conducting their own trades, rather than outsourcing to the banks, has been widespread. Using data from an EMS and/or FX trading platform allows for meaningful data examination and guides the calibration of strategies accordingly. This includes the detailed review of algo behaviour and performance, as well as strategy options such as netting. Benchmarking for FX trading has also progressed to reflect the increased data reliability. The array of metrics now includes simple mid-price and bid/ask metrics as well as more advanced calculations such as size-adjusted spread and cost impact models.

The reliance on pre-trade tools, such as Virtu’s FX ACE model, has become more widespread in FX with currency pairs exhibiting patterns related to available liquidity at different times of the day. This information can be used in decision-support prior to trading and in post-trade review to help maximise liquidity, minimise spreads and to assist in comparing different strategies.

Additionally, leading firms are now leveraging peer-based comparisons, allowing them to experiment beyond standard benchmarking. The use of peer data must be handled with care to ensure true apples-to-apples comparison, with standardisation and curation to ensure the data is clean and relevant. Meaningful peer-based comparisons also require sufficient breadth and depth of data to have statistical significance. But such data can be invaluable in helping identify areas which need further scrutiny in what is a very fragmented market.

Analysis of FX trading has been used by leading edge firms to look at the total cost of trades in other instruments. In other words, if switching from one asset class to another involves a currency exchange (for instance, from bonds to equities, or from European to US equities), it is now possible to factor in the costs related to implementing the currency transaction alongside the cost associated with the underlying trades themselves. Often the cost of delaying a currency purchase can outweigh the gains made by the related trades – assuming the currency had been traded instantly. It is highly likely that this kind of multi-asset analysis will become mainstream in the years to come.

Fixed income TCA – time to play catch-up

Compared to equities or even to FX, fixed income is a late entrant to the world of analytics. Driven by many of the same factors as other asset classes, FI TCA is finally starting to catch up – mainly due to the regulatory requirements from MiFID II and the availability of reference data. The increased use of electronic platforms in fixed income assists firms in capturing data and assessing the outcomes against a variety of benchmarks and metrics. In some cases, the analysis is primarily for regulatory and compliance purposes, but as in other asset classes, leading firms are putting a lot of effort and using sophisticated data to enhance their performance.

However, not all executions have moved onto such systems with a considerable volume of trading still being conducted by voice, especially in less liquid instruments. The fragmented nature of the market, as it relates to trading venues and the number and complexity of instruments, has been a challenge which has slowed the transition.

The fixed income market incorporates a variety of asset types – ranging from the more liquid government and corporate bonds to the less frequently traded instruments such as Munis, mortgage-backed, CDS, IRS and other related categories. Each class of instrument has its own unique set of trading characteristics. These require the appropriate application of relevant market data and suitable metrics designed to provide actionable analysis.

As with all trading analysis, a key ingredient is good market data against which the trade is compared. While reliable sources are available for liquid instruments, it is considerably harder to find meaningful reference data for the more esoteric instruments. In many cases it requires the aggregation and cleaning of data from multiple sources in order to produce a realistic and fair comparison. As with all market data, the cost to obtain data has increased, in some cases very steeply. There is renewed talk from ESMA of a mandatory consolidated tape being introduced for analysing bond trading, although this is unlikely to occur quickly. In Europe there has been an ongoing discussion of a consolidated tape for equities since MiFID in 2007 and the regulators are still in the process of finding a way to implement it.

The multiplicity of venues and trading methods means that fixed income data must be sourced wherever it is available – indicative and firm quotes, historical prices, evaluated prices, dealer-to-client platforms, multi-dealer platforms and so on. One hope has been the Approved Publication Arrangement (APA) platforms, although so far these have proved less than ideal in terms of consistency and accuracy. To incorporate and standardise a range of data sources into a coherent and consistent repository requires significant scale and is not something any individual buy-side firm can easily do. Hence the need for an external analytics provider who can step in as the aggregator, cleanser and curator of vast amounts of data from these various inputs.

Interestingly this aggregation process is also an important ingredient as part of the workflow that an EMS provides. Just as with equities and FX, it is likely that greater data aggregation and decision support will become a vital ingredient in the EMS with the subsequent execution data feeding back into the post trade analysis. We expect to see leading firms implement deeper integration of pre and post-trade analytics for fixed income as trading technologies develop further.

Derivatives TCA – a mixed bag

Derivatives TCA is hard to summarise in a simple table. The most liquid instruments, such as listed index futures, have characteristics similar to global equities. OTC instruments are much harder to measure, and the availability of data and level of sophisticated analysis is akin to what is found in illiquid fixed income. As median firms and leading-edge firms expand their analysis and monitoring into derivatives, they will require the same tools available for other asset classes – including cost models and peer group analysis. Although data is available, the analysis and monitoring performed for derivatives by most firms lag other asset classes with the same liquidity characteristics.

Conclusion – how far we have come

The world of analytics continues to move forward rapidly, driven by regulatory priorities and spurred forward by leading firms and their quest to add alpha. Over time, the continued development and refinement of new technologies should deliver even more precise data, better transparency, oversight and control of trading events in multiple venues and their impact on all asset classes.

Given the increased complexity of market structure and the ever-changing patterns of trading, the key to improving execution remains a moving target – one that will continue to evolve, in some ways radically, for the foreseeable future. Firms will need to remain nimble and apply lessons learned from post-trade analysis to ensure they adapt and thrive in the changing landscape.

*Sources: “MiFID II Fuels Investor Demand for TCA” and “The State of Transaction Cost Analysis – 2019”, Greenwich Associates; available at https://www.greenwich.com/node/109726 and https://www.greenwich.com/market-structure-technology/state-of-transaction-cost-analysis-2019

©BestExecution 2020