Best Execution 10th Anniversary : Michael Sparkes on TCA

Michael Sparkes, Director, ITG Analytics looks at how the TCA industry has developed and what the future might hold.

The analysis of equity trading has been around for many years in the form of Transaction Cost Analysis (TCA), although the nature of the analysis and implementation has changed dramatically over the last ten years. Significant developments have been made to help measure and manage increasingly complex trading and execution processes while the scope has broadened from equities to other major asset classes. The result is that the benefits are now appreciated by a far wider audience.

In the intervening years Europe has undergone profound changes in terms of the regulations, market structure, technology or even the human behaviour of the participants. In a number of cases these ingredients have reacted to each other, often with unintended or unexpected consequences.

Financial markets have mirrored numerous other industries in the increased use of electronic systems and increased production of vast amounts of data with considerably better granularity and definition. This has sharply improved the accuracy of analysis for trading, and fostered the development of a broad range of sophisticated decision support tools for traders.

Past performance

Ten years ago some asset managers only reviewed their trading data once a quarter, sending files to a TCA provider for processing sometimes several weeks after the end of the period. The TCA vendor would often process the data and return the results via courier in the form of one or more bound volumes, similar to old-fashioned telephone directories (another relic of a bygone era!). This meant that the client reviewed trading results four or five months after the fact and only had hard copy reports.

In addition to the huge lags in the monitoring process, the data analysed was far less granular than today. There was typically no record of the venues on which the trades occurred, for instance, or any details of the algo strategies employed. In fact algos were only used by a relatively small number of clients and details were lacking with strategies simply labelled as just algos.

While there were pioneers who engaged positively with TCA to understand the drivers of their costs, the interval VWAP or the open, close, high and low for the day were commonly used metrics to demonstrate whether or not a trade achieved a simple definition of best execution. Implementation shortfall benchmarks were used by some but by no means all clients while the cost models did not always take into account market conditions during the trade, which is a fundamental element in assessing the outcome. Similarly, peer‑based comparisons were usually relatively high level, which made it difficult to compare apples to apples.

TCA gains traction

The widespread take-up of analytics by the industry has been surprisingly positive over the last decade. There has been a clear move towards improved measurement and management of trading processes which has led to better cost control and investment performance for the end investor. This has not just been the case in equity trading, but across other asset classes such as FX and Fixed Income thanks to pressure from regulators and clients.

Benchmarking has also become much more sophisticated and is typically centred on Implementation Shortfall (essentially the movement in price between the start and end of the order, also known as slippage), often in conjunction with a model of expected costs for a given order and strategy combination. The use of algos is now prevalent in equity markets and is growing in FX trading as well. The precise measurement of the performance of these strategies is a core element in contemporary TCA.

These days, most firms use electronic Order Management Systems or Execution Management Systems (OMSs and EMSs) to facilitate their trading. Together they provide the data that is accelerating the proliferation of analysis. This includes a range of FIX tags that provide greater colour and insight than ever before. Which venue? Which algo? Which strategy?

All can now be monitored closely, and fine-tuned to obtain better results. Timestamps are often calibrated down to the millisecond, and with each fill of an algo being recorded there can be hundreds if not thousands of separate records for a single order. That can mean millions of rows of data per annum to be processed and analysed for one trading desk.

The concept of best execution has moved on from simply observing whether the price was the best available at a point in time to a more holistic view of the investment process. In many cases it starts upstream in portfolio construction, stock selection and manager timing, flowing through into the strategy selected for the execution process. Once the order has been completed many clients also look to see if there is any reversion in the price potentially implying excessive impact either at the granular fill level or at the level of the overall trade. The way an order is traded should reflect other aspects of the investment process such as market conditions in order to capture as much alpha as possible.

Other asset classes

While equity TCA and its usage has rapidly advanced over the years, the analysis of FX and Fixed Income is still catching up. The well-publicised issues with custodial bank law suits and scandals surrounding trading irregularities on the WM/R Fix and LIBOR have sharply focused minds. The challenges for TCA though are far greater in these markets which have traditionally traded over the counter. Obtaining comprehensive market data has been virtually impossible historically but a combination of increased electronic trading and enhanced post-trade reporting requirements are changing the dynamics.

The increased use of Electronic Communications Networks (ECNs) in FX trading has helped considerably as far as clean and accurate data is concerned. However, the variety of trade types and strategies, including forward trades, swaps, non-deliverable forwards, fixing trades, hedging strategies and so on means that the inputs have to be varied and applied with care to assess the client’s activities correctly. Despite these challenges the development of TCA in the FX market is now well established and a range of important benchmarks is available, going well beyond a simple mid-price or bid-ask spread at a given point in time.

Fixed income is experiencing the same trends, although the challenges are greater. Not only is there voice trading, often coupled with highly illiquid, infrequently traded instruments, but there are many sub-classes of instrument, each with their own peculiarities. There has been an increase in electronic trading in some parts of the fixed income market, and regulators are requiring more post-trade transparency. This is making better data available but it has been a slow process.

Looking ahead

The question of course is what will the next ten years hold for TCA? There are some trends which will undoubtedly continue – better data, more FIX tags, greater granularity, sophisticated models and a more in-depth insight into FX and fixed income markets.

There is likely to be far more joined-up thinking between pre-trade, real time and post trade analysis. The current norm of end-of-day processing coupled with T+1 and quarterly reporting is rapidly becoming outdated. Embedded analytics are seen in most OMS and EMS platforms and these will grow further, almost certainly enhanced by aspects of artificial intelligence to identify and analyse patterns as they develop live in the market. The volumes of data are simply too great to be handled effectively by a human brain.

New technologies are being developed such as the algo wheel approach which aims to further reduce the unintended biases created by human intervention in the selection of counterparties or algo strategies. This further enhances the reliability of the data set, whether looking at one client’s results or across a broader peer group.

Even quarterly peer analysis conducted four times a year may eventually be replaced by something closer to real time, such as a rolling three months that is updated daily, retaining large enough data comparisons for strong statistical significance while introducing much more recent data with which to compare performance. Additionally the ability to analyse one’s own history, and that of a broader peer group, for similar orders in similar market conditions is becoming a key requirement. Questions are being asked as to which algo should I use in current market conditions to minimise cost, or risk, or both? How much confidence do I have in this choice?

Over time, the idea of having separate analysis for each asset class is likely to be replaced by true multi-asset analysis. For example, this could mean looking at the cost of delaying an FX trade for 24 hours or more versus the related trade in an equity or a bond. Currently great effort may be put into shaving a few basis points off the implicit cost of trading an equity position, only for ten times that saving to be given away by an inefficient process for handling the resulting currency trade.

The concept of measuring and managing a trading process through the collection of highly detailed data is here to stay. The same applies to the use of data from previous experience to inform and support decisions regarding future trades. The enormous volumes of data being processed and the rise of the data scientist on the trading desk will also inevitably be an embedded feature. The ways in which an asset manager can enhance their performance and mitigate their risk will vary, but the ability to use TCA and a broad range of analytics to help achieve improved investment performance has never been better. That trend is only likely to accelerate.

©BestExecution 2018

[divider_to_top]

Related Articles

Latest Articles