TCA : RISING TO THE TASK

RISING TO THE TASK.

OneTick_Louis-Lovas

Louis Lovas, Director of Solutions at OneMarketData, LLC explains that the rise in uptake of transaction cost analysis has created a technology challenge, which has to be met.

Technology is making a sweeping transformation in trading styles as the accelerating use of algorithms creates a more competitive environment. Market participants are witnessing a new normal defined by thinning margins, diminishing volumes and uncertain regulatory policy. Paradoxically this translates to the increased use of algorithms as firms look to squeeze alpha out of a diminishing pot. Regulators are also watchful as technological change blankets the industry. They wrestle with fears of systemic risk in the technology fallout manifested in HFT-induced crashes, IPO mishaps and rogue code debacles (e.g. Knight Capital). However, this does not signal an end to profitability and the discovery of alpha for the institutional firm, but rather a changing attitude. Adapting to this new normal has driven a think out-of-the-box approach to penetrate the fog of market structure, seek asset class diversification and explore far off geographies.

While high-speed algorithmic trading has been grabbing the headlines there has been another technology evolution occurring, one that leverages the same state-of-the-art, high-speed computer and software technologies. As a corollary to high frequency traders’ low-latency objectives, asset managers and institutional investors are focused on overall trade performance. This has pushed best execution beyond price to an overall understanding and management of trade performance and opportunity costs, thereby creating the incentive to invest in technology for Transaction Cost Analysis (TCA) simply because it can generate alpha by exposing, and ideally lowering, the cost at which you buy and sell.

Widespread liquidity across lit and dark pools has pushed firms to expand their hunt for alpha across brokers and borders. The disparity across markets is a natural barrier for efficient executions. Once discovered, the goal of stealth execution algorithms is to protect alpha by taking prices within range and working to blend in with other participants’ activity. While every trade has a lasting effect, the goal is to minimise market-moving impact. Measuring the change in a benchmark (e.g. VWAP) before and after order completion
can indicate an algo’s stealth effectiveness. If the market reverts to previous levels, it’s an indication of just how much an order may have influenced
the market. This type of collateral market statistic combined with performance analytics offer a profile of trade executions that add up to an overall view of execution quality.

TCA is intended to provide a number
of quantifiable objectives, first to track and compare broker performance – looking at intra- day efficiencies by measuring and monitoring executions against benchmarks, including arrival price and market price. Second is to identify order exceptions that have become problems – to find the outliers, and highlight the impact of implicit costs or slippage, measured as implementation shortfall and opportunity costs such as crossing the spread.

To achieve the cross-broker, and cross-asset visibility, institutional investors are demanding custom TCA solutions. The same technologies that enable algo-trading are being leveraged for their analytical and data processing abilities. These low-latency engines encompass the core set of capabilities necessary for measuring both real-time and historical trade performance. TCA relies on three fundamental components: data management, analytics and visualisation. These are needed for both traditional post-trade TCA and intra-day, real-time cost analysis.

Time-series tick database and Complex Event Processing (CEP) is the ideal technology mix for the organisation and understanding of data. The ability to consolidate, filter and enrich the raw markets is a hallmark of CEP. Trading diversity creates challenges for price transparency and measuring execution quality against benchmarks. Achieving this requires highly-customisable analytical tooling, technology that customers can easily pilot. And that forces vendors to pay close attention to tooling design to ensure their technology is easy-to-use, robust and scalable. Therefore it makes sense to leverage the same algo-trading technology to build customised systems for cost management.

The role of data management

Data management for TCA is about bringing together disparate data types. It starts with consuming market data – trades and quotes which come in many shapes, sizes and encodings. However, tick data is usually derived from many sources. There are 13 major exchanges in the US alone and 20 across Europe and Asia. The determinants of price discovery, volume, and trading patterns define the structure unique to each market, asset class and geography influenced by participants and current regulation.

Measuring trade performance demands a confidence in the accuracy and quality of pricing data. Tick data management has to deal with cancellations and corrections, consolidating order books across exchanges and applying corporate action price adjustments and symbol name changes. The creation of accurate and reliable price benchmarks for measuring trade performance is only possible with clean, consistent data. By
the same token, capturing all order activity is the cornerstone for understanding trade performance.

Data management for TCA demands access to a broad view of market data. Whether traditional end-of-day analysis or real-time monitoring, historical content along with real-time intra-
day price action play a vital role to establish benchmarks. It starts with consuming market data, often in differing formats and protocols from liquidity suppliers.

Tick data has to be consolidated, coalesced and price-adjusted across providers for true price transparency. The creation of accurate and reliable order-book analytics for meaningful benchmarks is only possible with this scrubbing. This is especially true in markets that do not provide a national
best bid-offer (NBBO) such as foreign exchange. Establishing an accurate Arrival Price is derived from the broader consolidated view of liquidity.

Capturing and time stamping individual orders and their corresponding fills play a vital role. The accuracy of measured execution quality against benchmark prices depends on the technology behind managing the data. This is especially true for real-time analysis where benchmark prices and determinants of notional prices include a historical context.

The analytical advantage

Analytics is central to TCA’s value. Increasing competition and thinning margins has heightened sensitivity to trade costs and has brought into sharper focus the need for versatile tools for analytical TCA. Customisable analysis offers the flexibility to show order and fill performance against a variety of price benchmarks by venue, industry, algorithm or sector. The analysis can then provide insight into the best and worst execution performance.

Yet, analysis of trade executions is necessarily complex and involves comparison of execution prices with collateral market statistics or benchmarks. These measure market participation analysis and implementation shortfall analysis. The following chart (Figure 1. Execution performance by algorithm) shows the results of a historic volume-profile algorithm where an order is carved up into varying clip sizes (Period order quantity) to correspond with a historic volume pattern of past trade activity (Period market volume). This stealth algorithm determines participation rates from a previous timeframe (i.e. previous day, week or month) to best mimic normal market activity.

The result shows execution quality and market impact. As an algorithm works an order it inevitably moves the market price in the direction traded.
As this chart (Fig. 1) shows, the impact is felt in
the rapid rise in fill price (Fill average) against the comparative benchmark (Period market VWAP) up to a mid-point in the day where resistance levels peak. Deeper analysis can also show individual fill performance against a variety of price benchmarks (Arrival, TWAP, etc.).

TCA-OneMktData_Fig1_1350x750

Execution performance analysis as depicted
in this chart (Fig. 1) can represent end-of-day execution quality for completed orders or point-in- time (intra-day) as an order is worked through the day. Benchmark prices should maintain consistent (time) periodicity relative to fill activity. Real-time TCA provides traders with information on costs, quality and potential market impact as it happens, where analytics become actionable information
at the point of trade. Determining these metrics on an intervalised basis offers the ability to adjust an algorithm’s behaviour in real time. Execution strategies and routing logic can be adjusted intelligently in response to outlier conditions; either aggressively or passively in reaction to market conditions or broker behaviour.

The value of visualisation

The third component, visualisation, fashions the analytical metrics into a human readable format. As Figure 1 easily shows, visual representations that plot execution fill rates and participation rates to price-focused benchmarks offer a perspective for easy interpretation.

The purpose of data visualisation is to simplify comprehension of data and promote understanding and insight. The terabyte volumes of market data and order activity can be easily consumed and processed by high-speed technology. However, the single easiest way for our brains to interpret large amounts of information, communicate trends and identify anomalies is to create visualisations that distil, filter and smooth the content.

A visual tier that sits over TCA analytics lets users view the results of complex algorithmic processing. Not only can they get insight into what’s happened (e.g. spotting outliers), it is also possible to forecast what might happen. Rich graphics – scatter plots, line graphs and heat maps – are more concise and offer the means to quickly derive business actions. Figure 2 depicts execution quality as a ratio of order dollar value by an order’s duration in the market. Outliers (in red) are indicated by their distance from the norm. The transition from spreadsheets to charts visually registers comparative values, trends and outliers because they are seen as a whole. An impossible task poring over results in rows and columns is made manageable.

TCA-OneMktData_Fig2_1220x750

So having visual representations that plot order activity, performance metrics and participation analysis against benchmark prices will pinpoint outliers and can vastly improve an order’s final quality.

Trading in transition

Technology is reshaping the trading landscape
as algorithms and low-latency analytics continue
to dominate. But such is not the sole domain of high-speed traders. As the buyside becomes more discerning, demanding improved quality of execution, the sellside is forced to up the ante, offering expanded services to better understand and manage the trade lifecycle. Complex event processing (CEP) and tick data management are the consummate tools that can easily be recast, moulded to
unearth trade performance, a goal that is central
in the investment process as liquidity continues
to be fragmented and fleeting. Now uncovering
the performance of trading behaviour through customised, personalised transaction cost analysis is a critical component to any investors’ profitability. ■

 

Best Execution 2013

 

 

Related Articles

Latest Articles