Article

The Cost of Algorithmic Trading: A First Look at Comparative Performance

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The authors examine transaction costs associated with algorithmic trading, based on a sample of 2.5 million orders, of which one million are executed via algorithmic means. The data permit a comparison of algorithmic executions with a broader universe of trades, as well as across multiple providers of model-based trading services. Algorithmic trading is found to be a cost-effective technique, based on a measure of implementation shortfall. The superiority of algorithm performance applies only for order sizes up to 10% of average daily volume, however. Algorithmic trading performance relative to a commonly used volume participation benchmark also is quite good, although certainty of outcome declines sharply with the size of the order. A clear link between performance and variability in performance relative to both benchmarks appears to be lacking. Although rough equality across providers is observed on average, this equality of performance breaks down quickly as order size grows.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... • Order-splitting benchmarks like TWAP, Almgren-Chriss targets, and VWAP 1 Large asset managers conduct dynamic trading strategies using in-house trading desks and also via principal and agency trading with external brokers. 2 Madhavan (2002) discusses price improvement on order execution relative to VWAP. Domowitz and Yegerman (2005) estimate empirical order-execution costs benchmarked relative to VWAP. 3 Hagströmer and Nordén (2013) and Menkveld (2013) show that high-frequency (HFT) market makers are an important source of intraday liquidity. A common feature of HFT market makers is that they have "very short time-frames for establishing and liquidating positions" (SEC 2010), which is consistent with a zero target inventory level. ...
... The terminal restriction (1.8) gives boundary conditions for these time-varying coefficient functions. This allows us to derive endogenous intraday price and investor holding processes that are 13 If the terminal restriction (1.8) is eliminated, our model becomes simpler because the stock volatility becomes a free parameter and can, for example, be set to be one. The fact that competitive Radner equilibrium models without dividends have free volatilities is well-known; see, e.g., Theorem 4.6.3 in Karatzas and Shreve (1998). ...
Preprint
This paper presents a continuous-time model of intraday trading, pricing, and liquidity with dynamic TWAP and VWAP benchmarks. The model is solved in closed-form for the competitive equilibrium and also for non-price-taking equilibria. The intraday trajectories of TWAP trading targets cause predictable intraday patterns of price pressure, and randomness in VWAP target trajectories induces additional randomness in intraday price-pressure patterns. TWAP and VWAP trading both reduce market liquidity and increase price volatility relative to just terminal trading targets alone. The model is computationally tractable, which lets us provide a number of numerical illustrations.
... This includes Bertsimas and Lo [6], Almgren and Chriss [1,2], Gatheral and Schied [22], Engle et al. [17], Predoiu et al. [37], Boulatov et al. [8] and other research surveyed in Gatheral and Shied 1 Large asset managers conduct dynamic trading strategies using in-house trading desks and also via principal and agency trading with external brokers. 2 Madhavan [32] discusses price improvement on order execution relative to VWAP. Domowitz and Yegerman [13] estimate empirical order-execution costs benchmarked relative to VWAP. 3 Hagströmer and Nordén [26] and Menkveld [34] show that high-frequency (HFT) market makers are an important source of intraday liquidity. A common feature of HFT market makers is that they have "very short time-frames for establishing and liquidating positions" SEC [41], which is consistent with a zero target inventory level. ...
... (1.9) 12 It is possible to extend our model to include noise-trader orders such that the floating stock supply becomes an exogenous stochastic process a(t) + b(t)Z t + c(t)B t where a, b, and c are deterministic functions of time t ∈ [0, 1], B t is the risk-factor Brownian motion in (1.4), and Z t is a Brownian motion independent of all other random variables. 13 If the terminal restriction (1.8) is eliminated, our model becomes simpler because the stock volatility becomes a free parameter and can, for example, be set to be one. The fact that competitive Radner equilibrium models without dividends have free volatilities is well-known; see, e.g., Theorem 4.6.3 in Karatzas and Shreve [28]. ...
Article
Full-text available
This paper presents a continuous-time model of intraday trading, pricing, and liquidity with dynamic TWAP and VWAP benchmarks. The model is solved in closed-form for the competitive equilibrium and also for non-price-taking equilibria. The intraday trajectories of TWAP trading targets cause predictable intraday patterns of price pressure, and randomness in VWAP target trajectories induces additional randomness in intraday price-pressure patterns. TWAP and VWAP trading both reduce market liquidity and increase price volatility relative to just terminal trading targets alone. The model is computationally tractable, which lets us provide a number of numerical illustrations.
... Algorithmic Trading systems typically aim at achieving or beating a specified benchmark with their executions and may be distinguished by their underlying benchmark, their aggressiveness or trading style as well as their adaptation behaviour [15]. The volume-weighted average price (VWAP), which is calculated as the ratio of the value traded and the volume traded within a specified time horizon, commonly serves as a benchmark for (automated) trading [10]. Research on aggressiveness of orders is e.g. ...
... Empirical research found the execution quality of algorithms to be inferior to executions handled by a broker. Nevertheless, this underperformance can be overcompensated by the fact that algorithms can be offered at lower fees than human order handling [10]. Algorithms can be offered to customers at lower fees, as no (expensive) human traders are involved. ...
Article
As successful algorithmic trading systems constitute a priceless value to their operators, their procedures of trading are kept secret and only little is known about their adaptation behavior to current market developments. Based on a unique dataset provided by Deutsche Boerse AG the activity of computerized traders is analyzed. As the dataset provides high-precision timestamps a thorough analysis of submission, deletion and execution activities is enabled. Being able to distinguish algorithmic and non-algorithmic orders, empirical evidence on the different structures of algorithmic and non-algorithmic order flow is presented.
... Truly, over the last several years, speed of internet and news dissemination have improved. As in Domowitz and Yegerman (2005), the advent of algorithmic trading has provided institutional traders with the option of efficient execution. Anand et al. (2012) find broker selection is a fundamental dimension of institutional returns and persistence in execution performance is important enough to explain a significant portion of institutional investor performance. ...
Article
Full-text available
This study examines the performance of foreign investors’ (FI) portfolio alphas in the Indian stock market. We adopt calendar-time portfolio approaches on a dataset spanning the period 2003-2019. Holdings-based portfolios do not generate abnormal performance, while transactions-based portfolios show buys of foreign investors underperform sells. We investigate whether Direct Market Access (DMA) can help decrease the underperformance of FI. Though DMA reduces the underperformance of FI portfolios on shorter holding periods, trade informativeness remains negative across all trading holding periods, from one day to one year. The trading behavior continues to be positive feedback, indicating informational disadvantage to FI.
... They report a beneficial effect of AT on liquidity, noting that AT increases the realized spreads for market makers, indicating that market makers employ algorithms to trade both sides of the book. An earlier study, Domowitz and Yegerman [2005], studies a cross-section of such algorithms from different providers. Adverse selection for market makers, when using AT, can quickly build up costs when liquidity vanishes or suddenly decreases on one side of the limit order book. ...
Preprint
Full-text available
Market information events are generated intermittently and disseminated at high speeds in real-time. Market participants consume this high-frequency data to build limit order books, representing the current bids and offers for a given asset. The arrival processes, or the order flow of bid and offer events, are asymmetric and possibly dependent on each other. The quantum and direction of this asymmetry are often associated with the direction of the traded price movement. The Order Flow Imbalance (OFI) is an indicator commonly used to estimate this asymmetry. This paper uses Hawkes processes to estimate the OFI while accounting for the lagged dependence in the order flow between bids and offers. Secondly, we develop a method to forecast the near-term distribution of the OFI, which can then be used to compare models for forecasting OFI. Thirdly, we propose a method to compare the forecasts of OFI for an arbitrarily large number of models. We apply the approach developed to tick data from the National Stock Exchange and observe that the Hawkes process modeled with a Sum of Exponential's kernel gives the best forecast among all competing models.
... Unsurprisingly, numerous research efforts have been dedicated to understating how this intense automation of trading impacts market dynamics (Chaboud et al., 2014). Previous studies have found that algorithmic trading improves market liquidity (Hendershott et al., 2011) and facilitates price discovery (Carrion, 2013;Brogaard et al., 2014;Hirschey, 2021), also contributing to decreasing trading costs (Domowitz and Yegerman, 2005;Kim, 2010). Nonetheless, it is important to underline that these positive externalities have been validated during "normal" market evolution (SEC, 2020), whereas algorithmic trading can diminish liquidity and exacerbate volatility during distressed markets, with dire economic consequences (Treleaven et al., 2013). ...
Article
Full-text available
Generating reliable trading signals is a challenging task for financial market professionals. This research designs a novel decision-support system (DSS) for algorithmic trading and applies it empirically on two main crude oil markets. The novel DSS enables investors to interactively build algorithmic trading strategies by fine-tuning various predefined integral elements. The main novelty of this study is the forecasting procedure encompassed into the DSS, and the flexibility of the system that allows users to adjust the parameters of the predictive model embedded and the length of the recursive window, based on individual preferences and the trade-off between prediction accuracy (increased computing intensity) and computing efficiency. The DSS also introduces two new steps into a standard fixed-length recursive window out-of-sample forecasting technique. It first estimates a universe of candidate models on each rolling window and then applies a fitness function to optimize model fit and produce more reliable one-step predictions from each recursive forecasting origin. Point-forecasts are subsequently fitted into algorithmic trading strategies, whose absolute and risk-adjusted performance is finally evaluated by the DSS. In implementing the DSS-based algorithmic trading strategies, the system performs 60760 estimations and 1736 optimizations for each market. In robustness checks, an additional number of 8 DSS’s are designed and evaluated. The results confirm the superiority of DSS-based algorithmic trading strategies in terms of predictive ability and investment performance for both markets. Hence, owing to its performance, flexibility and generalizability, the DSS is an important tool for prediction, decision-making, and algorithmic trading in the financial markets.
... For order sizes up to 10% of daily average trading volume, algorithmic trading has been found to be a cost-effective method (Domowitz and Yegerman, 2005). The auto quote power of algorithms as a competitive advantage disappears when new entrants come with better models. ...
Article
Algorithmic trading has made a paradigm shift in Indian stock market. Popularity of algorithmic trading is gaining momentum among Indian traders and investors due to technological advancement. The objective of this paper is to apply Interpretive Structural Modeling (ISM) to develop a hierarchical structure among the key barriers of algorithmic trading in India. 11 barriers have been identified through the literature review which are then validated for significance, using a structured questionnaire, from the experts. ISM approach has been utilized to rank the barriers and analyze their mutual interactions. Subsequently, MICMAC analysis was conducted to reveal dependence and driving power of these barriers. MICMAC analysis also elicits the relative importance and interdependence between these barriers from the Indian context. A list of relevant barriers significantly helps the practitioners to take right decision while adopting algorithmic trading. The study has importance in the Indian context due to scarcity of research in this area.
... The widespread reliance on such solutions engendered new empirical patterns observable across major financial markets [24]. Dealing with pre-specified tasks at hand, automated trading systems have been leveraging execution speed while simultaneously attempting to optimize objective function be it the minimization of price volatility or transaction costs [25]. ...
Article
Full-text available
The paper postulates that enhanced informational efficiency and signal processing capacity, which have characterized the evolution of commodity markets’ architecture during the last two decades, have rendered commodity prices more robust with respect to external shocks. Our econometric analysis of times series over 2001–2015 revealed a persistent decline in the responsiveness of crude oil prices to inflows of information concerning potentially supply-disruptive events. International news on terrorist attacks involving damage to oil infrastructure including those occurring in proximity to oil extraction sites, political unrest, and conflicts of rivaling factions are all documented to exercise a decreasing impact on oil price volatility both over short and medium observation spans. The previously observed spikes in oil prices accompanying similar disruptive events in OPEC countries are also shown to flatten over time as price sensitivity to information shocks declines. The discovered weakening of market response becomes more pronounced from the mid-2000s, which corresponds to the period of rapid algorithmization of commodity trading.
... For order sizes up to 10% of daily average trading volume, algorithmic trading has been found to be a cost-effective method (Domowitz and Yegerman, 2005). The auto quote power of algorithms as a competitive advantage disappears when new entrants come with better models. ...
Conference Paper
Algorithmic trading has made a paradigm shift in Indian stock market. Popularity of algorithmic trading is gaining momentum among Indian traders and investors due to technological advancement. The purpose of this paper is to apply Interpretive Structural Modeling to develop a hierarchical structure among the key barriers of algorithmic trading in India. Eleven barriers have been identified through the literature reviews which are then validated for significance, using a structured questionnaire, from the domain experts. Interpretive structural modeling (ISM) approach has been utilized to rank the barriers and analyse their mutual interactions. Subsequently, MICMAC analysis was conducted to elucidate dependence and driving power of these barriers. MICMAC analysis also elicits the relative importance and interdependence between these barriers from Indian context. For Practitioners, a list of relevant barriers is indications to take a decision to adopt Algorithmic trading. The study has importance from Indian context due to scarcity of research in this area. For Researchers, this methodology facilitates to further carry out exploratory studies by identifying the factors and focus on their interactions through hierarchical structures. The proposed model developed through qualitative ISM modeling technique has been accomplished from the perspectives of capital market experts and brokers in algorithmic trading in India. Because of the novelty of the research we presume this may contribute significantly to the Literature.
... A number of studies have suggested that AT and HFT could follow VWAP strategies to optimize the timing of their trades (e.g. Domowitz and Yegerman, 2005;Hendershott et al., 2011;Easley, Lopez de Prado, and O'Hara, 2012). Carrion (2013) uses end-of-day VWAP metrics to show that, ex post, HFT times the market successfully. ...
Article
Does Algorithmic Trading (AT) exacerbate price swings in turbulent markets? We find that stocks with high AT experience less price drops (surges) on days when the market declines (increases) for more than 2%. This result is consistent with the view that AT minimizes price pressures and mitigates transitory pricing errors. Further analyses show that the net imbalances of AT liquidity demand and supply orders have smaller price impacts compared to non-AT net order imbalances and algorithmic traders reduce their price pressure by executing their trades based on the prevailing volume-weighted average prices.
... However, AT is insufficient to make a profit on every decision at every trading moment because the financial market is highly complicated ( Domowitz & Yegerman, 2006;Hu et al., 2015;Yadav, 2015 ). In their review of its limitations, Hu et al. (2015) find that AT poses a challenge in predicting future market trends. ...
Article
We study trading systems using reinforcement learning with three newly proposed methods to maximize total profits and reflect real financial market situations while overcoming the limitations of financial data. First, we propose a trading system that can predict the number of shares to trade. Specifically, we design an automated system that predicts the number of shares by adding a deep neural network (DNN) regressor to a deep Q-network, thereby combining reinforcement learning and a DNN. Second, we study various action strategies that use Q-values to analyze which action strategies are beneficial for profits in a confused market. Finally, we propose transfer learning approaches to prevent overfitting from insufficient financial data. We use four different stock indices—the S&P500, KOSPI, HSI, and EuroStoxx50—to experimentally verify our proposed methods and then conduct extensive research. The proposed automated trading system, which enables us to predict the number of shares with the DNN regressor, increases total profits by four times in S&P500, five times in KOSPI, 12 times in HSI, and six times in EuroStoxx50 compared with the fixed-number trading system. When the market situation is confused, delaying the decision to buy or sell increases total profits by 18% in S&P500, 24% in KOSPI, and 49% in EuroStoxx50. Further, transfer learning increases total profits by twofold in S&P500, 3 times in KOSPI, twofold in HSI, and 2.5 times in EuroStoxx50. The trading system with all three proposed methods increases total profits by 13 times in S&P500, 24 times in KOSPI, 30 times in HSI, and 18 times in EuroStoxx50, outperforming the market and the reinforcement learning model.
... Our main objective is to introduce and empirically examine a new measure of realised volatility that includes the volume associated with the price of each trade, namely the volume weighted volatility (VWV or ˆV WV σ ), or alternatively demand-based volatility. 4 The related volume weighted average price (VWAP) has been popular with institutional traders for a number of years as a benchmark for trading success over the day, with the objective of generating an average buying price for the daily trading below the VWAP, or an average selling price above the VWAP (see Madhavan, 2002;Bessembinder, 2003;Kissell and Malamut, 2005;Hobson, 2006;Domowitz and Yegerman, 2006;Sofianos, 2006;Hu, 2007). Moreover, Ting (2006) shows that the VWAP is less noisy than using the closing price, 5 thereby yielding a better approximation of the unobservable efficient price. ...
Article
We introduce a new conceptually superior realised volatility estimator, volume weighted volatility (VWV), which effectively measures demand-based volatility, rather than only measuring the variability of a price series. We compare the VWV to other return-And range-based measures using the stock index futures, with our results supporting the empirical uniqueness of VWV. First, regressions show that the VWV provides unique information. Second, VWV is (only) weakly associated with other volatility measures for the smallest four volatility quintiles. Third, correlograms illustrate that the VWV is less persistent than the other measures, leading to more unique volatility values. Finally, the VWV most closely approximates the normal distribution.
... Our main objective is to introduce and empirically examine a new measure of realised volatility that includes the volume associated with the price of each trade, namely the volume weighted volatility (VWV or ˆV WV σ ), or alternatively demand-based volatility. 4 The related volume weighted average price (VWAP) has been popular with institutional traders for a number of years as a benchmark for trading success over the day, with the objective of generating an average buying price for the daily trading below the VWAP, or an average selling price above the VWAP (see Madhavan, 2002;Bessembinder, 2003;Kissell and Malamut, 2005;Hobson, 2006;Domowitz and Yegerman, 2006;Sofianos, 2006;Hu, 2007). Moreover, Ting (2006) shows that the VWAP is less noisy than using the closing price, 5 thereby yielding a better approximation of the unobservable efficient price. ...
... Engle, Russell, and Ferstenberg (2012) use execution data from Morgan Stanley to study the trade-offs between algorithm aggressiveness and the mean and dispersion of execution costs. Domowitz and Yegerman (2006) Chaboud, Chiquoine, Hjalmarsson, and Vega (2014) study ATs in the foreign exchange markets on the electronic broking system (EBS) in 3 major currency pairs: euro-dollars, dollar-yen, euroyen. ...
... A final additional cost that may be unappreciated is transaction costs, which include commissions, bid-ask spread, and trade impact. There is a tradeoff whereby small traders with low trade impact generally have high commissions, cross the bid-ask spread to transact, and do not have sufficient experience or scale trading to do so in an optimal manner (algorithmic trading optimizes the size, frequency and limit order specifications, currently done thru computer-based execution of equity orders via direct market-access channels, Domowitz and Yegerman (2005)), while large institutions generate small commissions and trade based on strategies optimized through extensive trial-and-error, but in turn have a larger trade-impact due to their larger size. As brokers optimize over these three costs in a relationship, they must be considered simultaneously to get at the true trading costs. ...
Article
This paper presents a utility function refinement that explains the empirical irrelevance of risk to returns. The key is that in an environment where people care about relative wealth, risk is a deviation from what everyone else is doing, and therefore becomes like diversifiable risk in the CAPM, avoidable. Using an equilibrium or an arbitrage argument, a relative status utility function creates a zero risk-return correlation via a market model that implies a zero risk premium. This approach is described as being theoretically consistent, intuitive and a better description of the data.
... Engle, Russell, and Ferstenberg (2007) use execution data from Morgan Stanley algorithms to study the tradeoffs between algorithm aggressiveness and the mean and dispersion of execution cost. Domowitz and Yegerman (2005) study execution costs of ITG buy-side clients, comparing results from different algorithm providers. ...
Article
Full-text available
A seismic shift is taking place in the United States securities markets. The fault lines have been present for quite some time; however, it is only now, in the last few years that the ramifications of these displacements have been felt. The traditional approach to investing has gone from a focus on investing – namely examining companies to determine whether they will be a good long-term investment – to examining the markets as a whole. Nowhere is this shift more apparent than in the rise and increasing prevalence of quantitative trading models. As a result, there is now a disconnect between the markets themselves and the companies that are traded on the markets. Oftentimes, what a company does or does not do matters very little to whether that company’s stock should be bought or sold. Instead, whether that company’s stock is a good “buy” amounts more to how that stock is doing and how the market is behaving. This shift has broad implications for retail and institutional investor behavior, regulatory structures and the role of government in oversight and, if unchecked, the global economy at large. The ever-changing advances in computer technology have fostered a new breed of trading that is much more reliant on quantitative mathematics than on corporate analysis. This article explores algorithmic trading and assesses the impact of its dominance on regulation of the securities markets and their stability in the global economy.
... Their findings indicate that AT improves liquidity and enhances the informativeness of quotes". Similarly, [12] found AT to be a cost-effective technique for large orders. ...
Conference Paper
Algorithmic trading (AT) strategies aim at executing large orders discretely, in order to minimize the order's impact, whilst also hiding the traders' intentions. Most AT evaluation methods range from running the AT strategies against historical data (back testing) to evaluating them on simulated markets. The contribution of the work presented in this paper is twofold. First we investigated different types of agent-based market simulations and suggested how to identify the most suitable market simulation type, based on the specific market model to be investigated. Then we proposed an extended model of the Bayesian execution strategy. We implemented and assessed this model using our tool AlTraSimBa (Algorithmic Trading Simulation Back testing) against the standard Bayesian execution strategy and naive execution strategies, for momentum markets and random markets. The results revealed useful insights on the trade-offs between the frequency of decision making and more complex decision criteria, on one side, and the negative outcome of lost trading on the agents' side due to them not participating actively in the market for some of the execution steps.
... either best bid or ask price -are taken to be the decision price (also frequently referred to as the "arrival price"). For further details see Johnson (2010) and Domowitz and Yegerman (2005). ...
Article
Full-text available
We develop a sequential trade model of Iceberg order execution in a limit order book. The Iceberg-trader has the freedom to expose his trading intentions or (partially) shield the true order size against other market participants. Order exposure can cause drastic market reactions ("market impact") in the end leading to higher transaction costs. On the other hand the Iceberg trader faces a loss-in-priority when he hides his intentions, as most electronic limit order books penalize the usage of hidden liquidity. Thus the Iceberg-trader is faced with the problem to find the right trade-off. Our model provides optimal exposure strategies for Iceberg traders in limit order book markets. In particular, we provide a range of analytical statements that are in line with recent empirical findings on the determinants of trader's exposure strategies. In this framework, we also study the market impact also market impact of limit orders. We provide optimal exposure profiles for a range of high-tech stocks from the US S&P500 and how they scale with the state-of-the-book. We finally test the Iceberg's performance against the limit orders and find that Iceberg orders can significantly enhance trade performance by up to 60%.
... The trader is then called the price setter as it can manipulate the trading price through its trading behavior in the market. For example, institutional traders which heavily rely on algorithmic trading or automatic trading strategies most likely belong to this type of trader, see (Domowitz & Yegerman, 2005). As algorithmic trading starts to prevail in global electronic trading platforms, it is thus reasonable to include the formulation of the price setter when developing the financial market model for the electronic trading platform. ...
Article
Under the background of the electronic security trading platform Xetra operated by Frankfurt Stock Exchange, we consider the Xetra auction market system (XAMS) from `bottom-up', which the interaction among heterogeneous traders and Xetra auction market mechanism generates non-equilibrium price dynamics. First we develop an integrative framework that serves as general guidance for analyzing the economic system from `bottom-up' and for seamlessly transferring the economic system into the corresponding agent-based model. Then we apply this integrative framework to construct the agent-based model of XAMS. By conducting market experiments with the computer implementation of the agent-based model of XAMS, we investigate the role of the price setter who assumes its trading behavior can manipulate the market price. The main finding is that the introduction of the price setter in the setting of XAMS improves market efficiency while does not significantly influence price volatility of the market.
... ppon (2011) for models where investors compete on their trading algorithm's speed. Monitoring also has important cross market competition implications as in Foucault and Menkveld (2008) and others. use execution data from Morgan Stanley algorithms to study the tradeoffs between algorithm aggressiveness and the mean and dispersion of execution cost. Domowitz and Yegerman (2005) study execution costs of ITG buy-side clients, comparing results from different algorithm providers. Several recent studies use comprehensive data on AT. Chaboud, Chiquoine, Hjalmarsson, and Vega (2009) study the development of AT in the foreign exchange market on the electronic broking system (EBS) in three currency pairs euro-dollar, ...
Article
We examine the role of algorithmic traders (AT) in liquidity supply and demand in the 30 DAX stocks on the Deutsche Boerse in January 2008. AT represent 52% of market order volume and 64% of nonmarketable limit order volume. AT more actively monitor market liquidity than human traders. AT consume liquidity when it is cheap, i.e., when the bid-ask quotes are narrow, and supply liquidity when it is expensive. When spreads are narrow AT are less likely to submit new orders, less likely to cancel their orders, and more likely to initiate trades. AT react more quickly to events and even more so when spreads are wide.
Article
Full-text available
Abstract Introduction Algorithmic trading is a process that involves executing a large number of orders using electronically automated and pre-programmed trading instructions. These instructions account for various factors such as price, timing, and volume. The study aims to explore the impact and preferences for algorithmic trading among investors. Objectives To understand the operational processes and mechanics of algorithmic trading. To assess the reasons for the preference for algorithmic trading among investors. To analyse the impact of demographic variables on the frequency of trading and awareness of algorithmic trading. Research Methodology The research utilized a descriptive research design. Data collection was conducted through questionnaires and interviews. A non-probability sampling method was employed, and responses were gathered from 124 participants in Ahmedabad. Findings Algorithmic trading allows investors to use predefined strategies or create their own, which is a key factor in its preference. The study found that algorithmic trading is preferred due to its reduced human error and perceived safety. It was revealed that 55% of individuals are aware of algorithmic trading. Demographic variables were found to have a 34% impact on the frequency of trades conducted using algorithmic trading and a 29.2% impact on the awareness of algorithmic trading. The research indicates that investors are likely to prefer algorithmic trading in the future due to its safety, security, and the availability of various trading strategies.
Chapter
This strategy focuses on backtesting and algorithmic trading by applying the hill climbing method to find liquid levels at support and resistance levels. The strategy entails evaluating price movements and setting the threshold value for trading at these key levels. By closely studying the market, the technique seeks to find optimal entry and exit points based on recognized support and resistance levels. This method allows for a systematic approach to trading decisions using the concepts of liquidity and market dynamics. Implementing this technique entails testing historical data to determine its effectiveness and tweaking the threshold values for higher accuracy. By introducing the hill-climbing method into the backtesting process, traders may make informed decisions and perhaps increase their algorithmic trading performance.
Article
Using transactions-based calendar time (TBCT) portfolio analysis, we investigate informativeness of trades of investor categories, namely institutions, proprietary traders, and retail clients. We find that trade informativeness is positive for institutional and negative for retail-client investors. The informativeness of liquidity-demanding trades are less than the informativeness of liquidity-supplying trades for all trading groups, over both long and short horizons. We also find that institutions are benefitted by algorithmic executions compared to manual executions and this benefit is elevated on days of high volume and volatility. Proprietary algorithmic traders (high-frequency traders) generate positive alpha for their trades only from their liquidity-supplying trades.
Chapter
Trading decisions in financial markets can be supported by the use of trading algorithms. To evaluate trading algorithms and to generate orders to be executed on the stock exchange trading systems are used. In this chapter, we define the individual investors’ requirements on a trading system, and analyze 17 trading systems from an individual investor’s point of view. The results of our study point out that the best alternative for an individual investor is not one single trading system, but a combination of two different classes of trading systems.
Article
Full-text available
Based on the multi-agent model, an artificial stock market with four types of traders is constructed. On this basis, this paper focuses on comparing the effects of liquidation behavior on market liquidity, volatility, price discovery efficiency and long memory of absolute returns when the institutional trader adopts equal-order strategy, Volume Weighted Average Price (VWAP) strategy and Implementation Shortfall (IS) strategy respectively. The results show the following: (1) the artificial stock market based on multi-agent model can reproduce the stylized facts of real stock market well; (2) among these three algorithmic trading strategies, IS strategy causes the longest liquidation time and the lowest liquidation cost; (3) the liquidation behavior of institutional trader will significantly reduce market liquidity, price discovery efficiency and long memory of absolute returns, and increase market volatility; (4) in comparison, IS strategy has the least impact on market liquidity, volatility and price discovery efficiency, while VWAP strategy has the least impact on long memory of absolute returns.
Article
We investigate the relative roles of limit orders from proprietary algorithmic traders (PAT), agency algorithmic traders (AAT) and non-algorithmic traders (NAT) in the discovery of security prices in National Stock Exchange (NSE) of India. Our results suggest that PAT’s limit orders are most informative, however, AAT and NAT still contribute substantially to price discovery. Contrary to popular belief that algorithmic traders are only interested in large stocks, we find that two algorithmic trading groups together contribute nearly 30%–40% of the price discovery in both small and medium capitalization stocks whereas their combined share of trading volume only ranges between 10%–15% in these stocks. We see that price discovery contribution of PAT’s limit orders increase when we conduct our analysis at longer time gaps. This finding is evidence against the popular notion that HFTs only make prices informative in the very short run. We also find that LOB imbalance of PAT is most informative among three groups of traders and find no evidence to support the popular notion that fast traders often use limit orders to “spoof” market participants about future price movements. However, much of the informativeness of PAT LOB imbalance withers away when PAT places orders opposite to rest of the market suggesting that rather than generating information PAT possibly uses information produced by others.
Chapter
Although both media and the public seem to discuss the perceived dangers and threats of electronic trading only since the US flash crash in 2010, in reality, the shift towards electronic trading has been a long-lasting evolution. Often, the starting point of electronic trading is said to be the year 1971, when the National Association of Securities Dealers Automated Quotation (Nasdaq) became the first electronic stock market displaying quotes for 2500 over-the-counter securities. A significant migration process from over-the-counter and traditional floor trading to fully electronic markets took place on both sides of the Atlantic between the late 1970s and the mid-1990s. Starting from the electronification of major international exchanges, significant technological innovations emerged that successively walked up the value chain and led to a far-reaching automation of trading processes; first at Sell Side institutions and in a next step by their customers, i.e., Buy Side firms.
Article
Prediction of stock prices using various computer programs is on rise. Popularly known in the field of finance as algorithmic trading, a radical transformation has taken place in the field of stock markets for decision making through automated decision making agents. Machine learning techniques can be applied for predicting stock prices. This paper attempts to study the various stock market forecasting processes available in the forecasting plugin of the WEKA tool. Twenty experiments have been conducted on twenty different stocks to analyse the prediction capacity of the tool.
Article
Within the securities trading industry recent technological innovations enable Institutional Investors to self-directed trading and thus lead to a reassessment of their intermediation relationships. This may yield to an in-sourcing of trading activities by buy-side organization. Scientific literature outlines advantages and disadvantages of some of them but no empirical investigations are reported concerning drivers for the adoption or refusal of such innovations. Against the background of the increasing market share of technologies, such as Algorithmic Trading, this conceptual paper introduces a model that aims at closing this gap, by identifying the drivers and inhibitors for the adoption of new technology-based execution opportunities. To account for the organizational context of the survey and the meta-character of the innovation, the model incorporates the following modifications of TAM: First, a generalization towards TRA and TPB in order to account for competitive pressure and inhibitors. Second, the integration of TTF, as it is said to exhibit better results for work-related tasks and thus enables the model to account for the fit between the technology and the given task requirements. Finally, a perceived risk construct is added, as in an organizational context the adoption of innovations is associated with risks.
Article
A sampling frequency was designed according to the analysis of pressure signal for the purpose of leak detection of fluid pipeline. A united filter method including low-pass filter, notch filter and wavelet filter was proposed, and then the detailed implementary steps and parameters of the united filters were given. The united filters could filter the noise in pressure signal and recover the amplitude and phase of abnormal pressure which could improve the accuracy of leak detection effectively. At last, two cases were used to verify the united filters, and the results showed that the proposed filter method is effective.
Article
Full-text available
Delineation and Market Relevance of High-Frequency-TradingHigh-Frequency-Trading (HFT) has become quite prominent in public and academia after the May 6th, 2010 “Flash Crash” and in the context of the recent financial crisis. However, the public discussion is mostly based on generalizations instead of a well founded research-based point of view, and the terminology of electronic trading is often used indiscriminately. Literature defines HFT as a subset of Algorithmic Trading. Therefore, and to foster the understanding of these terms, we first describe Algorithmic Trading. Based on this definition we will then specify HFT.Algorithmic Trading in the broadest sense is the generation and submission of buy and sell orders by an algorithm (Prix et al. 2007, p. 1). An algorithm in this context is defined as a set of instructions which processes market data in real-time and submits orders to a single or multiple market places without human intervention. Narrow definitions require the algorithm ...
Conference Paper
In this paper we analyze US stock market after-hours trading. This is a trading outside the regular trading hours of 09:30-16:00. During this time the market is thinly traded and the possibility of price (in)efficiency arises. Price spikes up or down sometimes reaching several percent can be observed. This pattern can be exploited by a simple automated trading strategy that buys low if market drops and closes the position high on the next day when the market reopens. An empirical study using the most liquid stocks and exchange traded funds listed in NASDAQ and NYSE exchanges for the years 2000 to 2012 is conducted. We create a portfolio of ~400 automated trading strategies. The average portfolio performance is a 23 percent per annum with a Sharpe ratio of 4. This shows that prices are inefficient during after-hours trading in the US stock market. To test for significance we run an out-of-sample test from 2012 onwards.
Article
Financial markets are characterised by high levels of complexity and non-linearity. Information systems have often been applied to support investors by forecasting price changes in securities markets. In addition to the asset price, liquidity represents another financial variable that has a high relevance for investors because it constitutes a main determinant of total transaction costs. Previous research has shown that the level of liquidity is affected by the publication of corporate disclosures. To derive an optimal order execution strategy that minimises the transaction costs, investors as well as automated trading engines must be able to anticipate changes in the available market liquidity. However, there is no research on how to forecast the impact of corporate disclosures on market liquidity. Therefore, we propose an IT artefact that allows automated trading engines to appropriately react to news-related liquidity shocks. The system indicates whether the publication of a regulatory corporate disclosure will be followed by a positive liquidity shock, i.e., lower transaction costs compared to historical levels. Utilising text mining techniques, the content of the corporate disclosures is analysed to generate a trading signal. Furthermore, the trading signal is evaluated within a simulation-based use case that considers English and German corporate disclosures and is shown to be of economic value.
Article
Full-text available
In this study we evaluate the effectiveness of augmenting numerical market data with textual-news data, using data mining methods, for forecasting stock returns in intraday trading. Integrating these two sources of data not only enriches the information available for the forecasting model, but it can potentially capture joint patterns that may not otherwise be identified when each data source is employed separately. We start with market data and then gradually add various textual data representations, going from simple representations, such as word counts, to more advanced representations involving sentiment analysis. To find the incremental value of each data representation, we build an end-to-end recommendation process including data preprocessing, modeling, validation, trade recommendations and economic evaluation. Each component of the modeling process is optimized to remove human bias and to allow us to impartially compare the results of the various models. Additionally, we experiment with several forecasting algorithms to find the one that yields the “best” results according to a variety of performance criteria. We employ data representation procedures and modeling improvements beyond those used in previous related studies. The economic evaluation of the results is conducted using a simulation procedure that inherently accounts for transaction costs and eliminates biases that have potentially affected previous related data-mining studies. This research is one of the largest-scale data-mining studies for evaluating the effectiveness of integrating market data with textual news data for the purpose of stock investment recommendations. The results of our study are promising in that they show that augmenting market data with advanced textual data representation significantly improves stock purchase decisions. Best results are achieved when the approach is implemented with a nonlinear neural network forecasting algorithm.
Article
Algorithmic trading has been blamed for an increasing level of volatility in a number of financial markets. Adoption and deployment of algorithmic trading systems has increased and this is likely to continue, as regulation, competition and innovation drive the development of advanced technological tools. Expert and intelligent systems provide the mechanics for both reacting to and affecting a financial market that is now significantly faster and operating across multiple time zones and markets. Surprisingly, much of this innovation has escaped discussion within the Information Systems research community. This paper explores this growing arena by engaging with senior practitioners in the industry and using interviews and grounded theory (GT) analysis to uncover their adoption concerns. The paper generalises these issues within a framework and guidelines aimed at supporting algorithmic trading system adoption, deployment and development.
Chapter
More and more quantitative models have started to find their way into the investment management industry. Typically, the quantitative efforts at firms start with risk management functions and portfolio optimization. Needless to say, there are many areas beyond these were quantitative methods are valuable. First, for example, the usage of equity derivatives allow investors to change the risk and return characteristics of their equity investment portfolio. Second, in international portfolios, beyond asset specific risk, portfolio managers are faced with currency risk and have to decide on how much of this risk they want to hedge. Quantitative optimization tools can be very helpful in dealing with both of these issues. Quantitative forecasting tools are increasing in importance. Momentum, reversals, and regression-based strategies have traditionally been the “bread and butter” for many traders, but other more sophisticated econometric techniques are now being used more broadly such as vector autoregressive models, dynamic factor and state space models, and cointegration. A very important part of any modeling effort is the model selection process and the implementation of a rigorous testing methodology in order to avoid model overfitting and data snooping biases. Algorithmic trading (that is, trade execution based on quantitative models) is now a more widespread tool throughout the investment management industry and most major brokers offer these services in some form. Keywords: financial derivatives; currency management; hedge ratios; robust estimation and optimization; investment management; benchmark; forecasting; transaction costs; trade execution; algorithmic trading
Article
The widespread use of algorithmic trading has led to the question of whether the most suitable algorithm is always being used. We propose a practical framework to help traders qualitatively characterize algorithms as well as quantitatively evaluate comparative performance among vari-ous algorithms. We demonstrate the applicability of the quantitative model using historical data from orders executed through ITG Algorithms.
Article
We use transaction-level data and decompose the US equity premium into day (open to close) and night (close to open) returns. We document the striking result that the US equity premium over the last decade is solely due to overnight returns; the returns during the night are strongly positive, and returns during the day are close to zero and sometimes negative. This day and night effect holds for individual stocks, equity indexes, and futures contracts on equity indexes and is robust across the NYSE and Nasdaq exchanges. Night returns are consistently higher than day returns across days of the week, days of the month, and months of the year. The effect is driven in part by high opening prices which subsequently decline in the first hour of trading.
Article
Actual investment performance reflects the underlying strategy of the portfolio manager and the execution costs incurred in realizing those objectives. Execution costs, especially in illiquid markets, can dramatically reduce the notional return to an investment strategy. This paper examines the interactions between cost, liquidity and volatility, and analyses their determinants using panel data for 42 countries from September 1996 to December 1998. We document wide variation in trading costs across countries; emerging markets, in particular, have significantly higher trading costs even after correcting for factors such as market capitalization and volatility. We analyse the inter-relationships between turnover, equity trading costs and volatility, and investigate the impact of these variables on equity returns. In particular, we show that increased volatility, acting through costs, reduces a portfolio's expected return. However, higher volatility reduces turnover also, mitigating the actual impact of higher costs on returns. Further, turnover is inversely related to trading costs, providing a possible explanation for the increase in turnover in recent years. The results demonstrate that the composition of global efficient portfolios can change dramatically when cost and turnover are taken into account. Copyright 2001 by Blackwell Publishers Ltd.
The Search for the Ultimate Trade: Market Players in
  • Randy Grossman
Grossman, Randy, " The Search for the Ultimate Trade: Market Players in
Institutional Equity Trading in America: A Buy-Side Perspective The Tabb Group
  • Larry Tabb
Tabb, Larry, " Institutional Equity Trading in America: A Buy-Side Perspective, " consulting report, The Tabb Group, April 2004.
Best Execution for Buy-Side Firms: A Challenging Issue, A Promising Debate, A Regulatory Challenge
  • Jean-Renẻ Giraud
Giraud, Jean-Renẻ, " Best Execution for Buy-Side Firms: A Challenging Issue, A Promising Debate, A Regulatory Challenge, " consulting report, Edhec-Risk Advisory, June 2004.
The implementation Costs of Algorithmic Trading
QSG, " The implementation Costs of Algorithmic Trading, " consulting report, Quantitative Services Group LLC., December 2004.