## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

In this paper, we present a new methodology for modelling intraday volume, which allows for a reduction of the execution risk in VWAP (Volume Weighted Average Price) orders. The results are obtained for all the stocks included in the CAC40 index at the beginning of September 2004. The idea of considered models is based on the decomposition of traded volume into two parts: one reflects volume changes due to market evolution; the second describes the stock specific volume pattern. The dynamic of the specific volume part is depicted by ARMA and SETAR models. The implementation of VWAP strategies allows some dynamic adjustments during the day in order to improve tracking of the end-of-day VWAP.

To read the full-text of this research,

you can request a copy directly from the authors.

... The hypotheses of the thesis are 1. The proposed regression method δ-SVR leads to decreased number of support vectors and improved flexibility of SVM. ...

... For p = 70, the SVC with knowledge about a margin has smaller generalization error for every test except one, Table III. 1. The testing performance gain varies from 0% to 3%. ...

... A new, third method was proposed by me in [32]. 1. Solve 2 parameter subproblems analytically (SMO algorithm). ...

In this thesis, we propose a regression method, called δ support vector regression (δ-SVR), that replaces a regression problem with binary classification problems which are solved by support vector machines (SVM). The results indicate that δ-SVR achieves comparable to ε-insensitive support vector regression (ε-SVR) generalization error, fewer support vectors, and smaller generalization error over different values of ε and δ. We propose also a method called φ support vector classification (φ-SVC) for incorporating knowledge about margin of an example for classification and regression problems and two applications: decreasing the generalization error of reduced models, and incorporating the nonlinear constraint to the SVM optimization problem. Moreover, we propose two SVM implementation improvements. The first one, called heuristic of alternatives (HoA), regards a new heuristic for choosing parameters to the working set. The second one, called sequential multidimensional subsolver (SMS), regards a new method of solving subproblems with more than two parameters. Finally, we present an application of proposed methods for executing orders on exchanges.

... Moving average automatic trading strategies set buying and selling orders depending on the position of the average price for a given period with respect to the current market price (see [50]). Finally, weighted arithmetic average indexes are used as trading benchmarks in pension plans (see [11]). ...

... . . , 0 with ϑ N (µ) ≡ µ/(N + 1) and ψ given by (11). ...

... which follows from (25) by iterated expectations and a change to the measure P with ϕ, ψ,φ given by (10), (11), (22). ...

We propose an accurate method for pricing arithmetic Asian options on the discrete or continuous average in a general model setting by means of a lower bound approximation. In particular, we derive analytical expressions for the lower bound in the Fourier domain. This is then recovered by a single univariate inversion and sharpened using an optimization technique. In addition, we derive an upper bound to the error from the lower bound price approximation. Our proposed method can be applied to computing the prices and price sensitivities of Asian options with fixed or floating strike price, discrete or continuous averaging, under a wide range of stochastic dynamic models, including exponential Lévy models, stochastic volatility models, and the constant elasticity of variance diffusion. Our extensive numerical experiments highlight the notable performance and robustness of our optimized lower bound for different test cases.

... They predicted the future value of the common component by averaging the 20-day historical data. Bialkowski, Darolles, and Le Fol (2008) used the ARMA model to predict the value of the specific component. In their work, the historical data from the first 20 trading days (i.e., high-frequency data from intervals of 20 min) are used to analyze 40 stocks from the CAC40 index. ...

... Section 3.3 will discuss whether using different stock portfolios for the PCA -for example, adding or reducing the number of stocks or using stocks from the same industry -will help to improve prediction. Finally, after decomposing the principal component, Bialkowski, Darolles, and Le Fol (2008) used the ARMA (1,1) and SETAR (1,1) models to make predictions regarding the specific component. The outcome of our empirical analysis indicates that in Chinese stock markets, although the specific component of the volume series is stationary, most of the series displays long memory. ...

... In the stock market, approximately 50% of the institutional investors use the VWAP strategy in transactions (Bank of America 2007; Bialkowski, Darolles, and Le Fol 2008). The VWAP strategy is rapidly becoming more popular because of its simplicity. ...

We investigate the modeling and forecasting of the intra-daily volume time series in Chinese stock market with an application to dynamic Volume Weighted Average Price (VWAP) method. The empirical results show that: (1) This method performs better than the traditional static VWAP strategy; (2) By adjusting time scale (time window) and the composition of the stock portfolio according to the principal component analysis method, we can further improve the forecasting accuracy of the stock turnover series; (3) There is significant long memory characteristic in the special component of the turnover series when using the dynamic VWAP method, however, we find that it can not improve the prediction of turnover series by using ARFIMA model on these series. We also analyze the reasons and provide some explanations.

... There is a large and active literature on time series analysis of high-frequency data from financial markets. While several types of financial time series have been the subject of analysis, this paper contributes to the literature on modeling and forecasting of intraday transaction volume (Białkowski, Darolles, & Le Fol, 2008;Brownlees, Cipollini, & Gallo, 2011;Humphery-Jenner, 2011). The interest in analyzing transaction volume data arises from its application to computing a benchmark price, known as the volume weighted average price (VWAP), commonly used in the industry to measure the market price impact of financial transactions. ...

... A key component for formulating a VWAP trading strategy is to obtain a good forecast for the volume weights w b,t . In particular, the trader does not need to predict the sequence of prices p it (Białkowski et al., 2008). The volume weights w b,t in turn depend on the trading volume a b,t in each bin b = 1, … , B over the trading day t. ...

... As a forecasting problem, a VWAP strategy therefore requires a sequence of forecasts for different step sizes over the trading day. Białkowski et al. (2008) and Brownlees et al. (2011) generate these forecasts from a one-step-ahead model by recursively iterating on the previous ahead forecasts. This method of obtaining multistep forecasts by recursively iterating on one-step forecasts is also known as the plug-in method. ...

This paper examines the performance of iterated and direct forecasts for the number of shares traded in high-frequency intraday data. Constructing direct forecasts in the context of formulating volume weighted average price trading strategies requires the generation of a sequence of multistep-ahead forecasts. I discuss nonlinear transformations to ensure nonnegative forecasts and lag length selection for generating a sequence of direct forecasts. In contrast to the literature based on low-frequency macroeconomic data, I find that direct multiperiod forecasts can outperform iterated forecasts when the conditioning information set is dynamically updated in real time.

... 同维度包含的信息结合起来进一步提高了精度; (2) 在模型中引入的局部波动性可以较为灵活地捕捉 数据的波动聚集性, 提高预测准确性, 实证检验也明确支持了这一点; (3) 除去对自己模型的实证分析, 基于中国市场的数据对 3 个典型的应用较广的分解组合预测模型进行了系统的实证分析和比较, 包括 比较细致的稳健性检验, 并尝试从模型理论的角度对其不同表现进行解释和分析. ...

... 对于剩余部分的预测, Bialkowski 等 [3] 分别利用了自回归滑动平均 (autoregressive moving average, ARMA)(1, 1) 和自我激励阈值自回归 (self-exciting threshold autoregressive, SETAR) 两个模型: e t,i,j = ψ 1ẽt,i−1,j + ψ 2 + ϵ t,i,j , (2.4) e t,i,j = (ϕ 11ẽt,i−1,j + ϕ 12 )I(ẽ t,i−1,j ) + (ϕ 21ẽt,i−1,j + ϕ 22 )[1 − I(ẽ t,i−1,j )] + ϵ t,i,j , (2.5) 其中, ...

... In this paper, we propose a new dynamic approach based on factor models (see also Bialkowski et al. 2008). It assumes that intra-day volume can be decomposed into two parts each predicted using separate timeseries models. ...

... Then, with time passing, he or she will improve predictions and be able to beat a trader who predicted the whole U-shaped volume at the beginning of the trading day. Further details of the approach are available in Bialkowski et al. (2008). ...

This paper proposes a new dynamic approach to modelling intra-day trading volume based on factor models. It assumes that intra-day volume can be decomposed into two parts each predicted using separate time-series models. By enabling more accurate prediction of intra-day volume, this methodology allows for a significant reduction in the cost of executing Volume Weighted Average Price orders.1.

... A better understanding and detailed insight of multi-seasonality patterns can be very helpful in capturing the dynamics of trading volume and consequently improving the performance of trading algorithms. In recent years, trading algorithms have been developed and used to reduce transaction costs as much as possible [6,15]. ...

... The concept of realized volatility relies only on intraday price dynamics, independent of trading volume dynamics. However, very few studies deal with intraday trading volume analyses [6,9,23]. It is well known that high-frequency data exhibit periodic patterns in trading activity, that is, intraday trading volume seasonality [13,20]. ...

The seasonal and trend decomposition of a univariate time-series based on Loess (STL) has several advantages over traditional methods. It deals with any periodicity length, enables seasonality change over time, allows missing values, and is robust to outliers. However, it does not handle trading day variation by default. This study offers how to deal with this drawback. By applying multiple STL decompositions of 15-minute trading volume observations, three seasonal patterns were discovered: hourly, daily, and monthly. The research objective was not only to discover if multi-seasonality exists in trading volume by employing high-frequency data but also to determine which seasonal component is most time-varying, and which seasonal components are the strongest or weakest when comparing the variation in the magnitude between them. The results indicate that hourly seasonality is the strongest, while daily seasonality changes the most. A better understanding of trading volume multiple patterns can be very helpful in improving the performance of trading algorithms.

... The slow incremental nature of the exchange serves two economic purposes: i) as a hedge against counter-party risk; and ii) as a means of reducing the "market-impact" from flooding the market with an excess supply of perishable goods which would reduce the "price". The latter strategy is similar to volume-participation algorithms for executing large trades of financial assets [Bialkowski et al., 2008]. ...

The complexity that we observe in nature can often be explained in terms of cooperative behavior. For example, the major transitions of evolution required the emergence of cooperation among the lower-level units of selection, which led to specialization through division-of-labor ultimately resulting in spontaneous order. There are two aspects to address explaining how such cooperation is sustained: how free-riders are prevented from free-riding on the benefits of cooperative tasks, and just as importantly, how those social benefits arise. We review these problems from an economic perspective, and highlight how ideas from economics can help us to better understand how the benefits of social interactions arise, how they are sustained, and how they affect the underlying social dilemmas.

... Some recent papers [HJ11] [MK12] [FW13] extend the model and incorporate the new information coming to the market but rely on the crucial assumption that the total market volume is known beforehand. Other works [BDLF08] take a different route and focus on the empirical modeling of the market volumes. A recent paper [GR13] studies the stochastic control problem including a market impact term, while the work by Li [Li13] takes a different approach and studies the optimal placement of market and limit orders for a VWAP objective. ...

We study the problem of optimal execution of a trading order under Volume
Weighted Average Price (VWAP) benchmark, from the point of view of a
risk-averse broker. The problem consists in minimizing mean-variance of the
slippage, with quadratic transaction costs. We devise multiple ways to solve
it, in particular we study how to incorporate the information coming from the
market during the schedule. Most related works in the literature eschew the
issue of imperfect knowledge of the total market volume. We instead incorporate
it in our model. We validate our method with extensive simulation of order
execution on real NYSE market data. Our proposed solution, using a simple model
for market volumes, reduces by 10% the VWAP deviation RMSE of the standard
"static" solution (and can simultaneously reduce transaction costs).

... Kissel andGlantz (2003, ch. 11) andBia lkowski et al. (2008)). The latter authors decompose the intra-day turnover (volume relative to outstanding shares, a common practice to stabilize the series, cf. ...

The Volume Weighted Average Price (VWAP) mixes volumes and prices at intra-daily intervals and is a benchmark measure frequently used to evaluate a trader’s performance. Under suitable assumptions, splitting a daily order according to ex-ante volume predictions is a good strategy to replicate the VWAP. To bypass possible problems generated by local trends in volumes, we propose a novel Generalized Autoregressive Score (GAS) model for predicting volume shares (relative to the daily total), inspired by the empirical regularities of the observed series (intra-daily periodicity pattern, residual serial dependence). An application to six NYSE tickers confirms the suitability of the model proposed in capturing the features of intra-daily dynamics of volume shares.

... The strategy is then assessed against actual trade data from the Tokyo stock exchange. [Jedrzej Bialkowski and Le Fol, 2008] presented a new methodology to modeling intra-day volume which allows for a reduction of the execution risk in VWAP. [McCulloch and Kazakov, 2007], [McCulloch and Kazakov, 2012] view it as a quadratic hedging problem under partial information and use the mean-variance method to minimize the variance of difference between market VWAP and execution VWAP. ...

We consider a Volume Weighted Average Price (VWAP) trading algorithm in which instead of following the static curve passively, the algo may adjust its participation rate in each interval. We propose a framework in which the adjustment only makes use of the expected value of the price appreciation, captured by trading signals. In order to avoid extreme behaviors, we bound the adaptive trading curve within a so-called trading envelope. Using two examples of signals, the Forward/Backward and a CAC40 stock, we confirm the potential improvement of our adaptive framework compared to previous ones.

... We only consider the temporary price impact for fair comparison. VWAP (Volume-weighted Average Price) is another model-based strategy which distributes orders in proportion to the (empirically estimated) market transaction volume in order to keep the execution price closely tracking the market average price ground truth (Kakade et al. 2004;Białkowski, Darolles, and Le Fol 2008). DDQN (Double Deep Q-network) is a value-based RL method (Ning, Ling, and Jaimungal 2018) and adopts state engineering optimizing for individual instruments. ...

As a fundamental problem in algorithmic trading, order execution aims at fulfilling a specific trading order, either liquidation or acquirement, for a given instrument. Towards effective execution strategy, recent years have witnessed the shift from the analytical view with model-based market assumptions to model-free perspective, i.e., reinforcement learning, due to its nature of sequential decision optimization. However, the noisy and yet imperfect market information that can be leveraged by the policy has made it quite challenging to build up sample efficient reinforcement learning methods to achieve effective order execution. In this paper, we propose a novel universal trading policy optimization framework to bridge the gap between the noisy yet imperfect market states and the optimal action sequences for order execution. Particularly, this framework leverages a policy distillation method that can better guide the learning of the common policy towards practically optimal execution by an oracle teacher with perfect information to approximate the optimal trading strategy. The extensive experiments have shown significant improvements of our method over various strong baselines, with reasonable trading actions.

... In this case, investors urgently need the dynamic adjusted algorithmic trading strategy. Bialkowski et al. 43 , Humphery-Jenner 44 present a dynamic model to predict future market trading volume, improve the traditional VWAP trading strategies. From the perspective of demand and supply of liquidity, Obizhaeva and Wang 45 develop a limit order book model based on liquidity algorithmic trading, and present the corresponding optimal algorithm trading strategies. ...

... Bia lkowski, et al. [14] decomposed the trading volume according to [15] and [16] into two parts: An average term representing the changes coming from market evolutions and a deviation term accounting for the opening and closure of arbitrage positions. ARMA(1,1) and SETAR are used for estimating the latter part separately, based on forty stocks of the CAC40 index with 20-minute intervals. ...

This paper proposes a dynamic model to forecast intraday volume percentages by decomposing
the trade volume into two parts: The average part as the intraday volume pattern and the
residual term as the abnormal changes. An empirical test on data spanning half-a-year gold futures
and S&P 500 futures reveals that a rolling average of the previous days’ volume percentages shows
great predictive ability for the average part. An SVM approach with the input pattern consisting of
two categories is employed to forecast the residual term. One is the previous days’ volume percentages
in the same time interval and the other is the most recent volume percentages. The study shows
that this dynamic SVM-based forecasting approach outperforms the other commonly used statistical
methods and enhances the tracking performance of a VWAP strategy greatly.

... The slow incremental nature of the exchange serves two economic purposes: i) as a hedge against counter-party risk; and ii) as a means of reducing the "market-impact" from flooding the market with an excess supply of perishable goods which would reduce the "price". The latter strategy is similar to volume-participation algorithms for executing large trades of financial assets [Bialkowski et al., 2008]. ...

The major transitions of evolution required the emergence cooperation amongst the lower-levels of selection. Many mathematical models have uncovered sufficient conditions for the evolution of cooperation amongst selfish agents but within this framework there are as many plausible sce-narios which lead to cooperative outcomes as there scenarios in which defection prospers. A new approach to explaining reciprocity appeals to the same mechanisms which have systematically enabled an explosion of reciprocity and welfare in human societies, viz. markets. The field of biological markets conjectures that Adam Smith's "invisible hand" is a universal phenomenon of nature rather than a parochial artefact of hu-man societies. In this paper I review this field and speculate how an understanding of the role of market interactions in nature can explain the major transitions in evolution and the corresponding explosive increase in the complexity of life.

... We now turn to the choice of a dynamics of the Bid-Ask spread. In the economic literature, Bid-Ask spread depends mainly on two factors: the value of the stock and the trading volume, see Potters and Bouchaud [114], Bialkowski et al. [13] and Lehalle [89]. ...

This PhD dissertation consists of three independent parts and deals with applications of stochastic control to finance. In the first part, we study the utility maximization problem in a market with defaults and total/partial information. The dynamic programming principle is used to characterize the value function. Given this characterization, we find a BSDE of which the value function is a solution. We also give an approximation of this value function. In the second part, we study BSDEs with jumps. We link BSDEs with jumps and Brownian BSDEs using the decomposition of processes in the reference filtration. With this link, we get a result of existence, a comparison theorem and a decomposition of Feynman-Kac formula. We use these techniques to work out the price of a European option in a complete market and the indifference price of a contingent claim in an incomplete market. Finally, in the third part, we use the error theory to explain the liquidity risk and to model the Bid-Ask spread. Then we solve an optimal liquidation problem for a large portfolio in discrete and deterministic time.

... Most of the existing literature on VWAP focuses on strategies and algorithms to execute orders as close as possible to the VWAP price (see e.g. Konishi (2002), Bialkowski et al. (2008), Fuh et al. (2010), Frei & Westray (2013)). On the other hand, surprisingly few results on actual pricing methodologies related to VWAP options have been published (Stace (2007), Novikov et al. (2014)). ...

Volume weighted average price (VWAP) options are a popular security type in
many countries, but despite their popularity very few pricing models have been
developed so far for VWAP options. This can be explained by the fact that the
VWAP pricing problem is set in an incomplete market since there is no
underlying with which to hedge the volume risk, and hence there is no uniquely
defined price. Any price, which is obtained will include a market price of
volume risk which must be determined from the corresponding volume statistics.
Our analysis strongly supports the hypothesis that the empirical volume
statistics of ASX equities can be described reasonably well by fitted gamma
distributions. Based on this observation we suggest a simple gamma
process-based model that allows for the exact analytic pricing of VWAP options
in a rather straightforward way.

... It has critical implications on investment decision. Many researchers and practitioners found that threshold models are more suitable when fitting volume series, see, for example, Białkowski et al. (2008) and Sabiruzzaman et al. (2010). ...

This note investigates the self‐weighted least absolute deviation estimation (SLADE) of a heavy‐tailed continuous threshold autoregressive (TAR) model. It is shown that the SLADE is strongly consistent and asymptotically normal. The SLADE is global in the sense that the convergence rate is first obtained before deriving its limiting distribution. Moreover, a test for the continuity of TAR model is considered. A sign‐based portmanteau test is developed for diagnostic checking. An empirical example is given to illustrate the usefulness of our method. Combined with the results (Yang and Ling, 2017), a complete asymptotic theory on the SLADE of a heavy‐tailed TAR model is established. This enriches asymptotic theory of statistical inference in threshold models.

... Stock trading volume prediction has a significant role in algorithmic trading systems [2,3,4], which aims to predict the stock trading volume based on preceding transaction data. Recently, progresses have been made towards more accurate volume prediction via various machine learning techniques. ...

Traditional knowledge distillation in classification problems transfers the knowledge via class correlations in the soft label produced by teacher models, which are not available in regression problems like stock trading volume prediction. To remedy this, we present a novel distillation framework for training a light-weight student model to perform trading volume prediction given historical transaction data. Specifically, we turn the regression model into a probabilistic forecasting model, by training models to predict a Gaussian distribution to which the trading volume belongs. The student model can thus learn from the teacher at a more informative distributional level, by matching its predicted distributions to that of the teacher. Two correlational distillation objectives are further introduced to encourage the student to produce consistent pair-wise relationships with the teacher model. We evaluate the framework on a real-world stock volume dataset with two different time window settings. Experiments demonstrate that our framework is superior to strong baseline models, compressing the model size by $5\times$ while maintaining $99.6\%$ prediction accuracy. The extensive analysis further reveals that our framework is more effective than vanilla distillation methods under low-resource scenarios.

... Trading volume prediction plays an important role in algorithmic trading strategies [4,5,7,14,18,27,43]. Many efforts are paid to volume prediction [1,2,9,16,17,23,24,34,37]. Machine learning or deep learning methods have many applications in volume prediction. ...

Adversarial training is a method for enhancing neural networks to improve the robustness against adversarial examples. Besides the security concerns of potential adversarial examples, adversarial training can also improve the performance of the neural networks, train robust neural networks, and provide interpretability for neural networks. In this work, we take the first step to introduce adversarial training in time series analysis by taking the finance field as an example. Rethinking existing researches of adversarial training, we propose the adaptively scaled adversarial training (ASAT) in time series analysis, by treating data at different time slots with time-dependent importance weights. Experimental results show that the proposed ASAT can improve both the accuracy and the adversarial robustness of neural networks. Besides enhancing neural networks, we also propose the dimension-wise adversarial sensitivity indicator to probe the sensitivities and importance of input dimensions. With the proposed indicator, we can explain the decision bases of black box neural networks.

... Trading algorithms aim at minimizing these costs by splitting orders in order to find a better execution price (Frei and Westray 2015;Barzykin and Lillo 2019) and the crucial part is the decision of when to execute the orders in such a way to minimize market impact or to achieve certain trading benchmarks (e.g. VWAP) (Brownlees et al. 2010;Satish et al. 2014;Bialkowski et al. 2008;Calvori et al. 2013;Kawakatsu 2018). Second, when different market venues are available, the algorithm must decide where to post the order and the choice is likely the market where more volume is predicted to be available. ...

We study the problem of the intraday short-term volume forecasting in cryptocurrency multi-markets. The predictions are built by using transaction and order book data from different markets where the exchange takes place. Methodologically, we propose a temporal mixture ensemble, capable of adaptively exploiting, for the forecasting, different sources of data and providing a volume point estimate, as well as its uncertainty. We provide evidence of the clear outperformance of our model with respect to econometric models. Moreover our model performs slightly better than Gradient Boosting Machine while having a much clearer interpretability of the results. Finally, we show that the above results are robust also when restricting the prediction analysis to each volume quartile.

... Most recently, the authors of [48] highlighted the complexity of contrasting and selecting an appropriate ACD model using statistical contrasts. The authors of [49] used an econometric model (ARMA and SETAR) to model intraday volume and divided the behavior of volume in two: on the one hand, the usual daily behavior, and on the other hand, abnormal behavior. The authors concluded that it is necessary to model price and volume jointly, but this factor was already highlighted by the authors of [50], who defined the positive relationship between changes in price and volume. ...

The usual measures of market risk are based on the axiom of positive homogeneity while neglecting an important element of market information—liquidity. To analyze the effects of this omission, in the present study, we define the behavior of prices and volume via stochastic processes subordinated to the time elapsing between two consecutive transactions in the market. Using simulated data and market data from companies of different sizes and capitalization levels, we compare the results of measuring risk using prices compared to using both prices and volumes. The results indicate that traditional measures of market risk behave inversely to the degree of liquidity of the asset, thereby underestimating the risk of liquid assets and overestimating the risk of less liquid assets.

... Following this article VWAP tracking has been attacked using a variety of different methods. McCulloch and Kazakov [26] view it as a quadratic hedging problem under partial information whereas Kakade et al. [21] and Bia lkowski et al. [6] use online learning and dynamic volume approaches and Humphery-Jenner [20] gives a VWAP trading rule which takes intraday noise into consideration. Finally Bouchard and Dang [8] formulate it as a stochastic target problem and derive a viscosity solution characterization of the value function. ...

We consider the optimal liquidation of a position of stock (long or short) where trading has a temporary market impact on the price. The aim is to minimize both the mean and variance of the order slippage with respect to a benchmark given by the market volume-weighted average price (VWAP). In this setting, we introduce a new model for the relative volume curve which allows simultaneously for accurate data fit, economic justification, and mathematical tractability. Tackling the resulting optimization problem using a stochastic control approach, we derive and solve the corresponding Hamilton–Jacobi–Bellman equation to give an explicit characterization of the optimal trading rate and liquidation trajectory.

... Recurrent neural networks have been used in [10] to predict price-flip events in limit order books by classifying sequences of observations of the book depths and market orders. Forecasting the traded volume has also been used in [11] to improve the execution of VWAP orders. That is, by forecasting the traded volume one can track the VWAP price matching it at the end of the chosen time window. ...

The optimal execution of stock trades is a relevant and interesting problem as it is key in maximizing profits and reducing risks when investing in the stock market. In the case of large orders, the problem becomes even more complex as the impact of the order in the market has to be taken into account. The usual solution is to split large orders into a set of smaller suborders that must be executed within a prescribed time window. This leads to the problem of deciding when in the time window execute each suborder. There are popular ways of executing the trading of these split orders like those which try to track the “Time Weighted Average Price” and the “Volume Weighted Average Price”, usually called TWAP and VWAP orders. This paper presents a strategy to optimize the splitting of large trade orders over a given time window. The strategy is based on the solution of an optimization problem that is applied following a receding horizon approach. This approach reduces the impact of prediction errors due to the uncertain market dynamics, by using new values of the price time series as they are available as time goes on. Suborder size constraints are taken into account in both market and limit orders. The strategy relies on price and traded volume forecast but it is independent of the prediction technique used. The performance index weighs not only the financial cost of the suborders, but also the impact on the market and the forecasting accuracy. A tailored optimization algorithm is proposed for efficiently solving the corresponding optimization problem. Most of the computations of the algorithm can be parallelized. Finally, the proposed approach has been tested through a case study composed by stocks of the Chinese A-share market.

... VWAP (Volume-weighted Average Price) is another model-based strategy which distributes orders in proportion to the (empirically estimated) market transaction volume in order to keep the execution price closely tracking the market average price ground truth (Kakade et al. 2004;Białkowski, Darolles, and Le Fol 2008). ...

As a fundamental problem in algorithmic trading, order execution aims at fulfilling a specific trading order, either liquidation or acquirement, for a given instrument. Towards effective execution strategy, recent years have witnessed the shift from the analytical view with model-based market assumptions to model-free perspective, i.e., reinforcement learning, due to its nature of sequential decision optimization. However, the noisy and yet imperfect market information that can be leveraged by the policy has made it quite challenging to build up sample efficient reinforcement learning methods to achieve effective order execution. In this paper, we propose a novel universal trading policy optimization framework to bridge the gap between the noisy yet imperfect market states and the optimal action sequences for order execution. Particularly, this framework leverages a policy distillation method that can better guide the learning of the common policy towards practically optimal execution by an oracle teacher with perfect information to approximate the optimal trading strategy. The extensive experiments have shown significant improvements of our method over various strong baselines, with reasonable trading actions.

... Conversely, Sustainability 2021, 13, 1011 3 of 16 VWAP focuses on the trading volume [31][32][33]. Generally, the intraday trading volume shows a U-shaped curve with much trading at the beginning or end of the market and relatively few trades in the middle of the market [34][35][36]. If a CD order focuses on trades that have less impact on the market, VWAP, which allocates relatively more trades in the middle of the market and relatively fewer trades in the early or late market, is more effective than TWAP [37]. ...

Research on stock market prediction has been actively conducted over time. Pertaining to investment, stock prices and trading volume are important indicators. While extensive research on stocks has focused on predicting stock prices, not much focus has been applied to predicting trading volume. The extensive trading volume by large institutions, such as pension funds, has a great impact on the market liquidity. To reduce the impact on the stock market, it is essential for large institutions to correctly predict the intraday trading volume using the volume weighted average price (VWAP) method. In this study, we predict the intraday trading volume using various methods to properly conduct VWAP trading. With the trading volume data of the Korean stock price index 200 (KOSPI 200) futures index from December 2006 to September 2020, we predicted the trading volume using dynamic time warping (DTW) and a genetic algorithm (GA). The empirical results show that the model using the simple average of the trading volume during the optimal period constructed by GA achieved the best performance. As a result of this study, we expect that large institutions will perform more appropriate VWAP trading in a sustainable manner, leading the stock market to be revitalized by enhanced liquidity. In this sense, the model proposed in this paper would contribute to creating efficient stock markets and help to achieve sustainable economic growth.

... We deliberately deviate from this as our goal is not index replication but optimal execution. For more information on index replication strategies and pricing of such contracts, the interested reader might refer to Guéant and Royer [17], Humphery-Jenner [22] and Białkowski et al. [5]. Recall that the VWAP itself is given by ...

Optimal execution, i.e., the determination of the most cost-effective way to trade volumes in continuous trading sessions, has been a topic of interest in the equity trading world for years. Electricity intraday trading slowly follows this trend but is far from being well-researched. The underlying problem is a very complex one. Energy traders, producers, and electricity wholesale companies receive various position updates from customer businesses, renewable energy production, or plant outages and need to trade these positions in intraday markets. They have a variety of options when it comes to position sizing or timing. Is it better to trade all amounts at once? Should they split orders into smaller pieces? Taking the German continuous hourly intraday market as an example, this paper derives an appropriate model for electricity trading. We present our results from an out-of-sample study and differentiate between simple benchmark models and our more refined optimization approach that takes into account order book depth, time to delivery, and different trading regimes like XBID (Cross-Border Intraday Project) trading. Our paper is highly relevant as it contributes further insight into the academic discussion of algorithmic execution in continuous intraday markets and serves as an orientation for practitioners. Our initial results suggest that optimal execution strategies have a considerable monetary impact.

... All these variables represent public information, as long as they are visible to market members, or easily recoverable from the LOB. Some of these variables act as driving elements for automated trading algorithms like the volume-weighted average price (VWAP) or the time-weighted average price (TWAP), see Bialkowski et al. (2008) or Brownlees et al. (2010). To examine how these variables can convey information on the traders' identity, we adopt the following bivariate probit model: ...

In this paper, we considerjoint estimation of objective and risk-neutral parameters for stochastic volatility option pricing models using both stock and option prices. A common strategy simplifies the task by limiting the analysis to just one option per date. We first discuss its drawbacks on the basis of model interpretation, estimation results and pricing exercises. We then turn the attention to a more flexible approach, that successfully exploits the wealth of information contained in large heterogeneous panels of options, and we apply it to actual S&P 500 index and index call options data. Our approach breaks the stochastic singularity between contemporaneous option prices by assuming that every observation is affected by measurement error, essentially recasting the problem as a non-linear filtering one. The resulting likelihood function is evaluated using a Monte Carlo Importance Sampling (MCIS) strategy, combined with a Particle Filter algorithm. The results provide useful intuitions on the directions that should be followed to extend the model, in particular by allowing jumps or regime switching in the volatility process.

This article discusses recent techniques and results in the area of forecasting intraday volume and intraday volume percentages. By exploring ways to predict volume, the authors seek to improve the performance of trading algorithms, many of which depend upon the volume that will trade while the order is active. Traditionally, algorithms use historical averages to predict volume over the lifetime of an order. The authors show that improving the prediction of volume boosts the performance of algorithms.

Based on the concept that the presence of liquidity frictions can increase the daily traded volume, we develop an extended version of the mixture of distribution hypothesis model (MDH) along the lines of Tauchen and Pitts (1983) to measure the liquidity portion of volume. Our approach relies on a structural definition of liquidity frictions arising from the theoretical framework of Grossman and Miller (1988), which explains how liquidity shocks affect the way in which information is incorporated into daily trading characteristics. In addition, we propose an econometric setup exploiting the volatility-volume relationship to filter the liquidity portion of volume and infer the presence of liquidity frictions using daily data. Finally, based on FTSE 100 stocks, we show that the extended MDH model proposed here outperforms that of Andersen (1996) and that the liquidity frictions are priced in the cross-section of stock returns.

This paper presents a new method to estimate the fractional differencing parameters in the SARFIMA model. A technique of split cosine bell tapering is suggested to improve the EGPH method. The simulation study shows that the optimal split proportion and bandwidth for the EGPH with split cosine bell tapering method respectively are p = 0.1 and b = 0.9. The new method with the optimal parameters outperforms the EGPH and EGPH with cosine bell tapering. We further applied the EGPH method to estimate intraday volume series and high-frequency absolute return data. The results show that the seasonal fractionally differencing parameters are all estimated to be large, while the nonseasonal fractionally differencing parameters are all very small. This indicates that their long memory property may be mainly caused by the structure of long-range dependence at the seasonal lags instead of dependence at the nonseasonal lags.

We propose a model for decomposing a volume series based on the Fast Fourier Transform (FFT) algorithm. By setting a threshold for the power spectrum, the model extracts the periodic and nonperiodic components from the original volume series and then predicts them. By analyzing samples from four major stock indices, we find that a too small threshold and a too large threshold cause negative effects on the performance of the FFT model. Appropriate thresholds are found at approximately the 93rd to 95th percentile for the four indices studied. The out-of-sample experiment for the 50 stocks of the Shanghai 50 Composite Index shows that the FFT model is superior to the classic moving average model in terms of both volume prediction and Volume-weighted Average Price (VWAP) tracking accuracy. Meanwhile, for almost all of the 50 stocks, the FFT model outperforms the Bialkowski et al. () model in terms of volume-prediction accuracy. The two models perform comparably in terms of the VWAP tracking error.

We consider a setting where market microstructure noise is a parametric function of trading information, possibly with a remaining noise component. Assuming that the remaining noise is Op(1/n), allowing irregular times and jumps, we show that we can estimate the parameters at rate n, and propose a volatility estimator which enjoys n convergence rate. Simulation studies show that our method performs well even with model misspecification and rounding. Empirical studies demonstrate the practical relevance and advantages of our method. Furthermore, we find that a simple model can account for a high percentage of the total variation in microstructure noise.

Mean marks form a versatile toolbox in the analysis of marked point processes (MPPs). For ergodic processes, their definition is straightforward and practical application is well established. In the stationary non-ergodic case, though, different definitions of mark averages are possible and might be practically relevant. In this paper, the classical definition of mean marks is compared to a set of new characteristics for non-ergodic MPPs, which stand out due to the weighting of ergodicity classes. Another weighting can be introduced on the single-point level via weights given by the marks themselves. These intrinsically given weights and the weighting of ergodicity classes are closely related to each other meaning that for suitable choices of weights, a mean mark characteristic can be expressed in either way. Estimators for the different definitions of mean marks are discussed and their consistency and asymptotic normality are shown under certain conditions.

In this paper, we use intraday COMEX gold futures to evaluate and compare the trading performance of volume weight average price (VWAP) strategy, time weighted average price (TWAP) strategy and implementation shortfall (IS) strategy. We find that they can track the market price very well only when price moves have no trend on the relevant day. And Market impact cost and timing risk cost of the three strategies are proved to be negative correlated. Moreover, we get the result that the timing risk cost of VWAP strategy is the highest and that of IS strategy is the lowest, while the situation of market impact cost is opposite.

The volume weighted average price (VWAP) over rolling number of days in the averaging period is used as a benchmark price by market participants and can be regarded as an estimate for the price that a passive trader will pay to purchase securities in a market. The VWAP is commonly used in brokerage houses as a quantitative trading tool and also appears in Australian taxation law to specify the price of share-buybacks of publically-listed companies. Most of the existing literature on VWAP focuses on strategies and algorithms to acquire market securities at a price as close as possible to VWAP. In our setup the volume process is modeled via a shifted squared Ornstein-Uhlenbeck process and a geometric Brownian motion is used to model the asset price. We derive the analytical formulae for moments of VWAP and then use the moment matching approach to approximate a distribution of VWAP. Numerical results for moments of VWAP and call-option prices have been verified by Monte Carlo simulations. © Springer International Publishing Switzerland 2014. All rights are reserved.

This chapter discusses the high trading costs that can arise in emerging markets and considers ways to ameliorate those transaction costs. Transaction costs might be high in emerging markets due to factors including thin trading, poor regulation, and limits to direct market access or algorithmic trading. A portfolio manager or trader must consider these costs when constructing and rebalancing a portfolio or when executing a trade. The chapter considers several ways that traders can ameliorate transaction costs when executing a larger order. It also discusses how a portfolio manager could take an approach to rebalancing that recognizes the potentially large costs involved in rebalancing.

There are few intraday volume forecasting models in the literature, and they do not reflect on each other regarding forecast performance. This paper compares two models that are often referenced: the model of Bialkowski, Darolles and Le Fol (2008) to that of Brownlees, Cipollini and Gallo (2011) using intraday data that covers 11 years of 33 NYSE and NASDAQ shares. The former is found to produce more accurate forecasts, while its estimation is faster by several orders of magnitude.

The least squares estimator of the threshold autoregressive (TAR) model may not be consistent when its tail is less than or equal 2. Neither theory nor methodology can be applied to model fitting in this case. This paper is to develop a systematic procedure of statistical inference for the heavy-tailed TAR model. We first investigate the self-weighted least absolute deviation estimation for the model. It is shown that the estimated slope parameters are -consistent and asymptotically normal, and the estimated thresholds are -consistent, each of which converges weakly to the smallest minimizer of a compound Poisson process. Based on this theory, the Wald test statistic is considered for testing the linear restriction of slope parameters and a procedure is given for inference of threshold parameters. We finally construct a sign-based portmanteau test for model checking. Simulations are carried out to assess the performance of our procedure and a real example is given.

The mixture of distribution hypothesis (MDH) model offers an appealing explanation for the positive relation between trading volume and volatility of returns. In this specification, the information flow constitutes the only mixing variable responsible for all changes. However, this single static latent mixing variable cannot account for the observed short-run dynamics of volume and volatility. In this paper, we propose a dynamic extension of the MDH that specifies the impact of information arrival on market characteristics in the context of liquidity frictions. We distinguish between short-term and long-term liquidity frictions. Our results highlight the economic value and statistical accuracy of our specification. First, based on some goodness of fit tests, we show that our dynamic two-latent factor model outperforms all competing specifications. Second, the information flow latent variable can be used to propose a new momentum strategy. We show that this signal improves once we allow for a second signal - the liquidity frictions latent variable - as the momentum strategies based on our model present better performance than those based on competing models.

Trading volume is one of the key measures of trading activity intensity and plays a crucial role in the financial market microstructure literature. In this paper, we examine the out-of-sample point and density forecasting performance of Bayesian Autoregressive Conditional Volume (ACV) models for intra-day volume data. Based on 5-min traded volume data for stocks quoted on the Warsaw Stock Exchange, a leading stock market in Central and Eastern Europe, we find that, in terms of point forecasts, the considered linear ACV models significantly outperform benchmarks such as the naïve and Rolling Means methods but not necessarily Autoregressive Moving Average (ARMA) models. Moreover, the point forecasts obtained within the exponential error ACV model are significantly superior to those calculated in other competing structures for which Burr or generalized gamma distributions are specified. The main finding from the analysis of density forecasts is that, in many cases, the linear ACV models with the Burr and generalized gamma distributions provide significantly better density forecasts than the linear ACV model with exponential innovations and the ARMA models in terms of the log-predictive score, calibration and sharpness.

In this paper, we apply Bayesian inference to model and forecast intraday trading volume, using Autoregressive Conditional Volume (ACV) models, and we evaluate the quality of volume point forecasts. In the empirical application, we focus on the analysis of both in‐ and out‐of‐sample performance of Bayesian ACV models estimated for 2‐minute trading volume data for stocks quoted on the Warsaw Stock Exchange in Poland. We calculate two types of point forecasts, using either expected values or medians of predictive distributions. We conclude that, in general, all considered models generate significantly biased forecasts. We also observe that the considered models significantly outperform such benchmarks as the naïve or Rolling Means forecasts. Moreover, in terms of the root mean squared forecast errors, the point predictions obtained within the ACV model with the exponential distribution emerge superior as compared to the ones calculated in structures with more general innovation distributions, although in many cases this characteristic turns out to be statistically insignificant. On the other hand, when comparing the mean absolute forecasts errors, the median forecasts obtained within the ACV models with the Burr and generalised gamma distribution are found to be statistically better than other forecasts.

The volume weighted average price (VWAP) of a security is a key measure of execution quality for large orders often used by institutional investors. We propose a VWAP tracking model with general price and volume dynamics and transaction costs. We find the theoretically optimal VWAP tracking strategy in several special cases. With these solutions we investigate three questions empirically. Do dynamic strategies outperform static strategies? How important is the choice of the market impact model? Does capturing the relationship between trading volume and the variance of stock price returns play an important role in optimal VWAP execution? We find that static strategies are preferable to dynamic ones, that simpler market impact models that assume either constant or linear market impact, perform as well as more sophisticated, nonlinear, market impact models, and that capturing the relationship between trading volume and the variance of stock price returns improves the performance of VWAP execution significantly.

In this study, we aim to showcase the value of news sentiment data on a global scale. This is done by creating long/short equity strategies across 49 countries, focusing on Large- and Mid-cap companies. Strategies are based on the recently introduced Sum Excess Sentiment Indicator (SESI) that adjusts for daily news sentiment bias. Country-level strategies are combined into regional and global portfolios to ensure maximum diversification. Our key findings are that news sentiment works everywhere. Returns are positive in 41 out of 49 countries, with Information Ratios (IRs) greater than 1.0 in one out of three countries. Strong Regional performance. All regional portfolios yield positive returns, with three (Asia & Pacific, North America, and Europe) out of five regions producing IRs of 3.0 or higher for a 1-day holding period. Global Portfolios ensure maximum diversification. By combining country-level strategies into cap-weighted global portfolios, we are able to achieve IRs as high as 4.76 for 1-day and 2.61 for 1-week holding periods.

This paper attempts to fit a model of stock market prices to check
the accuracy of the forecasts. It also employs a comparative evaluation of
traditional indicators such as Bollinger Bands, SMA, EMA, VWAP, MACD
and RSI to ascertain the most efficient way to comprehend and forecast
future price trajectories.

This study highlights the link between stock return volatility, operating performance, and stock returns. Prior studies suggest that there is a ‘low volatility’ anomaly, where firms with a low stock return volatility out-perform firms with a high stock return volatility. This paper confirms that low volatility stocks earn higher returns than high volatility stocks in emerging markets and developed markets outside of North America. We also show that low volatility stocks have higher operating returns and this might explain why low volatility stocks earn higher stock returns. These results provide a partial explanation for the ‘low volatility effect’ that is independent from the existence of market anomalies or per se inefficiencies that might otherwise drive a low volatility effect. We emphasize the importance of controlling for stock return volatility when analyzing operating performance and stock performance.

This paper reviews previous and current research on the relation between price changes and trading volume in financial markets, and makes four contributions. First, two empirical relations are established: volume is positively related to the magnitude of the price change and, in equity markets, to the price change per se. Second, previous theoretical research on the price-volume relation is summarized and critiqued, and major insights are emphasized. Third, a simple model of the price-volume relation is proposed that is consistent with several seemingly unrelated or contradictory observations. And fourth, several directions for future research are identified.

Decomposing returns into market and stock speci?c components is common practice and forms the basis of popular asset pricing models. But what about volume ? Can volume be decomposed in the same way as returns ? Lo and Wang (2000), in a recent paper, suggest such a decomposition. Our paper is in this line of work and, despite the similarity of the statistical approach, our contribution is twofold. First, we provide a theoretical model to explain the decomposition of volume. Our model is the ?rst, to our knowledge, to justify the strategies of new generation of traders, that we call liquidity arbitrageurs. Second, we propose a new e� cient screening tool that allows practitioners to extract speci?c information from volume time series. We provide an empirical illustration of the relevance and the possible uses of our approach on daily data from the FTSE index from 2000 to 2002.

In an adverse selection model of a securities market with one informed trader and several liquidity traders, we study the
implications of the assumption that the informed trader has more information on Monday than on other days. We examine the
interday variations in volume, variance, and adverse selection costs, and find that on monday the trading costs and the variance
of price changes are highest, and the volume is lower than on Tuesday. These effects are stronger for firms with better public
reporting and for firms with more discretionary liquidity trading.

We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. Two-fund separation
theorems suggest a natural definition for trading activity: share turnover. If two-fund separation holds, share turnover must
be identical for all securities. If $(K + 1)$-fund separation holds, we show that turnover satisfies an approximately linear K-factor structure. These implications are examined empirically using individual weekly turnover data for NYSE and AMEX securities
from 1962 to 1996. We find strong evidence against two-fund separation, and a principal-components decomposition suggests
that turnover is well approximated by a two-factor linear model.

This paper presents a study of intra-day patterns of stock market activity and introduces duration based activity measures for single stocks and multiple assets. The proposed measures involve weighted durations, i.e. times necessary to sell (buy) a predetermined volume or value of stocks. As such, they capture dependencies between intra-trade durations, transaction volumes and prices, and can be interpreted as liquidity measures. This approach allows us to highlight the intra-day variations of liquidity, its costs and volatility, and to develop a liquidity based asset ordering. The extension to a multivariate analysis yields new insights into the dynamics of portfolio liquidity by revealing various aspects of asset substitution, including the effects of correlated trade intensities of portfolio components. Several examples are used to show that in practice, the proposed liquidity measures become efficient instruments for strategic block trading and optimal portfolio adjustments. The paper also contains an empirical study of asset activity on the Paris Bourse. We examine the liquidity dynamics throughout the day and reveal the existence of periodic patterns resulting from world-wide interactions of major stock markets. In the multivariate setup, we report evidence on common patterns and correlations of trade intensities of selected stocks.

The paper develops an empirical return volatility-trading volume model from a microstructure framework in which informational asymmetries and liquidity needs motivate trade in response to information arrivals. The resulting system modifies the so-called "Mixture of Distribution Hypothesis" (MDH). The dynamic features are governed by the information flow, modeled as a stochastic volatility process, and generalize standard ARCH specifications. Specification tests support the modified MDH representation and show that it vastly outperforms the standard MDH. The findings suggest that the model may be useful for analysis of the economic factors behind the observed volatility clustering in returns.

In this paper, we are dealing with financial high frequency data; any time an order reaches the market, any time a cancellation or transaction occurs, a new record is made, ending up with a huge amount of data. Hence the time interval between two events is not fixed, forbidding the use of standard statistical tools. In the recent literature, several authors proposed time deformation techniques to deal with this problem. The practical importance of time deformation is to give a preprocessing technique to obtain a regularly spaced grid of data. The main contribution of this paper is to survey most of the time deformations proposed in the literature in a general setting and to compare them from both a statistical and financial point of view. We provide a new trading strategy in which the time to invest is endogeneous. Moreover, we highlight the fact that changing time scale can improve the daily gain following such a strategy.

This paper investigates the effect of trade size on security prices. We show that trade size introduces an adverse selection problem into security trading because, given that they wish to trade, informed traders perfer to trade larger amounts at any given price. As a result, market makers' pricing strategies must also depend on trade size, with large trades being made at less favorable prices. Our model provides one explanation for the price effect of block trades and demonstrates that both the size and the sequence of trades matter in determining the price-trade size relationship.

This paper proposes a new statistical model for the analysis of data which arrive at irregular intervals. The model treats the time between events as a stochastic process and proposes a new class of point processes with dependent arrival rates. The conditional intensity is developed and compared with other self-exciting processes. Because the model focuses on the expected duration between events, it is called the autoregressive conditional duration (ACD) model. Asymptotic properties of the quasi maximum likelihood estimator are developed as a corollary to ARCH model results. Strong evidence is provided for duration clustering for the financial transaction data analyzed; both deterministic time-of-day effects and stochastic effects are important. The model is applied to the arrival times of trades and therefore is a model of transaction volume, and also to the arrival of other events such as price changes. Models for the volatility of prices are estimated with price-based durations, and examined from a market microstructure point of view.

Volume profiles are a primary component of VWAP execution strategies. A study of simulated VWAP execution performance using intraday tick data indicates that innovations beyond a simple smile profile yields almost no benefit. Factors beyond the provider's control, such as stock characteristics like liquidity and volatility, have a much greater impact. These results indicate that other aspects of VWAP algorithms, such as the tactics on individual trades intraday, are more likely to yield significant benefits than refinements on the volume profile.

This study tests Karpoff's (1987) cosUy short sales hypothesis that attempts to explain the asymmetric relationship be- tween stock price changes and trading volume. Since short sales are disallowed on the Stock Exchange of Singapore, the data set offers a polar case of costly short sales. We document an asymmetric price change-volume relationship that is consistent with previous empiri- cal evidence based on US. data sets. More importantly we find a polar case of the asymmetric price change-volume relationship that supports Karpoff's (1987) costly short sales hypothesis: a positive correlation between positive price change and volume, but no signif- icant relationship between negative price change and volume.

Thesis (M. Sc.)--University of Manitoba, 1994. Includes bibliographical references (leaves 59-61). Photocopy.

Les systèmes de négociation utilisés sur les marchés financiers varient en termes de procédures d'appariement, de normes de rédaction des contrats, de présence d'intermédiaires ou non pour assurer la liquidité, de transparence des marchés... Nous nous intéressons, dans cet article, à l'effet direct sur les caractéristiques de marché d'une modification de la procédure d'appariement, c'est-à-dire de la fréquence d'échange et du choix de prix uniques ou multiples pour les contrats signés à une même date. /// The trading systems used on financial markets differ in terms of matching procedures, selected norms to write contracts, existence or not of intermediaries to ensure liquidity, market transparency... We are interested in measuring the direct effect on market specifics of a matching procedure modification namely, the matching frequency and the choice of a unique or multiple prices for the contracts concluded simultaneously.

We examine the contemporaneous correlation as well as the lead–lag relation between trading volume and return volatility in all stocks comprising the Dow Jones industrial average (DJIA). We use 5-minute intraday data and measure return volatility by the exponential generalized autoregressive conditional heteroscedasticity method. Contrary to the mixture of distribution hypothesis, the vast majority of the DJIA stock shows no contemporaneous correlation between volume and volatility. However, we find evidence of significant lead–lag relations between the two variables in a large number of the DJIA stocks in accordance with the sequential information arrival hypothesis.

This paper presents a framework to model duration, volume and returns simultaneously, obtaining an econometric reduced form that incorporates causal and feedback effects among these variables. The methodology is applied to two groups of stocks, classified according to trade intensity. We find that: (1) all stocks exhibit trading volume clustering (which is significantly higher for frequently traded stocks); (2) times of greater activity coincide with a higher number of informed traders present in the market only for the frequently traded stocks; (3) the more frequently traded stocks converge more rapidly (in calendar time) to their long-run equilibrium, after an initial perturbation.

This paper derives a static optimal execution strategy of a VWAP trade, in which the optimal execution strategy can be calculated by an iteration of a single variable optimization, rather than by a multivariable optimization. Analytical solutions are derived in some cases. We show that optimal execution times lag behind expected market trading volume distribution since price volatility tends to have a positive correlation with market trading volume. In a basket trade, execution error can be reduced by spreading out execution times according to the correlation of price movement. We confirm our theoretical results with actual trading data and simulations.

This paper analyzes stock returns at the close across the stocks of the Russell 1000 using (a) Transaction-level data for the period June 1997–July 1998, and (b) The complete record of all market-on-close (MOC) order imbalance indications. The last 5 min of the trading day explains a disproportionate fraction of the variation in daily returns, consistent with the hypothesis that institutional trading interest induces a common component to stock returns at the end of the day. This phenomenon reflects a higher demand for immediacy in the closing period. We find systematic return reversals following order imbalance publications consistent with temporary price pressure related to liquidity trading.

[fre] The effect of matching procedures on trades. . The trading systems used on financial markets differ in terms of matching procedures, selected norms to write contracts, existence or not of intermediaries to ensure liquidity, market transparency... We are interested in measuring the direct effect on market specifics of a matching procedure modification namely, the matching frequency and the choice of a unique or multiple prices for the contracts concluded simultaneously. [eng] The effect of matching procedures on trades. . The trading systems used on financial markets differ in terms of matching procedures, selected norms to write contracts, existence or not of intermediaries to ensure liquidity, market transparency... We are interested in measuring the direct effect on market specifics of a matching procedure modification namely, the matching frequency and the choice of a unique or multiple prices for the contracts concluded simultaneously.

Relative intra-day cumulative volume is intra-day cumulative volume divided by final total volume. If intra-day cumulative volume is modeled as a Cox (doubly stochastic Poisson) point process, then using initial enlargement of filtration with the filtration of the Cox process enlarged by knowledge of final volume, it is shown that relative intra-day volume conditionally has a binomial distribution and is a novel generalization of a binomial point process: the doubly stochastic binomial point process. Re-scaling the intra-day traded volume to a relative volume between 0 (no volume traded) and 1 (daily trading completed) allows empirical intra-day volume distribution information for all stocks to be used collectively to estimate and identify the random intensity component of the doubly stochastic binomial point process and closely related Cox point process.

This paper develops an empirical return volatility-trading volume model from a microstructure framework in which informational asymmetries and liquidity needs motivate trade in response to information arrivals. The resulting system modifies the so-called 'mixture of distribution hypothesis' (MDH). The dynamic features are governed by the information flow, modeled as a stochastic volatility process, and generalize standard autoregressive conditional heteroskedasticity specifications. Specification tests support the modified MDH representation and show that it vastly outperforms the standard MDH. The findings suggest that the model may be useful for analysis of the economic factors behind the observed volatility clustering in returns. Copyright 1996 by American Finance Association.

A complete transactions record is defined to be ultra-high frequency data. The transaction arrival times and associated characteristics can be analyzed by marked point processes. The ACD model developed by Engle and Russell (1998) is then applied to IBM transactions data to develop semi-parametric hazard estimates and measures of conditional variances. Both returns and variances are negatively influenced by surprisingly long durations as suggested by asymmetric information models of market micro-structure.

This paper develops an inferential theory for factor models of large dimensions. The principal components estimator is considered because it is easy to compute and is asymptotically equivalent to the maximum likelihood estimator (if normality is assumed). We derive the rate of convergence and the limiting distributions of the estimated factors, factor loadings, and common components. The theory is developed within the framework of large cross sections ("N") and a large time dimension ("T"), to which classical factor analysis does not apply.We show that the estimated common components are asymptotically normal with a convergence rate equal to the minimum of the square roots of "N" and "T". The estimated factors and their loadings are generally normal, although not always so. The convergence rate of the estimated factors and factor loadings can be faster than that of the estimated common components. These results are obtained under general conditions that allow for correlations and heteroskedasticities in both dimensions. Stronger results are obtained when the idiosyncratic errors are serially uncorrelated and homoskedastic. A necessary and sufficient condition for consistency is derived for large "N" but fixed "T". Copyright The Econometric Society 2003.

This paper proposes a new statistical model for the analysis of data which arrives at irregular intervals. The model treats the time between events as a stochastic process and proposes a new class of point processes with dependent arrival rates. The conditional intensity is developed and compared with other self-exciting processes. The model is applied to the arrival times of financial transactions and therefore is a model of transaction volume, and also to the arrival of other events such as price changes. Models for the volatility of prices are estimated, and examined from a market microstructure point of view.

This paper concerns the relationship between the variability of the daily price change and the daily volume of trading on the speculative markets. Our work extends the theory of speculative markets in two ways. First, we derive from economic theory the joint probability distribution of the price change and the trading volume over any interval of time within the trading day. And second, we determine how this joint distribution changes as more traders enter (or exit from) the market. The model's parameters are estimated by FIML using daily data from the 90-day T-bills futures market. The results of the estimation can reconcile a conflict between the price variability-volume elationship for this market and the relationship obtained by previous investigators for other speculative markets.

As a centralized, computerized, limit order market, the Paris Bourse is particularly appropriate for studying the interaction between the order book and order flow. Descriptive methods capture the richness of the data and distinctive aspects of the market structure. Order flow is concentrated near the quote, while the depth of the book is somewhat larger at nearby valuations. We analyze the supply and demand of liquidity. For example, thin books elicit orders and thick books result in trades. To gain price and time priority, investors quickly place orders within the quotes when the depth at the quotes or the spread is large. Consistent with information effects, downward (upward) shifts in both bid and ask quotes occur after large sales (purchases). Copyright 1995 by American Finance Association.

A measure of execution on market impact cost is developed; it is the difference between a transaction price and th e volume weighted average price for that day. Fourteen thousand insti tutional trades are examined. Market impact costs average five basis points. Commission costs average eighteen basis points. Total costs a verage twenty-three basis points. Total costs vary only slightly acro ss brokers and vary greatly across money managers. There is no trade- off between commission costs and market impact costs. Copyright 1988 by American Finance Association.

In this paper we develop some econometric theory for factor models of large dimensions. The focus is the determination of the number of factors (r), which is an unresolved issue in the rapidly growing literature on multifactor models. We first establish the convergence rate for the factor estimates that will allow for consistent estimation of r. We then propose some panel criteria and show that the number of factors can be consistently estimated using the criteria. The theory is developed under the framework of large cross-sections (N) and large time dimensions (T ). No restriction is imposed on the relation between N and T . Simulations show that the proposed criteria have good finite sample properties in many configurations of the panel data encountered in practice. JEL Classification: C13, C33, C43 Keywords: Factor analysis, asset pricing, principal components, model selection. # Email: Jushan.Bai@bc.edu Phone: 617-552-3689 + Email: Serena.Ng@bc.edu Phone: 617-552-2182 We thank two...

Duration, volume, and volatility impact of trades. European Central Bank Working Papers Series No. 125 Relative volume as a doubly stochastic binomial point process

- S Manganelli

Manganelli, S., 2002. Duration, volume, and volatility impact of trades. European Central Bank Working Papers Series No. 125. McCulloch, J., 2004. Relative volume as a doubly stochastic binomial point process. Working Papers Series.

Trading volume and arbitrage. Working paper, CREST

- S Darolles
- Le Fol

Darolles, S., Le Fol, G., 2003. Trading volume and arbitrage. Working paper, CREST.

VWAP Strategies Transaction Performance: The Changing Face of Trading Investment Guides Series

- A Madhavan

Madhavan, A., 2002. VWAP Strategies Transaction Performance: The Changing Face of Trading Investment Guides Series. Institutional Investor Inc., pp. 32–38.

Comparison of intraday volume models performance

Table 9: Comparison of intraday volume models performance, for period September 2, 2003 to October 6, 2003, theoretical PCA-SETAR
model.
MAPE
MSPE
Company
Mean
Std
Min
Max
Q95
Mean
Std
Min
Max
Q95

Summary of in-sample estimated costs of execution on VWAP order for period from

Table 12: Summary of in-sample estimated costs of execution on VWAP order for period from September 2, 2003 to
December 16, 2003, classical approach
MAPE
MSPE
Company
Mean
Std
Min
Max
Q95
Mean
Std
Min
Max
Q95

- D Easley

Easley, D., and M., O'Hara, 1987, Price; Trade Size, and Information in Securities
Markets, Journal of Financial Economics 19, 69-90.

Comparison of VWAP predictions, based on mean absolute percentage error (MAPE), for period from Models Mean STD Min Max Q95 Result of in-sample estimation PC

Table 28: Comparison of VWAP predictions, based on mean absolute percentage error (MAPE),
for period from September 2 to December 16, 2003.
Models
Mean
STD
Min
Max
Q95
Result of in-sample estimation
PC-SETAR
0.0706 0.0825 0.0017 0.4526 0.2030

- C Gouriéroux
- G Le And
- Fol

Gouriéroux, C. And G. Le Fol, 1998, Effet des Modes de Négociation sur les
Echanges, Revue Economique, Vol. 49, 3, 795-808.

Robustness check: Comparison of VWAP predictions, based on mean absolute percentage error (MAPE), for period from Models Mean STD Min Max Q95 Result of in-sample estimation PC

Table 29: Robustness check: Comparison of VWAP predictions, based on mean absolute percentage error (MAPE), for period from January 2 to April 20, 2004.
Models
Mean
STD
Min
Max
Q95
Result of in-sample estimation
PC-SETAR
0.0679 0.0681 0.0010 0.3792 0.1908