## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

In this paper we present an exact maximum likelihood treatment for the estimation of a Stochastic Volatility in Mean (SVM) model based on Monte Carlo simulation methods. The SVM model incorporates the unobserved volatility as an explanatory variable in the mean equation. The same extension is developed elsewhere for Autoregressive Conditional Heteroskedastic (ARCH) models, known as the ARCH in Mean (ARCH-M) model. The estimation of ARCH models is relatively easy compared with that of the Stochastic Volatility (SV) model. However, efficient Monte Carlo simulation methods for SV models have been developed to overcome some of these problems. The details of modifications required for estimating the volatility-in-mean effect are presented in this paper together with a Monte Carlo study to investigate the small-sample properties of the SVM estimators. Taking these developments of estimation methods into account, we regard SV and SVM models as practical alternatives to their ARCH co...

To read the full-text of this research,

you can request a copy directly from the authors.

... Chan (2017) developed a univariate time series model with time-varying parameters and stochastic volatility in order to investigate the time-varying effects of stochastic volatility on the level of a time series. The model developed by Chan (2017) is based on Koopman and Hol Uspensky's (2002) volatility in the mean (SVM) model. Koopman and Hol Uspensky's (2002) volatility in the mean (SVM) model was, indeed, developed for financial time series as an alternative to Engle et al. (1987) ARCH in the mean (ARCH-M) model. ...

... The model developed by Chan (2017) is based on Koopman and Hol Uspensky's (2002) volatility in the mean (SVM) model. Koopman and Hol Uspensky's (2002) volatility in the mean (SVM) model was, indeed, developed for financial time series as an alternative to Engle et al. (1987) ARCH in the mean (ARCH-M) model. Our study uses the TVP-SVM model of Chan (2017) in order to investigate the dynamic effect of output uncertainty on output growth. ...

... In this paper, we allow the impact of output volatility on output growth to be time-varying, capturing any structural instability in the macroeconomic environment that may alter output volatility-output growth relationship. The stochastic volatility (SV) model of Koopman and Hol Uspensky (2002) fits better to time series with conditional heteroskedasticity. Compared to the GARCH models, volatility in the SV model is specified as a latent stochastic process that allows volatility shocks. ...

By means of stochastic volatility in the mean model to allow for time-varying parameters in the conditional mean and quarterly data for the G7 countries, this article examines the dynamic nexus between the volatility of output and economic growth for the G7 countries. This approach allows us to model parameter time-variation so as to reflect changes in the effect of volatility appearing in both the conditional mean and the conditional variance. The evidence in this article indicates that the effect of output volatility on output growth is strongly time-varying and quite analogues for all the G7 countries, with a break around 1973. The effect of output volatility on growth becomes more negative after 1973, with negative and statistically significant estimates after 1973 or early 1990s. Our estimates show a reversal of the declining trend and a significant increase in output volatility in the late-2000s, indicating that the Subprime Crisis brought a temporary break in the Great Moderation. However, the Great Moderation seems to be generally restored by the mid-2010s. The effect of output growth on output volatility is insignificant for all countries except for Italy and the US, for which the estimates are positive and statistically significant. Our estimates also show that output volatility is counter-cyclical for all countries.

... To conduct our empirical investigation, we develop a SVAR model with stochastic volatilityin-mean (SVAR-SVM) where the volatility of system variables is modeled in a similar spirit to the stochastic volatility-in-mean model originally developed by Koopman and Hol Uspensky (2002). 4 Technically, our proposed model contributes to the related literature with respect to three important dimensions. ...

... We consider the general class of the stochastic volatility model (called SVM model) in the spirit of the seminar work of Koopman and Hol Uspensky (2002), where the conditional variance of the observable variables enters into the conditional mean equation. ...

... We consider the following generalization of the state-space structural vector autoregression (SVAR) with stochastic volatility (SV) which is very close to the SVM formula specified by Koopman and Hol Uspensky (2002), and Lemoine and Mougin (2010). ...

We propose an extended SVAR model to investigate the responses of the macroeconomic volatility to financial uncertainty shocks. The empirical model features the time-varying stochastic volatility-in-mean process where parameters allow for (i) the bilateral simultaneity between the shocks hitting the level and volatility of the endogenous variables, and (ii) the feedback from the endogenous variables to the volatility. Using the U.S. data, our findings show that macroeconomic volatility arises as an endogenous response to a rise in financial uncertainty. Moreover, shutting down the volatility feedback leads financial uncertainty shocks to react more strongly to macroeconomic variables. Consequently, the effects of financial uncertainty on macroeconomic volatility become more severe, especially in the short horizon.

... Because the TVP-SVM model with the time-varying parameter can correct the estimated errors due to the structural break in the energy returns, we can better address the structural instability. Unlike the SVM model of Koopman and Hol Uspensky [31], the TVP-SVM model's volatility enters the conditional mean as a covariate and affects the variable of interest directly and time variably. Therefore, this study is better able to assess whether the nexus between the energy returns and the energy price volatility has changed over time, especially for natural gas and petroleum product prices. ...

... In this paper, we build a two-stage approach in which we first test the existence of structural breakpoints in the volatility of the nine energy products by using the iterated cumulative sum of squares algorithm (ICSS) [32] and found that there exist multiple structural breakpoints in energy volatility. Hence, it may not be appropriate to estimate the volatility feedback of energy prices by using the traditional constant parameter model such as the GARCH-M model [33] and SVM model [31]. To this end, secondly, we employ the TVP-SVM proposed by Chan [29], which can handle the structural instability and acquire the time-varying influence of the energy price volatility on the energy returns. ...

... The log-volatility h following a stationary AR (1) process with φ < 1 , which is initialized with h ~N(µ, σ /(1-φ )) , and σ is the conditional variance. Model (3)-(4) generalizes the original setup in Koopman and Hol Uspensky [31], which permits the conditional mean of y to have time-varying parameters, namely both β and are also time-varying. The vector of coefficients γ = (α , β ) ' evolves based on a random walk process as shown in (5): ...

: In this paper, the time-varying volatility feedback of nine series of energy prices is researched by employing the time-varying parameter stochastic volatility in mean (TVP-SVM) model. The major findings and conclusions can be grouped as follows: Significant differences exist in the time-varying volatility feedback among the nine major energy productions. Specifically, crude oil and diesel’s price volatility has a remarkable positive time-varying effect on their returns. Yet the returns, for natural gas and most petroleum products are negatively affected by their price volatility over time. Furthermore, obvious structural break features exist in the time-varying volatility feedback of energy prices, which coincide with the breakpoints in the energy volatility. This indicates that some factors such as major global economic and geopolitical events that cause the sudden structural breaks in the energy volatility may also affect the volatility feedback of the energy price. Moreover, the volatility feedback in energy price will become weak and even have no impact on energy returns in some special periods when the energy price volatility is extremely high.

... However, the results have depended on a considerable pre-processing of these series, avoiding the problem of simultaneous estimation of the mean and variance. To address this problem, Koopman and Uspensky (2002) introduced the SV in mean (SVM) model, allowing the non-observed volatility to enter into the mean return equation as an explanatory variable. Based on Monte Carlo simulation methods, they derived an exact maximum likelihood estimation procedure by assuming normality of the innovations. ...

... Empirical applications of time-varying modeling of volatility using SVM models in developed countries have been studied by Koopman and Uspensky (2002) and Leão, Abanto-Valle, and Chen (2017). However, empirical evidence in Latin American markets is scarse; see Abanto-Valle, Migon, and Lachos (2011). ...

... It is important to note that the right side of the credibility interval is very close to zero in all markets except the U.S. Therefore, the posterior mean of ˇ2 parameter, which measures both the ex ante relationship between returns and volatility and the volatility feedback effect, is negative for all series and statistically significant for all the series with the exception of Peru. Following Koopman and Uspensky (2002), the volatility feedback effect (negative) dominates the positive effect which links the returns with the expected volatility. Our estimates are more negative compared to those of Koopman and Uspensky (2002) where the hypothesis that ˇ2 = 0 can never be rejected at the conventional 5% significance level. ...

The Stochastic Volatility in Mean (SVM) model of Koopman and Uspensky (2002) is revisited. An empirical study of five Latin American indexes in order to see the impact of the volatility in the mean of the returns is performed. Markov Chain Monte Carlo (MCMC) Hamiltonian dynamics is used to estimate latent volatilities and parameters. Our findings show that volatility has a negative impact on returns, indicating that volatility feedback effect is stronger than the effect related to the expected volatility. This result is clear and opposite to the finding of Koopman and Uspensky (2002).

... In contrast to the earlier works [3,[30][31][32][33][34], we show that the proposed model in this paper can capture the volatility feedback effect and the effect of the dynamic offset between supply and demand. e proposed method directly simulates smoother log-volatility from its posterior distribution using a proposal distribution, which approximates the target distribution and is different from the auxiliary mixture sampler [19] in which the return equation is first transformed into a linear equation. ...

... erefore, the potential causality of the volatility feedback effect runs from volatility to prices. A positive relation between expected returns and expected volatility conforms to the capital asset pricing model (CAPM), since rational risk-averse investors need higher expected returns during more variable periods (see, e.g., Koopman and Uspensky [32]). Unfortunately, the SV model cannot capture the financial volatility feedback effect and measure the relationship between expected return and expected volatility. ...

... Unfortunately, the SV model cannot capture the financial volatility feedback effect and measure the relationship between expected return and expected volatility. To solve the problem, Koopman and Uspensky [32] proposed the stochastic volatility-in-mean (SVM) model, including the unobservable volatility square as an explanatory variable in the mean term of the return equation as follows: ...

The non-linear market microstructure (MM) model for financial time series modeling is a flexible stochastic volatility model with demand surplus and market liquidity. The estimation of the model is difficult, since the unobservable surplus demand is a time-varying stochastic variable in the return equation, and the market liquidity arises both in the mean term and in the variance term of the return equation in the MM model. A fast and efficient Markov Chain Monte Carlo (MCMC) approach based on an efficient simulation smoother algorithm and an acceptance-rejection Metropolis–Hastings algorithm is designed to estimate the non-linear MM model. Since the simulation smoother algorithm makes use of the band diagonal structure and positive definition of Hessian matrix of the logarithmic density, it can quickly draw the market liquidity. In addition, we discuss the MM model with Student-t heavy tail distribution that can be utilized to address the presence of outliers in typical financial time series. Using the presented modeling method to make analysis of daily income of the S&P 500 index through the point forecast and the density forecast, we find clear support for time-varying volatility, volatility feedback effect, market microstructure theory, and Student-t heavy tails in the financial time series. Through this method, one can use the estimated market liquidity and surplus demand which is much smoother than the strong stochastic return process to assist the transaction decision making in the financial market.

... Chan [60] developed univariate time series model with time varying parameters and stochastic volatility in order to investigate the time-varying effects of stochastic volatility on the level of the series. The model developed by Chan [60] is based on Koopman and Hol Uspensky's [61] volatility in mean (SVM) model. Koopman and Hol Uspensky's [61] volatility in mean (SVM) model was, indeed, developed for financial time series as an alternative to Engle et al. [62] ARCH-M model. ...

... The model developed by Chan [60] is based on Koopman and Hol Uspensky's [61] volatility in mean (SVM) model. Koopman and Hol Uspensky's [61] volatility in mean (SVM) model was, indeed, developed for financial time series as an alternative to Engle et al. [62] ARCH-M model. Following the study of Chan [60], our study is the first in the literature that considers time-varying impact of volatility of the precious metal price returns on the precious metal price returns. ...

... In this paper, we allow the impact of metal price returns volatility (uncertainty) on the metal price returns to be time-varying, capturing any structural instability in the macroeconomic environment that may alter the return-return volatility relationship. Although, there are several approaches to measure metal price return volatility, a popular approach is the stochastic volatility (SV) model of Koopman and Hol Uspensky's [61], which usually fits better to time series that show conditional heteroskedasticity. Compared to the GARCH models, where the volatility specified with a deterministic function, volatility in the SV models is specified as a latent stochastic process that allows volatility shocks. ...

... The results indicate that volatility has a negative impact on returns suggesting that the volatility feedback e¤ect is stronger than the e¤ect related to the expected volatility. This result is clear and opposite to the …nding of Koopman and Uspensky (2002). The other countries present negative values but the upper tail of the intervals are near to the zero value. ...

... Frequently, the volatility of daily stock returns has been estimated with SV models, but the results have relied on a extensive pre-modeling of these series to avoid the problem of simultaneous estimation of the mean and variance. Koopman and Uspensky (2002) (Duane et al., 1987;Neal, 2011) for updating the log-volatilities at once and RMHC (Girolami and Calderhead, 2011;Nugroho and Morimoto, 2015) for parameters from the mean and volatility equations at once in two blocks. ...

... The value of β 1 that measures the correlation of returns is as expected, small and very similar to the first-order autocorrelation coefficients reported in Table 1 dominates the positive effect which links the returns with the expected volatility. Our estimates are more negative compared to those of Koopman and Uspensky (2002) where the hypothesis that β 2 = 0 can never be rejected at the conventional 5% siginificance level. Therefore, the volatility feedback effect is clearly dominant in our results (except for Peru) in comparison to those of Koopman and Uspensky (2002). ...

Utilizando un modelo de volatilidad estocástica en la media (SVM), realizamos un estudio empírico de los índices latinoamericanos en vivo para ver el impacto de la volatilidad en la media de los retornos. Utilizamos la dinámica hamiltoniana de MCMC. Los resultados indican que la volatilidad tiene un impacto negativo en los rendimientos, lo que sugiere que el efecto de retroalimentación de la volatilidad es más fuerte que el efecto relacionado con la volatilidad esperada. Este resultado es claro y contrario al hallazgo de Koopman y Uspensky (2002). Los demás países presentan valores negativos pero la cola superior de los intervalos se acerca al valor cero.

... Third, it provides additional support to the empirical relevance of the stochastic volatility methodology developed by Chan. The model is an extension of the SVM model [47]. Some can view the SVM model as an alternative to the ARCH-M model of [48]. ...

... Thus, it proposes an efficient Markov chain Monte Carlo (MCMC) algorithm to evaluate the likelihood/posterior distribution. The TVP-SVM model of [49] is an extension of the stochastic volatility model due to [47]. As discussed in [49], the model has the following specification: y t is an observed variable; t is a k × 1 vector of covariates; t is a k × 1 vector of time-varying parameters; y t and h t are disturbances that are mutually and serially uncorrelated; h t is logarithmic volatility that follows a stationary AR(1) process with | |< 1 . ...

... As outlined above, the model given by Eqs. (1)-(2) is a generalization of the [47] SVM allows the conditional mean of y t to have t and t as time-varying parameters. The vector of coefficients t = t , � t � evolves, and it is assumed that they evolve as a first-order random walk process: ...

This article investigates the effect of output volatility on output growth in Barbados, a small island developing state located in the Caribbean, based on data from 1976 through 2018. I conduct the analysis using the Stochastic Volatility in Mean model with time-varying parameters. The evidence suggests that output growth volatility, measured by stochastic volatility, negatively links with output growth at the aggregate level. I also examine disaggregate output data in order to understand this link better. The sectoral evidence suggests a positive link for services, a negative link for manufacturing, and a weak link for agriculture and the industrial sectors.

... Chan (2017) developed a model where the stochastic volatility has a direct and time-varying impact on the variable considered. He builds upon the stochastic volatility in mean (SVM) model of Koopman and Uspensky (2002). The latter model was originally advanced for financial time series as an alternative of the ARCH-M model of Engle et al. (1987). ...

... In this paper, we allow the impact of oil price uncertainty on the oil price to be time-varying, capturing any structural instability in the macroeconomic environment that may alter the price-price uncertainty relationship. Although, there are several approaches to measure oil price uncertainty, a popular approach is the stochastic volatility (SV) model of Koopman and Hol Uspensky (2002), which usually fits better to time series that show conditional heteroskedasticity. Compared to the GARCH models, where the volatility specified with a deterministic function, volatility in the SV models is specified as a latent stochastic process that allows volatility shocks. ...

High price volatility in oil markets creates uncertainty and risk, and increased risk premium may feed back into the prices. This study investigates the dynamic nexus between oil price and its volatility for oil spot and futures markets by means of stochastic volatility in the mean model with time-varying parameters in the conditional mean. The study finds substantial time-variation about the impact of oil price volatility on oil price return in both spot and 1-month to 10-month futures markets. The oil price return volatility has a positive impact on oil price return series over the sample period form the mid-1980s to 2017s except for four very short time periods, which correspond to collapse of OPEC in 1986, invasion of Kuwait in 1990/91, Asian crisis in 1997/2000 and the Global Financial Crisis in 2008. While the oil price return volatility has a positive impact on oil prices, it has limited negative impact on oil prices during periods corresponding to these historical events. Moreover, the findings from this study point out to the existence of a negative and small effect of the lagged oil return series on its volatility for both the spot and futures markets.

... Chan (2017) developed a model where the stochastic volatility has a direct and time-varying impact on the variable considered. He builds upon the stochastic volatility in mean (SVM) model of Koopman and Uspensky (2002). The latter model was originally advanced for financial time series as an alternative of the ARCH-M model of Engle et al. (1987). ...

... In this paper, we allow the impact of oil price uncertainty on the oil price to be time-varying, capturing any structural instability in the macroeconomic environment that may alter the price-price uncertainty relationship. Although, there are several approaches to measure oil price uncertainty, a popular approach is the stochastic volatility (SV) model of Koopman and Hol Uspensky (2002), which usually fits better to time series that show conditional heteroskedasticity. Compared to the GARCH models, where the volatility specified with a deterministic function, volatility in the SV models is specified as a latent stochastic process that allows volatility shocks. ...

... Bauwens et al. (2012) have given an exceptional report of various aspects of volatility models and their applications. Discrete-time model due to Koopman and Uspensky (2002) for regularly spaced data on India's monthly spices export is applied for modelling and forecasting of unobserved volatility process. Spices are the most important commercial crops in India and have great medicinal value. ...

... In a way, the model assumes a form of state space representation. The SV-M model proposed by Koopman and Uspensky (2002) can be written as where y t denotes the observation, s t is the latent volatility and d is the parameter measuring the volatility-in-mean effect and s * is a scale parameter that removes the need for a constant term in the stationary first-order log-volatility autoregressive process. ...

In the early years of time-series data analysis in both linear as well as nonlinear models, researchers were mainly focussed in modelling the mean equation. But, recently, concerns about conditional variance or volatility in the data have increased interest in the area of time-series modelling. Spices are an indispensable crop of India as it contributes substantially to agriculture in terms of farm income, employment and export earnings, which directly affect the GDP. It has been seen that there are sudden and high fluctuations in the spices export time series data. It is known that knowledge of volatility can be a good piece of information for a decision-making process. To this end, Stochastic Volatility in mean (SVM) model was proposed. In this paper, a methodology for estimation of SVM using Particle filter is carried out. Further, illustration of the developed methodology is also carried out using volatile dataset. Statistical measures are also computed to validate the importance of the developed methodology. © 2018 Oriental Scientific Publishing Company. All rights reserved.

... Uncertainty shocks are often modelled as time-varying volatility in economic activity or economic policy that can influence the market participants' decisions and macroeconomic aggregates (Bloom, 2009;Basu and Bundick, 2017;Fernández-Villaverde et al., 2015). Although there are several approaches to measuring uncertainty in a time series, a popular approach is stochastic volatility (SV) model of Koopman and Hol Uspensky (2002), which usually fits better to time series that show conditional heteroskedasticity. Compared to GARCH models, where volatility is specified with a deterministic function, volatility in SV models is specified as a latent stochastic process that allows volatility shocks. ...

We study effects of energy market uncertainty shocks on energy transition on the 28 European Union countries from 1990 to 2015 using annual frequency data. We assess the effects of oil price as well as the energy market supply, demand, and residual price shocks using a time-varying parameter panel data stochastic volatility model. We show the importance of reducing energy market uncertainty for the success of a clean energy transition in Europe as uncertainties have strong time-varying effects on the transition from fossil fuels to renewable energy. The oil price and residual energy price uncertainties are the key factors encouraging renewable energy transition that reduces the vulnerability of economies to energy shocks. Energy supply shocks affect the transition negatively while the demand shocks work similarly to residual energy prices shocks, requiring a robust energy base that is less volatile. The paper also discusses policy recommendations.

... On the econometric methodology for the effects of output volatility on output, the main methodology to assess this relationship is an ARCH-type model that provides the direct effect of a variable's conditional variance on its level (Evans 1991). Koopman and Uspensky (2002) propose the SVM model, which enables observing variability as a stochastic variable rather than as a deterministic variable, as occurs in ARCH-type models. Chan (2017) extends the SVM model with TVPs to explore the effect of volatility on output growth (the TVP-SVM model). ...

This article assesses the effect of output growth volatility on output growth within a stochastic-volatility-in-mean model with a time-varying framework for an open small economy: Turkey. Until now, the empirical evidence on industrial production mainly reveals that this relationship is negative. However, in further examining different sectors and sub-sectors of industrial production, we find the sign of the relationship changes depending on the sector. Moreover, there is limited evidence that the sign of the relationship changes over time. Thus, the evidence reveals that the nature of the output growth volatility–output growth relationship is not uniform across sectors.

... Note that the SV σ t is part of the measurement Equation (A14), which resembles the SV in mean model of Koopman and Uspensky (2002), with the state transition shown in Equation (A2). Nevertheless, the TVPM-SV model with bias correction is still conditionally linear, so that we use the simulated ML method introduced in the previous section for estimation. ...

This paper studies the evolution of long‐run output and technical progress growth rates in the G‐7 countries during the post‐war period by considering the concept of the natural rate of growth. We use time‐varying parameter models that incorporate both stochastic volatility and a Heckman‐type two‐step estimation procedure that deals with the possible endogeneity problem in the econometric models. Our results show a significant decline in long‐run growth rates that is not associated with the detrimental effects of the Great Recession, and that the rate of growth of technical progress appears to be behind the slowdown in long‐run GDP growth.

... Various extensions of the basic SV model of Taylor (1986) have been proposed. These include the model with leverage effect (Harvey & Shephard 1996), model with heavy-tailed distributions (Liesenfeld & Jung 2000), volatility in mean model (Koopman & Hol Uspensky 2002), model with jumps (Eraker et al. 2003), the model with moving average innovations (Chan 2013), among others. However, the estimation of these models is more complicated than GARCH-type models due to the stochastic evolution of volatility. ...

... Such models typically rule out that the level of the volatility directly affects the conditional mean of the predictive regression. This assumption is relaxed in Koopman and Hol Uspensky (2002) and Chan (2017) by assuming that the volatilities enter the conditional mean equation and thus exert a direct effect on the quantity of interest. ...

Successful forecasting models strike a balance between parsimony and flexibility. This is often achieved by employing suitable shrinkage priors that penalize model complexity but also reward model fit. In this note, we modify the stochastic volatility in mean (SVM) model proposed in Chan (2017) by introducing state-of-the-art shrinkage techniques that allow for time-variation in the degree of shrinkage. Using a real-time inflation forecast exercise, we show that employing more flexible prior distributions on several key parameters slightly improves forecast performance for the United States (US), the United Kingdom (UK) and the Euro Area (EA). Comparing in-sample results reveals that our proposed model yields qualitatively similar insights to the original version of the model.

... Methods proposed to alleviate these concerns include variants of stochastic volatility in mean (SVM) models. This modeling approach assumes time variation in the second moments of shocks to economic series, that also affect the respective first moments in dynamic time series models (see Koopman and Hol Uspensky, 2002). The time-varying volatilities are considered as a measure of uncertainty, establishing a unified framework for estimating uncertainty and its effects jointly. ...

This paper investigates the time-varying impacts of international macroeconomic uncertainty shocks. We use a global vector autoregressive (GVAR) specification with drifting coefficients and factor stochastic volatility in the errors to model six economies jointly. The measure of uncertainty is constructed endogenously by estimating a scalar driving the innovation variances of the latent factors, and is included also in the mean of the process. To achieve regularization, we use Bayesian techniques for estimation, and introduce a set of hierarchical global-local shrinkage priors. The adopted priors center the model on a constant parameter specification with homoscedastic errors, but allow for time-variation if suggested by likelihood information. Moreover, we assume coefficients across economies to be similar, but provide sufficient flexibility via the hierarchical prior for country-specific idiosyncrasies. The results point towards pronounced real and financial effects of uncertainty shocks in all countries, with differences across economies and over time.

... As the NSE stock market is not open 24 h, the intra-day covariance misses out the covariance contribution from the time market closes until it opens on the next working day. We follow the approach of Martens (2002) and Koopman and Hol Uspensky (2002), where a scaling factor is used to convert intra-day volatility to obtain a measure of volatility for the whole day. The scaling factor for returns of the i-th stock is computed as ...

The Hierarchical risk parity (HRP) approach of portfolio allocation, introduced by Lopez de Prado (2016), applies graph theory and machine learning to build a diversified portfolio. Like the traditional risk-based allocation methods, HRP is also a function of the estimate of the covariance matrix, however, it does not require its invertibility. In this paper, we first study the impact of covariance misspecification on the performance of the different allocation methods. Next, we study under an appropriate covariance forecast model whether the machine learning based HRP outperforms the traditional risk-based portfolios. For our analysis, we use the test for superior predictive ability on out-of-sample portfolio performance, to determine whether the observed excess performance is significant or if it occurred by chance. We find that when the covariance estimates are crude, inverse volatility weighted portfolios are more robust, followed by the machine learning-based portfolios. Minimum variance and maximum diversification are most sensitive to covariance misspecification. HRP follows the middle ground; it is less sensitive to covariance misspecification when compared with minimum variance or maximum diversification portfolio, while it is not as robust as the inverse volatility weighed portfolio. We also study the impact of the different rebalancing horizon and how the portfolios compare against a market-capitalization weighted portfolio.

... Some studies (see, e.g., French et al., 1987;Chou, 1988;Campbell and Hentschel, 1992;Ghysels et al., 2002) have reported consistently positive and significant estimates of the risk premium, while others (see, e.g., Campbell, 1987;Turner et al., 1989;Breen et al., 1989;Chou et al., 1992;Glosten et al., 1993) document negative values, unstable signs, or otherwise insignificant estimates. Moreover, the contemporaneous risk-return tradeoff appears sensitive to the use of ARCH as opposed to stochastic volatility formulations (Koopman and Uspensky, 1999), the length of the return horizon (Harrison and Zhang, 1999), along with the instruments and conditioning information used in empirically estimating the relationship (Harvey, 2001;Brandt and Kang, 2002). As we show below, these conflicting results are not necessarily inconsistent with the basic ICAPM model, in that the risk-return tradeoff relationship depends importantly on the particular volatility measure employed in the empirical investigations. 1 ...

... As many recent studies have pointed out the widespread nature of structural instability in many macroeconomic time series Watson, 1996, 2007;Cogley and Sargent, 2001;Kim, Nelson and Piger, 2004), Chan (2017) extends the SVM model of Koopman and Hol Uspensky (2002) to allow for time-varying parameters and re-examine the relationship between inflation and inflation uncertainty. By allowing the coefficient associated with the volatility in the conditional mean to be changing over time, Chan (2017) finds substantial time-variation in this relationship using data from the US, the UK and Germany. ...

This paper investigates whether the relationship between inflation and inflation uncertainty has changed and whether the change in this relationship has been gradual or abrupt. We extend the time‐varying parameter with stochastic volatility in mean model (TVP‐SVM) to include a mixture innovation disturbance in the time‐varying parameter process. The proposed model produces more reliable estimates and allows us to investigate the occurrence of breaks in the gradually evolving process of the time varying coefficients. Using data of US, Germany, Canada, New Zealand, UK, France, Italy, Spain and Australia, we find that: (i) the relationship between inflation and inflation uncertainty substantially varies over time; (ii) there is strong support for the existence of abrupt changes in the US inflation–inflation uncertainty relationship; (iii) our empirical results of Canada and New Zealand show that the correlation between inflation and inflation uncertainty has been much weaker since early 1990s, which coincides with the timing of the implementation of inflation targeting.

... As the NSE stock market is not open 24hrs, the intra-day covariance misses out the covariance contribution from the time market closes until it opens on the next working day. We follow the approach of [Martens, 2002] and [Koopman and Hol Uspensky, 2002], where a scaling factor is used to convert intra-day volatility to obtain a measure of volatility for the whole day. The scaling factor for returns of the i−th stock is computed as ...

The Hierarchical risk parity (HRP) approach of portfolio allocation, introduced by [Lopez de Prado, 2016], applies graph theory and machine learning to build a diversified portfolio. Like the traditional risk based allocation methods, HRP is also a function of the estimate of the covariance matrix, however, it doesn't require its invertibility. In this paper we first study the impact of covariance misspecification on the performance of the different allocation methods. Next we study under appropriate covariance forecast model whether the machine learning based HRP outperforms the traditional risk based portfolios. For our analysis we use the test for superior predictive ability, on out-of-sample portfolio performance, to determine whether the observed excess performance is significant or occurred by chance. We find that when the covariance estimates are crude, inverse volatility weighted portfolios are more robust, followed by the machine learning based portfolios. Minimum variance and maximum diversification are most sensitive to covariance misspecification. HRP follows the middle ground, it is less sensitive to covariance misspecification when compared with minimum variance or maximum diversification portfolio, while it is not as robust as the inverse volatility weighed portfolio. We also study the impact of different rebalancing horizon and how the portfolios compare against a market-capitalization weighted portfolio.

... Based on the SV-T model introduced by Harvey [4] (1994) and the joint distribution indicated by Mahieu and Schotman [5](1998), Jacquier [6] showed the parameter posterior distribution of the thick-tailed model. Koopman, S.J. and Uspensky, EH. [7](2010) proposed an extended model and analyzed the relationship between risks and expected returns. China's research on the volatility of stock market started late, however, with improvement of the capital market, more and more research results have been obtained. ...

... where k t and q t are jump size and jump indicator, respectively, and log volatility follows an AR(1) process. Next, we outline the SV in mean (SV-M) equation model given by Koopman and Hol Uspensky (2002), as a counterpart to GARCH-M model, where the SV enters the observation equation as shown below: ...

We examine and compare a large number of generalized autoregressive conditional heteroskedastic (GARCH) and stochastic volatility (SV) models using series of Bitcoin and Litecoin price returns to assess the model fit for dynamics of these cryptocurrency price returns series. The various models examined include the standard GARCH(1,1) and SV with an AR(1) log-volatility process, as well as more flexible models with jumps, volatility in mean, leverage effects, t-distributed and moving average innovations. We report that the best model for Bitcoin is SV-t while it is GARCH-t for Litecoin. Overall, the t-class of models performs better than other classes for both cryptocurrencies. For Bitcoin, the SV models consistently outperform the GARCH models and the same holds true for Litecoin in most cases. Finally, the comparison of GARCH models with GARCH-GJR models reveals that the leverage effect is not significant for cryptocurrencies, suggesting that these do not behave like stock prices.

... Similar to the GARCH-M, the stochastic volatility in mean model proposed by Koopman and Hol Uspensky (2002) accommodates the possibility of volatility feedback: ...

This study employs a prominent model comparison criterion, namely the Bayes factor, to compare three commonly used GARCH models with their stochastic volatility (SV) counterparts in modelling the dynamics of inflation rates. By using consumer price index (CPI) data from 18 developed countries to evaluate these models, we find that the GARCH models are generally outperformed by their stochastic volatility counterparts. Furthermore, the stochastic volatility in mean (SV-M) model is shown to be the best for all 18 countries considered. The paper also examines which model characteristics play a main role in modelling inflation rates. It turns out that inflation volatility feedback is a crucial feature that we should take into consideration when modelling inflation rates. The relevance of a leverage effect, however, is found to be rather ambiguous. Finally, the forecasting results using the log predictive score confirm these findings.

... Different from Chan, Koop, and Potter (2013), Chan (2017) recently adopts a stochastic volatility in mean approach following Koopman and Hol Uspensky (2002). Its approach allows for timevarying coefficients in the conditional mean using the same departure point of unobserved components model of Stock and Watson (2007). ...

Recent inflation dynamics in the United States (US) questioned the role of driving forces of inflation in the long run. Although the US recorded one of the longest economic recovery periods and the labour market conditions improved after the Global crisis from 2008 to 2009, the inflation level remained relatively low. Starting from this evidence, the purpose of our paper is to shed light to the influence of inflation uncertainty and labour market conditions on the US inflation level. To this end, we use two bounded measures of inflation uncertainty, and we compare a linear with an asymmetric Autoregressive Distributed Lag (ARDL) framework. We show that both inflation uncertainty and labour market conditions explain the long-run US inflation. However, these results are sensitive to the way the inflation uncertainty is computed. Moreover, contrary to the recent affirmations regarding the vanishing role of labour market in explaining the US inflation in the long run, we show that the labour market influence is stronger in the post-crisis, compared with the pre-crisis period. Therefore, the monetary policymakers cannot make abstraction of labour market developments in anticipating the US inflation level.

... Örneğin, Huang ve Hueng (2008-2003 döneminde ABD hisse senedi piyasaları için gerçekleşen piyasa risk priminin negatif olduğu dönem sayısının toplam dönemin %47,23'ünü oluşturduğu ifade etmişlerdir. Benzer şekilde Sandoval ve Saens (2004) 1995-2002 dönemi için Arjantin, Brezilya, Şili ve Meksika hisse senedi piyasalarında bu oranın sırasıyla yaklaşık %50, %48, %53 ve %46 olduğunu belirtmişlerdir. ...

... Allowing ARMA components in UC and UCSV models such as in Harvey (2011), Kim et al. (2014), Mertens and Nason (2017) and Cecchetti et al. (2017) has also shown to improve inflation forecasting. Chan (2017) and Koopman and Hol Uspensky (2002) consider UCSV models with SV also going into the trend equation linearly. In multivariate settings, Mertens (2016) and Chan et al. (2018) use survey inflation expectation to anchor the trend inflation. ...

The unobserved components time series model with stochastic volatility has gained much interest in econometrics, especially for the purpose of modelling and forecasting inflation. We present a feasible simulated maximum likelihood method for parameter estimation from a classical perspective. The method can also be used for evaluating the marginal likelihood function in a Bayesian analysis. We show that our simulation‐based method is computationally feasible, for both univariate and multivariate models. We assess the performance of the method in a Monte Carlo study. In an empirical study, we analyse U.S. headline inflation using different univariate and multivariate model specifications.

... Such models typically rule out that the level of the volatility directly affects the conditional mean of the predictive regression. This assumption is relaxed in Koopman and Hol Uspensky (2002) and Chan (2017) by assuming that the volatilities enter the conditional mean equation and thus exert a direct effect on the quantity of interest. ...

Successful forecasting models strike a balance between parsimony and flexibility. This is often achieved by employing suitable shrinkage priors that penalize model complexity but also reward model fit. In this note, we modify the stochastic volatility in mean (SVM) model proposed in Chan (2017) by introducing state‐of‐the‐art shrinkage techniques that allow for time‐variation in the degree of shrinkage. Using a real‐time inflation forecast exercise, we show that employing more flexible prior distributions on several key parameters sometimes improves forecast performance for the United States (US), the United Kingdom (UK) and the Euro Area (EA). Comparing in‐sample results reveals that our proposed model yields qualitatively similar insights to the original version of the model.

... There is a long tradition of using importance sampling to evaluate the integrated likelihood of stochastic volatility models. Earlier papers, such asDurbin and Koopman (1997),Shephard and Pitt (1997),Koopman and Hol Uspensky (2002), Frühwirth-Schnatter and Wagner (2008), McCausland (2012, have focused mostly on univariate stochastic volatility models. ...

Vector autoregressions (VARs) with multivariate stochastic volatility are widely used for structural analysis. Often the structural model identified through economically meaningful restrictions--e.g., sign restrictions--is supposed to be independent of how the dependent variables are ordered. But since the reduced-form model is not order invariant, results from the structural analysis depend on the order of the variables. We consider a VAR based on the factor stochastic volatility that is constructed to be order invariant. We show that the presence of multivariate stochastic volatility allows for statistical identification of the model. We further prove that, with a suitable set of sign restrictions, the corresponding structural model is point-identified. An additional appeal of the proposed approach is that it can easily handle a large number of dependent variables as well as sign restrictions. We demonstrate the methodology through a structural analysis in which we use a 20-variable VAR with sign restrictions to identify 5 structural shocks.

... The standard practice in volatility analyses includes the following steps: a) use of highfrequency daily weekly or monthly adjusted return data, that is, returns minus mean returns or differences in log returns of successive periods as in this paper, Liu and Morley (2009); b) use of summary statistics to describe the key return characteristics for different periods, including tests of normality (Shin, 2005;Xu, 1999;Uspensky and Koopman, 2002;Aggarwal et al., 2001;Rousan and Al-Khouri, 2005;Asai and McAleer, 2007;Wong and Cheung, 2010); c) providing visual perspectives on the observed volatilities by plotting the volatilities for each series; d) determining optimal number of lags for the models through critical examination of autocorrelation and partial autocorrelation statistics accompanying the Ljung-Box Q statistics; e) comparing alternative models using selected information criteria used in assessing model fitness for example Akaike Information (AIC), Schwarz Criterion (SC) and Hannan-Quinn Criterion (HQC) and log-likelihood function, Rousan and Al-Khouri (2005); f) assessing stability or stationarity of the models and the reliability of the volatility estimates, using known results on the required parameter values and/or their sums; g) using dummy variables as appropriate in the return and/or volatility models to isolate the effects of day of the week, month of the year, year or other external variables and policy shocks on the returns and volatilities (Islam and Watanapalachaikul, 2005, p. 134-144;Aliyu, 2012;Engle and Ng (1993;, Roh, 2007, p. 920;Aggarwal et al., 2001, p. 53;Batra, 2004, p. 17), and h) finding meaningful interpretations of model parameters in terms of volatility persistence, asymmetry or leverage (Engle and Ng, 1992;Roh, 2007). In this section we employ a good number of these steps, as appropriate, for our modelling objectives. ...

This paper presents recent results in augmented ARCH-GARCH volatility modelling of the Nigerian Stock Market (NSM) for the study period 2000-2010 which encompasses the 2004 bank reforms in Nigeria and the 2007 global financial crisis. The paper applies five candidate GARCH models capable of tracking key aspects of volatility dynamics in a stock market to the All Shares Index and returns of the NSM for 2000-2010, in order to identify factors which drive volatility in the system and select suitable best-fit models for this period. The results indicate September, and years 2000, 2003, 2008, and 2009 as significant volatility drivers. The best-fit model out of the five explored is the GARCH-M (1,1). Implications of these results for systematic stock market characterisation and development (SSMCD) are explored in the paper.

... The standard practice in volatility analyses includes the following steps: a) use of highfrequency daily weekly or monthly adjusted return data, that is, returns minus mean returns or differences in log returns of successive periods as in this paper, Liu and Morley (2009); b) use of summary statistics to describe the key return characteristics for different periods, including tests of normality (Shin, 2005;Xu, 1999;Uspensky and Koopman, 2002;Aggarwal et al., 2001;Rousan and Al-Khouri, 2005;Asai and McAleer, 2007;Wong and Cheung, 2010); c) providing visual perspectives on the observed volatilities by plotting the volatilities for each series; d) determining optimal number of lags for the models through critical examination of autocorrelation and partial autocorrelation statistics accompanying the Ljung-Box Q statistics; e) comparing alternative models using selected information criteria used in assessing model fitness for example Akaike Information (AIC), Schwarz Criterion (SC) and Hannan-Quinn Criterion (HQC) and log-likelihood function, Rousan and Al-Khouri (2005); f) assessing stability or stationarity of the models and the reliability of the volatility estimates, using known results on the required parameter values and/or their sums; g) using dummy variables as appropriate in the return and/or volatility models to isolate the effects of day of the week, month of the year, year or other external variables and policy shocks on the returns and volatilities (Islam and Watanapalachaikul, 2005, p. 134-144;Aliyu, 2012;Engle and Ng (1993;, Roh, 2007, p. 920;Aggarwal et al., 2001, p. 53;Batra, 2004, p. 17), and h) finding meaningful interpretations of model parameters in terms of volatility persistence, asymmetry or leverage (Engle and Ng, 1992;Roh, 2007). In this section we employ a good number of these steps, as appropriate, for our modelling objectives. ...

This paper presents recent results in augmented ARCH-GARCH volatility modelling of the Nigerian Stock Market (NSM) for the pre-2004 bank reforms period. The paper applies five candidate GARCH models capable of tracking key aspects of volatility dynamics in a stock market to the All Shares Index and returns of the NSM for 2000-2010, in order to identify factors which drive volatility in the system and select suitable best-fit models for this period. The results indicate the presence of asymmetric effects associated with March, July and year 2003 (a year before the reforms) as significant volatility drivers, suggesting that there is a one-year lead-in effects of the reforms on the NSM, probably due to policy announcement effects and expectation formations on the part of market participants. The best-fit model out of the five explored is the GJR-GARCH (1,1) model. Implications of these results for stock market characterisation and development are explored in the paper.

... In this work, we allow the impact of the volatility on the returns to be time-varying in the SV in mean model similar to that of Chan [24], which was initially developed by Koopman and Hol Uspensky [53]. The TVP-SVM model is sufficiently flexible to capture the stochastic volatility shocks and analyse the relationship between the returns and volatility. ...

This paper investigates volatility dependence between global crude oil and China's agriculture futures by employing a quantile-on-quantile approach. The time-varying parameter stochastic volatility in mean model is used to evaluate the conditional volatility. The empirical results demonstrate the heterogeneous dependence between crude oil volatility and volatility in China's agricultural futures across quantiles. The absolute volatility spillover exhibits an overall increasing trend with higher quantiles of agricultural volatility, and the volatility dependence is asymmetric across violent/stable market situations. Moreover, extremely high or low quantiles of oil volatility exert a considerable influence, while crude oil volatility does not influence the agricultural volatility in the normal mode of the crude oil market. Furthermore, a high persistence is noted in the volatility dynamics, and the impacts of volatility on the returns exhibit substantial time variation. These findings could have important economic implications for portfolio managers and policymakers in different economic and financial circumstances.

... Another comparative advantage of the proposed MCL technique is that only trivial modifications have to be imposed to extend the basic SV model to allow for heavy-tailed errors, leverage effects and explanatory variables; see, for example, Sandmann and Koopman (1998). Koopman and Hol-Uspensky (2001) proposed the SVin-Mean model, in which the mean may also be influenced by changes in the conditional volatility. Liesenfeld and Richard (2003) generalized the importance sampling method employed by Danielsson (1994) by making use of the efficient importance sampling procedure proposed by Richard and Zhang (1996). ...

... The jump indicator and jump size are modeled precisely as in the GARCH-J model. The fourth model, SV-M, is the SV in mean model of Koopman & Hol Uspensky (2002). Here the stochastic volatility enters the observation equation as a covariate: ...

... Following Koopman and Hol Uspensky (2002) and Chan (2017), we extend the SVL model by introducing the stochastic volatility as a covariate in the outcome equation. This extended model takes the following form ...

In this paper, we introduce a new Bayesian chi-squared test based on an adjusted quadratic loss function for testing a simple null hypothesis. We show that the asymptotic null distribution of our suggested test is a central chi-squared distribution under some assumptions required for the Bayesian large sample theory. We refer to our test as the Bayesian robust chi-squared test, since it is robust to parametric misspecification in the alternative model. That is, the limiting null distribution of our test is a central chi-squared distribution irrespective of parametric misspecification in the alternative model. In addition to being robust to parametric misspecification, our test also shares properties of the test suggested by Li et al. (2015) based on a quadratic loss function. We provide four examples to illustrate the implementation of our suggested Bayesian test statistic.

In this study, we employ generalized autoregressive conditional heteroscedastic (GARCH) and stochastic volatility models to investigate the dynamics of wheat, corn, and soybean prices. We find that the stochastic volatility model provides the highest persistence of the volatility estimation in all cases. In addition, based on the monthly data, we find that the jump process and asymmetric effect do not exist in agricultural commodity prices. Finally, by estimating Value at risk (VaR) for these agricultural commodities, we find that the upsurge in agricultural prices in 2008 may have been caused by financialization.

This paper studies the effects of world oil price uncertainty on China’s economy from both empirical and theoretical angles. First, we use a vector autoregression model with stochastic volatility in mean to explore the relation between world oil price uncertainty and real economic activity of China. We find that one standard deviation higher uncertainty shock of world oil price reduces electricity production by almost 0.2 percentage. Then we use a canonical New-Keynesian model solved by third-order perturbation method to explain this phenomenon, in which households’ precautionary saving channel distresses real activity when oil price uncertainty is higher.

In order to analyze the stock market bubble phenomenon, the vector autoregressive moving average (VARMA) model with non-Gaussian innovations and stochastic volatility components (VARMA-t-SV) is constructed for financial modeling. Considering the estimation complexity of VARMA-t-SV model, the Kronecker structure of likelihood function is employed to speed up computation. Then we develop the corresponding Markov chain Monte Carlo (MCMC) sampling method to test the covariance structure specifications. Model comparisons illustrate that the VARMA model with flexible covariance structures perform better performances. The model parameter estimation results show that the fat tail and the heteroscedasticity features are useful in raising the performances compared to the standard form. Finally, using Chinese financial markets data, the effects of monetary policy on stock market bubbles are analyzed based on the VARMA-t-SV model. The empirical studies provide evidence to support the rational asset price bubble theory, namely, the tightening monetary policy may not succeed in shrinking the asset price bubble, which provides suggestions for regulators and investors.

The volatility of the returns on financial assets is not a constant number over time as many valuation models, mainly derivatives, developed during the 80's, assume. The complexity of non-heteroscedasticity and the difference in results when estimated with different methodologies such as historical, implicit or stochastic calculation, make this subject too extensive a field to be covered in this work. However, stochastic volatility has been widely accepted in recent years. Monte Carlo Markov Chain (MCMC) method is explained and used to estimate the distribution of oil prices of Mexican basket as a stochastic variable. MCMC in the univariate case, supposes that we can estimate the distribution of a latent (hidden) variable through the behavior of another variable observed posteriori with the help of Bayesian inference; this method allows an efficient inference independent of the underlying process through an algorithm. The results show a correct adjustment of stochastic volatility to the behavior of the oil prices.

We develop a factor stochastic volatility model that incorporates leverage effects, return asymmetry, and heavy tails across all systematic and idiosyncratic model components. Our model leads to a flexible high-dimensional dependence structure that allows for time-varying correlations, tail dependence and volatility response to both systematic and idiosyncratic return shocks. We develop an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior estimation based on the particle Gibbs, ancestor sampling, particle efficient importance sampling methods and interweaving strategy. To obtain parsimonious specifications in practice, we build computationally efficient model selection directly into our estimation algorithm. We validate the performance of our proposed estimation method via simulation studies with different model specifications. An empirical study for a sample of US stocks shows that return asymmetry is a systematic phenomenon and our model outperforms other factor models for value-at-risk evaluation.

Se utilizan siete modelos GARCH y de volatilidad estocástica (SV) para modelar y comparar empíricamente la volatilidad de los rendimientos de cuatro materias primas: oro, cobre, petróleo y gas natural. Los resultados muestran evidencia de colas gruesas y saltos aleatorios creados por desequilibrios de oferta / demanda, episodios de inestabilidad internacional, tensiones geopolíticas y especulación de mercado, entre otros factores. También encontramos evidencia de un efecto apalancamiento en el petróleo y el cobre, derivado de su dependencia de la actividad económica mundial; y de un efecto apalancamiento inverso en oro y gas natural, consistente con el papel del primero como activo seguro y con la incertidumbre sobre la oferta futura del segundo. Además, en la mayoría de los casos no hay evidencia de un impacto de la volatilidad en la media. Finalmente, encontramos que los modelos de volatilidad de rendimiento con mejor desempeño son GARCH-t para el oro.

Bayesian inference with a large dataset is computationally intensive, as Markov chain Monte Carlo simulation requires a complete scan of the dataset for each proposed parameter update. To reduce the number of data points evaluated at each iteration of posterior simulation, we develop a double marginalized subsampling method, which is applicable to a wide array of microeconometric models including Tobit, Probit, regressions with non-Gaussian errors, heteroscedasticity and stochastic volatility, hierarchical longitudinal models, time-varying-parameter regressions, Gaussian mixtures, etc. We also provide an extension to double pseudo-marginalized subsampling, which has more applications beyond conditionally conjugate models. With rank-one update of the cumulative statistics, both methods target the exact posterior distribution, from which a parameter draw can be obtained with every single observation. Simulation studies demonstrate the statistical and computational efficiency of the marginalized sampler. The methods are also applied to a real-world massive dataset on the incidentally truncated mortgage rates.

We propose a moving average stochastic volatility in mean model and a moving average stochastic volatility model with leverage. For parameter estimation, we develop efficient Markov chain Monte Carlo algorithms and illustrate our methods, using simulated and real data sets. We compare the proposed specifications against several competing stochastic volatility models, using marginal likelihoods and the observed-data Deviance information criterion. We also perform a forecasting exercise, using predictive likelihoods, the root mean square forecast error and Kullback-Leibler divergence. We find that the moving average stochastic volatility model with leverage better fits the four empirical data sets used.

This study aims to investigate the effect of climate uncertainty on global commodity markets. To do so, I modify Theodoridis's (2018) time-varying factor-augmented VAR (FAVAR) with stochastic volatility in mean model. By incorporating the information from a large data set with an efficiently constructed dynamic factor structure, I not only overcome the omitted variable problem but also maintain the efficiency of the estimator to identify the nonlinear climate effects. Moreover, I apply Chang et al.'s (2017) endogenous regime switching in mean model for the climate variable, to overcome the statistical problem generated by the periodicity. The main empirical results can be summarized as follows. First, climate uncertainty generates an inflationary pressure on agricultural food, non-energy, and energy commodities for the El Niño years. Second, individual items such as maize and soybeans are more sensitive than the aggregated commodity indices to the effect of climate uncertainty. Third, climate uncertainty generates a negative supply shock, whereas market uncertainty generates a negative demand shock on the individual agricultural items.

The current paper studies equity markets for the contagion of squared index returns as a proxy for stock market volatility, which has not been studied earlier. The study examines squared stock index returns of equity in 35 markets, including the US, UK, Euro Zone and BRICS (Brazil, Russia, India, China and South Africa) countries, as a proxy for the measurement of volatility. Results from the conditional heteroskedasticity long memory model show the evidence of long memory in the squared stock returns of all 35 stock indices studied. Empirical findings show the evidence of contagion during the global financial crisis (GFC) and Euro Zone crisis (EZC). The intensity of contagion varies depending on its sources. This implies that the effects of shocks are not symmetric and may have led to some structural changes. The effect of contagion is also studied by decomposing the level series into explained and unexplained behaviors.

Bu çalışmada 7 gelişmiş 9 gelişen ülke hisse senedi piyasasında 24 farklı GARCH model yapısı kullanılarak toplam 378 defa risk-getiri ilişkisi incelenmiştir. Bulgular genel olarak değerlendirildiğinde risk parametresinin pozitif ve istatistiki olarak anlamlı olduğu durum sayısının toplam durumun sadece %22’sini oluşturduğu anlaşılmaktadır. Sonuçlar ülke bazlı olarak değerlendirildiğinde ise teorik beklentilerle uyumlu sonuçların sırasıyla en çok Meksika ( %83), Brezilya (%63), ABD (%58) ve Fransa (%46) hisse senedi piyasaları için geçerli olduğu ifade edilebilir.

This paper focuses on how explicit structural shocks that characterize the endogenous character of international oil price change affect the output volatility of the U.S. crude oil and natural gas mining industries. To this end, we employ a modified structural vector autoregressive model (SVAR) to decompose real oil-price changes into four components: U.S. supply shocks, non-U.S. supply shocks, aggregate demand shocks, and oil-specific demand shocks mainly driven by precautionary demand. The results indicate that output volatility of the U.S. crude oil and natural gas mining industry has significantly negative responses to U.S. supply shocks, aggregate demand shocks, and oil-specific demand shocks, while lacks significant response to non-U.S. supply shocks. Variance decomposition and historical decomposition confirm that U.S. supply shocks occupy most explaining variations in output volatility among the four structural oil shocks. Moreover, the oil-specific demand shocks explain more variation than that of aggregate demand shocks for the crude oil mining industry, but the opposite is true for the natural gas mining industry.

In this paper we examine the finite sample bias of sample skewness estimator for financial returns. We show that the bias of conventional sample skewness comes from two sources: the covariance between past return and future volatility, known as the leverage effect, and the covariance between past volatility and future return, commonly referred to as the volatility feedback effect. We derive explicit expressions for this bias and propose a nearly unbiased skewness estimator under mild assumptions. Our simulation study shows that the proposed estimator leads to almost unbiased skewness estimates with a sightly elevated mean squared error, and can reduce the bias of the skewness coefficient estimates by 40%. In our empirical application, we find that bias-corrected average skewness can better predict future market returns comparing to the case without bias-correction. This leads to an improved performance of skewness-based portfolios in terms of Sharpe ratio, certainty equivalence and transaction cost.

In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.

A simulation smoother in state space time series analysis is a procedure for drawing samples from the conditional distribution of state or disturbance vectors given the observations. We present a new technique for this which is both simple and computationally efficient. The treatment includes models with diffuse initial conditions and regression effects. Computational comparisons are made with the previous standard method. Two applications are provided to illustrate the use of the simulation smoother for Gibbs sampling for Bayesian inference and importance sampling for classical inference.

We undertake a comprehensive investigation of price and volume co-movement using daily New York Stock Exchange data from 1928
to 1987. We adjust the data to take into account well-known calendar effects and long-run trends. To describe the process,
we use a seminonparametric estimate of the joint density of current price change and volume conditional on past price changes
and volume. Four empirical regularities are found: (i) positive correlation between conditional volatility and volume; (ii)
large price movements are followed by high volume; (iii) conditioning on lagged volume substantially attenuates the “leverage”
effect; and (iv) after conditioning on lagged volume, there is a positive risk-return relation.

Changes in variance, or volatility, over time can be modelled using the approach based on autoregressive conditional heteroscedasticity
(ARCH). However, the generalizations to multivariate series can be difficult to estimate and interpret. Another approach is
to model variance as an unobserved stochastic process. Although it is not easy to obtain the exact likelihood function for
such stochastic variance models, they tie in closely with developments in finance theory and have certain statistical attractions.
This article sets up a multivariate model, discusses its statistical treatment and shows how it can be modified to capture
common movements in volatility in a very natural way. The model is then fitted to daily observations on exchange rates.

New techniques for the analysis of stochastic volatility models are developed. A Metropolis algorithm is used to construct a Markov Chain simulation tool. The exact solution to the filtering/smoothing problem of inferring about the unobserved variance states is a by-product of the authors' method. In addition, multistep-ahead predictive densities can be constructed. The authors illustrate their method by analyzing stock data. Sampling experiments are conducted to compare the performance of Bayes estimators to method of moments and quasi-maximum likelihood estimators proposed in the literature. In both parameter estimation and filtering, the Bayes estimators outperform these other approaches.

The expectati on of the excess holding yield on a long bond is postulated to depend upon its conditional variance. Engle's ARCH model is extended to allow the conditional variance to be a determinant of the mean and is called ARCH-M. Estimation and infer ence procedures are proposed, and the model is applied to three interest rate data sets. In most cases the ARCH process and the time varying risk premium are highly significant. A collection of LM diagnostic tests reveals the robustness of the model to various specification changes such as alternative volatility or ARCH measures, regime changes, and interest rate formulations. The model explains and interprets the recent econometric failures of the expectations hypothesis of the term structure. Copyright 1987 by The Econometric Society.

New techniques for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model are developed. A cyclic Metropolis algorithm is used to construct a Markov-chain simulation tool. Simulations from this Markov chain converge in distribution to draws from the posterior distribution enabling exact finite-sample inference. The exact solution to the fiitering/smoothing problem of inferring about the unobserved variance states is a by-product of our Markov-chain method. In addition, multistep-ahead predictive densities can be constructed that reflect both inherent model variability and parameter uncertainty. We illustrate our method by analyzing both daily and weekly data on stock returns and exchange rates. Sampling experiments are conducted to compare the performance of Bayes estimators to method of moments and quasi-maximum likelihood estimators proposed in the literature. In both parameter estimation and filtering, the Bayes estimators outperform these other approaches.

A maximum likelihood approach for the analysis of stochastic volatility models is developed. The method uses a recursive numerical integration procedure that directly calculates the marginal likelihood. Only conventional integration techniques are used, making this approach both flexible and simple. Experimentation shows that the method matches the performance of the best estimation tools currently in use. New stochastic volatility models are introduced and estimated. The model that best fits recent stock-index data is characterized by a highly non-Gaussian stochastic volatility innovation distribution. This model dominates a model that includes an autoregressive conditional heteroscedastic effect in the stochastic volatility process and a model that includes a stochastic volatility effect in the conditional mean.

SUMMARY In this paper we provide methods for estimating non-Gaussian time series models. These techniques rely on Markov chain Monte Carlo to carry out simulation smoothing and Bayesian posterior analysis of parameters, and on importance sampling to estimate the likelihood function for classical inference. The time series structure of the models is used to ensure that our simulation algorithms are efficient.

This new edition updates Durbin & Koopman's important text on the state space approach to time series analysis. The distinguishing feature of state space time series models is that observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbance terms, each of which is modelled separately. The techniques that emerge from this approach are very flexible and are capable of handling a much wider range of problems than the main analytical system currently in use for time series analysis, the Box-Jenkins ARIMA system. Additions to this second edition include the filtering of nonlinear and non-Gaussian series. Part I of the book obtains the mean and variance of the state, of a variable intended to measure the effect of an interaction and of regression coefficients, in terms of the observations. Part II extends the treatment to nonlinear and non-normal models. For these, analytical solutions are not available so methods are based on simulation.

In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.

This chapter evaluates the most important theoretical developments in ARCH type modeling of time-varying conditional variances. The coverage include the specification of univariate parametric ARCH models, general inference procedures, conditions for stationarity and ergodicity, continuous time methods, aggregation and forecasting of ARCH models, multivariate conditional covariance formulations, and the use of model selection criteria in an ARCH context. Additionally, the chapter contains a discussion of the empirical regularities pertaining to the temporal variation in financial market volatility. Motivated in part by recent results on optimal filtering, a new conditional variance model for better characterizing stock return volatility is also presented.

Recently suggested procedures for simulating from the posterior density of states given a Gaussian state space time series
are refined and extended. We introduce and study the simulation smoother, which draws from the multivariate posterior distribution
of the disturbances of the model, so avoiding the degeneracies inherent in state samplers. The technique is important in Gibbs
sampling with non-Gaussian time series models, and for performing Bayesian analysis of Gaussian time series.

This paper discusses and documents the algorithms of SsfPack 2.2. SsfPack is a suite of C routines for carrying out computations involving the statistical analysis of univariate and multivariate models in state space form. The emphasis is on documenting the link we have made to the Ox computing environment. SsfPack allows for a full range of different state space forms: from a simple time-invariant model to a complicated time-varying model. Functions can be used which put standard models such as ARMA and cubic spline models in state space form. Basic functions are available for filtering, moment smoothing and simulation smoothing. Ready-to-use functions are provided for standard tasks such as likelihood evaluation, forecasting and signal extraction. We show that SsfPack can be easily used for implementing, fitting and analysing Gaussian models relevant to many areas of econometrics and statistics. Some Gaussian illustrations are given.

We use the modified Black-Scholes model and a random variance option pricing model to study prices of European currency options traded in Geneva. The options, which cannot be exercised early, include calls and puts on the dollar/Swiss franc exchange rate. In the empirical analysis, we examine the model fit and the biases with respect to the strike price, time to maturity, and volatility. There is some evidence of mispricing and there are small gains available by trading with the random variance model.

In this paper, we examine the pricing of European call options on stocks that have variance rates that change randomly. We study continuous time diffusion processes for the stock return and the standard deviation parameter, and we find that one must use the stock and two options to form a riskless hedge. The riskless hedge does not lead to a unique option pricing function because the random standard deviation is not a traded security. One must appeal to an equilibrium asset pricing model to derive a unique option pricing function. In general, the option price depends on the risk premium associated with the random standard deviation. We find that the problem can be simplified by assuming that volatility risk can be diversified away and that changes in volatility are uncorrelated with the stock return. The resulting solution is an integral of the Black-Scholes formula and the distribution function for the variance of the stock price. We show that accurate option prices can be computed via Monte Carlo simulations and we apply the model to a set of actual prices.

Diffusion models for volatility have been used to price options while ARCH models predominate in descriptive studies of asset volatility. This paper compares a discrete-time approximation of a popular diffusion model with ARCH models. These volatility models have many siimilarities but the models make different assumptions about how the magnitude of price responses to information alters volatility and the amount of subsequent information. Several volatility models are estimated for daily DM/ exchange rates from 1978 to 1990. Copyright 1994 Blackwell Publishers.

The stochastic volatility model is used to estimate daily asset price dynamics. The model is estimated by integrating latent volatility out of the joint density of prices and volatility to obtain the marginal density of prices. Due to high number of dimensions of the integral, no conventional integration technique is applicable. A Monte Carlo method, called simulated maximum likelihood, is used to obtain the marginal density, where the latent variable is simulated conditional on available information. The model is estimated by 2022 observations from the S & P 500 index. For comparison ARCH type models are estimated with the same data.

This paper discusses the Monte Carlo maximum likelihood method of estimating stochastic volatility (SV) models. The basic SV model can be expressed as a linear state space model with log chi-square disturbances. The likelihood function can be approximated arbitrarily accurately by decomposing it into a Gaussian part, constructed by the Kalman filter, and a remainder function, whose expectation is evaluated by simulation. No modifications of this estimation procedure are required when the basic SV model is extended in a number of directions likely to arise in applied empirical research. This compares favorably with alternative approaches. The finite sample performance of the new estimator is shown to be comparable to the Monte Carlo Markov chain (MCMC) method.

Attempts to quantify the relationship between stock returns and volatility have produced conflicting conclusions in recent U.S. studies. Our paper examines this issue in the U.K. context for the first time using daily, weekly, fortnightly and monthly returns on the Financial Times All Share Index from January 1965 to December 1989. Volatility estimates are obtained from monthly sample variances and ARCH models. Expected returns are shown to have had a positive, though not statistically significant, relationship with expected volatility. The relationship between the unexpected components of the returns and volatility series is less clear: we find evidence for a negative relationship but only when volatility expectations are represented by standard deviations.

State space models are considered for observations which have non-Gaussian distributions. We obtain accurate approximations to the loglikelihood for such models by Monte Carlo simulation. Devices are introduced which improve the accuracy of the approximations and which increase computational efficiency. The loglikelihood function is maximised numerically to obtain estimates of the unknown hyperparameters. Standard errors of the estimates due to simulation are calculated. Details are given for the important special cases where the observations come from an exponential family distribution and where the observation equation is linear but the observation errors are non-Gaussian. The techniques are illustrated with a series for which the observations have a Poisson distribution and a series for which the observation errors have a t-distribution.

A simulation smoother in state space time series analysis is a procedure for drawing samples from the conditional distribution of state or disturbance vectors given the observations. We present a new technique for this which is both simple and computationally efficient.

It appears that volatility in equity markets is asymmetric: returns and conditional volatility are negatively correlated. We provide a unified framework to simultaneously investigate asymmetric volatility at the firm and the market level and to examine two potential explanations of the asymmetry: leverage effects and volatility feedback. Our empirical application uses the market portfolio and portfolios with different leverage constructed from Nikkei 225 stocks. We reject the pure leverage model of Christie (1982) and find support for a volatility feedback story. Volatility feedback at the firm level is enhanced by strong asymmetries in conditional covariances. Conditional betas do not show significant asymmetries. We document the risk premium implications of these findings. Article published by Oxford University Press on behalf of the Society for Financial Studies in its journal, The Review of Financial Studies.

This paper develops a new method for the analysis of stochastic volatility (SV) models. Since volatility is a latent variable in SV models, it is difficult to evaluate the exact likelihood. In this paper, a non-linear filter which yields the exact likelihood of SV models is employed. Solving a series of integrals in this filter by piecewise linear approximations with randomly chosen nodes produces the likelihood, which is maximized to obtain estimates of the SV parameters. A smoothing algorithm for volatility estimation is also constructed. Monte Carlo experiments show that the method performs well with respect to both parameter estimates and volatility estimates. We illustrate the method by analysing daily stock returns on the Tokyo Stock Exchange. Since the method can be applied to more general models, the SV model is extended so that several characteristics of daily stock returns are allowed, and this more general model is also estimated.

Many economic and financial time series have been found to exhibit dynamics in variance; that is, the second moment of the time series innovations varies over time. Many possible model specifications are available to capture this phenomena, but to date, the class of models most widely used are autoregressive conditional heteroskedasticity (ARCH) models. ARCH models provide parsimonious approximations to volatility dynamics and have found wide use in macroeconomics and finance. The family of ARCH models is the subject of this paper. In section II, we sketch the rudiments of a rather general univariate time-series model, allowing for dynamics in both the conditional mean and variance. In section III, we provide motivation for the models. In section IV, we discuss the properties of the models in depth, and in section V, we discuss issues related to estimation and testing. In Section VI, we detail various important extensions and applications of the model. We conclude in section VII with speculations on productive directions for future research.

In this paper, Markov chain Monte Carlo sampling methods are exploited to provide a unified, practical likelihood-based framework
for the analysis of stochastic volatility models. A highly effective method is developed that samples all the unobserved volatilities
at once using an approximating offset mixture model, followed by an importance reweighting procedure. This approach is compared
with several alternative methods using real data. The paper also develops simulation-based methods for filtering, likelihood
evaluation and model failure diagnostics. The issue of model choice using non-nested likelihood ratios and Bayes factors is
also investigated. These methods are used to compare the fit of stochastic volatility and GARCH models. All the procedures
are illustrated in detail.

The authors find support for a negative relation between conditional expected monthly return and conditional variance of monthly return using a GARCH-M model modified by allowing (1) seasonal patterns in volatility, (2) positive and negative innovations to returns having different impacts on conditional volatility, and (3) nominal interest rates to predict conditional variance. Using the modified GARCH-M model, they also show that monthly conditional volatility may not be as persistent as was thought. Positive unanticipated returns appear to result in a downward revision of the conditional volatility, whereas negative unanticipated returns result in an upward revision of conditional volatility. Copyright 1993 by American Finance Association.

This paper analyzes the relation of stock volatility with real and nominal macroeconomic volatility, economic activity, financial leverage, and stock trading activity using monthly data from 1857 to 1987. An important fact, previously noted by Robert R. Officer (1973), is that stock return variability was unusually high during the 1929-39 Great Depression. While aggregate leverage is significantly correlated with volatility, it explains a relatively small part of the movements in stock volatility. The amplitude of the fluctuations in aggregate stock volatility is difficult to explain using simple models of stock valuation, especially during the Great Depression. Copyright 1989 by American Finance Association.

One option-pricing problem which has hitherto been unsolved is the pricing of European call on an asset which has a stochastic volatility. This paper examines this problem. The option price is determined in series form for the case in which the stochastic volatility is independent of the stock price. Numerical solutions are also produced for the case in which the volatility is correlated with the stock price. It is found that the Black-Scholes price frequently overprices options and that the degree of overpricing increases with the time to maturity. Copyright 1987 by American Finance Association.

The aim of this survey paper is to provide an account of some of the important developments in the autoregressive conditional heteroskedasticity (ARCH) model since its inception in a seminal paper by Engle (1982). This model takes account of many observed properties of asset prices, and therefore, various interpretations can be attributed to it. We start with the basic ARCH models and discuss their different interpretations. ARCH models have been generalized in different directions to accommodate more and more features of the real world. We provide a comprehensive treatment of many of the extensions of the original ARCH model. Next we discuss estimation and testing for ARCH models and note that these models lead to some interesting and unique problems. There have been numerous applications and we mention some of these as we present different models. The paper includes a glossary of the acronyms for the models we describe. Copyright 1993 by Blackwell Publishers Ltd

It seems plausible that an increase in stock market volatility raises required stock returns, and thus lowers stock prices. We develop a formal model of this volatility feedback effect using a simple model of changing variance (a quadratic generalized autoregressive conditionally heteroskedastic, or QGARCH, model). Our model is asymmetric and helps to explain the negative skewness and excess kurtosis of U.S. monthly and daily stock returns over the period 1926–1988. We find that volatility feedback normally has little effect on returns, but it can be important during periods of high volatility.

This paper characterizes the rate of convergence of discrete-time multinomial option prices. We show that the rate of convergence depends on the smoothness of option payoff functions, and is much lower than commonly believed because option payoff functions are often of all-or-nothing type and are not continuously differentiable. To improve the accuracy, we propose two simple methods, an adjustment of the discrete-time solution prior to maturity and smoothing of the payoff function, which yield solutions that converge to their continuous-time limit at the maximum possible rate enjoyed by smooth payoff functions. We also propose an intuitive approach that systematically derives multinomial models by matching the moments of a normal distribution. A highly accurate trinomial model also is provided for interest rate derivatives. Numerical examples are carried out to show that the proposed methods yield fast and accurate results. Copyright Blackwell Publishers, Inc. 2000.

This paper examines the relation between stock returns and stock market volatility. We find evidence that the expected market risk premium (the expected return on a stock portfolio minus the Treasury bill yield) is positively related to the predictable volatility of stock returns. There is also evidence that unexpected stock market returns are negatively related to the unexpected change in the volatility of stock returns. This negative relation provides indirect evidence of a positive relation between expected risk premiums and volatility.

The dangers of shouting ``fire'' in a crowded theater are well understood, but the dangers of rushing to the exit in the financial markets are more complex. Yet, the two events share several features, and I analyze why people crowd into theaters and trades, why they run, what determines the risk, whether to return to the theater or trade when the dust settles, and how much to pay for assets (or tickets) in light of this risk. These theoretical considerations shed light on the recent global liquidity crisis and, in particular, the quant event of 2007.

This paper introduces an ARCH model (exponential ARCH) that (1) allows correlation between returns and volatility innovations (an important feature of stock market volatility changes), (2) eliminates the need for inequality constraints on parameters, and (3) allows for a straightforward interpretation of the "persistence" of shocks to volatility. In the above respects, it is an improvement over the widely-used GARCH model. The model is applied to study volatility changes and the risk premium on the CRSP Value-Weighted Market Index from 1962 to 1987. Copyright 1991 by The Econometric Society.

Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesian inference are developed. Conditions under which the numerical approximation converges almost surely to the true value with the number of Monte Carlo replications, and its numerical accuracy may be assessed reliably, are given. Importance sampling densities are derived from multivariate normal or student approximations to the posterior density. These densities are modified by automatic rescaling along each axis. The concept of relative numerical efficiency is introduced to evaluate the adequacy of a chosen importance sampling density. Applications in two illustrative models are presented. Copyright 1989 by The Econometric Society.

We use a multivariate generalized autoregressive heteroskedasticity model (M-GARCH) to examine three stock indexes and their associated futures prices: the New York Stock Exchange Composite, S&P 500, and Toronto 35. The North American context is significant because markets in Canada and the United States share similar structures and regulatory environments. Our model allows examination of dependence in volatility as it captures time variation in volatility and cross-market influences. Estimated time variation in volatility is significant and the volatilities are highly positively correlated. Yet, we find that the correlation in North American index and futures markets has declined over time.

A natural generalization of the ARCH (Autoregressive Conditional Heteroskedastic) process introduced in Engle (1982) to allow for past conditional variances in the current conditional variance equation is proposed. Stationarity conditions and autocorrelation structure for this new class of parametric models are derived. Maximum likelihood estimation and testing are also considered. Finally an empirical example relating to the uncertainty of the inflation rate is presented.

The analysis of non-Gaussian time series using state space models is considered from both classical and Bayesian perspectives. The treatment in both cases is based on simulation using importance sampling and antithetic variables; Monte Carlo Markov chain methods are not employed. Non-Gaussian disturbances for the state equation as well as for the observation equation are considered. Methods for estimating conditional and posterior means of functions of the state vector given the observations, and the mean square errors of their estimates, are developed. These methods are extended to cover the estimation of conditional and posterior densities and distribution functions. Choice of importance sampling densities and antithetic variables is discussed. The techniques work well in practice and are computationally efficient. Their use is illustrated by applying to a univariate discrete time series, a series with outliers and a volatility series.