May 2025
·
2 Reads
·
1 Citation
Journal of Financial Economics
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
May 2025
·
2 Reads
·
1 Citation
Journal of Financial Economics
December 2024
·
10 Reads
This paper provides a solution to the evaluation of treatment effects in selective samples when neither instruments nor parametric assumptions are available. We provide sharp bounds for average treatment effects under a conditional monotonicity assumption for all principal strata, i.e. units characterizing the complete intensive and extensive margins. Most importantly, we allow for a large share of units whose selection is indifferent to treatment, e.g. due to non-compliance. The existence of such a population is crucially tied to the regularity of sharp population bounds and thus conventional asymptotic inference for methods such as Lee bounds can be misleading. It can be solved using smoothed outer identification regions for inference. We provide semiparametrically efficient debiased machine learning estimators for both regular and smooth bounds that can accommodate high-dimensional covariates and flexible functional forms. Our study of active labor market policy reveals the empirical prevalence of the aforementioned indifference population and supports results from previous impact analysis under much weaker assumptions.
August 2023
·
5 Reads
Journal of Banking & Finance
March 2023
·
1 Read
·
1 Citation
Econometric Theory
We consider a sequential treatment problem with covariates. Given a realization of the covariate vector, instead of targeting the treatment with highest conditional expectation, the decision-maker targets the treatment which maximizes a general functional of the conditional potential outcome distribution, e.g., a conditional quantile, trimmed mean, or a socioeconomic functional such as an inequality, welfare, or poverty measure. We develop expected regret lower bounds for this problem and construct a near minimax optimal sequential assignment policy.
January 2023
·
16 Reads
·
1 Citation
SSRN Electronic Journal
September 2022
·
18 Reads
·
2 Citations
Journal of Econometrics
We study the problem of a decision maker who must provide the best possible treatment recommendation based on an experiment. The desirability of the outcome distribution resulting from the policy recommendation is measured through a functional capturing the distributional characteristic that the decision maker is interested in optimizing. This could be, e.g., its inherent inequality, welfare, level of poverty or its distance to a desired outcome distribution. If the functional of interest is not quasi-convex or if there are constraints, the optimal recommendation may be a mixture of treatments. This vastly expands the set of recommendations that must be considered. We characterize the difficulty of the problem by obtaining maximal expected regret lower bounds. Furthermore, we propose two (near) regret-optimal policies. The first policy is static and thus applicable irrespectively of the subjects arriving sequentially or not in the course of the experimentation phase. The second policy can utilize that subjects arrive sequentially by successively eliminating inferior treatments and thus spends the sampling effort where it is most needed.
September 2022
·
25 Reads
·
37 Citations
Journal of Econometrics
We develop a GMM approach for estimation of log-normal stochastic volatility models driven by a fractional Brownian motion with unrestricted Hurst exponent. We show that a parameter estimator based on the integrated variance is consistent and, under stronger conditions, asymptotically normally distributed. We inspect the behavior of our procedure when integrated variance is replaced with a noisy measure of volatility calculated from discrete high-frequency data. The realized estimator contains sampling error, which skews the fractal coefficient toward “illusive roughness.” We construct an analytical approach to control the impact of measurement error without introducing nuisance parameters. In a simulation study, we demonstrate convincing small sample properties of our approach based both on integrated and realized variance over the entire memory spectrum. We show the bias correction attenuates any systematic deviance in the parameter estimates. Our procedure is applied to empirical high-frequency data from numerous leading equity indexes. With our robust approach the Hurst index is estimated around 0.05, confirming roughness in stochastic volatility.
June 2022
·
327 Reads
·
91 Citations
Journal of Financial Econometrics
We inspect how accurate machine learning (ML) is at forecasting realized variance of the Dow Jones Industrial Average index constituents. We compare several ML algorithms, including regularization, regression trees, and neural networks, to multiple heterogeneous autoregressive (HAR) models. ML is implemented with minimal hyperparameter tuning. In spite of this, ML is competitive and beats the HAR lineage, even when the only predictors are the daily, weekly, and monthly lags of realized variance. The forecast gains are more pronounced at longer horizons. We attribute this to higher persistence in the ML models, which helps to approximate the long memory of realized variance. ML also excels at locating incremental information about future volatility from additional predictors. Lastly, we propose an ML measure of variable importance based on accumulated local effects. This shows that while there is agreement about the most important predictors, there is disagreement on their ranking, helping to reconcile our results.
November 2020
·
22 Reads
·
16 Citations
Consider a setting in which a policy maker assigns subjects to treatments, observing each outcome before the next subject arrives. Initially, it is unknown which treatment is best, but the sequential nature of the problem permits learning about the effectiveness of the treatments. While the multi-armed-bandit literature has shed much light on the situation when the policy maker compares the effectiveness of the treatments through their mean, much less is known about other targets. This is restrictive, because a cautious decision maker may prefer to target a robust location measure such as a quantile or a trimmed mean. Furthermore, socio-economic decision making often requires targeting purpose specific characteristics of the outcome distribution, such as its inherent degree of inequality, welfare or poverty. In the present paper we introduce and study sequential learning algorithms when the distributional characteristic of interest is a general functional of the outcome distribution. Minimax expected regret optimality results are obtained within the subclass of explore-then-commit policies, and for the unrestricted class of all policies.
October 2020
·
42 Reads
In this paper, we develop a generalized method of moments approach for joint estimation of the parameters of a fractional log-normal stochastic volatility model. We show that with an arbitrary Hurst exponent an estimator based on integrated variance is consistent. Moreover, under stronger conditions we also derive a central limit theorem. These results stand even when integrated variance is replaced with a realized measure of volatility calculated from discrete high-frequency data. However, in practice a realized estimator contains sampling error, the effect of which is to skew the fractal coefficient toward "roughness". We construct an analytical approach to control this error. In a simulation study, we demonstrate convincing small sample properties of our approach based both on integrated and realized variance over the entire memory spectrum. We show that the bias correction attenuates any systematic deviance in the estimated parameters. Our procedure is applied to empirical high-frequency data from numerous leading equity indexes. With our robust approach the Hurst index is estimated around 0.05, confirming roughness in integrated variance.
... Additionally, the findings of Christensen et al. (2023) based on high-frequency data indicate limited price discovery in the days following earnings announcements. However, their findings are confined to 50 liquid firms due to the extensive size of high-frequency data, while we look at all US stocks and aim to capture the speed of price discovery using daily data. ...
January 2023
SSRN Electronic Journal
... Inequality SWFs have been studied in Kasy (2016), Kitagawa and Tetenov (2021) or Kock et al. (2023). A popular SWF for an outcome Y is W = E [Y ](1−G(Y )). ...
September 2022
Journal of Econometrics
... The SV framework has inspired extensive methodological developments: Tauchen and Pitts [19] and Taylor [20] pioneered the application of stochastic principles to financial volatility modeling; Chib, Nardari, and Shephard [21] advanced Bayesian estimation techniques for high-dimensional multivariate SV models with time-varying correlations; Jensen and Maheu [22] introduced a semiparametric Bayesian approach incorporating Markov chain Monte Carlo methods to address distributional uncertainty; Fernández-Villaverde, Guerrón-Quintana, and Rubio-Ramírez [23] developed computationally efficient particle filtering algorithms tailored for large-scale SV models. Recent innovations continue to expand the SV paradigm, as evidenced by contributions from Rømer [24], Yazdani, Hadizadeh, and Fakoor [25], Bolko, Christensen, Pakkanen et al. [26], and Chan [27], among others. Notwithstanding these advancements, SV models remain computationally intensive, particularly for parameter estimation and short-term forecasting. ...
September 2022
Journal of Econometrics
... Researchers must tackle multiple moving elements to predict market volatility with confidence. Traditional approaches predicting market volatility failed because financial markets remain unpredictable and volatile (Rouf et al., 2021;Christensen et al., 2023). As a result, there has been growing interest in employing Machine Learning to enhance volatility forecasting as Machine Learning technology proves more effective than prior methods as Zhang and Lei (2022) explained. ...
June 2022
Journal of Financial Econometrics
... M.) introduced in 2000 the Multifractal Random Walk (MRW) model [21,22,28], which formally corresponds to H − → 0 + . Later, this super-rough regime was also observed in real data [12,13,23,28,29], as clarified recently in [24] (more on this later, and see also [3]). ...
January 2020
SSRN Electronic Journal
... Athey et al. (2022) and Qin and Russo (2024) assess the trade-off between the in-sample welfare and the super-population welfare of best-arm identification and study how to balance them out. Recent advances in bandit algorithms in the econometrics literature include those studied in Adusumilli (2021); Dimakopoulou et al. (2017); Kock et al. (2022); Kuang and Wager (2023), to list but a few relevant papers. Application and feasible implementation of EXP4.P algorithms to policy learning have, to our knowledge, not been studied in the policy learning literature. ...
November 2020
... Such limit theorems are in the same direction as those of this paper, conceptually. Recently theories of asymptotic expansion have been developed and applied: Yoshida [37,39] 1 , Podolskij and Yoshida [21], Podolskij et al. [19,20], Nualart and Yoshida [18], Yamagishi and Yoshida [32]. ...
Reference:
Asymptotic expansion for batched bandits
August 2020
The Annals of Applied Probability
... Vetter (2015) develops empirical estimations of integrated volatility of volatility and Bull (2017) uses a wavelet-thresholding. Ebner et al. (2018) applies Fourier inference for stochastic volatility models. Christensen et al. (2019) builds a new test based on a nonparametric estimator of the empirical distribution function of stochastic variance or, more recently, Li et al. (2021) considers tests where jumps are present. ...
June 2019
Journal of Econometrics
... Although some of the measures introduced above can partially mitigate the impact of microstructure, their reliance on high-frequency data still exposes them to microstructure noise (Christensen et al., 2017). Consequently, the techniques for smoothing out noise can also be used in realised volatility measures. ...
December 2016
Journal of Econometrics
... Zhang et al. (2011) even considered the presence of microstructure noise when deriving Edgeworth expansions for realized volatility and other microstructure noise robust estimators. Hounyo and Veliyev (2016) established a full formal validity of Edgeworth expansions for realized volatility estimators given in above references. In this paper, we develop the theory of Edgeworth expansion for spot volatility estimator, and use it to construct corrected confidence intervals which refine conventional confidence intervals based on normal approximation. ...
February 2016
Econometrics Journal