Table 3 - uploaded by Ka-Fu Wong
Content may be subject to copyright.
Source publication
We propose and evaluate a technique for instrumental variables estimation of linear models with conditional heteroskedasticity. The technique uses approximating parametric models for the projection of right-hand side variables onto the instrument space, and for conditional heteroskedasticity and serial correlation of the disturbance. Use of paramet...
Similar publications
In consequence of the adoption of the Manila Amendments to the STCW Convention and Code, the ECDIS model course would need to be reviewed and updated. Accordingly, three Polish matitime academies/universities present common position what necessary steps should be taken to revise and update existing IMO model course on Operational Use of ECDIS. Note...
Citations
... This method addresses potential endogeneity concerns, where explanatory variables may be correlated with the error term, and lead to inconsistent or biased estimates (Wooldridge, Wadud, & Lye, 2016). The use of lagged values mitigates reverse causality concerns and ensures the influence direction from independent variables to the dependent variable (West, Wong, & Anatolyev, 2009). Moreover, lagged variables account for dynamic effects, and capture the effects of past values of explanatory variables on current outcomes (Arellano & Bond, 1991). ...
Using data from Chinese A-share listed enterprises from 2007 to 2023, this study employs a two-way fixed effects model to examine the effects of internal control quality, CEO duality, ownership concentration, and financial distress on corporate fraud. The benchmark results indicate that higher internal control quality mitigates the risk of corporate fraud, while CEO duality, ownership concentration , and financial distress increase this risk. Robustness checks utilizing lagged variables confirm these findings. Heterogeneity analysis reveals that in highly leveraged firms, CEO duality and ownership concentration significantly increase fraud risk, whereas internal control quality reduces this risk. In low-leveraged firms, internal control quality reduces fraud, and CEO duality and financial distress increase fraud risk. Analysis based on business cycle heterogeneity shows the importance of robust internal controls in both fast and slow cycles, with varied effects of CEO duality and ownership concentration. Industry analysis indicates that internal controls are crucial in both heavily and less polluted industries, with ownership concentration and financial distress having significant impacts in less polluted sectors. Policymakers should mandate stricter internal control requirements and regular audits to ensure compliance and effectiveness.
... Gali and Gertler 1999;Ravenna and Walsh 2006;Murray 2006;Gali et al. 2007), I use the lags of variables as instruments. As has been discussed by West et al. (2009), the basis for using the lags of variables as instruments is that, usually, if a given variable is a legitimate instrument, then so are lags of that variable. Furthermore, the use of lags of variables helps overcoming the possibility of endogenously and bi-directional causality: it is unlikely that the current value of the dependent variable affects the past values of the variables (namely, the past cannot be caused by the future). ...
Labor-income and consumption taxes are often referred to as the primary causes of the labor wedge and differences in hours worked across countries. While this can be potentially true in the long run, its premise for explaining the cyclical behavior of the labor wedge is questionable. Using US data over 1955–2019, this paper first studies whether taxation explains the cyclical behavior of the labor wedge. It is shown that the tax wedge, which combines both types of taxes, fails in accounting for the countercyclicality of the labor wedge. I then study other factors that may raise the labor wedge during recessions, such as credit frictions on the firms’ side and price markups, and find that credit frictions are the primary reasons for this behavior. The empirical findings are consistent with the model-based results; the model with credit frictions successfully generates a countercyclical behavior of the labor wedge, whereas the model without credit frictions does not.
... for known observed variables w but unknown d (Á), Newey (1990) shows that the optimal instrument can be constructed via nonparametric estimation of d(w) using nearest neighbour and series approximation methods. In time series settings, the form of the optimal instrument depends additionally on the dynamic structure of the data (see also Hayashi and Sims (1983); Hansen et al., 1988;Heaton & Ogaki, 1991;Anatolyev, 2003;West et al., 2009). While attractive in principle, economic models often do not specify the aspects of the data generation process needed to construct the optimal instrument. ...
The 2013 Nobel Prize for Economics was awarded to Eugene Fama, Lars Hansen and Robert Shiller for their work on empirical asset pricing. Hansen's primary contribution to the cited work was the development of the generalised method of moments (GMM), a statistical method that has proved such a valuable tool for testing the validity of empirical asset pricing models. The public announcement of the award also acknowledges the wider impact of GMM on empirical analysis in economics and beyond, referring to the 1982 Econometrica paper in which Hansen introduced the method as ‘one of the most influential in econometrics’. In this paper, we reflect on how the GMM-based inference framework has evolved since 1982, reviewing developments on four main issues: model diagnostic testing, moment selection, identification and inference in misspecified models. We also illustrate the broader influence of GMM on econometrics by briefly exploring the connections between GMM and three other estimation methods: indirect inference, moment inequality based techniques, and a group of techniques that can be presented equivalently within either the generalised empirical likelihood or info-metric frameworks.
... In this case, the quarter-of-birth variables may be considered as "main" instruments and the interactions may be considered as other instruments. Another example arises in the situation considered by West et al. (2009). Suppose that we consider a (possibly misspecified) linear (in parameters) model for the relationship between the endogenous regressors and instruments. ...
This paper proposes the shrinkage generalized method of moments estimator to address the "many moment conditions" problem in the estimation of conditional moment restriction models. This estimator is obtained as the minimizer of the func-tion constructed by modifying the GMM objective function, such that we shrink the effect of a subset of moment conditions that are less important and used only for efficiency. We provide the closed form of the shrinkage parameter that minimizes the asymptotic mean squared error. A simulation study shows encouraging results.
... In addition to the LGMM estimator, we consider the OLS estimator and the infeasible GLS estimator which uses the true skedastic function as GLS weights. Furthermore, we compare the LGMM estimator to two other e¢ cient estimators: optimal instrumental variables estimator for heteroskedastic AR models (Kuersteiner, 2002; West, Wong and Anatolyev, 2009) and Carrasco and Florens'(2000)The mean bias, median bias, standard deviation, and root mean squared error (RMSE) for all estimators are reported in Tables 1 and 2 for sample sizes T = 100 and 500, respectively. As expected, the OLS estimator is characterized by a substantial downward bias that seems to increase with the degree of conditional heteroskedasticity. ...
This paper investigates statistical properties of the local GMM (LGMM) estimator for some time series models defined by conditional moment restrictions. First, we consider Markov processes with possible conditional heteroskedasticity of unknown form and establish the consistency, asymptotic normality, and semi-parametric efficiency of the estimator. Second, inspired by simulation results showing that the LGMM estimator has a significantly smaller bias than the OLS estimator, we undertake a higher-order asymptotic expansion and analyze the bias properties of the LGMM estimator. The structure of the asymptotic expansion of the LGMM estimator reveals an interesting contrast with the OLS estimator that helps to explain the bias reduction in the LGMM estimator. The practical importance of these findings is evaluated in terms of a bond and option pricing exercise based on a diffusion model for spot interest rate.
... A convenient tool is the optimality condition (5). Generally, employing the linear subclass of instruments delivers efficiency gains, often substantial, compared to the use of the basic instrument or a finite number of its lags (Stambaugh, 1993; Kuersteiner, 2002), especially in multiperiod problems (Kuersteiner, 2001, West, Wong and Anatolyev, 2002). Sometimes a special structure of conditional heteroskedasticity may kill any the gains though. ...
... In both models, all conventional GMM J-tests do not reject the null of correct specification at the 5% significance level. In both models, the conventional GMM and linearly optimal IV estimates turns out to be close, the latter seeming to be more precise (though, the confidence intervals for conventional GMM may not be reliable, as documented in Tauchen, 1986 and West, Wong and Anatolyev, 2002). The nonlinearly optimal IV estimates are even more precise, but differ in value quite sizably from linear IV estimates. ...
This article surveys estimation in stationary time-series models using the approach of optimal instrumentation. We review tools that allow construction and implementation of optimal instrumental variables estimators in various circumstances - in single- and multiperiod models, in the absence and presence of conditional heteroskedasticity, by considering linear and non-linear instruments. We also discuss issues adjacent to the theme of optimal instruments. The article is directed primarily towards practitioners, but econometric theorists and teachers of graduate econometrics may also find it useful. Copyright 2007 The Author Journal compilation © 2007 Blackwell Publishing Ltd.
... This leads to tractable theories that allow one to construct feasible instruments that attain the asymptotic efficiency bound relative to the restricted space of instruments. This is done in Kuersteiner (2002) for conditionally heteroskedastic AR models, in Kuersteiner (2001) for ARMA models with conditionally heteroskedastic innovations, and in West, Wong and Anatolyev (2002) for more general stationary time series models. Generally, employing the subclass of instruments delivers efficiency gains, often substantial, compared to the use of initially given instruments or a finite number of their lags. ...
... The additional instruments that we use in comparisons are the basic instrument z t implied by the OLS estimator, and the West-Wong-Anatolyev instrument (West, Wong and Anatolyev 2002), which is optimal in the class of linear combinations of the present and past basic instruments, and thus attains the efficiency bound in the class of GMM estimators that use as instruments lags of the basic instrument. An interesting feature of the present example is the asymptotic equivalence of the West-Wong-Anatolyev instrument and the one that would be optimal if there were no conditional heteroskedasticity (the proof of this fact is in the aforementioned Appendix). ...
... There is sufficient evidence, however, that these suffer a number of small sample deficiencies, mainly due to the need to estimate the efficient weighting matrix, thus we do not consider such estimators here. For their detailed consideration in conditionally heteroskedastic environments, see Tauchen (1986) and West, Wong and Anatolyev (2002). ...
The form of the optimal instrument for general multiperiod conditional moment restrictions is highly nonlinear and usually cannot be solved analytically. We show how to construct instruments approximately satisfying the optimality conditions, evaluate asymptotic variances of corresponding instrumental variables estimators in specific examples, and verify their behavior in finite samples. The asymptotic properties of approximately optimal instruments are favorable, and the finite sample properties of their feasible versions are advantageous compared to competitors. We also illustrate the proposed method with an application to ultra-high frequency data.
We must infer what the future situation would be without our interference, and what changes will be wrought by our actions. Fortunately, or unfortunately, none of these processes is infallible, or indeed ever accurate and complete.
This paper analyzes the higher order asymptotic properties of Generalized Method of Moments (GMM) estimators for linear time series models using many lags as instruments. A data dependent moment selection method based on minimizing the approximate mean squared error is developed. In addition, a new version of the GMM estimator based on kernel weighted moment conditions is proposed. It is shown that kernel weighted GMM can reduce the asymptotic bias compared to standard GMM. Kernel weighting also helps to simplify the problem of selecting the optimal number of instruments. A feasible procedure similar to optimal bandwidth selection is proposed for the kernel weighted GMM estimator.