May 2019
·
33 Reads
·
10 Citations
Federal Reserve Bank of Richmond Working Papers
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
May 2019
·
33 Reads
·
10 Citations
Federal Reserve Bank of Richmond Working Papers
May 2019
·
29 Reads
·
1 Citation
We find disparate trend variations in TFP and labor growth across major U.S. production sectors and study their implications for the post-war secular decline in GDP growth. We describe how capital accumulation and the network structure of U.S. production interact to amplify the effects of sectoral trend growth rates in TFP and labor on trend GDP growth. We derive expressions that conveniently summarize this long-run amplification effect by way of sectoral multipliers. These multipliers are quantitatively large and for some sectors exceed three times their value added shares. We estimate that sector-specific factors have historically accounted for approximately 3/4 of long-run changes in GDP growth, leaving common or aggregate factors to explain only 1/4 of those changes. Trend GDP growth fell by nearly 3 percentage points over the post-war period with the Construction sector alone contributing roughly 1 percentage point of that decline between 1950 and 1980. Idiosyncratic changes to trend growth in the Durable Goods sector then contributed an almost 2 percentage point decline in trend GDP growth between 2000 and 2018. Remarkably, no sector has contributed any steady significant increase to the trend growth rate of GDP in the past 70 years.
October 2018
·
35 Reads
·
4 Citations
October 2018
·
74 Reads
·
187 Citations
The classic papers by Newey and West (1987) and Andrews (1991) spurred a large body of work on how to improve heteroscedasticity- and autocorrelation-robust (HAR) inference in time series regression. This literature finds that using a larger-than-usual truncation parameter to estimate the long-run variance, combined with Kiefer-Vogelsang (2002, 2005) fixed-b critical values, can substantially reduce size distortions, at only a modest cost in (size-adjusted) power. Empirical practice, however, has not kept up. This article therefore draws on the post-Newey West/Andrews literature to make concrete recommendations for HAR inference. We derive truncation parameter rules that choose a point on the size-power tradeoff to minimize a loss function. If Newey-West tests are used, we recommend the truncation parameter rule S = 1.3T1/2 and (nonstandard) fixed-b critical values. For tests of a single restriction, we find advantages to using the equal-weighted cosine (EWC) test, where the long run variance is estimated by projections onto Type II cosines, using ν = 0.4T2/3 cosine terms; for this test, fixed-b critical values are, conveniently, tν or F. We assess these rules using first an ARMA/GARCH Monte Carlo design, then a dynamic factor model design estimated using a 207 quarterly U.S. macroeconomic time series.
May 2018
·
205 Reads
·
524 Citations
The Economic Journal
External sources of as‐if randomness — that is, external instruments — can be used to identify the dynamic causal effects of macroeconomic shocks. One method is a one‐step instrumental variables regression (local projections – IV); a more efficient two‐step method involves a vector autoregression. We show that, under a restrictive instrument validity condition, the one‐step method is valid even if the vector autoregression is not invertible, so comparing the two estimates provides a test of invertibility. If, however, lagged endogenous variables are needed as control variables in the one‐step method, then the conditions for validity of the two methods are the same.
November 2017
·
7 Reads
·
25 Citations
Many questions in economics involve long-run or “trend” variation and covariation in time series. Yet, time series of typical lengths contain only limited information about this long-run variation. This paper suggests that long-run sample information can be isolated using a small number of low-frequency trigonometric weighted averages, which in turn can be used to conduct inference about long-run variability and covariability. Because the low-frequency weighted averages have large sample normal distributions, large sample valid inference can often be conducted using familiar small sample normal inference procedures. Moreover, the general approach is applicable for a wide range of persistent stochastic processes that go beyond the familiar I (0) and I (1) models. INTRODUCTION This paper discusses inference about trends in economic time series. By “trend” we mean the low-frequency variability evident in a time series after forming moving averages such as low-pass (cf. Baxter and King, 1999) or Hodrick and Prescott (1997) filters. To measure this low-frequency variability we rely on projections of the series onto a small number of trigonometric functions (e.g., discrete Fourier, sine, or cosine transforms). The fact that a small number of projection coefficients capture low-frequency variability reflects the scarcity of low-frequency information in the data, leading to what is effectively a “small-sample” econometric problem. As we show, it is still relatively straightforward to conduct statistical inference using the small sample of low-frequency data summaries.Moreover, these low-frequency methods are appropriate for both weakly and highly persistent processes. Before getting into the details, it is useful to fix ideas by looking at some data. Figure 1 plots the value of per-capita GDP growth rates (panel A) and price inflation (panel B) for the United States using quarterly data from 1947 through 2014, and where both are expressed in percentage points at an annual rate. The plots show the raw series and two “trends.” The first trend was constructed using a band-pass moving average filter designed to pass cyclical components with periods longer than T/6 ≈ 11 years, and the second is the full-sample projection of the series onto a constant and twelve cosine functions with periods 2T/j for j = 1, …, 12, also designed to capture variability for periods longer than 11 years.
June 2017
·
243 Reads
·
53 Citations
U.S. output has expanded only slowly since the recession trough in 2009, even though the unemployment rate has essentially returned to a precrisis, normal level. We use a growth-accounting decomposition to explore explanations for the output shortfall, giving full treatment to cyclical effects that, given the depth of the recession, should have implied unusually fast growth. We find that the growth shortfall has almost entirely reflected two factors: the slow growth of total factor productivity, and the decline in labor force participation. Both factors reflect powerful adverse forces that are largely unrelated to the financial crisis and recession—and that were in play before the recession.
May 2017
·
203 Reads
·
72 Citations
Journal of Economic Perspectives
This review tells the story of the past 20 years of time series econometrics through ten pictures. These pictures illustrate six broad areas of progress in time series econometrics: estimation of dynamic causal effects; estimation of dynamic structural models with optimizing agents (specifically, dynamic stochastic equilibrium models); methods for exploiting information in "big data" that are specialized to economic time series; improved methods for forecasting and for monitoring the economy; tools for modeling time variation in economic relationships; and improved methods for statistical inference. Taken together, the pictures show how 20 years of research have improved our ability to undertake our professional responsibilities. These pictures also remind us of the close connection between econometric theory and the empirical problems that motivate the theory, and of how the best econometric theory tends to arise from practical empirical problems.
January 2017
·
26 Reads
·
94 Citations
Brookings Papers on Economic Activity
U.S. output has expanded only slowly since the recession trough in 2009, even though the unemployment rate has essentially returned to a precrisis, normal level. We use a growth-accounting decomposition to explore explanations for the output shortfall, giving full treatment to cyclical effects that, given the depth of the recession, should have implied unusually fast growth. We find that the growth shortfall has almost entirely reflected two factors: the slow growth of total factor productivity, and the decline in labor force participation. Both factors reflect powerful adverse forces that are largely unrelated to the financial crisis and recession-and that were in play before the recession.
April 2016
·
225 Reads
·
138 Citations
American Economic Review
The US economy has performed better when the president of the United States is a Democrat rather than a Republican, almost regardless of how one measures performance. For many measures, including real GDP growth (our focus), the performance gap is large and significant. This paper asks why. The answer is not found in technical time series matters nor in systematically more expansionary monetary or fiscal policy under Democrats. Rather, it appears that the Democratic edge stems mainly from more benign oil shocks, superior total factor productivity (TFP) performance, a more favorable international environment, and perhaps more optimistic consumer expectations about the near-term future. (JEL D72, E23, E32, E65, N12, N42).
... Separately, Barrios et al. (2012), Müller andWatson (2024), and Conley and Kelly (2025) demonstrate that applying standard variance estimators to spatially correlated data often yields spurious findings. The simulation experiments conducted in Section 2 generate results that are consistent with this view. ...
January 2024
Econometrica
... (18) et al. 2024). In this study, CV was used to estimate and compare the lithological units with homogeneous/heterogeneous magnetic data source as given by Muller and Watson (2022) in Equation 21 where, is the standard deviation of the magnetic data set; is the mean of the magnetic data set. According to Singh, (2001) and Fotheringham et al. (2024), CV < 60% indicates low variability (homogenous) while CV > 60 % implies high variability (heterogeneous). ...
January 2022
Econometrica
... There are high frequency financial variables and low frequency macroeconomic variables. Classifying these variables depending on their frequency is a common approximation in the time series econometrics literature (see, e.g., Bańbura et al. (2010), Müller and Watson (2015), Engle (2000)). We follow this approach. ...
November 2017
... It is, however, worth noting that the extension of the theoretical insights from the location model to regression models requires thatx t ε t is stationary and the partial sum ofx t is roughly linear (T −1 rT t=1x 2 t ≈ rσ 2 x for 0 ≤ r ≤ 1), 18 which may be implausible in some applications. In those situations, I refer the readers to, for instance, Müller and Watson (2023) and Ibragimov and Müller (2010), respectively, for valid inference approaches. ...
Reference:
Optimal HAR inference
September 2022
... Converting fields such that 2 years out of every 3 are planted with corn has the potential to increase SOC stocks by 172.9 million Mg C. While that figure is 723.8 million Mg C less than the gain obtained through conversion to corn mono-cropping, it nevertheless gives another indication of the estimated potential to increase carbon sequestration. If this result is multiplied by a mean Social Cost of Carbon estimate of $678/t C [33], the total benefits are estimated at over $117 billion. However, carbon sequestration is not the only societal goal, as food security, the financial health of farmers, and other environmental challenges are also important concerns. ...
September 2022
Nature
... To study the demand for labor, capital, energy, and nonenergy intermediate inputs in the Quebec manufacturing industry, we use data on unit prices and quantities of each factor within each manufacturing SME. Firms databases often do not incorporate quantities and prices of factors and output but include only expenditures on inputs and revenues derived from output sales (Foerster et al. 2022). Data on input prices and quantities per manufacturing SME are constructed by incorporating bond yields using the calculation technique proposed by Harper et al. (1989) and assuming perfect competition in markets, allowing us to consider that the prices of manufacturing inputs/outputs are those of firms and that firms buy inputs or sell their outputs at these prices. ...
May 2022
Journal of Political Economy
... Empirical validity of the Phillips Curve, however, has been debated. Influential studies such as Ball et al. (2011) and Stock and Watson (2020) have questioned its robustness, arguing that structural changes in the economy-such as globalization, labor market rigidities, and higher degree of inflation anchoring-have weakened the relationship between inflation and economic slack. More recent studies, such as Hazell et al. (2022) and Ball and Mazumder (2019), have sought to revive the Phillips Curve by incorporating nonlinearities and state-dependent effects in its modelling and estimation. ...
December 2020
Journal of Money Credit and Banking
... We consider the projections described in , obtained from Bayesian hierarchical models and expert elicitations. Müller et al. (2022) and Raftery and ševčíková (2023) present detailed descriptions of the GDP per capita and population models, respectively. To obtain these projections, we utilize the dataset . ...
October 2020
Review of Economics and Statistics
... The financial econometric volatility is silent upon links between asset return volatility and its determinants. Instead, the focus is on modeling volatility and not underlying macroeconomic factors (Bollerslev et al., 2010)Volatility is one of the important quantities for an investor and any economic and financial theory is dependent upon it.(Park & Linton, 2012)asymmetry in volatility is rare in emerging and frontier markets; asymmetry in correlations concerns the Hungarian stock market; and the relationship between volatility and correlations is positive and significant in the majorityof countries. ...
March 2010
... These include panel data (Almuzara and Sancibrian, 2024), nonlinear specifications (Caravello and Martinez-Bruera, 2024;Gonçalves, Herrera, Kilian, and Pesavento, 2024), simultaneous confidence bands (Montiel Olea and Plagborg-Møller, 2019), variance decompositions (Plagborg-Møller and Wolf, 2022), and certain more technical structural shock identification schemes (Uhlig, 2005;Baumeister and Hamilton, 2015). For brevity, we touch only briefly on proxy or instrumental variable identification, and we abstract from weak instrument issues, even though these are likely to be important in practice (Montiel Olea, Stock, and Watson, 2021). Excellent reviews of LPs, VARs, and the relationship between them include Kilian and Lütkepohl (2017), Stock and Watson (2018), Baumeister and Hamilton (2024), and Jordà and Taylor (2025). ...
August 2020
Journal of Econometrics