ArticlePDF Available

Abstract

Claims that the parameters of an econometric model are invariant under changes in either policy rules or expectations processes entail super exogeneity and encompassing implications. Super exogeneity is always potentially refutable, and when both implications are involved, the Lucas critique is also refutable. We review the methodological background; the applicability of the Lucas critique; super exogeneity tests; the encompassing implications of feedback and feedforward models; and the role of incomplete information. The approach is applied to money demand in the u.S.A. to examine constancy, exogeneity, and encompassing, and reveals the Lucas critique to be inapplicable to the model under analysis.
http://www.tandfonline.com/doi/abs/10.1080/07474939208800238
... See, Bodkin, Klein and Marwah (1991) for a comprehensive history of macroeconometric model building. See also Favero and Hendry (1992). 2 Models built for India are, by and large, based on the Keynesian framework thus underplaying the role of supply side. ...
... Johansen's procedure of cointegration test is based on the estimation of the VECM (4) using the maximum likelihood method ( Johansen and Juselius ,1990). Note also that since the r cointegrating vectors appearing in D are non-unique 6 , these have to be normalized. Typically, using the normalization, one would obtain the cointegrating vectors as the rows of a matrix   ...
... In the present exercise, we have considered a total of 13 basic macroeconomic variables. 8 , Total Exports at 1993-94 prices (TE) 9 , Total Imports at 1993-94 prices 6 This is because for any F:rxr non-singular, FD is also a set of r linearly independent cointegrating vectors of t Y . 7 Such restrictions are usually called over-identifying restrictions. ...
... To interpret the outcome we need to assume that all quantile-specific parameters remain fixed at their full-sample estimates as the system is subjected to counterfactual shocks. This assumption can be phrased and tested in terms of the "super-exogeneity" of certain variables; see Engle et al. (1983) and Favero and Hendry (1992). Accordingly, the policy interventions should be small enough to not cause pronounced variation in deterministic parameters; see e.g. ...
Technical Report
Full-text available
Macroprudential policymakers assess medium-term downside risks to the real economy arising from financial imbalances and implement policies aimed at managing those risks. In doing so, they face an inherent intertemporal trade-off between the expected growth and downside risks. This paper reviews the literature on Growth-at-Risk, embeds it in the wider literature on macroprudential policy, and proposes an empirical risk management framework that combines insights from the two literatures, by forecasting the entire real GDP growth distribution with a structural quantile vector autoregressive model. It accounts for direct and indirect interactions between financial vulnerabilities, financial stress and real GDP growth and allows for potential non-linear amplification effects. The framework provides policymakers with a macro-financial stress test to monitor downside risks to the economy and a macroprudential stance metric to quantify when interventions may be beneficial.
... But Marschak, Tinbergen and Klein were also agreed that very few changes in policy-making are capable of changing the macro relationships included in their models. Econometricians including Sims (1982), Favero and Hendry (1992) and Ericsson and Irons (1995) conclude, after empirically investigating the policy-instability of model parameters, that the scope of the Lucas critique is very narrow indeed: the impact of changes in policy regime on model parameters is mostly negligible and traditional macro-econometric models still perform well for policy evaluation (see Sergi 2017;Hendry and Muellbauer 2018). In an ironical twist, micro-founded DSGE models are found to fail the self-imposed Lucas-test. ...
Article
Full-text available
The Rebuilding Macroeconomic Theory Project, led by David Vines and Samuel Wills (2020), is an important, albeit long overdue, initiative to rethink a failing mainstream macroeconomics. Professors Vines and Wills, who must be congratulated for stepping up to the challenge of trying to make mainstream macroeconomics relevant again, call for a new multiple-equilibrium and diverse (MEADE) paradigm for macroeconomics. Their idea is to start with simple models, ideally two-dimensional sketches, that explain mechanisms that can cause multiple equilibria. These mechanisms should then be incorporated into larger DSGE models in a new, multiple-equilibrium synthesis – to see how the fundamental pieces of the economy fit together, subject to it being “properly micro-founded”. This paper argues that the MEADE paradigm is bound to fail, because it maintains the DSGE model as the unifying framework at the center of macroeconomic analysis. The paper reviews 10 fundamental weaknesses inherent in DSGE models which make these models irreparably useless for macroeconomic policy analysis. Mainstream macroeconomics must put DSGE models, once and for all, in the Museum of Implausible Economic Models – and learn important lessons from non-DSGE macroeconomic approaches.
... 278). Statistical tests for the notion of "super exogeneity" are proposed by Favero and Hendry (1992), Engle and Hendry (1993), and Hendry and Santos (2010). These tests rely on analyzing to what extent parameter values are sensitive to exogeneous interventions on the purported cause. ...
Preprint
This paper shows that testability of reverse causality is possible even in the absence of exogenous variation, such as in the form of instrumental variables. Instead of relying on exogenous variation, we achieve testability by imposing relatively weak model restrictions. Our main assumption is that the true functional relationship is nonlinear and error terms are additively separable. In contrast to existing literature, we allow the error to be heteroskedastic, which is the case in most economic applications. Our procedure builds on reproducing kernel Hilbert space (RKHS) embeddings of probability distributions to test conditional independence. We show that the procedure provides a powerful tool to detect the causal direction in both Monte Carlo simulations and an application to German survey data. We can infer the causal direction between income and work experience (proxied by age) without relying on exogeneous variation.
Thesis
Full-text available
Neste trabalho defendemos a tese de que apesar da maior parte da literatura disponível concluir que existem relações de causalidade à Granger entre a atividade turística e a atividade económica, este conhecimento não é suscetível de se traduzir em informação diretamente útil para a política económica em virtude de três conjuntos de argumentos. Em primeiro lugar, porque os procedimentos metodológicos adotados ignoram aspetos cruciais do conceito de causalidade à Granger. Em segundo lugar, porque a literatura disponível revela a presença de viés que poderá estar associado à possível preferência dos autores de estudos e dos editores de revistas científicas pela apresentação e publicação de resultados estatisticamente significativos. Em terceiro lugar, porque o efeito empírico sistematicamente detetado não é genuíno, ou seja, está confinado a horizontes temporais curtos. Para concretizar os nossos objetivos, baseámo-nos primeiro numa revisão sistemática da literatura e depois numa revisão crítica da mesma à luz dos vários conceitos e significados de causalidade em economia e em econometria. Posteriormente, procedemos a uma série de análises de meta-regressão de 78 efeitos empíricos relativos à hipótese de causalidade à Granger do turismo para o produto e 74 efeitos empíricos relativos à hipótese de causalidade à Granger do produto para o turismo. Esses efeitos empíricos foram recolhidos a partir de 51 trabalhos distintos e contemplam a análise de 42 países diferentes. Nas nossas análises de meta-regressão confirmámos, adicionalmente, a relevância estatística do nível de desenvolvimento económico enquanto determinante do efeito empírico associado à causalidade à Granger do produto para o turismo e do grau de especialização em turismo enquanto determinante do efeito empírico associado à causalidade à Granger em ambos os sentidos. Descobrimos, com alguma surpresa, que o efeito empírico causal à Granger do turismo para o produto está associado, de forma positiva e estatisticamente significativa, à dimensão populacional dos países estudados e meta-analisados. Constatámos também que existem várias escolhas metodológicas suscetíveis de contribuir para a obtenção de efeitos empíricos significativamente maiores, o que veio a reforçar as conclusões relativas à possível presença de viés no âmbito da literatura analisada. Palavras-chave: turismo; crescimento económico; causalidade à Granger; revisões sistemáticas da literatura; análises de meta-regressão.
Article
An analysis of government programs for macroeconomic stabilization of selected countries is made to establish their compliance with scientific approaches that determine the political choice in favor of the use of monetary and/or fiscal instruments for stimulation of economic activity based on the revision of the substantive provisions of neoclassical synthesis and the new macroeconomic consensus to highlight the peculiarities of interpretation of macroeconomic processes, the nature of cyclical fluctuations and ways to level and adjust them. It is established that the most popular in the political sphere are the conclusions of the new neoclassical synthesis (New Consensus in Macroeconomics), which combines the new Keynesian approach and the real business cycle approach, however, they are also adjusted in any form, depending on the priority of the government. (the desire to achieve full employment; price stabilization; economic growth and balance of payments; efficient use of limited resources), provide mostly short-term planning horizon, which complicates the exit from the “vicious circle” of economic policy, when its dynamic development becomes hostage to the need for constant adaptation in accordance with the changing conditions, which it itself by its own adjustment causes. It was found that in the situation of the Coronavirus crisis the issues of combating the simultaneous shocks of supply and demand, and unemployment in particular, are recognized as a priority and sought to be addressed through a combination of monetary and fiscal policy tools, including regulatory competition by neoprotectionism. defined by us as a set of principles, tools and methods of regulatory policy in international trade, international capital movements and foreign investment, as well as international monetary, financial and credit relations, the imperative of which is to stimulate socio-economic development and economic growth by creating conditions for increasing the economic activity of all economic entities.
Article
The Rebuilding Macroeconomic Theory Project, led by David Vines and Samuel Wills (2020), is an important, albeit long overdue, initiative to rethink a failing mainstream macroeconomics. Professors Vines and Wills, who must be congratulated for stepping up to the challenge of trying to make mainstream macroeconomics relevant again, call for a new multiple-equilibrium and diverse (MEADE) paradigm for macroeconomics. Their idea is to start with simple models, ideally two-dimensional sketches, that explain mechanisms that can cause multiple equilibria. These mechanisms should then be incorporated into larger DSGE models in a new, multiple-equilibrium synthesis – to see how the fundamental pieces of the economy fit together, subject to it being ‘properly micro-founded’. This paper argues that the MEADE paradigm is bound to fail, because it maintains the DSGE model as the unifying framework at the center of macroeconomic analysis. The paper reviews 10 fundamental weaknesses inherent in DSGE models which make these models irreparably useless for macroeconomic policy analysis. Mainstream macroeconomics must put DSGE models, once and for all, in the Museum of Implausible Economic Models – and learn important lessons from non-DSGE macroeconomic approaches.
Article
Full-text available
Resumo Este artigo enquadra o estudo da causalidade em Economia, com dados em série temporal, no contexto mais abrangente da análise filosófica da causalidade. Para o efeito, demonstramos que o espectro de abordagens existentes em Economia resulta das respostas que já foram dadas, até agora, aos problemas ontológico, epistemológico e pragmático da causalidade no âmbito específico dessa disciplina. Concluímos ainda que longe de ser a única resposta disponível, correta ou adequada, a onipresente causalidade à Granger constitui apenas uma das várias respostas possíveis. Numa perspectiva mais geral, nosso objetivo é proporcionar uma visão concisa, mas ao mesmo tempo articulada e historicamente fundamentada acerca das origens, evolução e problemática da análise da causalidade em Economia.
Article
This paper examines some recent techniques designed to draw inferences about the credibility of changes in macroeconomic policy regimes. An alternative two-step approach, based on the decomposition between permanent and transitory components of a "credibility variable" is proposed. The methodology is then used to test for the existence of a credibility effect in the Cruzado stabilization plan implemented in Brazil in 1986.
Article
Full-text available
A framework is proposed for interpreting recent developments in time series econometrics, emphasizing the problems of linking economics and statistics. There are six main expository themes: models are viewed as (reduced) reparameterizations of data processes through marginalizing and conditioning; the latter operation is related to the economic notion of contingent plans based on weakly exogenous variables; a typology of dynamic equations clarifies the properties of conditional models; estimation of unknown parameters is treated using estimator generating equations; and tests are interrelated in terms of the efficient score statistic; finally, the concept of encompassing rival hypotheses (separate or nested) provides an 'overview' criterion for evaluating empirical estimates which have been selected to satisfy conventional criteria. The discussion is illustrated by an estimated model of the demand for money. /// Nous proposons une méthodologie permettant d'interpréter certains développements récents de l'économétrie des séries temporelles et mettons particulièrement l'accent sur les liens nécessaires entre l'économie et la statistique. L'exposé est axé sur six thèmes principaux: les modèles sont présentés comme le résultat d'opérations (implicites) de marginalisation et conditionnalisation des processus générateurs des observations et sont donc interprétés comme des reparamétrisations (et réductions) de ce processus; la conditionnalisation est liée au concept économique de plan contingent basé sur des variables faiblement exogènes; une typologie des équations dynamiques permet de clarifier les propriétés des modèles conditionnels; l'estimation des paramètres inconnus est basée sur des équations génératrices d'estimateurs; les tests d'hypothèses sont réinterprétés en termes de la statistique efficace du score; l'analyse d'hypothèses rivales (séparées ou non) est centrée sur un concept déductif d''encompassing' qui peut servir de critère général pour l'évaluation d'estimations ponctuelles empiriques, choisies sur base des critères conventionnels. Les valeurs estimées d'un modèle de demande de monnaie sont utilisées à des fins d'illustration au cours de la discussion.
Article
L'auteur considère le problème suivant. L'estimation des coefficients d'une équation stochastique est basée sur certaines hypothèses, qui peuvent être incorrectes. Dans le cas òu ces hypothèses sont incorrectes, il est possible d'indiquer les conséquences de la spécification erronée. Cette idée est appliquée à des problèmes d'agrégation, à des relations curvilinéaires et à des élasticités de substitution. Le criterium de la corrélation multiple maximum pour choisir entre plusieurs spécifications est aussi examiné.
Article
There has been increasing concern recently over the use of the simple first order Markov form to model error autocorrelation in regression analysis. The consequence of misspecifying the error model will be especially serious when the regressors include lagged values of the dependent variable. The purpose of this paper is to develop Lagrange multiplier tests of the assumed error model against specified ARMA alternatives. It is shown that all of the tests can be regarded as asymptotic tests of the significance of a coefficient of determination, and a table is provided which gives details of two general tests and several special cases.
Article
Having estimated a linear regression with p coefficients, one may wish to test whether m additional observations belong to the same regression. This paper presents systematically the tests involved, relates the prediction interval (for m = 1) and the analysis of covariance (for m > p) within the framework of general linear hypothesis (for any m), and extends the results to testing the equality between subsets of coefficients.
Article
The relation between the demand for money balances and its determinants is a fundamental building block in most theories of macroeconomic behavior. The demand for money is a critical component in the formulation of monetary policy, and a stable demand function for money has long been perceived as a prerequisite for the use of monetary aggregates in the conduct of policy. The repeated breakdown of existing empirical models in the face of newly emerging data has fostered a vast industry devoted to examining and improving the demand for money function. This process has been aided by a growing arsenal of econometric techniques that has permitted more sophisticated examinations of dynamics, functional forms, and expectations. These techniques have also provided researchers with a wide variety of diagnostic tests to evaluate the adequacy of particular specifications. The chapter reviews underlying theoretical models to re-examine measurement and specification issues such as the definition of money and the appropriate scale and opportunity cost variables. It discusses the estimation issues, criticisms, and modifications in the partial adjustment model.
Article
Three econometric methodologies, associated respectively with David Hendry, Christopher Sims and Edward Leamer have been advocated and practiced by their adherents in recent years. A number of good papers have appeared about each methodology, but little has been written in a comparative vein. This paper is concerned with that task. It provides a statement of the main steps to be followed in using each of the methodologies and comments upon the strengths and weaknesses of each approach. An attempt is made to contrast and compare the techniques used, the information provided, and the questions addressed by each of the methodologies. It is hoped that such a comparison will aid researchers in choosing the best way to examine their particular problem.