Clive W. J. Granger's research while affiliated with University of California, San Diego and other places

Publications (296)

Article
Even though the trend components of economic time series were among the first to be distinguished, even today the trend remains relatively little understood. As Phillips (2005) notes, no one understands trends, but everyone sees them in the data. Economists and econometricians can give plenty of examples of trends, such as straight lines, exponenti...
Article
Phillips' (1958) original curve involves a nonlinear relationship between inflation and unemployment. We consider how his original results change due to updated theoretic and empirical studies, increased computer power, enlarged datasets, increases in data frequency and developed time series econometric models. In the linear models, there was weak...
Book
This volume explains recent theoretical developments in the econometric modelling of relationships between different statistical series. The statistical techniques explored analyse relationships between different variables, over time, such as the relationship between variables in a macroeconomy. Examples from Professor Terasvirta's empirical work a...
Book
This book contains an extensive up-to-date overview of nonlinear time series models and their application to modelling economic relationships. It considers nonlinear models in stationary and nonstationary frameworks, and both parametric and nonparametric models are discussed. The book contains examples of nonlinear models in economic theory and pre...
Article
This paper describes how the notion of cointegration came about, and discusses some generalizations to indicate where the topic may go next. In particular, some issues in the analysis of possibly cointegrated quantile time series are discussed.
Chapter
For the first three-quarters of the 20th century the main workhorse of applied econometrics was the basic regression.
Chapter
Decisions in the fields of economics and management have to be made in the context of forecasts about the future state of the economy or market. As decisions are so important as a basis for these fields, a great deal of attention has been paid to the question of how best to forecast variables and occurrences of interest. There are several distinct...
Article
This chapter considers the role of pragmatics in econometrics, looking at specific examples in econometrics and associated areas, and discusses the advantages and disadvantages of following a pragmatic approach. A great deal of econometric literature start off with a set of formal assumptions, or axioms, and then derive theories and specific models...
Article
Full-text available
Application of econometric principles and techniques (VAR-MGARCH) to risk analytics and forecasting in operations management, healthcare, security and other verticals. Forecasting is an underestimated field of research in supply chain management. Recently advanced methods are coming into use. Initial results are encouraging, but often require chang...
Article
Full-text available
Forecasting is a necessity almost in any operation. However, the tools of forecasting are still primitive in view of the great strides made by research and the increasing abundance of data made possible by automatic identification technologies, such as, radio frequency identification (RFID). The relationship of various parameters that may change an...
Article
In this paper we consider the effects of nonlinear transformations on integrated processes and unit root tests performed on such series. A test that is invariant to monotone data transformations is proposed. It is shown that series are generally not cointegrated with nonlinear transformations of themselves, but the same transformation applied to a...
Article
The idea of fractional differencing is introduced in terms of the infinite filter that corresponds to the expansion of (1-B)d. When the filter is applied to white noise, a class of time series is generated with distinctive properties, particularly in the very low frequencies and provides potentially useful long-memory forecasting properties. Such m...
Article
Using the theory of divergent series, a class of linear filters are considered that when applied to a constant generate a wide class of deterministic trends in mean. If these filters are applied to a white noise, series are produced that have changing variances and may correspond to fractional integrated series. Some other trend-generating mechanis...
Article
This is a report on our studies of the systematical use of mixed-frequency datasets. We suggest that the use of high-frequency data in forecasting economic aggregates can increase the accuracy of forecasts. The best way of using this information is to build a single model that relates the data of all frequencies, for example, an ARMA model with mis...
Article
Although linear models have been the central focus of econometrics for most of the twentieth century, great developments in non-linear models took place from the latter part of the century. This paper questions the future development of non-linear models in economics and shows (via White's Theorem) that any non-linear model can be approximated by a...
Chapter
Simulations have shown that if two independent time series, each being highly autocorrelated, are put into a standard regression framework, then the usual measures of goodness of fit, such as t and R-squared statistics, will be badly biased and the series will appear to be ‘related’. This possibility of a ‘spurious relationship’ between variables i...
Chapter
INTRODUCTION, HISTORY, AND DEFINITIONSSIMULATIONSTHEORYSPURIOUS REGRESSIONS WITH STATIONARY PROCESSESRELATED PROCESSES
Article
Complicated and sophisticated global models are available and popularly used (but commonly without model evaluation procedures), and hence, the question of how one can evaluate a global model is worth being investigated. We discuss whether or not these global models together can be fully utilized and, if so, how this might be achieved.
Article
Looking ahead thirty years is a difficult task, but is not impossible. In this paper we illustrate how to evaluate such long-term forecasts. Long-term forecasting is likely to be dominated by trend curves, particularly the simple linear and exponential trends. However, there will certainly be breaks in their parameter values at some unknown points,...
Article
Brazil has long ago removed most of the perverse government incentives that stimulated massive deforestation in the Amazon in the 1970s and 1980s, but the highly controversial policy concerning road building still remains. While data is now abundantly available due to the constant satellite surveillance of the Amazon, the analytical methods typical...
Article
Full-text available
Forecasting is a necessity almost in any operation. However, the tools of forecasting are still primitive in view of the great strides made by research and the increasing abundance of data made possible by automatic identification technologies, such as radio frequency identification ( RFID). The relationship of various parameters that may change an...
Article
New York stock price series are analyzed by a new statistical technique. It is found that short-run movements of the series obey the simple random walk hypothesis proposed by earlier writers, but that the long-run components are of greater importance than suggested by this hypothesis. The seasonal variation and the ‘businesscycle’ components are sh...
Article
Full-text available
SUMMARYA model of the form xt - xt-1= etwhere xt is the price of a share at time t and et forms a sequence of independent random variates is postulated as a model of the price determining mechanism of stock markets.The form of the distribution function of the et's is investigated. In opposition to suggestions that have been made in connection with...
Article
The first two influential books on economic forecasting are by Henri Theil [1961, second edition 1965. Economic Forecasts and Policy. North-Holland, Amsterdam] and by George Box and Gwilym Jenkins [1970. Time Series Analysis, Forecasting and Control. Holden Day, San Francisco]. Theil introduced advanced mathematical statistical techniques and consi...
Article
This paper compares the out-of-sample forecasting performance of three long-memory volatility models (i.e., fractional integrated (FI), break and regime switching) against three short-memory models (i.e., GARCH, GJR and volatility component). Using S&P 500 re-turns, we find that structural break models produced the best out-of-sample forecasts, if...
Article
Full-text available
When forecasts of the future value of some variable, or the probability of some event, are used for purposes of ex ante planning or decision making, then the preferences, opportunities and constraints of the decision maker will all enter into the ex post evaluation of a forecast, and the ex post comparison of alternative forecasts. After a presenti...
Article
Full-text available
Brazil has long ago removed most of the perverse government incentives that stimulated massive deforestation in the Amazon in the 70s and 80s, but one highly controversial policy remains: Road building. While data is now abundantly available due to the constant satellite surveillance of the Amazon, the analytical methods typically used to analyze t...
Article
Full-text available
Almost all fisher models assume time-invariant parameter values of the underlying biological growth function except for an i.i.d. error term. We examine the economic implications of cyclical growth parameters in both single and multi-species models, which are frequently observed in many real-world fisheries. Neither optimal harvest rates nor optima...
Article
The long memory characteristic of financial market volatility is well documented and has important implications for volatility forecasting and option pricing. When fitted to the same data, different volatility models calculate the unconditional variance differently and could have very different volatility persistent parameters. Hence, they produce...
Article
One method of describing the properties of a fitted autoregressive model of order p is to show the p roots that are implied by the lag operator. Considering autoregressive models fitted to 215 US macro series, with lags chosen by either the Bayesian or Schwarz information criteria or Akaike information criteria, the roots are found to constitute a...
Article
Virtually all nonlinear economic models with independent, identically distributed stochastic shocks and time-invariant structural parameters will generate persistent, partially predictable heteroskedasticity (“volatility clustering”) in their key dependent variables. This paper offers some examples of this phenomenon, derives i.i.d. shock, time-inv...
Article
Advances in SCM DSS: Potential to Improve Forecasting Accuracy by Datta and Granger (July 2006) shoumen@mit.edu Forecasting is a necessity almost in any operation. However, the tools of forecasting are still primitive in view of the great strides made by research and the increasing abundance of data made possible by automatic identification technol...
Article
Full-text available
We present here a new way of building vine copulas that allows us to create a vast number of new vine copulas, allowing for more precise modeling in high dimensions. To deal with this great number of copulas we present a new efficient selection methodology using a lattice structure on the vine set. Our model allows for a lot of degrees of freedom,...
Article
This is an overview of how empirical finance has evolved since 1960 with some suggestions about the future developments. The original attention just to modelling the first two moments will change to considering the whole distribution. Particular attention is paid to the ‘long-memory’ property and some comments are made about the relevance of contin...
Article
More and more data, greatly increased computing power, a rising number of research enthusiasts, an increased number of finance journals, and sophisticated techniques have been the characteristics of empirical finance in the past 30 years. Topics of current interest relate to conditional means, conditional variances, and conditional distributions. T...
Article
Full-text available
A comparison is presented of 93 studies that conducted tests of volatility-forecasting methods on a wide range of financial asset returns. The survey found that option-implied volatility provides more accurate forecasts than time-series models. Among the time-series models, no model is a clear winner, although a possible ranking is as follows: hist...
Article
Full-text available
This paper presents a set of questions prepared by Clive Granger with responses by David Hendry on the use of PcGets (see Hendry and Krolzig, 2001) in data modeling and as a new research tool. PcGets is an Ox package (see Doornik, 2001) implementing automatic general-to-specific (Gets) modeling for linear regression models based on the theory of re...
Article
Since the paper by David Hendry and me, which is at the center of this discussion, was written by us separately, I think it is worth recording how this paper started. David was attending the Allied Social Sciences Association meeting in San Diego in January 2004 and stayed with my wife and me. In preparing a talk I realized that over my career I ha...
Article
There exist a variety of reasons for the failure to find a uniquecointegrating relationship between economic time series where onewould normally be expected on the basis of economic theory. Amongthese are the testing procedure, the span of the data set, the choiceof lag length in generating the test statistic, the presence ofstructural breaks, and...
Article
Full-text available
The paper outlines a methodology for analyzing daily stock returns that relinquishes the assumption of global stationarity. Giving up this common working hypothesis reflects our belief that fundamental features of the financial markets are continuously and significantly changing. Our approach approximates the nonstationary data locally by stationar...
Article
Full-text available
Standard fisheries models used in economics and for management purposes almost always assume parameter values of the fishing system are stable. In this paper, we put forth models where the parameters of the biological growth model systematically change over time. The models considered are fundamentally different from those in the literature (e.g.,...
Article
The modern world has influenced the approach to empirical modeling and consequently the approach to methodology in general. The question of whether to base a model on an economic theory is easier when several models can be constructed, but an empirical evaluation analysis is required. Starting with a widely specified model and using a reduction pro...
Article
Stock & Watson (1999) consider the relative quality of different univariate forecasting techniques. This paper extends their study on forecasting practice, comparing the forecasting performance of two popular model selection procedures, the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). This paper considers several...
Article
In this paper we compare the relative efficiency of different methods of forecasting the aggregate of spatially correlated variables. Small sample simulations confirm the asymptotic result that improved forecasting performance can be obtained by imposing a priori constraints on the amount of spatial correlation in the system. One way to do so is to...
Article
A transformed metric entropy measure of dependence is studied which satisfies many desirable properties, including being a proper measure of distance. It is capable of good performance in identifying dependence even in possibly nonlinear time series, and is applicable for both continuous and discrete variables. A nonparametric kernel density implem...
Article
This paper shows that occasional breaks generate slowly decaying autocorrelations and other properties of I(d) processes, where d can be a fraction. Some theory and simulation results show that it is not easy to distinguish between the long memory property from the occasional-break process and the one from the I(d) process. We compare two time seri...
Article
The two prize winners in Economics this year would describe themselves as "Econometricians," so I thought that I should start by explaining that term. One can begin with the ancient subject of Mathematics which is largely concerned with the discovery of relationships between deterministic variables using a rigorous argument. (A deterministic variab...
Article
The standard method of model building is to consider just a single specification and then discuss its estimation, interpretation, and output. In practice many specifications are available so the usual technique is to choose the best, according to some criterion such as by testing, and then use that to produce an output. However, this means that any...
Article
The authors argue that economics as a science is held back through the use of “asterisk” reporting. We concur that many authors do a poor job at disseminating their results, however consider the call to reject the tools a little overblown. The tools are very useful for many of the most important parts of economics as a science, such as testing theo...
Article
George E. P. Box y Gwilym M. Jenkins (1970) y otros habían propuesto previamente métodos para analizar una única serie integrada, pero en el análisis conjunto de dos o más de tales series faltaba un rasgo importante. Resulta que la diferencia entre un par de series integradas puede ser estacionaria, y esta propiedad se conoce como "cointegración"....
Article
A definition for a common factor for bivariate time series is suggested by considering the decomposition of the conditional density into the product of the marginals and the copula, with the conditioning variable being a common factor if it does not directly enter the copula. We show the links between this definition and the idea of a common factor...
Article
When considering the relative quality of forecasts the method of comparison is relevant: should we use vertical measures, such as mean square forecasting error, or the recently developed horizontal measure time distance. Four models for inflation in the US are considered based on univariate time series, a leading indicator, a univariate model combi...
Article
Financial market volatility is an important input for investment, option pricing, and financial market regulation. The emphasis of this review article is on forecasting instead of modelling; it compares the volatility forecasting findings in 93 papers published and written in the last two decades. Provided in this paper as well are volatility defin...
Article
The paper asks the question - as time series analysis moves from consideration of conditional mean values and variances to unconditional distributions, do some of the familiar concepts devised for the first two moments continue to be helpful in the more general area? Most seem to generalize fairly easy, such as the concepts of breaks, seasonality,...
Article
We introduce a new index that explores the linkage between business-cycle fluctuations and deviations from long-run economic relationships. This index is virtually a measure of the distance between an attractor, a space spanned by the associated cointegrating vectors, and a point in the n-dimensional Euclidean space. The index is applied to U.S. qu...
Article
In modeling series with leading or lagging indicators, it is useful to compare models in terms of time–distance. This paper formalizes the concept of time–distance in terms of various metrics, and investigates their behavior. For evaluating forecasts, time–distance metrics are shown to be more useful than standard measures (such as mean squared for...
Article
When considering the relative quality of forecasts the method of comparison is relevant: should we use vertical measures, such as mean square forecasting error, or the recently developed horizontal measure time distance. Four models for inflation in the US are considered based on univariate time series, a leading indicator, a univariate model combi...
Article
The relationships between the variables measuring socio-economic status (SES) and those measuring various aspects of health and mortality were analyzed. Various causality definitions were used with statistical data, which was considered a vector of time series, for the analysis. The SES variables considered in the analysis included wealth, income,...
Article
Building large models, with little dynamics, was long considered to be an alternative to small dimensional time series models involving many lags. The advantages of one modelling methodology are compared to others; such as the size of the model, the use of economic theory, and simultaneity in specification. The question of how to evaluate the possi...
Article
Full-text available
A multi-disciplinary team of authors analyze the economics of Brazilian deforestation using a large data set of ecological and economic variables. They survey the most up to date work in this field and present their own dynamic and spatial econometric analysis based on municipality level panel data spanning the entire Brazilian Amazon from 1970 to...
Article
Full-text available
Modeling financial returns on longer time intervals under the assumption of stationarity is, at least intuitively, given the pace of change in world's economy, a choice hard to defend. Relinquishing the global stationarity hypothesis, this paper conducts a data analysis focused on the size of the returns, i.e. the absolute values of returns, under...
Article
The efficient market hypothesis gives rise to forecasting tests that mirror those adopted when testing the optimality of a forecast in the context of a given information set. However, there are also important differences arising from the fact that market efficiency tests rely on establishing profitable trading opportunities in ‘real time’. Forecast...
Article
Investor risk is a complicated concept in practice and is not well captured by measures of volatility as is well understood by uncertainty theory. Rather than asking statisticians to attempt to measure risk, it may be better to listen to decision theorists, but their suggestions are not very practical. Diversification is clearly helpful in reducing...
Article
Possibly hitherto unnoticed cointegrating relationships among integrated components of data series are identified. If the components are cointegrated, the data are said to have hidden cointegration. The implication of hidden cointegration on modeling data series themselves is discussed through what we call crouching error correction models. We show...
Article
Full-text available
This paper presents evidence, using data from Consensus Forecasts, that there is an "attraction" to conform to the mean forecasts; in other words, views expressed by other forecasters in the previous period influence individuals' current forecast. The paper then discusses--and provides further evidence on--two important implications of this finding...
Article
It has long been investigated in the finance literature that whether or not beta responds asymmetrically to good and bad news as measured by large positive and negative returns respectively. In this paper we define three market scenarios, namely, bad, usual and good, conditional on the quantiles of the market returns distribution. We investigate th...
Article
Many standard structural models in economics have the property that they induce persistent, partially predictable heteroskedasticity ("volatility clustering") in their key dependent variables, even when their underlying stochastic shock variables are all serially independent and homoskedastic, and their structural parameters are all time-invariant....
Article
Dominant properties of various kinds can be defined for distributions including trends, strong seasonality, business cycles, and a persistent component. We say that in the joint distribution of X and Y, conditional on W has a common factor if W is a dominant component, but it does not appear in the copula, only in the conditional marginal distribut...
Article
Financial market volatility is an important input for investment, option pricing and financial market regulation. In this review article, we compare the volatility forecasting findings in 93 papers published and written in the last two decades. This article is written for general readers in Economics, and its emphasis is on forecasting instead of m...
Article
This paper establishes practical criteria for selecting amongst hypothetical data generating processes in cases where the series has long memory and exponential distribution which implies that the innovations have extremely fat tails. Copyright 2001 by Taylor and Francis Group
Article
A survey of nonlinear multivariate macro empirical models is attempted. Although theory may suggest that nonlinearity is to be expected, empirical studies have difficulty in discovering strong consistent effects. Regime switching techniques appear to be the most successful and evidence of nonlinearity is most found for interest rates. Most of the s...
Article
Financial market volatility is an important input for investment, option pricing and financial market regulation. In this review article, we compare the volatility forecasting findings in 72 papers published and written in the last decade. This article is written for general readers in Economics, and its emphasis is on forecasting instead of modell...
Article
This paper, using daily returns on 30 Dow Jones Industrial stocks for the period 1991-1999, investigates the possibility of portfolio diversification when there are negative large movements in the stock returns (i.e. when the market is bearish). We estimate the quantiles of stock return distributions using non-parametric and parametric methods that...
Article
It is pointed out that if the generating mechanism is a fraction integrated process I(d), where d can be less than 1/2, but a simple ARMA model is fitted, a consistent estimation procedure is likely to produce a unit root. Thus the properties of the fitted model will be quite unlike those of the generating mechanism.
Article
A survey of nonlinear multivariate macro empirical models is attempted. Although theory may suggest that nonlinearity is to be expected, empirical studies have difficulty in discovering strong consistent effects. Regime switching techniques appear to be the most successful and evidence of nonlinearity is most found for interest rates. Most of the s...
Article
The asymptotic distributions of cointegration tests are approximated using the Gamma distribution. The tests considered are for the I(1), the conditional I(1), as well as the I(2) model. Formulae for the parameters of the Gamma distributions are derived from response surfaces. The resulting approximation is flexible, easy to implement and more accu...
Conference Paper
In recent years, a number of power spectra have been estimated from economic data and the majority have been found to be of a similar shape. A number of implications of this shape are discussed, particular attention being paid to the reality of business cycles, stability and control problems, and model building.
Article
This paper, using daily returns on 30 Dow Jones Industrial stocks for the period 1991–1999, investigates the possibility of portfolio diversification when there are negative large movements in the stock returns (i.e. when the market is bearish). We estimate the quantiles of stock return distributions using non-parametric and parametric methods that...
Conference Paper
An estimate of the number of floods per century that can be expected at any given point of a river would obviously be an important piece of information when any expensive flood-prevention scheme is under discussion. Gumber (1958) has discussed such an estimation for a non-tidal stretch of river, and has shown how to derive estimates from existing f...
Conference Paper
By considering the model generating the sum of two or more series, it is shown that the mixed ARMA model is the one most likely to occur. As most economic series are both aggregates and are measured with error it follows that such mixed models will often be found in practice. If such a model is found, the possibility of resolving the series into si...
Conference Paper
New York stock price series are analyzed by a new statistical technique. It is found that short-run movements of the series obey the simple random walk hypothesis proposed by earlier writers, but that the long-run components are of greater importance ...
Article
A variable is defined to be self-generating if it can be forecast efficiently from its own past only. Conditions are derived for certain linear combinations to be self-generating in error correction models. Interestingly, there are only two candidates for self-generation in an error correction model. They are cointegrating relationships and common...
Article
New York stock price series are analyzed by a new statistical technique. It is found that short-run movements of the series obey the simple random walk hypothesis proposed by earlier writers, but that the long-run components are of greater importance ...
Conference Paper
This chapter presents the time series analysis of error-correction models. The main purpose of error-correction models is to capture the time-series properties of variables, through the complex lag-structures allowed, while at the same time incorporating an economic theory of an equilibrium type. By using error-correction models, a link is formed a...
Conference Paper
New York stock price series are analyzed by a new statistical technique. It is found that short-run movements of the series obey the simple random walk hypothesis proposed by earlier writers, but that the long-run components are of greater importance ...

Citations

... The IT, PSTR, and GPSTR models are competing models, as non-linearity tests reject the null hypothesis of the linear model. As argued by Granger (2001), it is well known that nonlinear models are inclined to overfit the data and thus, out-of-sample forecasting evaluation is recommended. Hence, we employ cross-validation to calculate the mean squared error (MSE). ...
... In making any empirical investigation of macroeconomic time series variables, it is advisable to establish whether the variables under study are stationary or not because, there is a likelihood of producing unreliable results because sing a regression analysis on time series data could produce biased outcomes arising from chance of non-stationary data (Granger & Newbold, 1974;Perron & Phillips, 1988). Thus, the unit root test was necessary in finding the order of integration among the study variables. ...
Reference: No2 977022
... This asymptotic result improves the previous studies of Deng (2014) whereβ = O p (1). Simply speaking, the spirit of our methodology follows the suggestion of Granger (2001) saying that the proper reaction to having a possible spurious relationship is to add lagged dependent and independent variables until the errors appear to be white noise. ...
... Among them, Normal Copula and Student-t copula cannot describe the asymmetric correlat-ion between variables, and Clayton Copula can only describe the lower tail correlation, while Gumbel copula can only describe the upper tail correlation. In this paper, we used Symmetrized Joe-Clayton (SJC) copula proposed by Patton [52][53][54] to describe the interdependence between two markets, while SJC copula can accurately describe the asymmetric upper and lower tail relations among different variables. The expression of the distribution function of SJC copula is: ...
... This phenomenon is known as asymmetric dependence structures, see e.g. Silvapulle and Granger (2001), Campbell et al. (2002), Okimoto (2008), Ang and Chen (2002), Hong et al. (2007), Chollete et al. (2009), Aas et al. (2009), Garcia and Tsafack (2011). ...
... Source: Self Extract probability values, which corresponds to show the number of cointegration equations exists in the model. There are number of conventional techniques available to estimate the long-run coefficient parameters, including, ordinary least square (OLS) regression, two-stage least square (2SLS) regression, generalized method of moments (GMM) estimator, ARDL bounds testing approach, etc.; however, each have a certain limitations to found the robust estimation, i.e., OLS violate the basic error term properties, which largely shows the biased estimates, if the variables possess a differenced stationary series (Granger 2010). The 2SLS regression is generally used for handling endogeneity issue among the regressors (Cumby et al. 1983); while GMM estimator is used to absorb for more than one endogeneity issue exists in the model (Baum et al. 2003). ...
... where Y t ∈ R p , m(I t−1 ) = E(Y t |I t−1 ) is the conditional mean almost surely (a.s.) of Y t given the conditioning set I t−1 , ε t = Y t − E(Y t |I t−1 ) by construction is a martingale difference sequence (MDS) with respect to I t−1 , and the conditioning set at time t is I t = {Y t , Y t−1 , ...}. In parametric multivariate time series modeling, the form of m(·) is usually specified up to a finite dimensional parameter, including the vector auto-regressive moving average (VARMA) model in Lütkepohl (2005) and other nonlinear conditional mean models in Tsay (1998), Teräsvirta et al. (2010), Teräsvirta and Yang (2014), and Keywords and phrases: martingale difference divergence matrix; martingale difference hypothesis; multivariate time series models; specification test; wild bootstrap Dahlhaus (2017), to name a few. These linear/nonlinear multivariate conditional mean models have been widely applied in empirical macroeconomics and statistics. ...
... In regions where there is no causality in any direction, as in Europe & Central Asia, Sub-Saharan Africa, and the Middle East & North Africa, the results suggest the same specification model issue, which implies that using the variables in those regions for forecasting purposes would entail limitations. In that regard, this research defines forecasting as an extrapolation based on empirical models, a different concept from prediction (Granger, 2012). In the context of this research, a prediction of FS would occur in a model of the form: ...
... Throughout the intense period of colonization of the Amazon during the 1960s-1990s, the Brazilian government formulated policies to encourage the occupation of the region, offering reduced taxes and subsidized credit for the installation of new businesses, especially in the agricultural sector (ANDERSEN; GRANGER, 2007). The timber industry grew as an important economic activity in the region, while cattle ranching followed into the cleared areas to become an extensive activity. ...
... In this case, it is usually used a loss function, but we can also choose the distance criterion proposed by Granger and Jeon (2003) Armstrong and Collopy (2000) stresses that these measures are not independent of the unit of measurement, unless if they are expressed as percentage. The purpose of using the abovementioned indicators is related to the characterization of distribution errors. ...