ArticlePDF Available

Criteria for Evaluation of Econometric Models

Authors:
... Therefore modelers should rather focus on constructing models with high descriptive power. Nevertheless, different variations of these criteria can be found in comprehensive literature overviews such as Dhrymes et al. [17] or Radzicki [41]. Of a particular relevance is the conclusion of Dhrymes et al.: 'validation becomes a problem-dependent or decision-dependent process, differing from case to case as the proposed use of the model under consideration changes' [17]. ...
... Nevertheless, different variations of these criteria can be found in comprehensive literature overviews such as Dhrymes et al. [17] or Radzicki [41]. Of a particular relevance is the conclusion of Dhrymes et al.: 'validation becomes a problem-dependent or decision-dependent process, differing from case to case as the proposed use of the model under consideration changes' [17]. ...
... Therefore, the model description offered by the framework described in Section 3 and Section 4 becomes a road-map for the selection of validation techniques on a case by case basis - [44], [6] and [32] offer comprehensive overviews of practical validation tests. Therefore, our position is close to Dhrymes et al. [17], with the addition that the model description given by our framework can be used for guiding the selection of validation tests. In a certain sense, Mingers' 'multimethodology' [35] idea is being translated to the issues of validation. ...
Chapter
Full-text available
As the world has evolved to become ever more dependent on complex ecosystems of large, interacting systems, it has become ever more important to be able to reason rigorously about the design, construction, and behaviour not only of individual systems—which may include aspects related to all of people, process, and technology—but also of their assembly into ecosystems. In such situations, it is inevitable that no one type of model—such as mathematical models of dynamical systems, logical models of languages, or discrete event simulation models—will be sufficient to describe all of the aspects of ecosystems about which rigorous reasoning is required. We propose here a meta-theoretical framework, the ‘triangle framework’, within which different types of models may be categorized and their interactions, especially during the construction of models, can be understood. Its explicit goals are to facilitate a better understanding of the nature of models and to provide a more inclusive language for the description of heterogeneous models. Specifically, we identify three qualities of models, each derived from modelling goals—conceptuality, mathematicality, and executability—and explain how models will, typically, have all of these qualities to varying extents. We also show how the framework supports an analysis of how models can be co-designed by their various stakeholders within an identified translation zone within the process of model construction. We explore our ideas in the concrete setting of models encountered in a range of surveyed security papers, drawn from a diverse collection of security conferences. Although descriptive in nature, we envision this framework as a necessary first step in the development of a methodology for heterogeneous model design and construction, diverse enough to characterize the myriad of model types used in the field of information security while at the same time addressing validation concerns that can reduce their usability in the area of security decision-making.
... 2 Models relating economic outcomes to annual average temperatures are not directly comparable to those estimating responses to daily average temperatures given presumed non-linearities in the response function. 3 For example, see Friedman (1953); Dhrymes et al. (1972); Cooley and LeRoy (1981); Leamer (1978Leamer ( , 1983; White (1996); Yatchew (1998); Hansen et al. (2011), and Belloni et al. (2014). 4 See Keynes (1939), Koopmans (1947), Leamer (1978), Leamer (1983), Hendry et al. (1990), Chatfield (1996), and Sullivan et al. (1999), among others. ...
Article
Econometric models of temperature impacts on GDP are increasingly used to inform global warming damage assessments. But theory does not prescribe estimable forms of this relationship. By estimating 800 plausible specifications of the temperature-GDP relationship, we demonstrate that a wide variety of models are statistically indistinguishable in their out-of-sample performance, including models that exclude any temperature effect. This full set of models, however, implies a wide range of climate change impacts by 2100, yielding considerable model uncertainty. The uncertainty is greatest for models that specify effects of temperature on GDP growth that accumulate over time; the 95% confidence interval that accounts for both sampling and model uncertainty across the best-performing models ranges from 84% GDP losses to 359% gains. Models of GDP levels effects yield a much narrower distribution of GDP impacts centered around 1–3% losses, consistent with damage functions of major integrated assessment models. Further, models that incorporate lagged temperature effects are indicative of impacts on GDP levels rather than GDP growth. We identify statistically significant marginal effects of temperature on poor country GDP and agricultural production, but not rich country GDP, non-agricultural production, or GDP growth.
... A reverse-engineering approach to empirical economics contrasts with more conventional model-centric approaches that indirectly simulate market dynamics with models fitted to the data, and presume that good fits imply real-world correspondence [17,18]. However, relying on goodness-of-fit to empirically validate models commits the logical fallacy of 'affirming the consequent': If A, then B; B, therefore A. In the context of model validation: If the model is true, it provides a good fit; this model provides a good fit, therefore it is true [19]. ...
Article
Full-text available
An empirical question of long-standing interest is how price promotions affect a brand’s sale shares in the fast-moving consumer-goods market. We investigated this question with concurrent promotions and sales records of specialty beer brands pooled over Tesco stores in the UK. Most brands were continuously promoted, rendering infeasible a conventional approach of establishing impact against an off-promotion sales baseline, and arguing in favor of a dynamics approach. Moreover, promotion/sales records were volatile without easily-discernable regularity. Past work conventionally attributed volatility to the impact of exogenous random shocks on stable markets, and reasoned that promotions have only an ephemeral impact on sales shares in stationary mean-reverting stochastic markets, or a persistent freely-wandering impact in nonstationary markets. We applied new empirical methods from the applied sciences to uncover an overlooked alternative: ‘systematic persistence’ in which promotional impacts evolve systematically in an endogenously-unstable market governed by deterministic-nonlinear dynamics. We reconstructed real-world market dynamics from the Tesco dataset, and detected deterministic-nonlinear market dynamics. We used reconstructed market dynamics to identify a complex network of systematic interactions between promotions and sales shares among competing brands, and quantified/characterized the dynamics of these interactions. For the majority of weeks in the study, we found that: (1) A brand’s promotions drove down own sales shares (a possibility recognized in the literature), but ‘cannibalized’ sales shares of competing brands (perhaps explaining why brands were promoted despite a negative marginal impact on own sales shares); and (2) Competitive interactions between brands owned by the same multinational brewery differed from those with outside brands. In particular, brands owned by the same brewery enjoyed a ‘mutually-beneficial’ relationship in which an incremental increase in the sales share of one marginally increased the sales share of the other. Alternatively, the sales shares of brands owned by different breweries preyed on each other’s market shares.
Article
Full-text available
The primary purpose of the paper is to enable deeper insight into the measurement of economic forecast accuracy. The paper employs the systematic literature review as its research methodology. It is also the first systematic review of the measures of economic forecast accuracy conducted in scientific research. The citation-based analysis confirms the growing interest of researchers in the topic. Research on economic forecast accuracy is continuously developing and improving with the adoption of new methodological approaches. An overview of the limits and advantages of the methods used to assess forecast accuracy not only facilitate the selection and application of appropriate measures in future analytical works but also contribute to a better interpretation of the results. In addition to the presented advantages and disadvantages, the chronological presentation of methodological development (measures, tests, and strategies) provides an insight into the possibilities of further upgrading and improving the methodological framework. The review of empirical findings, in addition to insight into existing results, indicates insufficiently researched topics. All in all, the results presented in this paper can be a good basis and inspiration for creating new scientific contributions in future works.
Article
Full-text available
Many studies have dealt with the determinants of elderly migration, but none have done so from a time series perspective. This article focuses on the timing of elderly migration, or the factors that cause migration levels to be higher some years than other years. A model is developed for net migration of the elderly into rapidly growing areas. It is tested using data for three rapidly growing states: Florida, Arizona and Nevada. The model is found to explain a large proportion of the variation in annual levels of elderly net migration into these states and to provide estimates that track very well both inside and outside the sample period. The potential usefulness of the model in forecasting elderly and total population appears to be substantial.
Chapter
Early twentieth century research into genetic pathways initiated a revolution on hybrid crops, industrial farming, and eugenics and initially motivated the work on structural equation models. Geneticist and statistician Sewall Wright’s seminal work in path analysis laid the groundwork for the Chicago and Scandinavian Schools of structural equation modeling. Though rooted in the natural sciences, structural equation models rose to prominence through their utility for determining relationships and structures of unobserved, latent constructs in the social sciences. Advances in computing in the 1970s gave the increasingly complex structural equation model methodologies the necessary capabilities to handle larger and larger datasets. Automation drove the move from the pairwise Pearsonian correlations of path analysis, to structures of canonical correlations in partial least squares path analysis, and finally to the full covariance structure methods of the Chicago and Scandinavian Schools.
Chapter
Early structural equation models were developed around structures of canonical correlations through statistics developed in the 1930s. These evolved into partial least squares path analysis (PLS-PA). Hermann Wold developed multiple approaches to analyzing structures of latent constructs, culminating in the computer implementations of his research assistant Jan-Bernhard Lohmöller. Lohmöller’s software popularized structural equation models as a tool for interpreting survey research into structural models built upon pairs of latent constructs. Because of its shortcomings as a statistical tool, Wold considered PSL-PA results to be only “plausible” and suitable for exploratory data analysis. This chapter surveys the uses, misuses, and pitfalls of PLS-PA in data analysis. It additionally explores common misconceptions about PLS-PA such as the function of resampling, and sample size formulas.
Chapter
Partial least squares path methods elicit only pairwise relationships between latent constructs, though they allude to a more complete “plausible” structure. In the 1950s, the Cowles Commission at the University of Chicago and in the 1970s Karl Jöreskog of Uppsala University extended structural equation model methods to incorporate all of the information in the covariance structure, in what are called full-information methods. These methods placed structural equation models on secure statistical footing, allowing confirmatory testing of models of latent constructs. These models are substantially more complex than the crude methods used in PLS-PA, but provide performance measures that moved structural equation models from exploratory analysis into full-fledged hypothesis testing.
Article
Full-text available
This paper discusses the econometric methodology of general-to-specific modeling, in which the modeler simplifies an initially general model that adequately characterizes the empirical evidence within his or her theoretical framework. Central aspects of this approach include the theory of reduction, dynamic specification, model selection procedures, model selection criteria, model comparison, encompassing, computer automation, and empirical implementation. This paper thus reviews the theory of reduction, summarizes the approach of general-to-specific modeling, and discusses the econometrics of model selection, noting that general-to-specific modeling is the practical embodiment of reduction. This paper then summarizes fifty-seven articles key to the development of general-to-specific modeling.
Article
Full-text available
Refined olive oil (ROO) and extra virgin olive oil (EVOO) categories are different products with respect to their objective quality. Nevertheless, this quality gap is not reflected in the purchase behaviour of consumers in Spain, which is the main producer country worldwide. On the basis of economic theory, the price gap could be a part of the explanation; however, the objective price gap between EVOO and ROO has been on average around €0.40 kg⁻¹ since the 2007/2008 crop year in Spain. Therefore, this paper contributes to a more in-depth understanding of those factors, besides price, affecting consumers’ decision-making process in olive oil markets. We examine how consumers build their purchase preferences towards two products – namely EVOO and ROO–based on their evaluative judgements shaped by person-related and environmental factors. In doing so, a theoretical model is proposed and an empirical application in southern Spain is presented, using variance-based structural equation modelling (SEM) by means of partial least squares path modelling (PLS). The results show how attitude towards EVOO and ROO play a key role in explaining both EVOO and ROO consumption. In addition, taste preferences are shown to have an overriding moderator effect on the relationship between attitude towards ROO and consumption. Negative anticipated consequences regarding EVOO are core to shape consumers’ attitude towards ROO and also influence attitude towards the own product. Meanwhile, healthy shopping habits affect mainly attitude towards EVOO and the perceived value of private brands influences attitude towards ROO.
ResearchGate has not been able to resolve any references for this publication.