Article

25 Years of IIF Time Series Forecasting: A Selective Review

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We review the past 25 years of time series research that has been published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985; International Journal of Forecasting 1985-2005). During this period, over one third of all papers published in these journals concerned time series forecasting. We also review highly influential works on time series forecasting that have been published elsewhere during this period. Enormous progress has been made in many areas, but we find that there are a large number of topics in need of further development. We conclude with comments on possible future research directions in this field.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Time-series forecasting has been a topic of extensive research for years because of its use in many domains like weather and econometric forecasting. Reference [5] is a recent and comprehensive review of this research over the past 25 years. The motivation of making systems and DBMSs easier to manage (ideally self-tuning) has attracted recent interest in forecasting (e.g., [13,14]). ...
... Our goal in this paper is not to propose new synopses or transformers for forecasting-plenty of such operators are listed in [5]-but to develop algorithms that can compose existing operators into a good plan for a given query and dataset. While [13,14] show how various synopses can forecast performance problems and failures in real enterprise environments, transformations are selected manually in [13,14]. ...
... For example, [23] develops techniques to maintain a large number of LCCs in real-time. Reference [5] urges the time-series forecasting community to develop multivariate synopses that are robust and easy to use as well as good at multi-step-ahead forecasting. Commercial DBMSs have made rapid strides towards becoming self-tuning (e.g., see [17]). ...
Conference Paper
Forecasting future events based on historic data is useful in many domains like system management, adaptive query processing, environmental monitoring, and financial planning. We describe the Fa system where users and applications can pose declarative forecasting queries---both one-time queries and continuous queries---and get forecasts in real-time along with accuracy estimates. Fa supports efficient algorithms to generate execution plans automatically for forecasting queries from a novel plan space comprising operators for transforming data, learning statistical models from data, and doing inference using the learned models. In addition, Fa supports adaptive query-processing algorithms that adapt plans for continuous forecasting queries to the time-varying properties of input data streams. We report an extensive experimental evaluation of Fa using synthetic datasets, datasets collected on a testbed, and two real datasets from production settings. Our experiments give interesting insights on plans for forecasting queries, and demonstrate the effectiveness and scalability of our plan-selection algorithms.
... Similarly, the values measured by a temperature sensor indoors are not expected to change in normal conditions. The level of fuel or the distance traveled by a vehicle, the temperature of an area, the altitude of an airplane, the intensity detected by a light sensor during the day, the number of persons entering a mall over a certain period, etc., are examples of values that can be estimated with a prediction function (which can be built using different techniques [15,27]). ...
... If the query consists only of a value constraint, the format of predicted tuples is: <tpi, VM=I>, where the validity mark is given just by the validity interval I. For example, a tuple <s1, type1, t1, 3t+2, [5,15]> indicates that the sensor s1 sat-isfies the constraint during the interval [5,15] by considering the prediction function 3t + 2 (with timestamp t1). ...
... If the query consists only of a value constraint, the format of predicted tuples is: <tpi, VM=I>, where the validity mark is given just by the validity interval I. For example, a tuple <s1, type1, t1, 3t+2, [5,15]> indicates that the sensor s1 sat-isfies the constraint during the interval [5,15] by considering the prediction function 3t + 2 (with timestamp t1). ...
Conference Paper
Full-text available
Networks of sensors are used in many dieren t elds, from industrial applications to surveillance applications. A com- mon feature of these applications is the necessity of a mon- itoring infrastructure that analyzes a large number of data streams and outputs values that satisfy certain constraints. In this paper, we present a query processor for monitoring queries in a network of sensors with prediction functions. Sensors communicate their values according to a threshold policy, and the proposed query processor leverages predic- tion functions to compare tuples ecien tly and to generate answers even in the absence of new incoming tuples. Two types of constraints are managed by the query processor: window-join constraints and value constraints. Uncertainty issues are considered to assign probabilistic values to the results returned to the user. Moreover, we have developed an appropriate buer management strategy, that takes into account the contributions of the prediction functions con- tained in the tuples. We also present some experimental results that show the benets of the proposal.
... However, the ARIMA model cannot consider the influence of different time series. In the late 1970s and early 1980s, it was gradually found that linear models did not apply to many practical applications [7]. At this time, some applicable nonlinear models have been proposed, such as the bilinear model [27], threshold AutoRegressive model (AR) [31], AutoRegressive Conditional Heteroskedasticity model (ARCH) [8]. ...
... At this time, some applicable nonlinear models have been proposed, such as the bilinear model [27], threshold AutoRegressive model (AR) [31], AutoRegressive Conditional Heteroskedasticity model (ARCH) [8]. However, the analysis and prediction of nonlinear time series are still in their infancy [7]. ...
Article
Full-text available
Predicting the demand for Electric Vehicle charging energy is essential to increase utilization for the company, reduce costs for both car owners and the company and alleviate the burden on the electric grid stations. However, many factors may affect energy consumption at the station level, such as the growing popularity of EVs, time of day plugin, workday, holidays, random consumption, etc. To overcome the above challenges regarding avoiding overcharge, better managing dispatching stations, reducing energy wastage, we perform a comprehensive data analysis on EV charging stations and propose a novel deep learning based approach. Our research is based on the charging data obtained from a Chinese energy service provider, including the stations’ charging process and geographic information. In the forecasting part, we propose Temporal Encoder-Decoder +LSTM (T-LSTM-Enc) Concatenated with Temporal LSTM (T-LSTM-Ori-TimeFeatures) which aim to address the issue of charging demand prediction. The T-LSTM-Enc pre-trains the data to extract hidden relationships, and the T-LSTM-Ori-TimeFeatures capture the time features impacting the change on the charging data. We build our approach using temporal dependencies to apprehend the short-term, long-term, and trend characteristics for charging demand prediction. To show the efficiency of the proposed method, we evaluate our model using the two datasets for energy consumption of EV charging stations, and the results show that our approach gives promising and good performance.
... The validation and influence of (RF) for regression has been examined and analyzed in [3]. Another method called boosting method uses to emphasize misclassified cases by adding new examples which results in building an ensemble model in order to earn a competitive performance for time series prediction [4] [4][5] [5]. As the most common use of boosting, Ada-Boost [6] was assessed with other ML algorithms such as support vector machines (SVM) [7] and merged with this algorithm to further improve for prediction attitude [8]. ...
... The validation and influence of (RF) for regression has been examined and analyzed in [3]. Another method called boosting method uses to emphasize misclassified cases by adding new examples which results in building an ensemble model in order to earn a competitive performance for time series prediction [4] [4][5] [5]. As the most common use of boosting, Ada-Boost [6] was assessed with other ML algorithms such as support vector machines (SVM) [7] and merged with this algorithm to further improve for prediction attitude [8]. ...
Conference Paper
Forecasting energy consumption is critical in decision-making for efficient energy saving, improve stability of the power grid, and prevent supply-demand discrepancy. To predict day-ahead load forecasting for the demand of city of Kirkuk two scenarios were presented. First, benchmarked three individual machine learning algorithms e.g. generalized linear model (GLM), artificial neural network (ANN), and random forest (RF). Second, compared the predictive capabilities for individual models with the ensemble models. The results indicate that the predictive models maybe can be improved using simple ensemble learning strategies such as averaging the predicted results. This study is also present future research directions to improve the model prediction capabilities.
... The conventional method of stock market prediction mainly focuses on time-series analysis. De Gooijer et al. [3] have reviewed the papers published at journals managed by the International Institute of Forecasters (Journal of Forecasting 1982Forecasting -1985International Journal of Forecasting 1985-2005 and found that over one-third of all the papers published at these journals focused on time-series forecasting. In particular, conventional time-series analysis methods include autoregressive model (AR) [4], moving average model (MA) [5], autoregressive and moving average model (ARMA) [6] and autoregressive integrate moving average model(ARIMA) [7]. ...
... The conventional method of stock market prediction mainly focuses on time-series analysis. De Gooijer et al. [3] have reviewed the papers published at journals managed by the International Institute of Forecasters (Journal of Forecasting 1982Forecasting -1985International Journal of Forecasting 1985-2005 and found that over one-third of all the papers published at these journals focused on time-series forecasting. In particular, conventional time-series analysis methods include autoregressive model (AR) [4], moving average model (MA) [5], autoregressive and moving average model (ARMA) [6] and autoregressive integrate moving average model(ARIMA) [7]. ...
Article
Full-text available
Stock market prediction has been identified as a very important practical problem in the economic field. However, the timely prediction of the market is generally regarded as one of the most challenging problems due to the stock market’s characteristics of noise and volatility. To address these challenges, we propose a deep learning-based stock market prediction model that considers investors’ emotional tendency. First, we propose to involve investors’ sentiment for stock prediction, which can effectively improve the model prediction accuracy. Second, the stock pricing sequence is a complex time sequence with different scales of fluctuations, making the accurate prediction very challenging. We propose to gradually decompose the complex sequence of stock price by adopting empirical modal decomposition (EMD), which yields better prediction accuracy. Third, we adopt LSTM due to its advantages of analyzing relationships among time-series data through its memory function. We further revised it by adopting attention mechanism to focus more on the more critical information. Experiment results show that the revised LSTM model can not only improve prediction accuracy, but also reduce time delay. It is confirmed that investors’ emotional tendency is effective to improve the predicted results; the introduction of EMD can improve the predictability of inventory sequences; and the attention mechanism can help LSTM to efficiently extract specific information and current mission objectives from the information ocean.
... The fundamental part of this analysis is to determine the amount of energy to be contracted, linked to a certain effective purchase period. Different approaches can be employed for forecasting [15][16][17], including the ARIMA model [18] and artificial neural networks [19,20]. ...
Article
The essential difference between the Free Contracting Environment (FCE) and the Regulated Contracting Environment (RCE) is the possibility of freely negotiating energy terms and prices with suppliers. Disconnected from the tariffs regulated by the government, in the FCE, consumers bear the costly difference between the contracted energy and that consumed. This cost can be reduced with accurate knowledge of the consumer profile, based on the analysis of historical data. In this article, a methodology is proposed to evaluate the migration of consumers to the FCE. In a case study, graphical statistical techniques help identify the profile of a consumer in the city of Rio de Janeiro, subgroup A4 and with green tariff modality, in the period from 2016 to 2019. Then, classical and artificial neural network-based methods are used for consumption forecasting twelve months ahead. In particular, Long and Short Term Memories (LSTM) networks performed better than Autoregressive Integrated Moving Average (ARIMA) models. At the end, it is demonstrated with economic and financial indicators, the right decision of this consumer to migrate to the FCE, prior to the analysis performed in this case study.
... The combination of the aforementioned methods and other forecasting methods can grasp the factors of commodity price changes and get better forecasting results. Abramson and Finizza [13] proposed a combined prediction model of belief network and probability model to predict oil prices; Feng et al. [15] combined the Autoregressive Integrated Moving Average (ARIMA) model with the neural network to predict oil prices; Fan et al. [16] used the wavelet neural network to predict oil prices while optimizing the network input by principal component analysis. The results show that the combined forecasting method has higher accuracy than each of the methods. ...
Article
Gasoline is the lifeblood of the national economy. The forecasting of gasoline prices is difficult because of frequent price fluctuations, its complex nature, diverse influencing factors, and low accuracy of prediction results. Previous studies mainly focus on forecasting gasoline prices in a single region by single time series analysis which ignores the daily price co-movement of different series from multiple regions. Because price co-movement may contain useful information for price forecasting, this paper proposes the Lasso-CNN ensemble model that combines statistical models and deep neural networks to forecast gasoline prices. In this model, the Least Absolute Shrinkage and Selection Operator (Lasso) screens and chooses the correlated time series to enhance the performance of forecasting and avoid overfitting, while Convolutional Neural Network (CNN) takes the selected multiple series as its input and then forecasts the gasoline prices in a certain region. Forecasting results of gasoline prices at the national level and regional levels by using the new method demonstrate that the new approach provides more accurate results for the predictions of gasoline prices than those results generated by alternative methods. Thus, the relevant series can enhance the performance of forecasting and help to gain better results.
... Research on time series analytics is demonstrated in past research works [1,2] with a rich background and its pivotal importance trended recently with the growth of data volumes [3][4][5]. Due to the significance of the field, tools that are reliable, scalable, and accurate in forecasting are in high demand. ...
Article
Full-text available
A prominent area of data analytics is "timeseries modeling" where it is possible to forecast future values for the same variable using previous data. Numerous usage examples, including the economy, the weather, stock prices, and the development of a corporation, demonstrate its significance. Experiments with time series forecasting utilizing machine learning (ML), deep learning (DL), and AutoML are conducted in this paper. Its primary contribution consists of addressing the forecasting problem by experimenting with additional ML and DL models and AutoML frameworks and expanding the AutoML experimental knowledge. In addition, it contributes by breaking down barriers found in past experimental studies in this field by using more sophisticated methods. The datasets this empirical research utilized were secondary quantitative data of the real prices of the currently most used cryptocurrencies. We found that AutoML for timeseries is still in the development stage and necessitates more study to be a viable solution since it was unable to outperform manually designed ML and DL models. The demonstrated approaches may be utilized as a baseline for predicting timeseries data.
... Therefore, having an efficient and reliable forecasting model plays an important role in this industry. In the past decades, time series forecasting methods have been at the core of attention of researchers and practitioners in this field [3][4][5][6][7]. These methods are usually simple and understandable. ...
... Although research in time series analysis found a place in the past [1,2], its significance increased recently with the growth of data volumes resulting from users, industries, and markets. The research in time series data has a rich background with pivotal importance in different applications including economic, weather, stock price, business development, and other use cases. ...
Article
Full-text available
Time-series forecasting is a significant discipline of data modeling where past observations of the same variable are analyzed to predict the future values of the time series. Its prominence lies in different use cases where it is required, including economic, weather, stock price, business development, and other use cases. In this work, a review was conducted on the methods of analyzing time series starting from the traditional linear modeling techniques until the automated machine learning (AutoML) frameworks, including deep learning models. The objective of this review article is to support identifying the time-series forecasting challenge and the different techniques to meet the challenge. This work can be additionally an assist and a reference for researchers and industries demanding to use AutoML to solve the problem of forecasting. It identifies the gaps of the previous works and techniques used to solve the problem of forecasting time series.
... Journal of Forecasting (IJF) (De Gooijer & Hyndman, 2005). Often these studies were of an empirical nature, using one or more benchmark methods/ models as a comparison. ...
Thesis
Full-text available
This research is to forecast short term sales of Edlee Food Services every weekly by using Box-Jenkins ARIMA model. The data used to develop forecast model was from the total sales of all the seven outlets of Edlee Food Services on weekly basis from December 2007 until March 2010. The forecasting model used is non seasonable ARIMA (1,1,2). From this study, we can conclude that ARIMA models can be used for forecasting by other businesses for planning and decision-making purposes.
... However, physical reality demonstrates enough problems for taking into account nonlinear phenomena even if they seem to be quite clear on the whole, for example [Karagueuzian et al., 2013;Sun & Zhou, 2014;Schilder et al., 2015;Leonov et al., 2016;Vega et al., 2018;Rezgui & Lowenberg, 2018;Wang et al., 2019;Kolokolov & Monovskaya, 2019a;Benham et al., 2019;Tartaruga et al., 2019;Lenci & Rega, 2019]. At the same time, pragmatic daily activity relates mainly to comparatively stable systems, to which bifurcation phenomena seem alien; and thus handy trends and performance indexes on averaged dynamics have been looking preferable, for example [de Gooijer & Hyndman, 2006;Fildes et al., 2008;Climents & Hendry, 2012;Hyndman & Athanasopoulos, 2014;Tetlock & Gardner, 2015;Timmermann, 2019;Ahmed & Khalid, 2019]. So, many scientific, economic, social, and other arguments are considered before agreeing that results of bifurcation analysis can be crucial for one or another practical case. ...
Article
The paper concludes the series of the research works on the interdisciplinary analytical approach (the so-called bifurcation-fractal analytics) to forecast the dynamics of pulse systems on the basis of modified bifurcation diagrams. These diagrams are intended to integrate incompatible-in-traditional-spaces images (mainly transients from the phase space and evolution pictures from the parametric space) in order to analyze whether the running transient converges towards the domain of a desirable behavior taking into account its qualitative and quantitative characteristics. The interdisciplinary context of the analytics appears because the novel mathematical images need physical meaning and further this research should be continued for proposing the expedient engineering decision. Here, the advantages to forecast the evolutionary tendencies in different scales of the modified bifurcation diagrams are illustrated by computer-based simulations. Translations of the mathematical images into their physical implementations follow the principle of the bifurcation poker and the principle of the spatial nonuniformity. The experimental results to verify these principles are presented. Variants of engineering proposals are commented in comparison with traditional abilities of the small-signal design. The results systematized in the paper confirm that fundamental obstacles to forecast abnormal evolutionary changes in the dynamics of pulse energy converters are absent. Prospects of the outcome from the experience accumulated in engineering to similar problem statements independently of a pulse system nature are finally discussed. And the regulatory hypothesis on local climate dynamics is considered in this connection. This discussion is quite suitable here because a local climate system can be described as a converter of the solar energy under a specific pulse control realized by the astronomic forcing and there are circumstances, in which electrotechnical simulations can remain a unique way to accelerate the research on prerequisites and sequels of the forthcoming climate changes. Then the bifurcation-fractal analytics provides an initial theoretical foundation to this research.
... In this field notable contributions include the work of Wiener [26] and others [19]. Detailed overview on the advances of time series modelling and forecasting techniques over the last decades are given in [8] and references inside. Moreover, large scale benchmarking of popular modelling and forecasting techniques are given in [15]. ...
Chapter
Full-text available
Time series modelling and forecasting techniques have a wide spectrum of applications in several fields including economics, finance, engineering and computer science. Most available modelling and forecasting techniques are applicable to a specific underlying phenomenon and its properties and lack generality of application, while more general forecasting techniques require substantial computational time for training and application. Herewith, we present a general modelling framework based on a recursive Schur - complement technique, that utilizes a set of basis functions, either linear or non-linear, to form a model for a general time series. The basis functions need not be orthogonal and their number is determined adaptively based on fitting accuracy. Moreover, no assumptions are required for the input data. The coefficients for the basis functions are computed using a recursive pseudoinverse matrix, thus they can be recomputed for different input data. The case of sinusoidal basis functions is presented. Discussions around stability of the resulting model and choice of basis functions is also provided. Numerical results depicting the applicability and effectiveness of the proposed technique are given.
... Traditional stock market forecasting methods focus on time series analysis. De Gooijer et al. [1] reviewed papers published in journals administered by the International Association of Forecasters (Journal of Forecasting 1982Forecasting -1985 International Journal of Forecasting 1985Forecasting -2005 and found that in these journals more than a third of the published papers focused on time series prediction. In particular, traditional time series analysis methods include autoregressive (AR), moving average (MA), autoregressive and moving average (ARMA), and autoregressive integral moving average (ARIMA) [2]. ...
Article
Full-text available
This paper is an analysis of investor sentiment in the stock market based on the bidirectional encoder representations from transformers (BERT) model. First, we extracted the sentiment value from online information published by stock investor, using the Bert model. Second, these sentiment values were weighted by attention for computing the investor sentiment indicator. Finally, the relationship between investor sentiment and stock yield was analyzed through a two-step cross-sectional regression validation model. The experiments found that investor sentiment in online reviews had a significant impact on stock yield. The experiments show that the Bert model used in this paper can achieve an accuracy of 97.35% for the analysis of investor sentiment, which is better than both LSTM and SVM methods.
... Even though an enormous body of research is conducted on forecasting methods and new methods continue to emerge, a significant proportion of these new forecasting methods are based on the two famous forecasting benchmarks: exponential smoothing and ARIMA [3]. Among the two, exponential smoothing is preferred more frequently to be used as a basis of comparison for new approaches both in its original form and in forecasting combinations over using ARIMA due to its simplicity and proven accuracy [4]. ...
Article
Full-text available
Forecasting is a crucial step in almost all scientific research and is essential in many areas of industrial, commercial, clinical and economic activity. There are many forecasting methods in the literature; but exponential smoothing stands out due to its simplicity and accuracy. Despite the facts that exponential smoothing is widely used and has been in the literature for a long time, it suffers from some problems that potentially affect the model's forecast accuracy. An alternative forecasting framework, called Ata, was recently proposed to overcome these problems and to provide improved forecasts. In this study, the forecast accuracy of Ata and exponential smoothing will be compared among data sets with no or linear trend. The results of this study are obtained using simulated data sets with different sample sizes, variances. Forecast errors are compared within both short and long term forecasting horizons. The results show that the proposed approach outperforms exponential smoothing for both types of time series data when forecasting the near and distant future. The methods are implemented on the U.S. annualized monthly interest rates for services data and their forecasting performance are also compared for this data set.
... These methods can be classified as qualitative and quantitative. For quantitative methods, different statistical models have been developed ( Abdollahzade et al. 2015;Gooijer and Hyndman 2005). However, with the advances and improvements of computational techniques, artificial intelligence (AI) techniques have occupied a relevant spot in the developed models (AcostaCervantes et al. 2013;Aladag et al. 2009;Andrawis et al. 2011;Du et al. 2014). ...
Chapter
Full-text available
owadays, good supply chain management is most important to guarantee a competitive advantage and to accomplish the value promise offered to the company’s clients. To this end, it is important to reduce uncertainty associated with demand, and it is important that demand forecast is as accurate as possible. To achieve this, it is necessary to know the features of the demand to be forecast and, based on this, to build or choose the best and the most accurate model or technique, which is based normally on that with fewer errors. Many statistical techniques exist, but for some 20 years, many heuristic algorithms have been developed that allow to absorb the variance associated with demand, and to reduce forecasting errors with better results than those obtained by statistical methods. These methods are normally adaptive and allow to hybridize techniques to construct different models. This document reviews these adaptive techniques, such as neural networks (NN) and hybrid methods, and in particular models based on case-based reasoning (CBR).
... These methods can be classified as qualitative and quantitative. For quantitative methods, different statistical models have been developed ( Abdollahzade et al. 2015;Gooijer and Hyndman 2005). However, with the advances and improvements of computational techniques, artificial intelligence (AI) techniques have occupied a relevant spot in the developed models (AcostaCervantes et al. 2013;Aladag et al. 2009;Andrawis et al. 2011;Du et al. 2014). ...
... In recent decades, numerous time series forecasting models have been proposed. A review of the 25 year period until the year 2005 can be found in De Gooijer and Hyndman (2005). It is clear that time series forecasting methods are still dominated by two major techniques: Box et al. (1970) (ARIMA) and exponential smoothing (ES) (Brown, 1959). ...
Article
Full-text available
Exponential smoothing models are simple, accurate and robust fore- casting models and because of these they are widely applied in the literature. Holt's linear trend method is a valuable extension of exponential smoothing that helps deal with trending data. In this study we propose a modified version of Holt's linear trend method that eliminates the initialization issue faced when fitting the original model and simplifies the optimization process. The proposed method is compared empirically with the most popular forecasting algorithms based on ex- ponential smoothing and Box-Jenkins ARIMA with respect to its predictive performance on the M3-Competition data set and is shown to outperform its competitors.
... Aunque esta medida del error de un modelo de predicción presenta serios problemas cuando el valor medido es cero [40], se puede emplear sin ningún problema por el rango fisiológico de la glucosa (40 – 400 mg/dl) [1] en el que se está trabajando. Para mejorar la capacidad de predicción de nuestro modelo de regresión por PLS se procesaron los espectros: al de impedancia eléctrica se derivó dos veces de modo que la presencia de picos se hiciera mas evidente [35], como lo señala la figura 3.12. ...
Thesis
Full-text available
La diabetes mellitus es una enfermedad crónica en la cual la glucosa en la sangre alcanza niveles muy altos. Estos niveles de glucosa anormales pueden dañar los vasos sanguíneos grandes y pequeños, pueden provocar ceguera, enfermedades de los riñones, amputaciones de extremidades, y enfermedades cardiovasculares. De acuerdo al Sistema Nacional de Información en Salud actualmente existen mas de 10 millones de personas diagnosticadas con diabetes y es la principal causa de mortalidad general en México, con el 13.6 % de los fallecimientos. En los pacientes diabéticos un monitoreo frecuente de las concentraciones de glucosa en la sangre es crucial para seguir un tratamiento efectivo y así reducir la mortalidad. Sin embargo, el autoanálisis requiere que la persona diabética extraiga una gota de sangre de la yema del dedo. Este método es incómodo y doloroso para el paciente, sobre todo si se tiene que medir el nivel de glucosa varias veces al día, además conlleva riesgo de infección. Es por eso que se han llevado a cabo esfuerzos para desarrollar un medidor de glucosa no invasivo mediante métodos ópticos o eléctricos. En este trabajo se describe una técnica conjunta basada en mediciones de impedancia eléctrica y de espectroscopía en el cercano infrarrojo (NIR) para predecir las concentraciones capilares de glucosa. Diez voluntarios sanos participaron en el experimento, con concentraciones de glucosa desde 94 hasta 223 mg/dl. El error medio cuadrático de predicción (RMSEP) de la técnica conjunta electro - óptica fue de 21.96 mg/dl y el error porcentual medio absoluto tuvo un valor de 13.31%. El análisis de error de la rejilla de Clarke arrojó que el 77.86% de los valores se encontró en la zona A, un 22.14% en la región B y ninguno en las zonas C-E. Estos resultados muestran evidencia preliminar que una técnica conjunta electro–óptica puede utilizarse para predecir las concentraciones de glucosa capilar de forma no invasiva. Se deberán ejecutar mas estudios para evaluar la utilidad y limitaciones de dicha técnica en pruebas a largo plazo, así también como considerar los efectos en el modelo de predicción de los factores ambientales y fisiológicos.
... An alternative is to employ a demand model based on a specific forecasting technique, such as time series forecasting techniques. (De Gooijer and Hyndman, 2005) provide a comprehensive review on multiple time series forecasting techniques. Moreover, (Poler and Mula, 2011) propose an automatic selection method among these techniques for better adapting different work settings and reducing forecast errors. ...
Article
This research discusses procurement planning problems engaged in global sourcing. The main difficulty is caused by the geographically long distance between buyer and supplier, which results in long lead times when maritime transport is used. Customer demands of finished products usually evolve during the shipment, thus extra costs will be produced due to unpredictable overstocks or stockouts. This thesis presents adaptive planning approaches to make adequate long-distance procurement plans in a cost-efficient manner.Firstly, an adaptive procurement planning framework is presented. The framework deploys demand forecasting and optimal planning in a rolling horizon scheme. In each subhorizon, demands are assumed to follow some known distribution patterns, while the distribution parameters will be estimated based on up-to-date demand forecasts and forecast accuracy. Then a portable processing module is presented to transform the sub-horizon planning problem into an equivalent standard lot-sizing problem with stochastic demands.Secondly, optimal or near-optimal procurement planning methods are developed to minimize expected total costs including setup, inventory holding and stockout penalty in subhorizons. Two extreme stockout assumptions are considered: backorder and lost sale (or outsourcing). The proposed methods can serve as benchmarks to evaluate other methods. Numerical tests have validated the high efficiency and effectiveness of both sub-horizon planning methods and the overall adaptive planning approaches.
... There is no consensus on which is best so it is generally recommended to use several. We shall consider all measures listed in the surveys of [6,7] and the article [20]. ...
Article
Full-text available
Only two Croston-style forecasting methods are currently known for handling stochastic intermittent demand with possible demand obsolescence: TSB and HES, both shown to be unbiased. When an item becomes obsolescent then TSB's forecasts decay exponentially, while HES's decay hyperbolically. We describe a third variant called Linear-Exponential Smoothing that is also unbiased, decays linearly to zero in a finite time, is asymptotically the best variant for handling obsolescence, and performs well in experiments.
... In general, a wide range of statistical and artificial intelligence techniques have been developed for process forecasting. Statistical time series methods are based on the assumption that the data have an internal structure that can be identify by using simple and partial autocorrelation, [1], [2], [3], [4]. Time series forecasting methods detect and explore such a structure. ...
Article
This paper presents a model for predicting the next-day energy production of a photovoltaic solar plant. The model is capable of forecasting the next-day production profile of such a system, merely by using the information obtained from the plant itself and the solar global radiation values for the previous operation days. This prediction is key in many photovoltaic systems in order to interact with conventional electrical grids. For example, Spanish legislation requires this type of information for large photovoltaic plants. In fact, the deviations from the predicted values are financially penalized. A three-stage procedure is used to build the model, which is capable of learning specific information about each facility and of using this information to fit the prediction. This model binds the use of regression techniques and the use of a special type of probabilistic finite automata developed from machine learning. The energy prediction yearly error is less that 20 percent which is a significant improvement over previous proposed models, whose errors are around 25 percent.
... Next we survey error measures, largely based on [10,11]. No one error measure is generally accepted as useful on intermittent demand, and opinion is highly divided [24]. ...
Article
Full-text available
To compare different forecasting methods on demand series we require an error measure. Many error measures have been proposed, but when demand is intermittent some become inapplicable, some give counter-intuitive results, and there is no agreement on which is best. We argue that almost all known measures rank forecasters incorrectly on intermittent demand series. We propose several new error measures with wider applicability, and correct forecaster ranking on several intermittent demand patterns. We call these "mean-based" error measures because they evaluate forecasts against the (possibly time-dependent) mean of the underlying stochastic process instead of point demands.
... The existing diagnostic techniques [8,[14][15][16][17] do not ensure to a full extent the solution of the problem of prediction of operational stability loss in PECs owing to some specific features of nonlinear dynamics in the systems of this class, as well as to a high modu lation frequency [18]. Nevertheless, approaches to solving this problem are starting to form. ...
Article
The possibility of diagnosing bifurcation phenomena in the dynamics of a PWM converter by the symbolic and the spectral methods is explored. The investigation is carried out using an experimental DC PWM drive setup.
... In any comparison of forecasting methods we must choose accuracy measures. [9] lists 17 measures, noting that a " bewildering array of accuracy measures have been used to evaluate the performance of forecasting methods " , that no single method is generally preferred, and that some are not well-defined on data with intermittent demand. We shall use measures that have been recently recommended n i=2 |y i − y i−1 | ...
Article
Croston's method is generally viewed as superior to exponential smoothing when demand is intermittent, but it has the drawbacks of bias and an inability to deal with obsolescence, in which an item's demand ceases altogether. Several variants have been reported, some of which are unbiased on certain types of demand, but only one recent variant addresses the problem of obsolescence. We describe a new hybrid of Croston's method and Bayesian inference called Hyperbolic-Exponential Smoothing, which is unbiased on non-intermittent and stochastic intermittent demand, decays hyperbolically when obsolescence occurs and performs well in experiments.
... In recent decades, numerous time series forecasting models have been proposed. De Gooijer and Hyndman [18]performed a review of the last 25 years. Time series forecasting software tools usually offer a variety of techniques, some of which provide the user the possibility to automatically define parameters or, even, to select the best forecasting method. ...
Chapter
Companies survive in saturated markets trying to be more productive and more efficient. In this context, to manage more accurately the finished goods inventories becomes critical for make to stock production systems companies. In this paper an inventory replenishment expert system with the objectives of improving quality service and reducing holding costs is proposed. The Inventory Replenishment Expert System (IRES) is based on a periodic review inventory control and time series forecasting techniques. IRES propose the most effective replenishment strategy for each supply classed derived of an ABC-XYZ Analysis.
... A large variety of statistical and artificial intelligence techniques have been developed for process forecast~ng [14]. Time series methods are based on the assumption that the data have an internal structure, such as autocorrelation, trend, or seasonal variation. ...
Conference Paper
Full-text available
Variations of solar irradiance are known to have a significant influence on electric power generation by solar energy systems. With high connection densities of PV system in the low voltage (LV) network, this might cause to degrade electric power quality. The present study describes a multiplicative ARMA models to generate instantaneous series of global irradiation. The data set used in this work corresponding to five minutes global irradiance data were recorded in a radiometric station located in south Spain (Córdoba) during a four years period (1994–1997). The development of these models is based on removing the annual periodicity and seasonal variation of solar radiation. The method proposed considers fitting an AR model to the data. The selection of the order of the model has been made on the basis of seven different criteria. The predicted values of solar radiation are compared with the observed data series and it was found that this approach leads to optimal predictions.
... A large variety of statistical and artificial intelligence techniques have been developed for process forecasting [14]. Time series methods are based on the assumption that the data have an internal structure, such as autocorrelation, trend, or seasonal variation. ...
Conference Paper
Full-text available
Variations of solar irradiance are known to have a significant influence on electric power generation by solar energy systems. With high connection densities of PV system in the low voltage (LV) network, this might cause to degrade electric power quality. The present study describes a multiplicative ARMA models to generate instantaneous series of global irradiation. The data set used in this work corresponding to five minutes global irradiance data were recorded in a radiometric station located in south Spain (Cordoba) during a for years period (1994-1997). The development of these models is based on removing the annual periodicity and seasonal variation of solar radiation. The method proposed considers fitting an AR model to the data. The selection of the order of the model has been made on the basis of seven different criteria. The predicted values of solar radiation are compared with the observed data series and it was found that this approach leads to optimal predictions.
... Especially prediction on data stream, which is assumed huge amount of data and high speed updating, is important technique in many domains. There have been many prediction approach proposed so far [4]. One of the traditional approach is Exponential Smoothing that is a heuristic method based on weighted mean of past data. ...
Conference Paper
In this paper, we propose a new technique for time-series prediction. Here we assume that time-series data occur depending on event which is unobserved directly, and we estimate future data as output from the most likely event which will happen at the time. In this investigation we model time-series based on event sequence by using Hidden Markov Model(HMM), and extract time-series patterns as trained HMM parameters. However, we can’t apply HMM approach to data stream prediction in a straightforward manner. This is because Baum-Welch algorithm, which is traditional unsupervised HMM training algorithm, requires many stored historical data and scan it many times. Here we apply incremental Baum-Welch algorithm which is an on-line HMM training method, and estimate HMM parameters dynamically to adapt new time-series patterns. And we show some experimental results to see the validity of our method.
... In particular, valuable tools for forecasting and time series processing appear in statistics and signal processing. [8] is a recent and comprehensive review of this research over the past 25 years. [6] and [11] present additional work in this area. ...
Article
Full-text available
Time series data is common in many settings including scientific and financial applications. In these applications, the amount of data is often very large. We seek to support prediction queries over time series data. Prediction relies on model building which can be too expensive to be practical if it is based on a large number of data points. We propose to use statistical tests of hypotheses to choose a proper subset of data points to use for a given prediction query interval. This involves two steps: choosing a proper history length and choosing the number of data points to use within this history. Further, we use an I/O conscious skip list data structure to provide samples of the original data set. Based on the statistics collected for a query workload, which we model as a probability mass function (PMF) over query intervals, we devise a randomized algorithm that selects a set of pre-built models (PM's) to construct, subject to some maintenance cost constraint when there are updates. Given this set of PM's, we discuss interesting query processing strategies for not only point queries, but also range, aggregation, and JOIN queries. We conduct a comprehensive empirical study on real world datasets to verify the effectiveness of our approaches and algorithms.
Chapter
Bigdata analysis has been the key to the abnormal detection of industrial systems using the Industrial Internet of Things (IIoT). How to effectively detect anomalies using industrial spatial-temporal sensor data is a challenging issue. Deep learning-based anomaly detection methods have been widely used for abnormal detection and fault identification with limited success. Temporal Convolutional Network (TCN) has the advantages of parallel structure, larger receptive field and stable gradient. In this work, we propose a new industrial anomaly detection model based on TCN, called IAD-TCN. In order to highlight the features related to anomalies and improve the detection ability of the model, we also introduce attention mechanism into the model. The experimental results over real industrial datasets show that the IAD-TCN model outperforms the traditional TCN model, the long short-term memory network (LSTM) model, and the bidirectional long short-term memory network model (BiLSTM).
Article
Full-text available
Because of the complexity, nonlinearity, and volatility, stock market forecasting is either highly difficult or yields very unsatisfactory outcomes when utilizing traditional time series or machine learning techniques. To cope with this problem and improve the complex stock market’s prediction accuracy, we propose a new hybrid novel method that is based on a new version of EMD and a deep learning technique known as long-short memory (LSTM) network. The forecasting precision of the proposed hybrid ensemble method is evaluated using the KSE-100 index of the Pakistan Stock Exchange. Using a new version of EMD that uses the Akima spline interpolation technique instead of cubic spline interpolation, the noisy stock data are first divided into multiple components technically known as intrinsic mode functions (IMFs) varying from high to low frequency and a single monotone residue. The highly correlated sub-components are then used to build the LSTM network. By comparing the proposed hybrid model with a single LSTM and other ensemble models such as the support vector machine (SVM), Random Forest, and Decision Tree, its prediction performance is thoroughly evaluated. Three alternative statistical metrics, namely root means square error (RMSE), mean absolute error (MAE) and mean absolute percentage error (MAPE), are used to compare the aforementioned techniques. The empirical results show that the suggested hybrid Akima-EMD-LSTM model beats all other models taken into consideration for this study and is therefore recommended as an effective model for the prediction of non-stationary and nonlinear complex financial time series data.
Article
La Previsión de la Demanda es un proceso crucial para cualquier empresa sea proveedor, fabricante o minorista. En la literatura existe una gran cantidad trabajos de investigación sobre técnicas de previsión de series temporales. Sin embargo, en muchos casos, la selección del mejor modelo de previsión de series temporales para cada histórico a tratar sigue siendo un problema complejo. En este artículo se propone un nuevo procedimiento de determinación automática de parámetros de métodos de previsión de series temporales, basado en la determinación de los errores de acierto según un método de simulación de predicción ex-ante de horizonte rodante.
Article
This paper investigates the multiscale properties and the evolution patterns of risk connectedness in global equity markets using the sample from January 03, 2000 to December 24, 2018. GARCH-EVT-VaR model is used to measure systemic risk of global equity markets. We deal with the application of Univariate Generalised Autoregressive Conditional Heteroskedasticity (GARCH) model and Extreme Value Theory (EVT) to measure the Value at Risk (VaR) of global equity markets. We then introduce the maximal overlap discrete wavelet transform (MODWT) method and partial correlation coefficients into the complex network theory to construct multiscale and partial correlation networks in global equity markets. We find the evidence of the strong risk connectedness among global equity markets in both the overall and multiscale networks, and their topological properties vary across time-frequency horizons. The US and Eurozone equity markets play a predominant role in the process of risk transmission. Most developing markets seem to remain inactive in the multiscale networks of risk connectedness. The rolling-window analysis based on time-frequency domain demonstrates that risk networks tend to be more concentrated not only at the mid-term scales but also during the financial crisis. Empirical results reveal multiscale risk connectedness characteristics, providing useful information for regulators to formulate macro prudential supervision policy and investors to update the portfolio strategies and risk prevention measures.
Article
We study a practical problem of predicting the upcoming events in data streams using a novel approach. Treating event time orders as relationship types between event entities, we build a dynamic knowledge graph and use it to predict future event timing. A unique aspect of this knowledge graph embedding approach for prediction is that we enhance conventional knowledge graphs with the notion of "states"---in what we call the ephemeral state nodes---to characterize the state of a data stream over time. We devise a complete set of methods for learning relevant events, for building the event-order graph stream from the original data stream, for embedding and prediction, and for theoretically bounding the complexity. We evaluate our approach with four real world stream datasets and find that our method results in high precision and recall values for event timing prediction, ranging between 0.7 and nearly 1, significantly outperforming baseline approaches. Moreover, due to our choice of efficient translation-based embedding, the overall throughput that the stream system can handle, including continuous graph building, training, and event predictions, is over one thousand to sixty thousand tuples per second even on a personal computer---which is especially important in resource constrained environments, including edge computing.
Article
Organizations rely on accurate demand forecasts to make production and ordering decisions in a variety of supply chain positions. Significant research in time series forecasting techniques and a variety of forecasting methods are available in the market. However, selecting the most accurate forecasting model for a given time series has become a complicated decision. Prior studies of forecasting methods have used either in-sample or out-of-sample performance as the basis for model selection procedures, but typically fail to incorporate both in their decision-making framework. In this research, we develop an expert system for time series forecasting model selection, using both relative in-sample performance and out-of-sample performance simultaneously to train classifiers. These classifiers are employed to automatically select the best performing forecasting model without the need for decision-maker intervention. The new model selection scheme bridges the gap between using in-sample and out-of-sample performance separately. The best performing model on the validation set is not necessarily selected by the expert system, since both in-sample and out-of-sample information are essential in the selection process. The performance of the proposed expert system is tested using the monthly dataset from the M3-Competition, and the results demonstrate an overall minimum of 20% improvement in the optimality gap comparing to the train/validation method. The new forecasting expert system is also applied to a real case study dataset obtained from MonarchFx (a distributed logistics solutions provider). This result demonstrates a robust predictive capability with lower mean squared errors, which allows organizations to achieve a higher level of accuracy in demand forecasts.
Article
Forecast combination has been proved to be an effective way to improve the forecasting accuracy. Most of the combining forecast methods now available belong to performance based weighting strategies, which judge the individual models mainly on the basis of their in-sample forecasting accuracy. Less attention has been paid to consider the characteristics underlying the distribution or the shape of forecasts from individual forecasters. However, information hidden in the distributions is of great value because the difference of shapes indicating distinct response towards the same pattern of a certain time series. In this paper, a cloud model based hybrid method for combining forecast(CMBCF) is proposed. In general, the new framework attempts to extract the local distribution characteristics of forecasting series by transforming the series into several cloud models. After the similarity comparison of the series represented in the form of cloud models, CMBCF assigns dynamic weights to individual models and construct the final combining forecast. The experimental results based on widely used time series data sets demonstrate the advantage of CMBCF over several traditional and state-of-art combining forecast strategies.
Conference Paper
Solar radiation forecasting is important for multiple fields, including solar energy power plants connected to grid. To address the need for solar radiation hourly forecasts this paper proposes the use of statistical and data mining techniques that allow different solar radiation hourly profiles for different days to be found and established. A new method is proposed for forecasting solar radiation hourly profiles using daily clearness index. The proposed method was checked using data recorded in Malaga. The obtained results show that it is possible to forecast hourly solar global radiation for a day with an energy error around 10% which means a significant improvement on previously reported errors.
Conference Paper
This paper compares four different methods to obtain maximum likelihood estimates of the parameters of a Gompertz-lognormal diffusion process, where no analytical solution for the likelihood equations exists. A recursive method, a Newton-Raphson algorithm, a Simulated Annealing algorithm and an Evolutionary Algorithm to obtain estimates are proposed. The four methods are compared using a simulated data set. The results are compared with simulated paths of the process in terms of several error measurements.
Article
Stock market prediction is a complex and tedious task that involves the processing of large amounts of data, that are stored in ever growing databases. The vacillating nature of the stock market requires the use of data mining techniques like clustering for stock market analysis and prediction. Genetic algorithms and neural networks have the ability to handle complex data. In this chapter, we propose a fuzzy based neuro-genetic algorithm - Fuzzy based Evolutionary Approach to Self Organizing Map(FEASOM) to cluster stock market data. Genetic algorithms are used to train the Kohonen network for better and effective prediction. The algorithm is tested on real stock market data of companies like Intel, General Motors, Infosys, Wipro, Microsoft, IBM, etc. The algorithm consistently outperformed regression model, backpropagation algorithm and Kohonen network in predicting the stock market values.
Article
Full-text available
In this work we present a large scale comparison study for the major machine learning models for time series forecasting. Specifically, we apply the models on the monthly M3 time series competition data (around a thousand time series). There have been very few, if any, large scale comparison studies for machine learning models for the regression or the time series forecasting problems, so we hope this study would fill this gap. The models considered are multilayer perceptron, Bayesian neural networks, radial basis functions, generalized regression neural networks (also called kernel regression), K-nearest neighbor regression, CART regression trees, support vector regression, and Gaussian processes. The study reveals significant differences between the different methods. The best two methods turned out to be the multilayer perceptron and the Gaussian process regression. In addition to model comparisons, we have tested different preprocessing methods and have shown that they have different impacts on the performance.
Chapter
Full-text available
In this paper, the three main forecasting topics that are currently getting the most attention in electric power systems are addressed: load, wind power and electricity prices. Each of these time series exhibits its own stylized features and is therefore forecasted in a very different manner. The complete set of forecasting models and techniques included in this revision constitute a guided tour in power systems forecasting. KeywordsElectricity markets-Electricity price forecasting-Short term load forecasting-Time series models-Wind power forecasting
Conference Paper
Full-text available
Hot-spot events accessing has recently received considerable attentions in the event stream historical analysis systems. Noting that predicates in SQL (Structured Query Language) requests usually have similarity features in a short time in event stream systems, that means events frequently queried recently might be queried again in the near future. This paper proposes a prediction model to forecast query predicates and then to choose them for speculative execution. We propose an adaptive two-level scoring (TLS) prediction algorithm, which can adjust parameters according to the system resource usage conditions. We introduce two metrics accuracy rate and efficiency rate, for query prediction evaluation, and make a detailed analysis of system costs. Our experimental results in DBroker system demonstrate the TLS algorithm and local speculative execution method can significantly reduce query response time.
Article
Forecasting industrial end-use natural gas consumption is an important prerequisite for efficient system operation and a basis for planning decisions. This paper presents a novel prediction model that provides forecasting in a medium-term horizon (1–3 years) with a very high resolution (days) based on a decomposition approach. The forecast is obtained by the combination of three different components: one that captures the trend of the time series, a seasonal component based on the Linear Hinges Model, and a transitory component to estimate daily variations using explanatory variables. The flexibility of the model allows describing demand patterns in a very wide range of historical profiles. Furthermore, the proposed method combines a very simple representation of the forecasting model, which allows the expert to integrate judgmental analysis and adjustment of the statistical forecast, with accuracy and high computational efficiency. Realistic case studies are provided.
Article
Demand forecasting is an essential process for any firm whether it is a supplier, manufacturer or retailer. A large number of research works about time series forecast techniques exists in the literature, and there are many time series forecasting tools. In many cases, however, selecting the best time series forecasting model for each time series to be dealt with is still a complex problem. In this paper, a new automatic selection procedure of time series forecasting models is proposed. The selection criterion has been tested using the set of monthly time series of the M3 Competition and two basic forecasting models obtaining interesting results. This selection criterion has been implemented in a forecasting expert system and applied to a real case, a firm that produces steel products for construction, which automatically performs monthly forecasts on tens of thousands of time series. As result, the firm has increased the level of success in its demand forecasts.
Article
Full-text available
We assess the usefulness of pre-testing for seasonal roots, based on the HEGY approach, for out-of-sample forecasting. It is shown that if there are shifts in the deterministic seasonal components then the imposition of unit roots can partially robustify sequences of rolling forecasts, yielding improved forecast accuracy. The analysis is illustrated with two empirical examples where more accurate forecasts are obtained by imposing more roots than is warranted by HEGY. The issue of assessing forecast accuracy when predictions of any one of a number of linear transformations may be of interest is also addressed.
Article
Full-text available
Aggregating information by combining forecasts from two or more forecasting methods is an alternative to using just a single method. In this paper we provide extensive empirical results showing that combined forecasts obtained through weighted averages can be quite accurate. Five procedures for estimating weights are investigated, and two appear to be superior to the others. These two procedures provide forecasts that are more accurate overall than forecasts from individual methods. Furthermore, they are superior to forecasts found from a simple unweighted average of the same methods.
Article
A vector of continuous proportions consists of the proportions of some total accounted for by its constituent components. An example is the proportions of world motor vehicle production by Japan, the USA and all other countries. We consider the situation where time series data are available and where interest focuses on the proportions rather than the actual amounts. Reasons for analysing such times series include estimation of the underlying trend, estimation of the effect of covariates and interventions, and forecasting. We develop a state space model for time series of continuous proportions. Conditionally on the unobserved state, the observations are assumed to follow the Dirichlet distribution, often considered to be the most natural distribution on the simplex. The state follows the Dirichlet conjugate distribution which is introduced here. Thus the model, although based on the Dirichlet distribution, does not have its restrictive independence properties. Covariates, trends, seasonality and interventions may be incorporated in a natural way. The model has worked well when applied to several examples, and we illustrate with components of world motor vehicle production.
Article
This paper describes a Bayesian approach to forecasting. The principles of Bayesian forecasting are discussed and the formal inclusion of “the forecaster” in the forecasting system is emphasized as a major feature. The basic model, the dynamic linear model, is defined together with the Kalman filter recurrence relations and a number of model formulations are given. Multi‐process models introduce uncertainty as to the underlying model itself, and this approach is described in a more general fashion than in our 1971 paper. Applications to four series are described in a sister paper. Although the results are far from exhaustive, the authors are convinced of the great benefits which the Bayesian approach offers to forecasters.
Article
Four predictors for a temporally and contemporaneously aggregated variable are compared. The predictors are obtained by (1) temporally and contemporaneously aggregating forecasts from a . multivariate model for the disaggregate data, (2) forecasting the temporally and contemporaneously aggregated variable directly, (3) forecasting the contemporaneously aggregated series and temporally aggregating the forecasts, and (4) forecasting the temporally aggregated system of variables and contemporaneously aggregating the forecasts. The basis for comparison is the mean squared forecast error. Under this criterion the first predictor is optimal if the data generation process is assumed known. Under special circumstances the third predictor may become optimal if the process has to be specified and estimated on the basis of the available data set.
Article
When considering the relative quality of forecasts the method of comparison is relevant: should we use vertical measures, such as mean square forecasting error, or the recently developed horizontal measure time distance. Four models for inflation in the US are considered based on univariate time series, a leading indicator, a univariate model combining with the specifications of the two models, and a bivariate model. According to the mean squared forecast errors an AR(1) model is superior, but it performs much less well than models using a leading indicator when considered in terms of time distance. These results hold for both standard procedures and for the bootstrap reality check. (C) 2002 Published by Elsevier B.V. on behalf of International Institute of Forecasters.
Article
The simplifying operators in ARIMA (autogressive integrated moving average) models determine the form of the corresponding forecast functions. For example, regular differences imply polynomial trends and seasonal differences certain periodic functions. The same functions also arise in the context of many other forecast procedures, such as regressions on time, exponential smoothing and Kalman filtering. In this paper we describe how the various methods update the coefficients in these forecast functions and discuss their similarities and differences. In addition, we compare the forecasts from seasonal ARIMA models and the forecasts from Winters' additive and multiplicative smoothing methods. /// L'application d'opérateurs simplifiant les modèles ARIMA détermine du même coup le genre de fonctions de prédiction qui en résulte. Par exemple, l'utilisation de différences finies basées sur des intervalles réguliers génère des fonctions polynomiales, tandis que les différences finies éliminant les comportements saisonniers génèrent des fonctions à caractère périodique. Ces mêmes fonctions se retrouvent dans d'autres méthodes de prédiction, telles que les régressions sur le temps, la graduation exponentielle et le filtre de Kalman. Cet article présente donc une description de l'influence des différentes méthodes sur le processus de révision des coefficients des fonctions de prédiction. Une analyse comparative de ces méthodes est aussi inclue. De plus, nous comparons les prédictions d'un modèle ARIMA saisonnier et celles des modèles (additifs et multiplicatifs) de graduation de Winters.
Article
The Bayesian Steady Forecasting model is generalized to a very wide class of processes other than the normal by defining the time series on the decision space. Examples of such processes are presented including a Beta‐Binomial process, a Poisson‐Gamma process and a Student‐t sample distribution steady model. Simple updating relations are given for most of the processes discussed.
Article
This article considers forecasting a single time series when there are many predictors (N) and time series observations (T). When the data follow an approximate factor model, the predictors can be summarized by a small number of indexes, which we estimate using principal components. Feasible forecasts are shown to be asymptotically efficient in the sense that the difference between the feasible forecasts and the infeasible forecasts constructed using the actual values of the factors converges in probability to 0 as both N and T grow large. The estimated factors are shown to be consistent, even in the presence of time variation in the factor model.
Article
This new edition updates Durbin & Koopman's important text on the state space approach to time series analysis. The distinguishing feature of state space time series models is that observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbance terms, each of which is modelled separately. The techniques that emerge from this approach are very flexible and are capable of handling a much wider range of problems than the main analytical system currently in use for time series analysis, the Box-Jenkins ARIMA system. Additions to this second edition include the filtering of nonlinear and non-Gaussian series. Part I of the book obtains the mean and variance of the state, of a variable intended to measure the effect of an interaction and of regression coefficients, in terms of the observations. Part II extends the treatment to nonlinear and non-normal models. For these, analytical solutions are not available so methods are based on simulation.
Article
Two separate sets of forecasts of airline passenger data have been combined to form a composite set of forecasts. The main conclusion is that the composite set of forecasts can yield lower mean-square error than either of the original forecasts. Past errors of each of the original forecasts are used to determine the weights to attach to these two original forecasts in forming the combined forecasts, and different methods of deriving these weights are examined.
Article
The purpose of this paper is to draw international comparisons of the coherence of indexes of leading economic indicators with selected telecommunications traffic series. The traffic series under consideration are total Australian telephone outgoing and U.S. outgoing telephone to Australia with data consisting of monthly observations spanning the period 1970–1983. The response of the telecommunications traffic to these indexes is analysed using cross-spectral techniques. Additionally, a dynamic regression forecasting model for Australian traffic is estimated using the Australian index as an explanatory variable. In comparison to an ARIMA model for the telecommunications data this model reduces post-sample MSE by 19 percent.
Article
A procedure for deriving the variance of the forecast error for Winters' additive seasonal forecasting system is given. Both point and cumulative T-step ahead forecasts are dealt with. Closed form expressions are given in the cases when the model is (i) trend-free and (ii) non-seasonal. The effects of renormalization of the seasonal factors is also discussed. The fact that the error variance for this system can be infinite is discussed and the relationship of this property with the stability of the system indicated. Some recommendations are given about what to do in these circumstances.
Article
The vector ARIMA (VARIMA) model is a multivariate generalization of the univariate ARIMA model. VARIMA can accomodate assumptions on exogeneity and on contemporaneous relationships. Exogeneous forecasts and non-zero future shocks make it possible to generate alternative forecasts. In a case study VARIMA well describes developments in the 1970's and successfully competes with judgemental methods and ARIMA in providing a general outlook of the early 1980's.
Article
Dynamic Bayesian models are developed for application in nonlinear, non-normal time series and regression problems, providing dynamic extensions of standard generalized linear models. A key feature of the analysis is the use of conjugate prior and posterior distributions for the exponential family parameters. This leads to the calculation of closed, standard-form predictive distributions for forecasting and model criticism. The structure of the models depends on the time evolution of underlying state variables, and the feedback of observational information to these variables is achieved using linear Bayesian prediction methods. Data analytic aspects of the models concerning scale parameters and outliers are discussed, and some applications are provided.Dynamic Bayesian models are developed for application in nonlinear, non-normal time series and regression problems, providing dynamic extensions of standard generalized linear models. A key feature of the analysis is the use of conjugate prior and posterior distributions for the exponential family parameters. This leads to the calculation of closed, standard-form predictive distributions for forecasting and model criticism. The structure of the models depends on the time evolution of underlying state variables, and the feedback of observational information to these variables is achieved using linear Bayesian prediction methods. Data analytic aspects of the models concerning scale parameters and outliers are discussed, and some applications are provided.
Article
ABSTRACT A vector of continuous,proportions,consists of the proportions,of some,total accounted,for by its constituent,components.,An example,is the proportions of U.S. tax revenues from each of personal tax, cor­ porate,tax and,social tax. We consider,the situation where,time,series data are available and,where,interest focuses on,theprop(}.rti?~srath7r thaIl.tbe ··actua follows,•the Dirichlet-conjugate (DC) distribution,which,is .introduced here. Thus the model, while based on the Dirichlet distribution, does not have its restrictive independence properties. The state transition distribution, or
Article
Many short-term forecasting systems are based on exponentially weighted moving averages. It is usual to forecast the cumulative demand over a lead time or production horizon, and to describe this forecast in terms of its mean and variance. When the forecast horizon is fixed, the variance is often taken as the product of the number of periods and the variance per period. This is a serious error and typically underestimates the variance by a factor of about two. This paper details the need for a proper awareness of the correction factors.
Article
In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.
Article
We study the most basic Bayesian forecasting model for exponential family time series, the power steady model (PSM) of Smith, in terms of observable properties of one-step forecast distributions and sample paths. The PSM implies a constraint between location and spread of the forecast distribution. Including a scale parameter in the models does not always give an exact solution free of this problem, but it does suggest how to define related models free of the constraint. We define such a class of models which contains the PSM. We concentrate on the case where observations are non-negative. Probability theory and simulation show that under very mild conditions almost all sample paths of these models converge to some constant, making them unsuitable for modelling in many situations. The results apply more generally to non-negative models defined in terms of exponentially weighted moving averages. We use these and related results to motivate, define and apply very simple models based on directly specifying the forecast distributions.
Article
A dynamic, linear model for the analysis of univariate time series is proposed. It encompasses many of the common statistical models as special cases such as multiple regression, exponential smoothing and mixed autoregressive‐moving average processes. its distinguishing feature is that it relies on only one primary source of randomness. It therefore not only provides a simpler framework for the study of dynamic models but also eliminates the need for the contentious system variance matrix which has been credited with hampering the use of recursive forecasting methods in practice. The associated Kalman filter is also derived.
Article
Despite increasing applications of artificial neural networks (NNs) to fore- casting over the past decade, opinions regarding their contribution are mixed. Evaluating research in this area has been diÅcult, due to lack of clear criteria. We identified eleven guidelines that could be used in evaluat- ing this literature. Using these, we examined applications of NNs to business forecasting and prediction. We located 48 studies done between 1988 and 1994. For each, we evaluated how eÄectively the proposed tech- nique was compared with alternatives (eÄectiveness of validation) and how well the technique was implemented (eÄectiveness of implementation). We found that eleven of the studies were both eÄectively validated and imple- mented. Another eleven studies were eÄectively validated and produced positive results, even though there were some problems with respect to the quality of their NN implementations. Of these 22 studies, 18 supported the potential of NNs for forecasting and prediction. #1998 John Wiley & Sons, Ltd.
Article
Forecasting the effects of changes in advertising or pricing strategies on a company's sales or market share is an important task faced by marketing managers. This paper applies a time series approach, intervention analysis, to several marketing policy applications illustrating the flexibility and value of the method for testing hypotheses and providing forecasts. Empirical evidence is presented for two different marketing situations, one that involves a change in advertising and another that involves offering price specials.