Forecasting
... The relative inertia of population stocks suggests that this is the case. Indeed, errors in population forecasts five to ten years into the future are often smaller than the errors of economic forecasts over a similar period (Ascher, 1978). However, population flows are much harder to predict (Keilman, 1990), so in the long run, population processes are much more uncertain than generally recognized. ...
... However, a panel of the US National Research Council (2000) carried out a broad assessment of UN mortality forecasts and concluded that the existence of a maximum life span is conceivable but that it is unlikely that the possible maximum would be reached within near decades. This phenomenon is sometimes called 'assumption drag' (Ascher, 1978), and it has also been noted in demographic forecasts prepared by national statistical agencies (Keilman, 1990Keilman, , 1997). For instance, the sharp decline in birth rates in the 1970s or improved life expectancies of men in the 1970s, after a period of stagnation, were only gradually accommodated in forecasts. ...
Introduction Are population processes easy to predict? The relative inertia of population stocks suggests that this is the case. Indeed, errors in population forecasts five to ten years into the future are often smaller than the errors of economic forecasts over a similar period (Ascher, 1978). However, population flows are much harder to predict (Keilman, 1990), so in the long run, population processes are much more uncertain than generally recognized. Yet, many tasks of social policy, such as planning of schools and health care require information about the likely developments of population variables for twenty or thirty years into the future. Analyses of the sustainability of pension systems require that we take an even longer view, so the US Office of the Actuary routinely prepares forecasts seventy-five years into the future (Andrews and Beekman, 1987), for example. One way the uncertainty in population variables manifests itself is through changing views, over time, of the demographic future. For instance, a forecast of a particular population made in 2000 may be different from one made ten years earlier. New data for the period 1990-2000, different interpretations of historical developments before 1990, refined techniques of analysis and prediction - all these shape different conditions for the forecast made in 2000, compared to the one made in 1990. As an example, consider Table 2.1. It shows UN forecasts of the 2050 old-age dependency ratio (OADR), i.e. the ratio of the elderly population (aged 65+) to the working age population (aged 20-64). © Cambridge University Press 2008 and Cambridge University Press, 2009.
Background
Despite the significance of demand forecasting accuracy for the registered nurse (RN) workforce, few studies have evaluated past forecasts.
Purpose
This paper examined the ex post accuracy of past forecasting studies focusing on RN demand and explored its determinants on the accuracy of demand forecasts.
Methods
Data were collected by systematically reviewing national reports or articles on RN demand forecasts. The mean absolute percentage error (MAPE) was measured for forecasting error by comparing the forecast with the actual demand (employed RNs). Nonparametric tests, the Mann‒Whitney test, and the Kruskal‒Wallis test were used to analyze the differences in the MAPE according to the variables, which are methodological and researcher factors.
Results
A total of 105 forecast horizons and 196 forecasts were analyzed. The average MAPE of the total forecast horizon was 34.8%. Among the methodological factors, the most common determinant affecting forecast accuracy was the RN productivity assumption. The longer the length of the forecast horizon was, the greater the MAPE was. The longer the length of the data period was, the greater the MAPE was. Moreover, there was no significant difference among the researchers’ factors.
Conclusions
To improve demand forecast accuracy, future studies need to accurately measure RN workload and productivity in a manner consistent with the real world.
This Intergovernmental Panel on Climate Change Special Report (IPCC-SREX) explores the challenge of understanding and managing the risks of climate extremes to advance climate change adaptation. Extreme weather and climate events, interacting with exposed and vulnerable human and natural systems, can lead to disasters. Changes in the frequency and severity of the physical events affect disaster risk, but so do the spatially diverse and temporally dynamic patterns of exposure and vulnerability. Some types of extreme weather and climate events have increased in frequency or magnitude, but populations and assets at risk have also increased, with consequences for disaster risk. Opportunities for managing risks of weather- and climate-related disasters exist or can be developed at any scale, local to international. Prepared following strict IPCC procedures, SREX is an invaluable assessment for anyone interested in climate extremes, environmental disasters and adaptation to climate change, including policymakers, the private sector and academic researchers.
Concepts from the Enlightenment and the historical origins of modern social sciences are used to discuss how futures studies deserves recognition as a social science in its own right and as a needed component of the curricula of other disciplines as well, especially in public administration. In focus groups, undergraduate students who had just completed a course in futures studies identified what they would emphasize if they become teachers of our field. They would emphasize critical thinking, individual relevance and empowerment, interrelatedness, technology as a two-sided agent of change, a risk management approach to understanding crises and opportunities, past efforts to anticipate possible futures, developing scenarios using the Societal, Technological, Economic, Environmental, and Political framework, environmental scanning and backcasting, and especially the importance of Enlightenment values in framing preferred futures. As teachers, they would use technology extensively but were sharply divided on whether futures studies should be taught in an online only format.
This paper focuses on problems and their causes and cures in policy and planning for large-infrastructure projects. First, it identifies as the main problem in major infrastructure developments pervasive misinformation about the costs, benefits, and risks involved. A consequence of misinformation is cost overruns, benefit shortfalls, and waste. Second, it explores the causes of misinformation and finds that political-economic explanations best account for the available evidence: planners and promoters deliberately misrepresent costs, benefits, and risks in order to increase the likelihood that it is their projects, and not those of their competition, that gain approval and funding. This results in the ‘survival of the unfittest’, in which often it is not the best projects that are built, but the most misrepresented ones. Finally, it presents measures for reforming policy and planning for large-infrastructure projects with a focus on better planning methods and changed governance structures, the latter being more important.
Many statistical agencies routinely produce population forecasts, and revise these forecasts when new data become available, or when current demographic trends indicate that an update is necessary. When the forecaster strongly revises, from one forecast round to the next one, a forecast for a certain target year (for instance the life expectancy in 2050), this indicates large uncertainty connected to mortality predictions. The aim of this chapter is to shed more light on the uncertainty in mortality forecasts, by analysing the extent to which life expectancy predictions for 2030 and 2050 were revised in subsequent rounds of population forecasts published by statistical agencies in selected countries. It updates and extends earlier work that focused on United Nations and Eurostat forecasts published between 1994 and 2004. There the conclusion was that life expectancy forecasts for 18 European countries for the year 2050 had been revised upwards systematically, by around 2 years on average during the 10-year publication period. A recent analysis based on official population forecasts for Norway published in the period 1999–2018 led to the same conclusion. Here we will show that the period of upward revisions seems to have ended for some European countries.
The evolution of 3G and Long Term Evolution (LTE) has brought a remarkable advantage to mobility-based digital solutions by providing ubiquitous Internet services to a plethora of users with various needs. Nowadays, the dramatic adoption of smart devices and increasing consumption of mobile Internet have been accelerating mobile technology development activities toward a 5G ecosystem. Therefore, industry players need to make managerial decisions and design strategies based on forecasts related to the introduction of 5G technology. The aim of this research is to enhance the understanding of technological forecasting and technology life cycle in the mobile telecommunication industry. Specifically, the study was carried out in Turkey to contribute to the engineering management field on the topic. This contribution will be realized by forecasting the ideal launch time of 5G services for a leading telecommunication company in the market, which has been integrating new mobile technologies and densifying its existing network at an unprecedented rate in recent years. Our findings reveal that the most appropriate time for the 5G deployment and launch of mobile telecommunication services in Turkey would be in August 2020. This is the time when most of the forecast parameters used for making decisions on network investments are predicted to reach maturity.
It seems that the least accurate (population) forecasts are those published in the period of great historical turning points both economic and political. Several studies analysed the forecasts’ accuracy in Western countries, but the post-1990 development in the post-communistic countries has not been analysed in this respect. The general goal of the study is to show how hardly predictable and poorly predicted the demographical processes have appeared to be during the major societal and economic turning points after the post-communistic transformation started. To do this, the study first provides an exact measurement of the forecasts’ accuracy in transitioning Slovakia and Czechia. The key finding is that the forecasters either did not recognise the beginning of somewhat completely “new” or underestimated the dimensions of the turning points and turns in the recent trends. Thus, the assumption “drag” shows up much more frequently than some kind of over-reaction. Implicitly, the research re-opens a perpetual question whether methodological–mathematical improvements are more (or less) important than the deep insight into the forecasted processes. Secondly, the study demonstrates the range and dimension of changes that impact the demographic present and future. Here, the alternative future is built and simulated in terms of what the populations would have looked like if the socialist system had not collapsed. It is a quite simple but very smart way how to demonstrate the range of turnover since 1989.
The future U.S. population will grow slowly, age significantly, and have larger minority components. These factors' effects on recreational behavior have received extensive attention, but analyses of the future number of participants have inadequately examined the simultaneous effects of demographic variables.
The results from a national cohort-component population projection employing age, race/ethnicity, and activity-specific rates of cohort participation through the year 2025 suggest that the number of participants will grow slowly due primarily to age effects, but race/ethnicity effects will also be important. The results demonstrate that detailed demographic analyses must be utilized to plan services for the future population of participants.
Failures of countries in setting and achieving renewable energy targets are prevalent, raising uncertainty about the overall contribution of renewable energy to global emission reductions. Lack of policy and incorrect modelling analysis are among the sources of the failures. Thus understanding these two sources is crucial to improve confidence about renewables. We assess errors in the projections of renewable energy capacity and production in the United States and European Union countries, which have high commitments to green energy supply. Our results show that solar energy has the lowest uncertainty due to having the most achievable projections of capacity and production. On the other hand, other renewables may entail attractive policies, and further research is needed related to advancing reliable technology and accurate weather predictions. Our findings also provide ranges of projection uncertainty of six renewable energy
technologies and, at the same time, draw attention to ways to rectify the dominant errors in the renewable energy projections.
Over a decade ago, Bretschneider and Gorr proposed two directions for future research in government forecasting: one was to extend the research on developing and evaluating alternative forecasting methods and the other, to look at forecasting as a human activity and examine how organizational factors affect forecasting. What has happened since then? To see partially what has been done and what remains to be done, this paper provides a review of the literature on government revenue forecasting with a primary focus on the state level and identifies areas for future research in government revenue forecasting.
Fiscal stress has forced local governments to pay increasing attention to revenue trends and has increased the importance of financial forecasting in local government. After reviewing the role of revenue forecasting in financial planning and discussing the use of regression and econometric analysis in revenue forecasting, this article applies this technique to forecast several key revenue components in a medium-sized city. Three general conclusions may be drawn: (1) systematic revenue forecasting and long-range planning are necessities, not luxuries, (2) risk aversion to "technical" revenue forecasting can be overcome, and (3) the implementation of a systematic revenue forecasting system does not require a battery of "rocket scientists." As municipal revenue bases come to rely less on relatively stable property taxes and more on less stable sources such as sales taxes, fees, and charges, the use of a regression and econometric based model should prove increasingly fruitful.
The cohort-component population projection algorithm has generally been viewed as having one purpose, namely population forecasting. And it has been ‘canonized’ as the one best method for this purpose. A more fruitful view might be to see it first and foremost as a theoretical model of population dynamics, useful for many different purposes. At the same time, other approaches to population forecasting should be given greater attention, approaches with both advantages and disadvantages compared to the cohort-component approach (See Chap. 4 above).
This article proposes a unifying theory, or Golden Rule, of forecasting. The Golden Rule of Forecasting is to be conservative. A conservative forecast is consistent with cumulative knowledge about the present and the past. To be conservative, forecasters must seek out and use all knowledge relevant to the problem, including knowledge of methods validated for the situation. Twenty-eight guidelines are logically deduced from the Golden Rule. A review of evidence identified 105 papers with experimental comparisons; 102 support the guidelines. Ignoring a single guideline increased forecast error by more than two-fifths on average. Ignoring the Golden Rule is likely to harm accuracy most when the situation is uncertain and complex, and when bias is likely. Non-experts who use the Golden Rule can identify dubious forecasts quickly and inexpensively. To date, ignorance of research findings, bias, sophisticated statistical procedures, and the proliferation of big data, have led forecasters to violate the Golden Rule. As a result, despite major advances in evidence-based forecasting methods, forecasting practice in many fields has failed to improve over the past half-century.
This chapter elaborates the proposals in Box 1.1 for opening the climate change regime to adaptive governance. For that purpose we pull together the historical case materials in previous chapters and relevant theoretical material. Recall that Chapter 2 reviewed the evolution of scientific management in the climate change regime and exceptions that point toward adaptive governance. Chapter 3 reviewed Barrow as a microcosm of things to come as signs of climate change become more obvious at lower latitudes, including steps toward adaptive governance. Beyond these historical case materials, however, various aspects of the proposals for adaptive governance have been accepted or recommended in general literature on climate change, environmental hazards, and related policies, and in more theoretical literature on science, policy, and decision making. These convergent sources from different and larger bodies of experience add support to the proposals for adaptive governance in climate change. In particular, these convergent sources document a latent but coherent frame of reference in which the case materials become more than mere historical curiosities. They become foundations for an alternative frame to understand and reduce net losses from climate change. The established frame in the climate change regime is not the only construction of the relevant past and possible futures.
When economics moves from “description” to “prescription,” the research shifts from basic science to applied science. Chapter 13 emphasizes the interdependence between prediction and prescription in economics, which is connected with the “descriptive” and “normative” realms. The interest moves from prediction as a test to a guide for policy-making, which involves the use of prediction for public policy as well as quantitative and qualitative considerations.
In this sphere of economics, the insufficiency of prediction is clear, as is the need for prescription. Thus, it may be a move from predictivist instrumentalism to the possible primacy of prescription. In any case, economic prescription needs values. The axiological content of prescription is relevant and involves internal and external values of prescriptions. It seems clear that the relation between prediction and prescription in economics is a central tenet for future developments.
After the aims and processes, prediction in economics requires the analysis of the issues on evaluation and limits. Chapter 12 begins with the use of prediction as a test, both in economic theory and in applied economics. The evaluation of predictions in the context of economic models is considered taking into account the problem of uncertainty and, consequently, forecast uncertainty. The appraisal of economic predictions is focused on the criteria of prediction as a test, which follows several steps: (a) main criteria in the appraisal of predictions; (b) methodological processes to the assessment of predictions (i.e., different kinds of testing); (c) the case of econometrics (as a tertium quid between laboratory experimentation and thought experiments); and (d) the existence of predictive errors and their economic costs. The limits and obstacles of prediction in economics are also studied: the limits of predictability—both epistemological and ontological—and the obstacles to predictors.
Models for forecasting changes in mortality, morbidity, and disability in elderly populations are essential to national and state policies and health and social programs. The rapid growth of the elderly and oldest-old populations have implications for the size and long-term fiscal soundness of programs, such as U.S. Social Security and Medicare. Less well understood are qualitative health and functional changes of future elderly populations and how changes affect federal and state health policy, public and private health-care providers, and private acute and long-term care (LTC) insurance.
This chapter describes a new regional travel demand forecasting method, based on micro-simulation and dynamic analysis. In this method, socioeconomic and demographic forecasting is combined with dynamic travel demand forecasting to more accurately depict complex travel behavior. The system has two components: a micro-simulator of household socioeconomics and demographics, and a dynamic model system of household car ownership and mobility. Many explanatory variables that are exogenous to other forecasting models are endogenous in this system. Future travel behavior is predicted for each simulation year by creating an entire temporal path of change in socioeconomic, demographic, and travel demand variables. Most model parameters were estimated using observations from five waves of the Dutch National Mobility Panel (DNMP) from 1984–1988. Other sources of information were also used to estimate key parameters. This chapter reviews the model structure, data requirements, estimation methods, and assumptions. Examples of forecasting for the year 2010 illustrate the predictive capability and limitations of the new forecasting method.
Can the opinions of individual experts be amalgamated into a collective ‘best estimate’ that is more likely to be correct than the individual estimates? This question is of interest to risk analysts in taking account of differences among experts within probabilistic (and other) risk studies, and to decision makers in judging among the competing conclusions of the risk studies themselves. Several mathematically and conceptually ingenious methods, Bayesian and otherwise, have been proposed for synthesizing expert opinions, especially when expressed as probability estimates. I argue that (except in special circumstances where a large body of experience with phenomena known to be similar to those under study in virtually all relevant aspects makes the effort hardly worthwhile) no such method can be shown to be more successful than method R: adopt the estimate of a randomly chosen expert. This result is reached by criticizing a supposed parallel (in the Lewis panel’s evaluation of the Rasmussen reactor safety report’s methodology) between the track records of scientists and bookies in making judgments. The analogy wrongly presupposes that a trait of ‘reliability’ can be ascribed, in one degree or another, to both a scientist and a measuring instrument. I also argue that expert opinion pooling depends for its justification on some very debatable assumptions about the nature of probability and of science. The fundamental issue is whether mistaken agreement and thoroughgoing disagreement are the rule or the exception in science, especially in areas of importance to public policy.
That all data refer to the past and all use of data the future implies a line between past and future drawn at “now.” Without continuities that make possible extrapolation across that line statistical data would be useless, indeed the very possibility of purposeful behavior would be in doubt (Keyfitz 1977).
The mechanism by which economic, demographic, and social variables affect total national births is known with greater certainty than is the distribution of these births across regions. Indeed, the U.S. Bureau of the Census predicts convergence of regional fertility rates, whereas the MIT-Harvard Joint Center predicts a further widening of regional differences (Jackson et al. 1981). Consequently, a two-step forecasting procedure is presented here. An economic-demographic model produces forecasts of total live births and other demographic variables, and then the regional shares of national total live births are forecast. Since the mechanism by which total live births are distributed across regions is not known, a parsimonious estimation approach is adopted which uses historical data on birth shares. An alternative approach would be to estimate a separate economic-demographic model for each region, but before attempting to do so, the simpler and less costly approach developed here should be explored. Although concentrating on regional births, the approach taken in this chapter could also be used to estimate marriages and divorces by region and births, marriages, and divorces by state.
Modern systems analysis really began as “scientific management” with Frederick Taylor in 1911, gathered steam with World War II’s operations research, and was propelled forward by complex weapon system design and strategic analysis in the 1950s. It then burst full force onto the national scene with RAND’s systems analysts in the early 1960s. By 1967, Max Ways wrote in FORTUNE’S survey “The Road to 1977”:
The further advance of this new style (systems analysis) is the most significant prediction that can be made about the next ten years. By 1977, this new way of dealing with the future will be recognized at home and abroad as a salient American characteristic. (1967: 94)
Since World War II there has been a remarkable proliferation in the use of mathematical modeling under the labels of systems analysis, decision analysis, operations analysis, management science, and policy analysis. Technology forecasting, impact, and transfer have also reflected the trend. However, serious limitations are inherent in these reductionist approaches and they have created a serious gap between the model and the real-world decision maker. This discussion focuses on the problem and a means to overcome it.
This paper discusses the evolution of linked economic-demographic models. The benefits of linking models are explored and illustrated using the major national and regional econometric models in the U.S. and Canada. The current status of linked models is evaluated and future directions of development are suggested. -Author
If death rates at each age in France remained at current levels over the lifetimes of babies born in France this year, then more than half the babies would live to celebrate their 80th birthdays. Among baby girls, two-thirds would become octogenarians and half would reach age 85. Death rates have been declining in France (and in most other developed countries as well) for many decades. In particular, death rates among octogenarians and nonagenarians have fallen substantially since 1950. Extrapolating these rates of improvement into the future yields an astonishing result: half of all French babies may survive to celebrate their 95th birthdays and half of French girl babies may become centenarians.
Since the 1960s, the US has depended primarily on a ‘command and control’ approach to managing and regulating environmental and public health risks. These prescriptions have succeeded in reducing some categories of gross pollution. However, they can also be overly rigid and economically inefficient, stifle innovation and competitive ability, and sometimes produce unintended consequences that increase pollution or risk. Such methods are inappropriate for responding to the risks from complex and highly uncertain problems. We propose a conceptual framework for developing regulation and management practices from first principles. This framework ties policy implementation to fundamental values and beliefs about how the world works. It helps to reveal the implications of different policy and regulatory strategies and shows that certain values and beliefs, and the strategies that stem from them, are more appropriate for such issues than other, more traditional approaches. In addition, we discuss how the policy development process itself reflects underlying values and is as important as the content of the policy itself. In a context where risks are highly uncertain and solutions are not definitive, useful content can be developed and improved only through an effective process.
The author summarizes the most important lessons he has learned in more than 40 years of studying planning from the perspectives of planning theory, planning methods, and computer applications in planning. He suggests that planner's traditional -professional roles, models, and methods often fail to adequately consider alternative futures and unnecessarily restrict meaningful public participation. He proposes that these problems can be overcome, but not easily, by adopting more community-centered -approaches of planning with the public and using simple, easy-to-use, and -understandable models and methods. The author recognizes that following his advice in practice raises many difficult questions that he cannot currently answer.
Olivier T abatoni et Pierre A. M ichel , L'évaluation de l'entreprise , Presses Universitaires de France, Paris, 1979, 180 pages.
III. CURRENT PERIODICALS ‐ ARTICLES DE REVUES ‐ ZEITSCHRIFTENARTIKEL: LIST OF THE USED SHORTENINGS PUBLICATIONS MENTIONNÉES PAR UNE ABRÉVIATION LISTE DER ABGEKÜRZTEN ZEITSCHRIFTENTITEL
This study considers the accuracy of national population forecasts of the Netherlands and the Czechoslovak Socialist Republic (CSSR) produced by the statistical agencies of these countries after World War II. Attempts are made to link patterns of ex-post errors to changes in forecasting methodology. We look at the demographic components employed in each forecast, the procedure to extrapolate fertility and the level at which assumptions for each component are formulated. Errors in total population size, fertility, mortality and foreign migration, and age structure are considered. We discuss trends in errors and methodology since 1950 and compare the situations in the two countries. The findings suggest that methodology has only a very limited impact on the accuracy of national population forecasts.
This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding questions of cost underestimation and escalation for different project types, different geographical regions, and different historical periods. Four kinds of explanation of cost underestimation are examined: technical, economic, psychological, and political. Underestimation cannot be explained by error and is best explained by strategic misrepresentation, i.e., lying. The policy implications are clear: In debates and decision making on whether important transportation infrastructure should be built, those legislators, administrators, investors, media representatives, and members of the public who value honest numbers should not trust the cost estimates and cost-benefit analyses produced by project promoters and their analysts.
In the context of intense interest in identifying what works in mental health, we sought to establish a consensus on what doesnot work-discredited psychological assessments and treatments used with children and adolescents. Applying a Delphi methodology, we engaged a panel of 139 experts to participate in a two-stage survey. Participants reported their familiarity with 67 treatments and 35 assessment techniques and rated each on a continuum from not at all discredited to certainly discredited. The composite results suggest considerable convergence in what is considered discredited and offer a first step in identifying discredited procedures in modern mental health practice for children and adolescents. It may prove as useful and easier to identify what does not work for youth as it is to identify what does work-as in evidence-based practice compilations. In either case, we can simultaneously avoid consensually identified discredited practices to eradicate what does not work and use inclusively defined evidence-based practices to promote what does work.
Approaches taken by states in their revenue forecasting are extremely diverse. This research identifies six institutional structures that states utilize in their revenue forecasting processes. The results show that the “typical” state utilizes a non-consensual approach to forecast formulation with the forecast being done by a single executive agency or cabinet office and with the executive having the final say in the forecast. The “typical” state will not have an economic advisory council, but will utilize faculty from local universities. The “typical” state updates its forecast about every six months and the forecasters perceive their forecast as binding the state budget.
Resource estimates are generally assumed to be the direct product of geological and engineering information. Historical analysis, using a perspective suggested by the sociology of science, demonstrates that social factors influence the magnitude and variation among resource estimates made during the same historical period. In terms of magnitude and variation, three chronological patterns in the estimates of United States crude oil resources can be discerned, and it is argued (1) that each pattern reflects the ideological orientation present at that time, and (2) that changes from one ideological orientation to another can be traced to changes in the political-economic environment of the oil industry.
In this paper, we argue and show how system dynamics modelling can be combined with exploratory modelling and analysis in order to address grand societal challenges, which are almost without exception characterized by dynamic complexity and deep uncertainty. Addressing such issues requires the systematic exploration of different hypotheses related to model formulation and model parametrization and their effect on the kinds of behavioral dynamics that can occur. Through two cases, we illustrate the combination of system dynamics and exploratory modelling. The first case illustrates its use for discovering different types of dynamics related to metal and mineral scarcity. The second case illustrates its use for worst-case discovery in water scarcity. We conclude that exploratory system dynamics modelling represents a promising approach for addressing deeply uncertain dynamically complex societal challenges. Copyright © 2013 John Wiley & Sons, Ltd.
Planners and decision makers in the present day, fast changing economic environment prefer educated guesses on the future movement of the business related variables. Forecasting of macro and micro variables are increasingly being considered necessary for the decision makers. Several forecasting methods are available and forecasting for the short term as well as long term is easily done thanks to the present development in statistical packages and high speed computers. Researchers often use forecast errors to evaluate the efficiency of the forecasting methods. The present study is an attempt in this regard. It explains the meaning and relevance of statistical/technical analysis in forecasting share prices in the stock market in the short run. Forecast has been done for the weekly share price of ICICI Bank. The results obtained using three extrapolative statistical methods have been compared with the actual share prices of ICICI Bank in the forecast period.
A body of 800 experts, worldwide, on international relations was surveyed in 1977 concerning their short-term (next 5 years) and long-term (next 25 years) predictions for international conflict, political change, and economic development. We present their predictions for both time-frames, and for the 5-year predictions compare them with events that actually occurred. We compare the predictions, and their relative success for the short- term predictions, by methodological orientation (behavioral or traditional) and nationality (United States, Japanese, and Other) of the predictor. Traditionalists succeeded somewhat better than behavioralists (at least outside their subjects of expertise), and both the Americans and the Others had greater success than the Japanese, though none of the differences between groups was great. The experts typically predicted little or no change in events or trends, rather than predicting major change. This is true for short- and long-range predictions. Within a generally optimistic set of predictions for the next 25 years, the Japanese group stands out as especially so, particularly on international cooperation, war-avoidance, and prosperity.
Small-area population forecasts are used for a wide variety of planning and budgeting purposes. Using 1970 to 2005 data for incorporated places and unincorporated areas in Florida, we evaluate the accuracy of forecasts made with several extrapolation techniques, averages, and composite methods, and we assess the effects of differences in population size, growth rate, and length of forecast horizon on forecast errors. We further investigate the impact of adjusting forecasts to account for changes in special populations and annexations. The findings presented in this study will help practitioners make informed decisions when they construct and analyze small-area population forecasts.
This article raises some questions about the relationship between program evaluation and forecasting. Contrasting the two fields in terms of mind-set, purpose, problems, advantages, and use shows that although their modes of inquiry are indeed very different in many ways, there is also interdependence between the two in several areas. It is argued that each can be greatly strengthened by better understanding and use of the other's techniques.
Over the past decade sustainable development has increasingly been adopted as an objective of government policy, and “planning for sustainable development” is now a real-world activity of officials and ministries. A survey of conceptual issues, of three environmental economists' proposals to achieve sustainability, and of initial practical experience in the industrialised countries suggest that planning for environmentally sustainable development involves an ambitious attempt to reconcile environmental and development objectives. It is argued that to the extent that this proves possible it will be a radically disjointed process involving interaction of many agencies and actors. At the heart of the challenge of planning for sustainable development is an issue of institutional design to which political science, despite the hesitancy it has so far displayed to engage with the issue, could yet make a meaningful contribution.
When compared with that of regression or other forecasting techniques, a spreadsheet's presentation of data is open and accessible. The spreadsheet's main advantage is that it quickly recalculates all formulas and graphs after each input. It allows students to tinker with modeled situations, see immediate results, and thus become familiar with what-if modeling, the most widely used planning tool of management and government. Two spreadsheets are included which facilitate orderly presentation of social change concepts, mcluding time trends, additive versus geometric growth, accelerating and decelerating cross-impacts, and leading influences and lagging effects. Keywords: forecasting, spreadsheet, social change.
City governments in the U.S. enter into agreements with private developers to provide infrastructure such as roads, drainage systems, and utilities. From a normative perspective, public manag ers should calculate the costs to city government to execute these public/private deals before they agree to participate. Four cost categories deserve attention: (1) initial expenses required for project approval by city officials (approval spending); (2) spending for changes during project construction (momentum spending); (3) public spending to entice private investment (incentive spending); and (4) city government spending to maintain and operate the new infrastructure (operations spending). This paper calculates spending categories in a case study of Alliance Airport, a major public/private development in Fort Worth, Texas, requiring $63 million of direct and indirect public expenditures. Officials estimated less than 25 percent of total spending before project approval. Two-thirds of the public costs are foregone revenues associated with incentive spending by city government. The findings suggest city administrators and legislators should focus more closely on a precise calculation of these four types of costs before agreeing to public/private economic development deals.
Projections of future populations are integral to many planning applications, yet are often poorly understood. This analysis focuses on the implications of the choices planners make when they construct projections. Specifically, it examines the impact of length of base period, analyzes the error structure of projection techniques for counties in the aggregate and by size and growth rates, investigates the role of averaging, and compares the performance of trend extrapolation and cohort—component methods. The article concludes by discussing forecast complexity, data quality, the role of assumptions, and other considerations of forecasting in a planning context.
This article examines the value of developing large-scale projection models. Although its primary focus is on the methods that would integrate demographic and economic models to yield housing stock forecasts, the considerations can apply to large scale modeling efforts in general. A case study provides the context for practical considerations. Technical, theoretical, institutional, practical, ethical, and planning issues are considered. The institutional context within which large models are developed and used can affect the information content of forecasts. Realistic contexts can make simpler projection models backed by qualitative studies, produce better forecasts than the large projection models that absorb most of an agency's research budget. Theoretical and econometric considerations as well as track record may favor the use of simpler projection models. The commitment to modeling, however, can help staff development and reduce the role of expediency, wishful thinking, and political manipulation of forecasts. The article stresses the need for a balance between quantitative and qualitative methods.
Two reviews published in 1988 examined vendors selling demographic data for use on microcomputers (Levine 1988a, 1988b). Several changes have occurred in the past two years in the way demographic data are packaged and distributed. This review will consider the changes in the data distributed by the U.S. Bureau of the Census and three private vendors: CACI, Inc., National Decision Systems, and Woods and Poole Economics, Inc.
ResearchGate has not been able to resolve any references for this publication.