Management Science

Published by INFORMS
Online ISSN: 1526-5501
Print ISSN: 0025-1909
Publications
PIP A free-standing clinic is an outpatient abortion facility, apart from a hospital, privately owned, operated for profit, and its surgical services are limited to abortions. Using the queueing theory, a methodology is developed to determine the optimal design of such a clinic in order to maximize its profits. The methodology for determining the design involved the following components: 1) formulating a description of the free-standing clinic, 2) developing the clinic's objective function, 3) formulating the stopping rule, and 4) determining the clinic configuration with maximum profitability. The 3 main limitations to the methodology are the size of the n-space which it can easily handle, the complexity of the system under study, and the absence of social welfare variables in the objective function. However, it can be seen that by optimally designing a clinic through use of the methodology, abortion clinic management can indirectly improve conditions which affect the mental and physical health of the patients. An example of a clinic which was actually designed employing the methodology is presented.
 
This paper assesses the validity and accuracy of firms' backward patent citations as a measure of knowledge flows from public research by employing a newly constructed data set that matches patents to survey data at the level of the research and development lab. Using survey-based measures of the dimensions of knowledge flows, we identify sources of systematic measurement error associated with backward citations to both patent and nonpatent references. We find that patent citations reflect the codified knowledge flows from public research, but they appear to miss knowledge flows that are more private and contract based in nature, as well as those used in firm basic research. We also find that firms' patenting and citing strategies affect patent citations, making citations less indicative of knowledge flows. In addition, an illustrative analysis examining the magnitude and direction of measurement error bias suggests that measuring knowledge flows with patent citations can lead to substantial underestimation of the effect of public research on firms' innovative performance. Throughout our analyses we find that nonpatent references (e.g., journals, conferences, etc.), not the more commonly used patent references, are a better measure of knowledge originating from public research. This paper was accepted by Lee Fleming, entrepreneurship and innovation.
 
The extensive use of information technologies by organizations to collect and share personal data has raised strong privacy concerns. To respond to the public's demand for data privacy, a class of clustering-based data masking techniques is increasingly being used for privacy-preserving data sharing and analytics. Although they address reidentification risks, traditional clustering-based approaches for masking numeric attributes typically do not consider the disclosure risk of categorical confidential attributes. We propose a new approach to deal with this problem. The proposed method clusters data such that the data points within a group are similar in the nonconfidential attribute values, whereas the confidential attribute values within a group are well distributed. To accomplish this, the clustering method, which is based on a minimum spanning tree (MST) technique, uses two risk-utility trade-off measures in the growing and pruning stages of the MST technique, respectively. As part of our approach we also propose a novel cluster-level microperturbation method for masking data that overcomes a common problem of traditional clustering-based methods for data masking, which is their inability to preserve important statistical properties such as the variance of attributes and the covariance across attributes. We show that the mean vector and the covariance matrix of the masked data generated using the microperturbation method are unbiased estimates of the original mean vector and covariance matrix. An experimental study on several real-world data sets demonstrates the effectiveness of the proposed approach. This paper was accepted by Sandra Slaughter, information systems.
 
Governments in certain countries have tried over the last two decades to draw most industry to the largest cities. The effect has generally been a large increase in both the population and land area of the cities, as the nearby rural areas are taken out of agricultural production and urbanized. Where the land is especially fertile, this can represent a definite loss to the country's agricultural capabilities. By studying the Nash equilibria we show that, given political pressures, there is a very real danger that this process will continue until all, or practically all of the land has been urbanized.
 
A class of optimization problems is investigated in which some of the functions, continuous in all their arguments, have continuous right and left hand derivatives but are not equal at a point called the corner. For this nonclassical problem, a set of first order necessary conditions for stationarity is determined for an optimal path which may have arcs lying on a corner for a nonzero length of time. Enough conditions are provided to constuct an extremal path. This, in part, is achieved by noting that the corner defines a manifold in which the derivatives of all the functions are uniquely defined. Two examples, representing possible aggregate production and employment planning models, illustrate the theory.
 
--Editorial by Erich Fromm, National University of Mexico
 
The purpose of this paper is to explore the contributions of information systems to the organization. A descriptive model is presented which identifies (1) expected predictors of performance and the use of an information system and (2) the relationship between the use of a system and performance. The results of a study of sales force performance and the use of a sales information system are presented to provide empirical evidence to support the model. The results confirm the general relationships among classes of variables in the model, but specific relations among variables are complex and depend heavily on the environment of the organization.
 
Regression Analysis of the Effects of Distribution of Knowledge and Decentralization Measure on Group Performance (Average of the Last Two Months Stock Price)
This study investigates the effect of knowledge distribution and group structure on performance in MBA game teams. We found that group performance was contingent on the distribution of knowledge within the group and networks of social relationships among group members. Studying 39 teams of MBA students in two management simulation games, we found that, in general, groups that had broadly distributed knowledge, i.e., groups made up of members who had general knowledge, outperformed groups that had knowledge concentrated in different members, i.e., groups made up of members who had specialized or both specialized and general knowledge. However, the advantage that the former enjoyed over the latter disappeared when groups of specialists or mixed groups had decentralized network structures.
 
Ludwig von Bertalanffy, in 1951, and Kenneth Boulding, in 1956, wrote articles which have provided a modern foundation for general systems theory [Bertalanffy, L. von. 1951. General system theory: A new approach to unity of science. Human Biol. (December) 303–361; Boulding, K. 1956. General systems theory: The skeleton of science. Management Sci. (April) 197–208.]. We build on that foundation in applying general systems theory to management. The general theory is reviewed for the reader. Next, it is applied as a theory for business, and an illustrative model of the systems concept is developed to show the business application. Finally, the systems concept is related to the traditional functions of a business, i.e., planning, organizing, control, and communications.
 
Parameters Estimated with Weighting (l;f:;)-112
Systems analyses frequently require estimates of the average time that it takes to perform repetitive tasks. These time estimates may be easy to obtain; for example, if one man is performing a single operation, then a simple division of his total working time by the frequency of the task repetition, gives an estimate of the average performance time. However, in many instances, work consolidation requires that individuals be called upon to perform several different operations. For example, crews may be sent into the field to carry out a number of different tasks, for which the relative rate of occurrence may change from day to day. This introduction of multiple operations implies that the required separate time estimates can no longer be obtained by a simple division. This paper discusses linear model estimation of the average time taken to perform specific tasks; the estimation is possible even when a number of different tasks are performed by an individual or group. The models are based on the conceptual relation that the time taken to perform a task multiplied by the rate of occurrence in the considered time period, and summed over all possible tasks, should equal the total time taken by all tasks. Estimation is also possible when direct observations are difficult to obtain (as in stopwatch procedures) because the proposed models do not require direct time observations but rather utilize linear combinations of the individual time parameters. An actual application is discussed.
 
Education is America's biggest single business. It uses more dollars than any other. Education is non-profit, so there is no immediate measure of profit in dollars to assess its performance. Yet the success of the university is at the very root of the economic, the cultural development, the welfare of the community. How should it be organized and managed? How can its performance be assessed so that its job is done most effectively?
 
--Editorial by Russell L. Ackoff, University of Pennsylvania
 
This paper describes a methodology for management information systems design which employs a formalized framework for significantly involving manager-users in the design process. The process seeks to develop a system design on the basis of a criterion which considers both technical cost-benefit considerations and the manager's perception of the potential utility of the system. A key element of the methodology is the development of descriptive and normative system models which are based on the concept of a "linear responsibility chart." These models serve as the basis for the negotiated development of a consensus system model which defines the framework for the decision-oriented analysis of information requirement. The process of information analysis involves joint manager-analyst activities which are aimed at the explication of the implicit decision models which are used for decision making.
 
Guest editorial about organization of TIMS meetings and suggestions for further improvements, including the use of management science methods in planning and organizing the meetings.
 
The solution of today's major social problems will come from more and better technology--not from less technology. For technology is just another name for human knowledge. We need to deepen our scientific knowledge, broaden our repertory of alternatives, and strengthen our technology of decision procedures. Above all, we need a more profound understanding of Man himself, for all human problems have their roots in our own nature.
 
What is management? In recent years, a number of schools or approaches to management have evolved. What is rather upsetting to the manager is that there are a variety of unrelated approaches without any suggestion of their relationship to each other. Even more upsetting is that some of these approaches seem to be based on the disciplines of particular researchers rather than their ability to help managers. The study explores the rationale underlying two predominant orientations. One is concerned with the process of management, while the other emphasizes the function of management. The managerial process involves such intuitive principles as planning, organizing, and staffing. The managerial function involves arranging equipment to perform functions such as procurement, production, and adaptation. The goal is to illustrate the relationship of these approaches to each other and to specific managerial needs. The study was conducted among top managers in six municipal organizations. The research utilized a questionnaire and decision-making simulation for gathering data on the relationship of each managerial orientation. The results indicate that each orientation deals with different areas of managerial action. Key managerial processes focus on integrating, making decisions, recording information, motivating, and negotiating. The managerial functions, in this study, are part of three subsystem efforts which are administrative, adaptive, and technical. When management is defined as a process, it has no relationship to functions; when it is defined as functions of specific subsystems, there is no positive relationship to the process.
 
This paper provides exploratory evidence on the cross-sectional association between process management techniques and two profit measures: return on assets and return on sales. Using a sample of firms in two industries (automotive and computer) and four countries (Canada, Germany, Japan, and the United States), we find that certain process management techniques improve profitability while others have little effect on financial performance. In particular, long-term partnerships with suppliers and customers are associated with higher performance in both industries. The value of other techniques such as statistical process control, process capability studies, and cycle time analysis, on the other hand, appears to vary by industry, reflecting differences in the stages of the two industries' process management practices. Finally, computer organizations following an innovation-oriented strategy earned significantly higher accounting returns regardless of the process management techniques employed, suggesting that these techniques have only a second-order effect on performance in this industry.
 
--Editorial by Martin K. Starr, Columbia University
 
The Virtual Design Team (VDT) extends and operationalizes Galbraith's (1973) information-processing view of organizations. VDT simulates the micro-level information processing, communication, and coordination behavior of participants in a project organization and predicts several measures of participant and project-level performance. VDT-1 (Cohen 1991) and VDT-2 (Christiansen 1993) modeled project organizations containing actors with perfectly congruent goals engaged in complex but routine engineering design work within static organization structures. VDT-3 extends the VDT-2 work process representation to include measures of activity flexibility, complexity, uncertainty, and interdependence strength. It explicitly models the effects of goal incongruency between agents on their information processing and communication behavior while executing more flexible tasks. These extensions allow VDT to model more flexible organizations executing less routine work processes. VDT thus bridges rigorously between cognitive and social psychological micro-organization theory and sociological and economic macro-organization theory for project teams. VDT-3 has been used to model and simulate the design of two major subsystems of a complex satellite launch vehicle. This case study provides initial evidence that the micro-contingency theory embodied in VDT-3 can be used to predict organizational breakdowns, and to evaluate alternative organizational changes to mitigate identified risks. VDT thus supports true "organizational engineering" for project teams.
 
Maximum Likelihood Estimates of Manhattan Hotel Failure, 1898-1980 †
In this study, we examine how experience at the level of the organization, the population, and the related group affects the failure of Manhattan hotels. We find organizational experience has a U-shaped effect on failure; that organizations enjoy reduced failure as a function of population experience before their founding, but not after; and that related organizations provide experience that lowers failure, but it matters whether their experience is local or non-local, and if it was acquired before or after the relationship was established. These results indicate both the difficulty of applying different types of experience to reduce the risk of organizational failure, and the relevance of experience for the evolution of organizational populations.
 
The familiar square-root formula for the optimal economic order quantity was derived originally by Harris in 1915 (Harris, F. W. 1915. What quantity to make at once. The Library of Factory Management, Vol. V. Operation and Costs. A. W. Shaw Company, Chicago, 47--52.). Since all citations found to Harris's paper over the past 35 years are incorrect, it seems that no one has been able to locate the paper during this period. The paper's apparent misplacement probably stems from an inaccurate citation given by Raymond in 1931 (Raymond, F. E. 1931. Quantity and Economy in Manufacture. McGraw-Hill, New York.). Harris's paper exemplifies skillful presentation of the concepts of management science, and still merits reading. Today's researcher can learn much from Harris's modeling approach, and would do well to emulate his expositional clarity and motivational effectiveness.
 
This paper examines the daily return variability of the S&P 500 and the Dow Jones indices over the 1928--1989 period. We use the traditional close-to-close standard deviation of returns, two alternative estimators incorporating the daily high and low of the index, and a robust estimator to measure the volatility of stock index returns. The 1980s were the third most volatile decade behind the 1920s and 30s. To a large extent, this was caused by the anomalous behavior of the fourth quarter of 1987. Returns in the 1980s had far more skewness and kurtosis than in any other decade studied; these results were not entirely due to 1987, as returns in 1988 and 1989 had large measures of both skewness and kurtosis. The frequency of extreme-return events increased in the 1980s, but was still dramatically less than the 1920s and 30s. When extreme negative days occurred in the 1980s the losses tended to be more severe than in the previous four decades. Extreme-return days are preceded by significant losses and are intertemporally clustered. There is no evidence of short-term market reversals after either positive or negative jumps in stock index returns.
 
These brief comments on the preceding article by Charnes and Cooper are limited to those of their remarks that bear on the evaluation of Kantorovich's long paper of 1939 reprinted in the July 1960 issue of Management Science.
 
A survey of more than 4,000 significant innovations and innovating firms in the UK from 1945–1983 shows that the scope and organisation of technological activities vary greatly as functions of firms' principal activities and size. 1. Technological opportunities and threats are greatest in firms in chemicals and engineering. Opportunities in such science-based and specialist supplier firms in general emerge horizontally (in related product markets) and downstream (in user sectors). In scale-intensive (e.g. steel, vehicles) and supplier-dominated (e.g. printing, construction) firms, opportunities tend to be upstream in related production technologies. Breakthrough innovations in science-based firms also induce clusters of technological opportunities upstream for suppliers, horizontally for partners, and downstream for users. Their effective exploitation requires diversity of firms' technological activities greater than that strictly required for current output. 2. The nature of technological opportunities, and of organisation for their exploitation, also varies with firm size. Firms with fewer than 1,000 employees have major opportunities with specialised strategies in mechanical engineering and instruments. The prevalence of broad front technological strategies, and of divisionalisation, increases sharply with firm size, together with dependence on formal R and D activities. The size of innovating divisions has diminished sharply over the period. Divisionalisation improves the “goodness of fit” between the core business of innovating divisions and the innovations themselves, but 40% have remained consistently outside the core business of divisions. 3. These findings help identify the key tasks of technological strategy in firms in different industries, and of different sizes. Thus, in large firms, divisionalisation can create the small size of unit conducive to effective implementation, but it cannot absolve central management from the continuous task of matching technological opportunities with organisational forms and boundaries.
 
In memoriam for Judith Larrivee (1947-1993), Technical Editor of Management Science.
 
My predecessors, the three former presidents of The Institute of Management Sciences, have already established something of a tradition for the TIMS presidential address. Having been involved with the affairs of the Institute during most of a year, they talked about its aims and activities. This address also will concern these activities; in fact, that is its title: The Institute in Action. Before this address is over I hope you will have reflected on something which you already know and that you will carry this thought with you, if no other, when you leave. You are the Institute and the Institute in Action is you in action.
 
The focus of this paper is on three major questions: (1) what is the theoretical rationale for the strategic group concept?; (2) does strategic group membership have performance implications?; and (3) are strategic groups and membership in strategic groups stable characteristics of industries? A statistical procedure is proposed to longitudinally identify strategic groups. The empirical setting is the U.S. pharmaceutical industry over the period 1963--82. While performance differences are found in terms of market share, profitability differences between groups are not observed. Neither are differences found in terms of risk and risk-adjusted performance. These results are attributed to the existence of performance variation within each strategic group. A dynamic model of competitive repositioning is proposed which helps integrate the findings from the study.
 
Approximately two hundred books on mathematical programming have been published during the past eight years. Most workers in this field have immediate access to only a minor fraction of these, but would have at least vicarious access to many of the rest through book reviews if only they knew where to look. That is the primary motivation for this bibliography of book reviews published during the years 1965-1972. A secondary reason is simply to furnish a reasonably comprehensive list of recent books on mathematical programming.
 
This paper presents the first R and D price indexes based largely on actual prices and expenditures reported by firms. The results, based on a carefully designed sample of about 100 firms, show that the GNP deflator, which is used in the official government R and D statistics, has tended to underestimate the rate of inflation in R and D. For managerial and economic purposes, such as R and D budgeting and productivity studies, these indexes should be of widespread use.
 
This paper analyzes Japanese manufacturing firms' export behavior and their performance. In a slow growth era, leading firms' and minor firms' export ratio tended to be less than that of follower (medium market share) firms in the same industry; there appears to be an inverted-U shape between export ratio and relative size of firms. An affiliation with a large industrial group has no positive relationship with export ratio, while other firm-specific factors do not show consistent relationships with it in the years 1971 to 1985. Exports and firm performance do not have a positive relationship. Competitive environment in the domestic market is related to the export strategy of Japanese firms.
 
Researchers in all academic disciplines benefit from an understanding of the intellectual development of their field. This understanding is essential for conducting studies which build systematically on prior research. The purpose of this study is to document the intellectual development of the ideas represented by published research in Management Information Systems (MIS) based on an author co-citation analysis. The resulting mapping is intended to serve as a benchmark for future assessments of MIS as a field as well as a means for documenting the emergence of new research specialties. The study sought to identify (1) the subfields which constitute MIS research, (2) the reference disciplines of these subfields, (3) the diffusion of the ideas represented by these subfields to other disciplines, and (4) which of these subfields represent active areas of current MIS research. Nine invisible colleges, or informal clusters of research were uncovered. These nine empirically defined conceptual groupings collectively define the intellectual foundations of MIS as well as the forces currently shaping MIS research. Four of the clusters represent early MIS research themes which are still popular, based on subsequent citation patterns. Despite the centrality of the concept of the organization to widely-accepted definitions of MIS, the results suggest that MIS research is not well-grounded in organization theory nor have MIS research results been widely diffused in the organizational literature. Suggestions for developing a better link between MIS and organizational theory are presented based on the concept of organizational effectiveness.
 
Reply to: "Notes on Hofshi and Korsh 1972" (Joglekar, Prafulla. Notes on Hofshi and Korsh 1972. Management Sci. 22 (6) 717-720).
 
Rami Hofshi and James Korsh [Hofshi, Rami, James Korsh. 1972. A measure of an individual's power in a group. Management Sci. 19 (1, September)] (hereafter referred to as H & K) made an important attempt to define a measure of an individual's power in a group. Whereas the direction of their effort is significantly different from earlier efforts in assigning numerical weights to each individual of an organization, there are a number of logical flaws in their model. These comments are meant to identify the inadequacies of the H & K model.
 
So, will management be automated by 1975? To the extent that automation means systems planning (and it does) the answer is yes. To the extent that automation means the utilization through investment of the latest machines to help do the required wok (and it does) the answer is yes. To the extent that automation means flexible and fast reaction to customer needs (and it does) the answer is yes. To the extent that automation means the utilization of mathematical techniques beyond that of simple arithmetic (and it does) the answer is yes. To the extent that automation means that all decisions can be made in advance; that people are not the most important ingredient in a business; that business judgment, skill and entrepreneurship are not needed (and it does not mean these things), management will not be automated by 1975 or ever.
 
This study describes the U.S. government's role in the development of manufacturing technologies and in the transfer of manufacturing innovations to private industry. The goals, organization, technologies, methods of technology transfer, user groups, and methods of program assessment for 13 government programs in this area are described and their implications for national manufacturing innovation and productivity are discussed.
 
We analyzed the total equity returns of indexes from Australia, Canada, Germany, Japan, the UK, and the US and total fixed income returns of indexes from all but Australia (excluded due to lack of data) to see if the returns of stocks and bonds were statistically different across markets during the 1980s. At the end of 1989, these countries represented over 87% of the market capitalization of the Morgan Stanley Capital International (MSCI) World Equity Index and over 88% of the Salomon Brothers World Bond Index. This study used monthly observations from January 1980 through December 1989 and examined returns based in local currency and hedged and unhedged US dollars. We found that sample mean stock and bond returns during the 1980s were statistically indistinguishable across countries. However, because the sample variances were so large relative to the sample means, it would have been difficult to detect differences in population means by any test. We found evidence of variance heterogeneity, which may be explainable by other economic factors. We also found that intercountry stock and bond correlations were not significantly different. Thus, we confirmed the results of other researchers, such as Jobson and Korkie (1981), but in a broader global context using more asset types and different statistical tests. Our work suggests reducing the number of input estimates to a MV global asset allocation problem. For practitioners trying to put MV analysis to use, these findings could have a significant effect on the practice of asset allocation.
 
The objective of this paper is to suggest some directions in the 1980s for the development of the theory and practice of planned change. The author makes two basic assumptions in approaching this objective. First, the two fields most widely associated with planned change, organization development and planning, are increasingly addressing similar or related problems and, as a result, offer significant opportunities to contribute to the development of the other. Second, we will not learn very much about directions for the field of planned change if discussion proceeds without consideration of the kinds of concrete problems we will likely face in the 1980s. The opportunities suggested in the first assumption will become clear by presenting some illustrative problems to be faced in the 1980s. The illustrative problems presented are highly interdependent. Organizations which attempt to solve these problems will find that many of their goals compete with or facilitate the goals of other organizations. Such problems and organizational consequences cannot be readily dealt with through central, comprehensive planning. An alternative is to create forums within and between organizations which include representation from all parties with an affected interest in the problems at issue. Action research is proposed as a problem-solving mode appropriate to such forums. It is a mode of problem-solving which searches for common values underlying different interests and encourages commitment to solving jointly defined problems.
 
This paper explores the impact of adjustments to the inputs on total returns, terminal wealth, and portfolio turnover in an unconstrained monthly mean-variance (MV) asset allocation over time. It is well known that MV allocations are very sensitive to small forecast errors in the means and covariances. This sensitivity is especially pronounced for errors in means. One way to control this sensitivity to forecast errors is to use Stein estimation. We examined three naive applications of Stein estimation for six individual country stock indexes, five country bond indexes and five cash indexes. This study has two major conclusions. First, any of the suggested adjustments to inputs dominate the results of an unadjusted-input MV optimization. Adjusted-input portfolios have higher mean return, less variance and greater terminal wealth than unadjusted-input portfolios. Second, these improvements become even greater with transaction costs.
 
In this paper, a predictive model is designed to provide the Educational Department of China with the information necessary to draw up the program (1985--2000) for education development. This model contains two submodels, that is, the predictive model to the annual teacher's number of each title and the fuzzy predictive model to the annual teacher's number of each age. The first submodel is a Markov Chain model and the second submodel is a fuzzy predictive model. Before establishing the second submodel, the fuzzy set of judging indices for the promotion of teachers and the reasonable age's regions for the promotion of teachers for each title were stated and the fuzzy membership functions for every judging index were established by using the fuzzy statistical method. In this paper, a method is presented for establishing models. The predictive models are widely useful. Not only can the models be used in the educational departments, but they can also be used for making promotion policy in other administrative departments.
 
Engineering manpower constitutes an important national resource where demand and supply must be in reasonable balance to assure continued economic growth. The paper establishes a demand analysis to predict the need for engineers in 15 industry groups through 1987, subject to various economic scenarios. It is shown that for total engineering employment in any industry segment the constant dollar gross national product fraction per engineer is only a slowly varying function of time. A least-square fit of this ratio versus time is shown to be an adequate predictor of employment and can be used for extrapolation with confidence. A parallel analysis is also developed for engineers employed in research and development. It is shown that, in general, the need for new engineers due to economic growth is smaller than the need to make up for attrition due to death, retirement, promotions out of engineering and voluntary leaving of the profession. The high mobility rate of engineers out of the profession is significant and its importance has not previously been documented. The paper concludes with a preliminary discussion of the balance between supply and demand. Even with a healthy economic growth the nation is likely to produce a small excess of engineering graduates over the near future. However, there will be a shortage of engineers holding advanced degrees for entry into research and development, even if R&D sees only a moderate growth. Better industry-university cooperation is called for to encourage engineers to undertake advanced degrees, to assess actual future needs and to advise students to select appropriate engineering disciplines.
 
On p. 673, we stated that we had been refused permission to replicate our study using the Markstrat game (Hogarth & Makridakis [Hogarth, R. M., S. Makridakis. 1981. The value of decision making in a complex environment: an experimental approach. Management Sci. 27(1) 93--107.]) and questioned whether our results had dampened willingness to conduct this kind of research using Markstrat. These statements were based on a misunderstanding. We sincerely regret their negative implications.
 
Levy and Levy (Management Science2002) present data that, according to their claims, violate prospect theory. They suggest that prospect theory's hypothesis of an S-shaped value function, concave for gains and convex for losses, is incorrect. However, all the data of Levy and Levy are perfectly consistent with the predictions of prospect theory, as can be verified by simply applying prospect theory formulas. The mistake of Levy and Levy is that they, incorrectly, thought that probability weighting could be ignored.
 
2025 Initial Task Compilation
Air Force 2025 was a study directed by the Chief of Staff of the United States Air Force to identify key system concepts and technologies for achieving air and space dominance in the year 2025. The study was a large effort in which over 200 military experts participated for more than one year. We developed a Value-Focused Thinking model, which we used to evaluate which futuristic system concepts have the greatest potential to ensure future U.S. air and space dominance. We named the value model Foundations 2025 because it represented a return to the basics of air and space dominance. We used the "silver standard" approach for value hierarchy development. The participants identified key verbs to describe tasks that must be performed in 2025 to ensure air and space dominance. The value hierarchy was developed bottom-up by aggregating these verbs into higher order tasks using affinity diagrams. Using the value hierarchy, we used multiattribute decision analysis techniques to develop an additive value model with 134 attributes. The Foundations 2025 value model was successfully used to score 43 futuristic system concepts and provide insights about the most promising system concepts and technologies. The analysis results directly supported the study director and the senior leadership of the United States Air Force.
 
Top-cited authors
Patrick Harker
  • Federal Reserve Bank Of Philadelphia
D. J. Wu
  • Georgia Institute of Technology
Todd R. Zenger
  • University of Utah
Rosemarie Ham Ziedonis
  • Boston University
Vibhanshu Abhishek
  • Carnegie Mellon University