Decision Sciences

Published by Wiley
Online ISSN: 1540-5915
Publications
Article
Allocation of at most n recoverable items among m demand locations is considered. Items are assumed to be demanded singly and independently and are eventually returned to the demand location. If a stockout occurs, the demand is lost and a penalty corresponding to lost profit is assessed. The allocation that minimizes the sum of expected stockout costs plus holding costs is readily obtained. An example involving allocation of rental cars among outlets is solved in detail.
 
Article
In this paper a framework is developed to seek information from the decision-maker about uncertain events, and to use this information for identifying a preferred alternative. Existing methods are examined critically to determine the best strategy for eliciting subjective probabilities int he context of decision-making. It is shown how a preferred alternative can be identified with less than perfect knowledge of the probabilities.
 
Article
It is demonstrated in this paper that early uncertainty resolution is preferred to delayed uncertainty resolution when comparing sets of uncertain cash flow sequences in an N-period setting. The implicit and often unwarranted assumptions regarding uncertainty resolution which are made in the three widely used methods in capital budgeting problems are demonstrated. Finally, it is demonstrated that the generally accepted procedure for measuring uncertainty resolution is restrictive in its underlying assumptions.
 
Article
Large-scale mixed-integer linear programming (MILP) models may easily prove extraordinarily difficult to solve, even with efficient commercially implemented MILP solution codes. Drawing on experience gained in solving and analyzing three intertemporal investment planning MILP models for electric power supply, this note offers several practical suggestions for reducing computer solution times for general production-allocation MILP models. Solution time reduction stems from judicious use of the powerful computational capabilities of existing commercial linear programming codes in conjunction with information known or to be learned by the practitioner about the model's structure.
 
Article
This article describes the design of a stock market exercise to be run with a business game (or a simulated business game). It reports on an exercise in which around 100 traders participated in a market, trading four stocks. The results obtained are reported: however, the work was primarily a feasibility study to see if very large group behavior could be studied in a controlled experiment. The experience gleaned from the experiment indicates that this is a fruitful approach.
 
Article
This paper illustrates the application of “fuzzy subsets” concepts to goal programming in a fuzzy environment. In contrast to a typical goal-programming problem, the goals are stated imprecisely when the decision environment is fuzzy. The paper first considers a fuzzy goal-programming problem with multiple goals having equal weights associated with them. A solution approach based on linear programming is developed. Next, the solution approach is extended to the case where unequal fuzzy weights are associated with multiple goals. Numerical examples are provided for both cases to illustrate the solution procedure.
 
Article
The purpose of this study was to estimate the impact of changes in Federal tax policy on investment behavior in the chemicals and allied products industry and to examine the possibility of a differential impact of changes in tax policy on investment behavior in manufacturing industries. The impact of changes in tax policy on investment behavior in the chemical and allied products industry was estimated and compared with the impact of tax policy on investment behavior in total manufacturing found in earlier studies. The model used to estimate the impact of changes in tax policy on investment behavior in the chemicals and allied products industry was the neo-classical model of capital accumulation as formulated initially by Dale Jorgenson. The conclusions reached were that changes in tax policy have had a measurable impact on investment behavior in the chemical and allied products industry which was greater as a percentage of gross investment than that found on total manufacturing in earlier studies.
 
Article
In an advertising strategy, it is preferable to optimize a sequence of periodic decisions, rather than optimizing each period's decision separately. Thus, a dynamic technique such as optimal control, which is used in this study, should be employed. Necessary conditions, as inferred from a parsimonious advertising model, are tested using data from the tobacco industry. Parameter estimation involves the use of non-linear regression. The estimates permit trajectories of optimal advertising expenditures and optimal market shares to be constructed for filter cigarettes. When compared with the actual trajectories, the optimizing trajectories exhibit a striking correspondence to reality. Therefore, interesting results can be obtained using a simple model, provided the optimizing method takes into account the dynamic nature of managerial decision-making and provided powerful methods for estimating model parameters are used.
 
Article
This paper reports the identification and ranking of major manufacturing issues that American manufacturing managers and academics must focus on and resolve in the 1990s to be competitive on a global basis. Vice-presidents of manufacturing from over 75 companies across the United States were asked in a Delphi study to rank the key strategic and tactical issues facing American manufacturing in the next three to five years. Based on three rounds of a Delphi study, quality management, manufacturing strategy, and process technology emerged as the top ranked strategic issues. The top ranked tactical issues were quality control, manufacturing planning and control systems, and work force supervision. These issues were valid across diverse industry groups, since the industrial and educational background of the respondents was shown to have no impact on the final consensus rankings and opinions reported in this paper. Factor analysis of the responses by the panel revealed that certain issues tend to be consistently viewed together, and were interpreted accordingly to provide insights into the ranking of issues. Apart from building a consensus among experts on key manufacturing issues of present and future importance, this study can also be helpful for setting future academic research and pedagogical priorities for the field of production and operations management.
 
Article
Supply chain management is a central and important area for academic research due to its impact on firms competing in today's global economy. Managing the flow of materials from supply sources to the ultimate customer represents a major challenge for today's managers. To assist managers, the concept of supply chain management has been adopted by many business leaders as an important way to assist in designing, planning, and controlling the network of facilities and tasks that comprise the many stages of the supply chain. In turn, the flow of academic research in the area has increased to provide a better set of guidelines for effective implementation and execution.This article sets the stage for recently completed research concentrating on supply chain management issues. First, a definition of supply chain management is provided and compared to recent usage in this area and logistics management. Also, a framework is provided that structures this dynamic and complex management task. Second, a review of past research is presented to illustrate the many paths supply chain management has traveled, and important contributions to supply management understanding and decision making. Third, recently completed research articles are introduced that have been selected to be part of this special issue of Decision Sciences. And fourth, future research directions for supply chain management that need to be pursued by interested investigators are discussed.
 
Article
Computer-generated graphics are becoming increasingly available to decision makers. Despite claims on the part of vendors that the use of graphics will improve decision speed and quality over traditional methods of data display, the available evidence is far from supportive. Initial studies show graphics to be no more effective in communicating information than tables. Correct interpretation of graphical displays appears to require training, which most users lack. Furthermore, there is evidence that those features that make a graph visually attractive—such as color, design complexity, and realism—may actually detract from accurate comprehension. This paper summarizes the literature dealing with the human use of graphics, develops several propositions based on persistent trends in the literature, and suggests directions for future research.
 
Article
The status of Quantitative Methods in business education is perhaps less “standard” and less understood than any other field. This is probably even more true at the graduate than at the undergraduate level. Courses which are normally considered to be Quantitative Methods courses are often housed in a wide range of departments. Some courses which would normally be classified as Quantitative Methods courses are also housed in colleges other than Business Administration. In order to determine the current status of graduate quantitative methods curricula in schools of business, a survey was made of the member institutions of the American Association of Collegiate Schools of Business. This paper presents the survey findings.
 
Article
Since the discipline of quantitative methods varies more widely from institution to institution than any other discipline, a survey was conducted to establish a profile of the undergraduate quantitative methods curricula. The survey was made among member institutions of the American Association of Collegiate Schools of Business in order to determine the current status of the structure and orientation of undergraduate programs in quantitative methods, faculty background and requirements, undergraduate course requirements for each of the major areas in business, and directions of future changes. This paper presents the survey results.
 
Article
Conventional approaches to determining optimal abandonment of a project under uncertainty either assume risk-neutrality or impose a mean-variance criterion. Risk-neutrality is unrealistic while the mean-variance criterion precludes determination of the optimal strategy without consideration of covariances of returns among projects. Further, the use of variance of present value as a risk measure may result in the “optimality” of a time 0 strategy that involves maintaining a position at time t that will be “suboptimal” and would not be maintained. The use of the multiperiod capital asset pricing model (CAPM) as a decision criterion is consistent with contemporary theory of market behavior and remedies the deficiencies of the mean-variance approach noted above. Computationally, the optimal strategy for abandonment, when the commitment must be made at time 0 (a lease, say), can be determined with little difficulty beyond that of mean-variance models. When time of abandonment can remain unspecified, the value of the prospect that abandonment will occur at the optimal time can be determined, though the technique necessary is considerably more complicated. In both cases, the marginal costs of commitments that limit discretion over abandonment can be determined and attributed to those commitments.
 
Article
ABSTRACTA pollution damage function is developed, using data from a study on the sensitivity of morbidity to ambient air concentrations of individual pollutants. This function is used in a model which incorporates the costs and technology for pollution abatement in the St. Louis Airshed. A set of air quality standards is determined which minimizes morbidity subject to a given control budget constraint. The results indicate that there should be a greater reduction of the pollutants associated with stationary sources than of those characteristic of the automobile.
 
Article
The U.S. service sector loses 2.3% of all scheduled labor hours to unplanned absences, but in some industries, the total cost of unplanned absences approaches 20% of payroll expense. The principal reasons for unscheduled absences (personal illness and family issues) are unlikely to abate anytime soon. Despite this, most labor scheduling systems continue to assume perfect attendance. This oversight masks an important but rarely addressed issue in services management: how to recover from short-notice, short-term reductions in planned capacity. In this article, we model optimal responses to unplanned employee absences in multi-server queueing systems that provide discrete, pay-per-use services for impatient customers. Our goal is to assess the performance of alternate absence recovery strategies under various staffing and scheduling regimes. We accomplish this by first developing optimal labor schedules for hypothetical service environments with unreliable workers. We then simulate unplanned employee absences, apply an absence recovery model, and compute system profits. Our absence recovery model utilizes recovery strategies such as holdover overtime, call-ins, and temporary workers. We find that holdover overtime is an effective absence recovery strategy provided sufficient reserve capacity (maximum allowable work hours minus scheduled hours) exists. Otherwise, less precise and more costly absence recovery methods such as call-ins and temporary help service workers may be needed. We also find that choices for initial staffing and scheduling policies, such as planned overtime and absence anticipation, significantly influence the likelihood of successful absence recovery. To predict the effectiveness of absence recovery policies under alternate staffing/scheduling strategies and operating environments, we propose an index based on initial capacity reserves.
 
Article
Recently developed large sample inference procedures for least absolute value (LAV) regression are examined via Monte Carlo simulation to determine when sample sizes are large enough for the procedures to work effectively. A variety of different experimental settings were created by varying the disturbance distribution, the number of explanatory variables and the way the explanatory variables were generated. Necessary sample sizes range from as small as 20 when disturbances are normal to as large as 200 in extreme outlier-producing distributions.
 
Article
Several linear programming methods have been suggested as discrimination procedures. A least absolute deviations regression procedure is developed here which is simpler to use and does not suffer from any lack of invariance. A simulation study shows it to be at least as effective as any of the methods previously discussed for normal and heavy-tailed distributions.
 
Article
Previous research has indicated that minimum absolute deviations (MAD) estimators tend to be more efficient than ordinary least squares (OLS) estimators in the presence of large disturbances. Via Monte Carlo sampling this study investigates cases in which disturbances are normally distributed with constant variance except for one or more outliers whose disturbances are taken from a normal distribution with a much larger variance. It is found that MAD estimation retains its advantage over OLS through a wide range of conditions, including variations in outlier variance, number of regressors, number of observations, design matrix configuration, and number of outliers. When no outliers are present, the efficiency of MAD estimators relative to OLS exhibits remarkably slight variation.
 
Article
Designing efficient physical data bases is a complex activity, involving the consideration of a large number of factors. Mathematical programming-based optimization models for physical design make many simplifying assumptions; thus, their applicability is limited. In this article, we show that heuristic algorithms can be successfully used in the development of very good, physical data base designs. Two heuristic optimization algorithms are proposed in the context of a generic and abstract model for physical design. One algorithm is based on generic principles of heuristic optimization. The other is based on capturing and using problem-specific information in the heuristics. The goodness of the algorithms is demonstrated over a wide range of problems and factor values.
 
Article
This paper develops the idea of animation in user interfaces designed for decision support systems (DSS), proposes a framework to investigate the efficacy of animation in these interfaces, and reports on a study that examined the effects of properties of animation specified by the framework. Based on a review of selected background literature, principal properties affecting the efficacy of animation in user interfaces designed for DSS are identified and the effects on decision quality of three of these properties are hypothesized. To evaluate these hypotheses, data was collected in a laboratory experiment involving two different tasks. The results for both tasks indicate that animation in user interfaces designed for DSS should employ parallel as opposed to sequential navigation interactivity techniques. The decision quality of subjects that used a parallel navigation technique was significantly greater than that of those that used a sequential navigation interactivity technique. The results regarding the efficacy of image abstraction and transition effects varied by task. For one task, decision quality was significantly greater for subjects that used realistic as opposed to abstract images, but decision quality did not vary by transition effect. For the other task, decision quality was significantly greater for subjects that used gradual as compared to abrupt transition, but image abstraction had no effect on decision quality.
 
Article
This paper examines difficulties with the use of weights to solve multiple objective decision support models: misunderstanding of the meaning of weights, issues of commensurability and, most important, the likely inability of weights alone to isolate the decision maker's most-preferred point. The constraint method is shown to be an attractive alternative.
 
Article
The allocation of time in academe is influenced by artificial constraints such as wage-fund pools and quotas on promotion and tenure. These constraints affect the potential productivity of professors. An analysis of the time-allocation process and of rent-seeking implications for that process suggests various ways to shift supply curves to counteract the effects of these constraints.
 
Article
We believe that partnering with industry can lead to research that is relevant, rigorous, and refreshing. Our experiences show that the potential benefits of partnering with industry are enormous, but that this is not an easy route for academics interested in operations research modeling or empirical methods. The need for grounded business research is greater now than ever, and, while academics have made great progress, there are still numerous opportunities to demonstrate the relevance of our research. We discuss how to establish industry contacts, identify fruitful academic—industry projects, and publish the resulting research.
 
Article
Alignment between organizational critical success factors (CSFs) and competencies is widely believed to improve performance. This study examines the performance implications of alignment between CSFs and one source of competence, the organization's information technology (IT) capability. The effects of three antecedent factors–environmental uncertainty, integration, and IT management sophistication–are also examined. This paper uses survey data from 244 large academic institutions, along with some secondary data. Following the profile deviation approach to measure alignment, the academic institutions are divided into three clusters based on their CSFs: the academic comprehensives, the reputed giants, and the small educators. The ideal profile of IT capability is next developed for each cluster in terms of four dimensions: information retrieval, electronic communication, computing facilities for students, and computer-aided education. Alignment is then computed for each institution as the proximity of its IT capability profile from the ideal IT capability profile for the cluster to which it belongs. The results suggest that alignment facilitates both perceived IT success and organizational performance. Moreover, sophisticated IT management facilitates both alignment and perceived IT success, environmental uncertainty facilitates perceived IT success but not alignment, and integration facilitates neither alignment nor perceived IT success.
 
Article
This field study provides variables that can be used by practitioners as action levers and by future researchers as the basis for theoretical development. Conclusions from relevant literature and findings from interviews with interdisciplinary research management identified forty variables that were viewed as important to interdisciplinary research project success. After adjusting the data for reliability attenuation, these variables were further analyzed to identify the best prediction equation. The findings suggested that project age or the longevity of the project and open discussion of disagreements were the best predictors of performance.
 
Article
A Markov-chain faculty planning model to be used with institution-specific data is presented to describe and better understand the complex phenomena of faculty movement through an institution and on its relationship to salary costs, composition of the faculty, and faculty turnover rate. The model updates the earlier work done at Stanford University and Oregon State University by the addition of states for fixed-term appointments and for part-time FTEs to reflect accurately the current hiring trends at many institutions. The model is implemented and tested at two different institutions. The findings suggest that the model is a viable means of gaining useful insights and quantitative data on the faculty profile, salary costs, expected departures, and part-time trends. And further, when used as a planning tool, and the model apparently is comprehensive and flexible enough to analyze the probable effects, both in the short and long run, of alternative personnel policies on the faculty composition.
 
Article
While it is widely understood that faculty in various disciplines tend to publish at different rates and in different forms, knowledge of these differences is too limited to facilitate systematically differentiated performance appraisal and reward systems. In this study, theory concerning knowledge production system characteristics as influences on individual performance is applied to academic research occupations using a classification scheme developed by Biglan [4]. Regression analysis is applied to a general (industry-wide) sample of United States faculty, with publishing patterns as performance measures. Each dimension of the classification scheme is found to have predictive validity. Output patterns are consistent with a conceptualization of research occupations in terms of (1) transformational/technological processes, (2) research mission, and (3) input/subject matter characteristics. The results offer a basis for generating disciplinary publishing norms and differentiated reward systems.
 
Percentage of services and manufacturing in Gross Domestic Product for selected countries. a
Article
Services are now a larger portion of the economy than manufacturing for every nation on Earth, and services are an overwhelming portion of Western economies. While decision-making research has begun responding to this change, much of the scholarly work still addresses manufacturing issues. Particularly revealing is the field of operations management (OM), in which the proportion of manuscripts dedicated to services has been estimated at 3%, 6%, and 7.5% by various authors. We investigate several possible reasons for the neglect of services in research, including the difficulty in defining services, viewing services as derivative activities, a lack of defined processes, a lack of scale in services, and the effect of variability on service performance. We argue that times have changed, and none of these reasons is valid anymore. We sound the warning that failure to emphasize services in our research and teaching may signal the decline of the discipline. We note the proportion of OM faculty in business schools has shrunk in the past 10 years. Finally, we examine a selection of service research agendas and note several directions for high-impact, innovative research to revitalize the decision sciences. With practitioners joining the call for more research in services, the academic community has an exciting opportunity to embrace services and reshape its future.
 
Article
The objective of this paper is to illustrate an application of a quantitative decision-rule strategy which incorporates the Wald classification statistic. An empirical example of this classification technique using data obtained from The University of Texas at Austin Measurement and Evaluation Center is presented. This study originated as an attempt to illustrate the usefulness of this infrequently used statistical technique as a practical tool for “predicting” academic performance. A method for measuring the operational effectiveness of the decision-rule used is also illustrated. This study emphasizes classification as an important part of statistical analysis and focuses on the Wald classification statistic to illustrate one type of problem for which it is suitable.
 
Article
Conjoint analysis studies typically utilize orthogonal fractional factorial experimental designs to construct a set of hypothetical stimuli. Occasionally, these designs include environmentally correlated attributes that can lead to stimulus profiles that are not representative of the subject's environment. To date, no one has proposed a remedy well-grounded in statistical theory. This note presents a new methodology utilizing combinatorial optimization procedures for creating modified fractional factorial designs that are as “orthogonal” as possible, which do not contain nonrepresentative stimulus profiles.
 
Article
As new technological innovations are rapidly introduced and changed, identifying an individual characteristic that has a persistent effect on the acceptance decisions across multiple technologies is of substantial value for the successful implementation of information systems. Augmenting prior work on individual innovativeness within the context of information technology, we developed a new measure of adopter category innovativeness (ACI) and compared its effectiveness with the existing measure of personal innovativeness in IT (PIIT). Further, we examined two alternative models in which the role of individual innovativeness was theorized differently—either as a moderator of the effects the perceived innovation characteristics of usefulness, ease of use, and compatibility have on future use intention (moderator model) or as a direct determinant of the innovation characteristics (direct determinant model). To ensure the generalizability of the study findings, two field studies (N= 634) were conducted, each of which examined the two models (moderator and direct determinant) and measured individual innovativeness using the two measures (ACI and PIIT). Study 1 surveyed the online buying practices of 412 individuals, and Study 2 surveyed personal digital assistant adoption of 222 healthcare professionals. Across the markedly different adoption contexts, the study results consistently show that individual innovativeness is a direct determinant of the innovation characteristics, and the two measures share many commonalities. The new measure offers some additional utilities not found in the PIIT measure by allowing individuals to be directly classified and mapped into adopter categories. Implications are drawn for future research and practice.
 
Article
The often paradoxical relationship between investment in information technology and gains in productivity has recently been attributed to a lack of user acceptance of information technology innovations. Diverse streams of research have attempted to explain and predict user acceptance of new information technologies. A common theme underlying these various research streams is the inclusion of the perceived characteristics of an innovation as key independent variables. Furthermore, prior research has utilized different outcomes to represent user acceptance behavior. In this paper we focus on individual's perceptions about the characteristics of the target technology as explanatory and predictive variables for acceptance behavior, and present an empirical study examining the effects of these perceptions on two frequently used outcomes in the context of the innovation represented by the World Wide Web. The two outcomes examined are initial use of an innovation and intentions to continue such use in the future, that is, to routinize technology use. Two research questions motivated and guided the study. First, are the perceptions that predict initial use the same as those that predict future use intentions? Our results confirm, as hypothesized by prior research, that innovation characteristics do explain acceptance behavior. The results further reveal that the specific characteristics that are relevant for each acceptance outcome are different. The second research question asks if perceived voluntariness plays a role in technology acceptance. Results show that external pressure has an impact on adopters' acceptance behavior. Theoretical and practical implications that follow are presented.
 
Article
The proliferation of innovative and exciting information technology applications that target individual “professionals” has made the examination or re-examination of existing technology acceptance theories and models in a “professional” setting increasingly important. The current research represents a conceptual replication of several previous model comparison studies. The particular models under investigation are the Technology Acceptance Model (TAM), the Theory of Planned Behavior (TPB), and a decomposed TPB model, potentially adequate in the targeted healthcare professional setting. These models are empirically examined and compared, using the responses to a survey on telemedicine technology acceptance collected from more than 400 physicians practicing in public tertiary hospitals in Hong Kong. Results of the study highlight several plausible limitations of TAM and TPB in explaining or predicting technology acceptance by individual professionals. In addition, findings from the study also suggest that instruments that have been developed and repeatedly tested in previous studies involving end users and business managers in ordinary business settings may not be equally valid in a professional setting. Several implications for technology acceptance/adoption research and technology management practices are discussed.
 
Article
Despite the development of increasingly sophisticated and refined multicriteria decision-making (MCDM) methods, an examination of the experimental evidence indicates that users most often prefer relatively unsophisticated methods. In this paper, we synthesize theories and empirical findings from the psychology of judgment and choice to provide a new theoretical explanation for such user preferences. Our argument centers on the assertion that the MCDM method preferred by decision makers is a function of the degree to which the method tends to introduce decisional conflict. The model we develop relates response mode, decision strategy, and the salience of decisional conflict to user preferences among decision aids. We then show that the model is consistent with empirical results in MCDM studies. Next, the role of decisional conflict in problem formulation aids is briefly discussed. Finally, we outline future research needed to thoroughly test the theoretical mechanisms we have proposed.
 
Article
Once a process is stabilized using control charts, it is necessary to determine whether this process is capable of producing die desired quality, as determined by the specifications, without the use of some additional inspection procedure such as 100 percent inspection or acceptance sampling. One common method of making this determination is the use of process capability ratios. However, this approach may lead to erroneous decisions due to the omission of economic information. This paper attempts to remedy this situation by developing economic models to examine the profitability of different inspection policies. These models employ the quadratic loss function to represent the economic cost of quality from external failures, which is commonly omitted or overlooked. Moreover, assuming a normal distribution for the quality characteristic allows the use of simplified formulas that are provided. Thus the calculations can be made using standard normal tables and a calculator. Additionally, these economic models may be used to determine if additional inspection procedures should be reinstated if the quality of the process was to decline, to make capital budgeting decisions involving new equipment that produces parts of a higher quality, and to determine the preferred 100 percent inspection plan or acceptance sampling plan.
 
Article
Several analysts have advocated the TIC criterion for the selection of bids in the tax-free bond market. The proponents, however, have overlooked the bias introduced by the TIC method. An analysis is developed which shows that use of TIC can result in economically incorrect rankings for the issuer. We prove, under conditions of certainty, that given any Bid A, it is possible to construct Bid B with coupon rates arbitrarily close to Bid A such that the TIC criterion leads to acceptance of the bid with the higher present value of interest costs.
 
Article
Dual-resource constrained queuing systems contain fewer servers than service facilities. This study uses computer simulation to evaluate several server assignment procedures in a dual-resource system. A field study serves as the basis for developing a model with two service facilities in parallel, a single server, and deterministic information access and transfer delays that can be applied to job shops, computer operating systems, and elevators. Several findings, useful in server assignment decision making, resulted from the study. If first-come, first-served sequencing is used, delaying server assignment at a facility until all jobs are completed reduces both the mean and the variance of job flow time. If shortest-process-time-first sequencing is used, an assignment rule is tested that delays a server at a facility until a sufficiently short job is estimated to have arrived elsewhere. This rule performs best overall in terms of both the mean and variance of flow time. Methods to implement this decision rule easily are discussed.
 
Article
The selection of an efficient set of access paths is critical to the design of large multiuser data bases. This task is formulated as an integer, linear mathematical program, and an approach to its solution is presented. A series of experiments support the assertion that this approach is sufficient for quickly producing an efficient, if not optimal, set of access paths for data-base problems of practical significance.
 
Article
The problem of patient no-shows (patients who do not arrive for scheduled appointments) is significant in many health care settings, where no-show rates can vary widely. No-shows reduce provider productivity and clinic efficiency, increase health care costs, and limit the ability of a clinic to serve its client population by reducing its effective capacity. In this article, we examine the problem of no-shows and propose appointment overbooking as one means of reducing the negative impact of no-shows. We find that patient access and provider productivity are significantly improved with overbooking, but that overbooking causes increases in both patient wait times and provider overtime. We develop a new clinic utility function to capture the trade-offs between these benefits and costs, and we show that the relative values that a clinic assigns to serving additional patients, minimizing patient waiting times, and minimizing clinic overtime will determine whether overbooking is warranted. From the results of a series of simulation experiments, we determine that overbooking provides greater utility when clinics serve larger numbers of patients, no-show rates are higher, and service variability is lower. Even with highly variable service times, many clinics will achieve positive net results with overbooking. Our analysis provides valuable guidance to clinic administrators about the use of appointment overbooking to improve patient access, provider productivity, and overall clinic performance.
 
Article
This field study investigated the use of nine information sources (personal subscriptions to periodicals, company library, data bases, superiors, subordinates, peers, internal documents, consultants, and other outsiders) for environmental scanning by 362 professionals employed in the corporate headquarters of two large commercial organizations. The usage frequency of essentially all information sources was positively related to perceived accessibility; however, usage frequency also tended to be positively related to environmental complexity. These results suggest that the information-gathering requirements associated with an individual's job may necessitate the use of less accessible sources, and this finding represents a break with prior research.
 
Article
Past research suggests that problem solving and/or decision behavior can be altered and improved by the changes in the way information is accessed and displayed. Also, researchers have found that the usefulness of different information display formats are contingent on the characteristics of the problem task. This research investigated the impact on problem solving when accessing and using information from linear and nonlinear systems. Also, the research investigated problem-solving performance of linear and nonlinear systems when applied to different combinations of problem tasks. In a laboratory setting, linear and nonlinear systems were developed to conduct this experiment. This experiment used 64 graduate business students in a two-factor repeated-measures design employing a multivariate analysis of variance to analyze the data. Repeated measures were conducted to analyze the experimental group under both linear and nonlinear treatments. The findings from the study support the notion that the nonlinear system resulted in superior problem solving and higher levels of user satisfaction than the linear system. Specifically, the nonlinear system enabled users to make faster and more accurate decisions on perceptual problem tasks than did the linear system. For analytical problem tasks, users performed faster with the nonlinear system; however, there was no significant difference in accuracy. User satisfaction was higher with the nonlinear system under both perceptual and analytical tasks.
 
Article
This paper reports the findings of a laboratory experiment designed to investigate the relationship of category width (CW) cognitive style with accountants' perceptions of accounting information. Subjects drawn from large accounting firms in Sydney, Brisbane, and Melbourne, Australia, were classified into broad, medium, and narrow categories following the test devised by Pettigrew [19]. Subjects were requested to state their level of confidence in decisions they had made after receiving (1) conventional accounting information or (2) conventional accounting information and human resources accounting (HRA) information using a one-group pretest-posttest design. The results indicated a significant relationship between CW cognitive style and the accountants' confidence in their decisions. Furthermore, CW cognitive style moderated the accounting-information/decision-making relationship.
 
Article
The validity of an occupation goal-expectancy model was evaluated using the actual position attainment behavior of professional public accounting firm employees. Actual position attainment behavior was monitored over a four-year period. The hypothesized model relationships between the model variables, goal choice behavior, and position attainment were tested using linear multiple discriminant analysis and simple classification matrices. A within-subject analysis was undertaken. The findings generally support the hypothesized relationships between expected utility, goal choice, and position attainment and the model's applicability within large public accounting firms.
 
Article
The nucleolus linear programming (LP) method for allocating joint costs associated with a shared resource often stops short of uniquely specifying all savings allocations. Barton [1] recently presented the minimum total propensity to disrupt (MTPD) as a secondary criterion for uniquely determining those allocations not provided by the initial LP solution. Barton demonstrated the uniqueness of his solution on a specific example with a four-entity grand coalition. We show a practical approach for implementing Barton's two-step procedure. Our purpose is to make the nucleolus/MTPD model readily accessible to a wide range of practitioners using commonly available spreadsheet tools. We also demonstrate the global optimality of the allocations that the model provides.
 
Article
The purpose of this paper is to derive the conditions under which disaggregated accounting data contribute to more accurate forecasts of corporate performance. A comparison formula is derived and applied to actual data. The results obtained indicate that disaggregated data do not necessarily produce better forecasts of corporate performance than do aggregated data. The paper concludes with implications of the results to some reporting issues.
 
Top-cited authors
Viswanath Venkatesh
  • Virginia Polytechnic Institute and State University
Fred D. Davis
  • University of Arkansas
Roger G. Schroeder
  • University of Minnesota Twin Cities
Ram Narasimhan
  • Michigan State University
Robert Beaudoin Handfield
  • North Carolina State University