
David Kelton- PhD
- Professor at University of Cincinnati
David Kelton
- PhD
- Professor at University of Cincinnati
About
145
Publications
44,826
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
17,394
Citations
Introduction
Current institution
Additional affiliations
September 2002 - present
Publications
Publications (145)
Computer simulation is a highly advantageous method for understanding and improving healthcare operations with a wide variety of possible applications. Most computer-simulation studies in emergency medicine have sought to improve allocation of resources to meet demand, or to assess the impact of hospital and other system policies on emergency depar...
Simulation is often used in papers and studies across diverse fields like logistics, supply chains, health care, manufacturing, and defense. But simulations must be properly done, including input and model building, designing/analyzing the simulations, and model verification/validation. Unfortunately, simulation studies are not always done well, ev...
In any setting, a proportion of incident active tuberculosis (TB) reflects recent transmission ("recent transmission proportion"), whereas the remainder represents reactivation. Appropriately estimating the recent transmission proportion has important implications for local TB control, but existing approaches have known biases, especially where dat...
Decades ago, simulation was famously characterized as a “method of last resort,” to which analysts should turn only “when all else fails.” In those intervening decades, the technologies supporting simulation—computing hardware, simulation-modeling paradigms, simulation software, design-and-analysis methods—have all advanced dramatically. We offer a...
Tuberculosis (TB) is an infectious disease that can progress rapidly after infection or enter a period of latency that can last many years before reactivation. Accurate estimation of the proportion of TB disease representing recent versus remote (long ago) transmission is critical to disease-control policymaking (e.g., high rates of recent transmis...
This paper evaluates sequential procedures for estimating the steady-state density of a stochastic process, typically (though not necessarily) observed by simulation, with or without intra-process independence. The procedure computes sample densities at certain points and uses Lagrange interpolation to estimate the density f(x) for each user-specif...
Objectives:
Crowding and limited resources have increased the strain on acute care facilities and emergency departments worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation is a computer-based tool that can be used to estimate how changes to complex health care delivery systems such as emergency d...
Rationale:
Household contact tracing has recently been endorsed for global tuberculosis (TB) control, but its potential population-level impact remains uncertain.
Objectives:
To project the maximum impact of household contact tracing for TB in a moderate-burden setting.
Methods:
We developed a stochastic, agent-based simulation model of a simp...
Tuberculosis (TB) transmission is a key factor for disease-control policy, but the timing and distribution of transmission and the role of social contacts remain obscure. We develop an agent-based simulation of a TB epidemic in a single population, and consider a hierarchically structured contact network in three levels, typical of airborne disease...
We apply service-operations-management concepts to improve the efficiency and equity of voting systems. Recent elections in the United States and elsewhere have been plagued by long lines, excessive waiting times, and perceptions of unfairness. We build models for the waiting lines at voting precincts using both traditional steady-state queueing me...
Providing equal access to public service resources is a fundamental goal of democratic societies. Growing research interest in public services (e.g., health care, humanitarian relief, elections) has increased the importance of considering objective functions related to equity. This article studies discrete resource allocation problems where the dec...
We consider the problem of resource allocation (RA) in the control of epidemics where a fixed budget is allocated among competing healthcare interventions to achieve the best health benefits, and propose a simulation-optimization framework to address a general form of the problem. While traditional approaches to the epidemic RA problem suffer from...
This is a companion article to Kasaie and Kelton (20134.
Kasaie , P. and
Kelton , W. D. 2013. Simulation optimization for allocation of epidemic-control resources. IIE Transactions on Healthcare Systems Engineering, 3(2): 78–93. [Taylor & Francis Online]View all references), and provides an extended discussion on the calibration, analysis, and op...
Guided by Little's law, decision and control models for operations in reentrant line manufacturing (RLM) systems are commonly set up to minimize the total work-in-process (WIP), which in turn indirectly minimizes cycle time (CT). By viewing the problem fundamentally differently, we re-formulate it as one that seeks to select the best cost function...
Employing mathematical modeling and analytical optimization techniques, traditional approaches to the resource-allocation (RA) problem for control of epidemics often suffer from unrealistic assumptions, such as linear scaling of costs and benefits, independence of populations, and positing that the epidemic is static over time. Analytical solutions...
Providing equitable voting experiences across voting precincts has been noted as an important goal in elections. We seek to provide equity to all voters so that no one particular group of voters is disadvantaged or disenfranchised. This paper uses the average absolute differences of waiting times across all precincts as a performance metric for equ...
Stochastic simulations involve random inputs, so produce random outputs too. This introductory tutorial is meant to call attention to the need to model and generate such inputs in ways that may not be the standard or defaults in simulation-modeling software, yet can be critical to model validity (a.k.a. getting right rather than wrong answers). The...
A Quasi-Independent (QI) subsequence is a subset of time series observations obtained by systematic sampling. Because the observations appear to be independent, as determined by the runs tests, classical statistical techniques can be used on those observations directly. This paper discusses implementation of a sequential procedure to determine the...
This work presents recent research to develop a series of optimization- and simulation-based decision support tools to aid in operational maritime mission planning for integration into the Globally Networked Maritime Headquarters with Maritime Operations Center. We present three tools: Global Fleet Station Mission Planner recommends route and missi...
This paper discusses a unified approach for estimating, via a histogram, the steady-state distribution of a stochastic process observed by simulation. The quasi-independent (QI) procedure increases the simulation run length progressively until a certain number of essentially independent and identically distributed samples are obtained. It is known...
Many decision-and-control algorithms have been proposed for autonomous unmanned aerial vehicles (UAVs). The nature of this problem, with large decision spaces and the desire for optimal performance criteria, indicates that closed-form analysis of any approach is nearly impossible, suggesting a simulation-based performance evaluation of relevant sce...
When lack of control is detected on a control chart, the usual recommendation is to ‘search for an assignable cause’, or ‘take remedial action’, etc., without specific advice on how to proceed. One important situation is a continuously adjustable machine whose mean output is being monitored by a Shewhart -chart. When lack of control is indicated by...
Batch means are sample means of subsets of consecutive subsamples from a simulation output sequence. Independent and normally distributed batch means are not only the requirement for constructing a confidence interval for the mean of the steady-state distribution of a stochastic process, but are also the prerequisite for other simulation procedures...
A summary and an analysis are given for an experimental performance evaluation of WASSP, an automated wavelet-based spectral method for constructing an approximate confidence interval on the steady-state mean of a simulation output process such that ...
We develop and evaluate the validity and power of two specific tests for the transition probabilities in a Markov chain estimated from aggregate frequency data. The two null hypotheses considered are (1) constancy of the diagonal elements of the one-step transition probability matrix and (2) an arbitrarily chosen transition probability’s being equa...
Chapters 3–6 discussed generating the basic random numbers that drive a stochastic simulation, as well as algorithms to convert them into realizations of random input structures like random variables, random vectors, and stochastic processes. Still, there could be different ways to implement such methods and algorithms in a given simulation model,...
This paper discusses implementation of a sequential pro- cedure to estimate the steady-state density of a stochastic process. The procedure computes sample densities at cer- tain points and uses Lagrange interpolation to estimate the density f(x). Even though the proposed sequential proce- dure is a heuristic, it does have strong basis. Our empiric...
A mean-time comparison of the algorithms of Floyd, Dantzig, Tabourier, and of repeated application of several single-source algorithms, for the all-pairs shortest-path problem with arbitrary arc lengths clearly demonstrates the superiority of the Tabourier procedure for networks in which an average of at least 25% of the potential direct arcs are p...
Nonstationary Poisson processes are appropriate in many applications, including disease studies, transportation, finance, and social policy. The authors review the risks of ignoring nonstationarity in Poisson processes and demonstrate three algorithms for generation of Poisson processes with piecewise-constant instantaneous rate functions, a capabi...
http://deepblue.lib.umich.edu/bitstream/2027.42/5937/5/bam4521.0001.001.pdf http://deepblue.lib.umich.edu/bitstream/2027.42/5937/4/bam4521.0001.001.txt
This paper discusses two sequential procedures to construct proportional half-width confidence intervals for a simulation estimator of the steady-state quantile and tolerance intervals for a stationary stochastic process having the (reasonable) property that the autocorrelation of the underlying process approaches zero with increasing lag. At each...
Two-stage selection procedures have been widely studied and applied to determine appropriate sample sizes for selecting the best of k designs. However, standard “indifference-zone” procedures are derived with a statistically conservative least-favorable-configuration assumption. The enhanced two-stage selection (ETSS) is a procedure that takes into...
We summarize the results of an experimental performance evaluation of using an empirical histogram to approximate the steady-state distribution of the underlying stochastic process. We use a runs test to determine the required sample size for simulation output analysis and construct a histogram by computing sample quantiles at certain grid points....
With the identification of a novel coronavirus associated with the severe acute respiratory syndrome (SARS), computational analysis of its RNA genome sequence is expected to give useful clues to help elucidate the origin, evolution, and pathogenicity ...
This paper both describes and exemplifies the logical style for preparing manuscripts for the INFORMS Journal on Computing. The intent is not for authors to spend time imitating the cosmetic style of a paper appearing in the journal or to produce near-camera-ready copy, but rather only to get the logical and organizational style correct, including...
Two-stage indifference-zone selection procedures have been widely studied and applied. It is known that most indifference-zone selection procedures also guarantee multiple comparisons with the best confidence intervals with half-width corresponding to the indifference amount. We provide the statistical analysis of multiple comparisons with control...
This tutorial introduces some of the ideas, issues, challenges, solutions, and opportunities in deciding how to experiment with simulation models to learn about their behavior. Careful planning, or designing, of simulation experiments is generally a great help, saving time and effort by providing efficient ways to estimate the effects of changes in...
In order to get more people to use and understand simulation, improved teaching of simulation to beginners is important. The panel members share their experience in teaching the classic systems of simulation, used for several decades, to novice students. 1 INGOLF STHL Although discrete simulation is a very powerful tool for the analysis of many dif...
How many runs should you make? . How should you interpret and analyze the output? This tutorial introduces some of the ideas, issues, challenges, solutions, and opportunities in deciding how to experiment with simulation models to learn about their behavior. Careful planning, or designing, of simulation experiments is generally a great help, saving...
Two-stage indifference-zone selection procedures have been widely studied and applied. It is known that most indifference-zone selection procedures also guarantee multiple comparisons with the best confidence intervals with half-width corresponding to the indifference amount. We provide the statistical analysis of multiple comparisons with a contro...
This paper discusses implementation of a sequential procedure to determine the simulation run length and construct a confidence interval for the mean of a steady-state simulation. The quasi-independent (QI) procedure increases the simulation run length progressively until a certain number of essentially independent and identically distributed syste...
In order to get more people to use and understand simulation, improved teaching of simulation to beginners is important. The panel members share their experience in teaching the classic systems of simulation, used for several decades, to novice students.
In order to get more people to use and understand simulation, improved teaching of simulation to beginners is important. The panel members share their experience in teaching the classic systems of simulation, used for several decades, to novice students.
this paper on the Operations Research website. Subject classifications: Simulation: Random number generation, Simulation: Random variable generation, Simulation: Statistical analysis, Computers/computer science: Software Experts now recognize that small linear congruential generators (LCGs) with moduli around 2 31 or so should no longer be used as...
A simulation model for detailed micro-level examination of ongoing and proposed probation-and-parole operations was developed and exercised for a wide variety of external conditions and alternative internal-decision scenarios. We describe the system and model, report on statistical experimental-design and sensitivity analyses, and discuss implicati...
This paper discusses implementation of a sequential procedure to construct proportional half-width confidence intervals for a simulation estimator of the steady-state quantiles and histograms of a stochastic process. Our quasiindependent (QI) procedure increases the simulation run length progressively until a certain number of essentially independe...
This panel discusses goals and educational strategies for teaching simulation in academia. Clearly, there is considerable material to cover in a single course or a sequence thereof in, say, an undergraduate program. The issue is how to motivate and empower students to analyze complex problems correctly and to prevent the pitfall of misusing the con...
With recent advances in parallel computation, distributed simulation has become a viable way of dealing with time-consuming simulations. For distributed simulations to run efficiently, care must be taken in assigning the tasks (work) in the simulated system to the available physical processors in the computer system. An inefficient assignment can r...
Simulation software has made great advances in recent years along
the dimensions of modeling capabilities, animated graphics, and ease of
use. There have also been real improvements in terms of both generality
to model a wide variety of systems, and in specialization for quick and
accurate modeling in specific application domains. And, of course, t...
This paper discusses implementation of a sequential procedure to construct proportional half-width confidence intervals for a simulation estimator of the steady-state quantiles and histograms of a stochastic process. Our quasi-independent (QI) procedure increases the simulation run length progressively until a certain number of essentially independ...
This tutorial introduces some of the ideas, issues, challenges, solutions, and opportunities in deciding how to experiment with a simulation model to learn about its behavior. Careful planning, or designing, of simulation experiments is generally a great help, saving time and effort by providing efficient ways to estimate the effects of changes in...
The use of batch means is a well-known technique for estimating the variance of mean point estimators computed from a simulation experiment. This paper discusses implementation of a sequential procedure to determine the batch size for constructing confidence intervals for a simulation estimator of the steady-state mean of a stochastic process. Our...
This tutorial introduces some of the ideas, issues, challenges, solutions, and opportunities in deciding how to experiment with a simulation model to learn about its behavior. Careful planning, or designing, of simulation experiments is generally a great help, saving time and effort by providing efficient ways to estimate the effects of changes in...
This paper discusses implementation of a sequential quantileestimation algorithm for highly correlated steady-state simulation output. Our primary focus is on issues related to computational and storage requirements of order statistics. The algorithm can compute exact sample quantiles and process sample sizes up to several billion without storing a...
This paper discusses the implementation of two sequential
procedures to construct confidence intervals for a simulation estimator
of the steady-state mean of a stochastic process. Our
quasi-independent-mean (QIM) methods attempt to obtain i.i.d. samples.
We show that our sequential procedures give valid confidence intervals.
The two assumptions req...
The article introduces some of the ideas, issues, challenges, solutions, and opportunities in deciding how to experiment with a simulation model to learn about its behavior. Careful planning, or designing, of simulation experiments is generally a great help, saving time and effort by providing efficient ways to estimate the effects of changes in th...
The use of batch means is a well-known technique for estimating the variance of mean point estimators computed from a simulation experiment. This paper discusses implementation of a sequential procedure to determine the batch size for constructing confidence intervals for a simulation estimator of the steady-state mean of a stochastic process. Our...
The paper discusses the implementation of a two-stage procedure to
determine the simulation run length for selecting the best of k designs.
We propose an Enhanced Two-Stage Selection (ETSS) procedure. The number
of additional replications at the second stage for each design is
determined by both the variances of the sample means and the differences...
This paper discusses implementation of a two-stage procedure to determine the simulation run length for selecting the best of k designs. We purpose an Enhanced Two-Stage Selection (ETSS) procedure. The number of additional replications at the second stage for each design is determined by both the variances of the sample means and the differences of...
The use of batch means is a well-known technique for estimating the variance of mean point estimators computed from a simulation experiment. This paper discusses implementation of a sequential procedure to determine the batch size for constructing confidence intervals for a simulation estimator of the steady-state mean of a stochastic process. Our...
This paper discusses implementation of a two-stage procedure to determine the simulation run length for selecting the best of k designs. We purpose an Enhanced Two-Stage Selection (ETSS) procedure. The number of additional replications at the second stage for each design is determined by both the variances of the sample means and the differences of...
This tutorial introduces some of the ideas, issues, challenges,
solutions, and opportunities in deciding how to experiment with a
simulation model to learn about its behavior. Careful planning, or
designing, of simulation experiments is generally a great help, saving
time and effort by providing efficient ways to estimate the effects of
changes in...
This paper discusses the implementation of a sequential
quantile-estimation algorithm for highly correlated steady-state
simulation output. The primary focus is on issues related to
computational and storage requirements of order statistics. The
algorithm can compute exact sample quantiles and process sample sizes up
to several billion without stor...
Simulation with Arena is used to analyze a controlled conveyor network with merging configuration (CNMC). We use simulation to realize the logic in a queueing-theoretic model (QTM), and to analyze the behavior of CNMCs under various conditions. We also examine the performance of QTM while keeping or violating the QTM assumptions and constraints. Si...
This paper describes, in general terms, methods to help design the runs for simulation models and interpret their output. Statistical methods are described for several different purposes, and related problems like comparison, variance reduction, sensitivity estimation, metamodeling and optimization are mentioned. The main point is to call attention...
This paper discusses recent developments in simulation research, current directions, as well as how research interacts with practice and software development. Among the methodological research areas considered are modeling techniques, generation of random numbers and processes, and statistical design and analysis. Projections for future research ar...
This panel looks at the issue of teaching simulation. It brings together three individuals with a wide diversity of academic and industrial experience to discuss the key issues that should be taught in a simulation course. Questions discussed include: Should a simulation language or general modeling concepts be taught in a simulation course? Should...
This paper describes, in general terms, methods for interpreting the output from simulation models. Statistical methods are described for several different purposes, and related problems like design, comparison, variance reduction, sensitivity estimation, metamodeling, and optimization are mentioned. The main point is to call attention to the chall...
Simulation experiments are sampling experiments by their very nature. Statistical issues dominate all aspects of a well-designed simulation study – model validation, selection of input distributions and associated parameters, experiment design frameworks, output analysis methodologies, model sensitivity, and forecasting are examples of some of the...
When estimating a Markov-process model from observed micro data, it may be necessary to assume a closure state for the process—a state that can account for entities that may come into or go out of existence.When collecting data to estimate such a model, we may not be able to “see” the closure state explicitly; we must posit some value for the numbe...
We develop a method for constructing confidence regions on the mean vectors of multivariate processes that is based on a vector autoregressive (VAR) representation of the data-generating process. A confidence-region-construction algorithm for a general autoregressive model is given. We establish the asymptotic validity of the confidence-region esti...
We propose a new procedure for providing confidence-interval estimators of the mean of a covariance-stationary process. The procedure, a modification of the method of batch means, is an improvement over existing methods when the process displays strong correlation and a comparatively small number of observations is available.
We assign weights to t...
An important factor affecting the performance of distributed simulations running on parallel-processing computers is the allocation of logical processes to the available physical processors. An inefficient allocation can result in excessive communication times and unfavorable load conditions. This leads to long run times, possibly giving performanc...
The authors investigate the effect of input-distribution specification on the validity of output from simple queuing models. In particular, the use of various kinds of empirical distributions for approximating service-time distributions is studied. It is shown that, when the approximating distributions were compared on the basis of variance and bia...
We compare, by means of factorially designed Monte Carlo simulation experiments, the performance of (macro-data) restricted least-squares point estimators with that of (micro-data) maximum likelihood estimators for Markov-process models. We find, by various measures of estimator accuracy, that micro data are approximately ten times more valuable th...
Provides a forum for the discussion of philosophical issues, but
more importantly sheds light on the quantitative relative merits of
various approaches for specifying input distributions and processes. The
authors give position statements and evidence on the alternative
approaches, including their practicality
We investigate the effect of input-distribution specification on the validity of output from simple queueing models. In particular, the use of various kinds of empirical distributions for approximating servicetime distributions is studied.
Mainly deal with queueing models, but give the properties of many useful statistical distributions and algorithms for generating them.