Theodore T. AllenThe Ohio State University | OSU · Department of Integrated Systems Engineering
Theodore T. Allen
Ph.D. Industrial and Operations Engineering, UMich
About
179
Publications
34,129
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,086
Citations
Introduction
Theodore T. Allen currently works at the Department of Integrated Systems Engineering at The Ohio State University. Theodore does research in Security & Efficiency Analytics including Supply Chain Optimization, Cybersecurity (Vulnerability Management, Optimized Incident Response, Cyber Resiliency Experimentation, and Vulnerability Intelligence), Digitalization in Manufacturing Including Vision Systems, and Public Awareness Campaigns Relating to Cybersecurity.
Additional affiliations
Education
September 1992 - March 1997
September 1991 - August 1992
September 1986 - June 1991
Publications
Publications (179)
Using empirical models to predict whether sections within pipes have defects can save inspection costs and, potentially, avoid oil spills. Optimal Classification Tree (OCT) formulations offer potentially desirable combinations of interpretability and prediction accuracy on unseen pipes. Approaches based on powerful state-of-the-art OCT formulations...
Images can provide critical information for quality engineering. Exploratory image data analysis (EIDA) is proposed here as a special case of EDA (exploratory data analysis) for quality improvement problems with image data. The EIDA method aims to obtain useful information from the image data to identify hypotheses for additional exploration relati...
DHL Supply Chain North America helped by the Ohio State University developed and implemented a suite of software called the Transportation Network Optimizer. The four modules relate to the same large scale vehicle routing integer programming including outsourcing. The software helped DHL save over $116M through improved bidding and outsourcing by r...
Background
Hand hygiene (HH) matters because it decreases pathogen transmission that can cause infection. Automatic alcohol-based hand rub (ABHR) dispensers are widely adopted in healthcare facilities as the preferred means of HH. Traditional automatic dispensers have a large supply of batteries in the dispenser housing, whereas energy-on-the-refil...
With the continuous improvement and observable benefits of electric vehicles (EVs), major logistic companies are introducing more EVs into their conventional fleets. This gives rise to a new type of vehicle routing problem with mixed vehicles, where heterogeneous internal combustion vehicles (ICVs) and electric vehicles are considered in route plan...
All newborns experience low blood glucose levels when they first initiate carbohydrate metabolism. Some levels remain low, with potential seizures and severe brain injury. Predicting newborns at higher risk is clinically useful because newborns can have their blood sugar raised with breastfeeding, donor milk, formula, or oral dextrose gels. Additio...
The use of artificial intelligence continues to increase. In healthcare, there has been a recent increase in AI applications to real-time individual patient clinical care, as opposed to population-based research or quality improvement efforts. However, the expertise to evaluate and implement these solutions is limited and often congregates in acade...
This study explores how EEG connectivity measures in children with ADHD ages 7–10 (n = 140) differ from an age-matched nonclinical database. We differentiated connectivity in networks, Brodmann area pairs, and frequencies. Subjects were in the International Collaborative ADHD Neurofeedback study, which explored neurofeedback for ADHD. Inclusion cri...
This paper considers a heterogeneous vehicle routing problem with common carriers and time regulations implemented at a major logistics company and contributing to an estimated $160M in savings. In our problem, the objective is to minimize the network costs by considering heterogeneous fleet routes, outsourcing options, time windows, and drivers’ l...
Objective
Reduce nurse response time for emergency and high-priority alarms by increasing discriminability between emergency and all other alarms and suppressing redundant and likely false high-priority alarms in a secondary alarm notification system (SANS).
Background
Emergency alarms are the most urgent, requiring immediate action to address a d...
Methods based on Gaussian stochastic process (GSP) models and expected improvement (EI) functions have been promising for box-constrained expensive optimization problems. These include robust design problems with environmental variables having set-type constraints. However, the methods that combine GSP and EI sub-optimizations suffer from the follo...
There is a great need for creating schedules that are optimized. Yet, some individuals have had less than desirable experiences with "optimal" scheduling. This could have been due to prioritization of the wrong criteria, leading to schedules that did not make practical sense, or that were math-intensive and were not able to be easily interpreted. A...
Immersive technology such as virtual, augmented, and mixed reality has been used in entertainment. Applying this technology for educational purposes is a natural extension. We tested the ability of immersive technology to enhance medical education within a scenario about progressively worsening tension pneumothorax using a virtual patient. The goal...
DHL Supply Q: 5 Chain North America moves more than 20 million packages each year. DHL transportation planners perform routing and cost-deduction tasks for many business projects. We refer to the associated planning problem as the Vehicle Routing Problem with Time Regulations and Common Carriers (VRPTRCC). Unlike ordinary vehicle routing problems,...
This paper proposes a new method to generate edited topics or clusters to analyze images for prioritizing quality issues. The approach is associated with a new way for subject matter experts to edit the cluster definitions by “zapping” or “boosting” pixels. We refer to the information entered by users or experts as “high-level” data and we are appa...
This paper focuses on the inverse problem of predicting inputs from measured outputs in the context of linear systems in steady-state. For system identification, we propose forward network identification regression (FNIR) and experimental planning involving simultaneously perturbing more than a single gene concentration using D-optimal designs. The...
Many business situations can be called “games” because outcomes depend on multiple decision makers with differing objectives. Yet, in many cases, the payoffs for all combinations of player options are not available, but the ability to experiment off-line is available. For example, war-gaming exercises, test marketing, cyber-range activities, and ma...
“Provisioning” in election systems refers to determination of the number of voting resources (poll books, poll workers, or voting machines) needed to ensure that all voters can expect to wait no longer than an appropriate amount, even the voter who waits the longest. Provisioning is a common problem for election officials and legislatures. A relate...
Election lines are more than a nuisance. In recent elections, needing to wait in lines deterred hundreds of thousands from voting and likely changed the winner in multiple cases. Part of the challenge is that even after the voter reaches the front of the line in some locations, it can require more than ten or twenty minutes to cast a ballot. Moreov...
In cybersecurity, incomplete inspection, resulting mainly from computers being turned off during the scan, leads to a challenge for scheduling maintenance actions. This article proposes the application of partially observable decision processes to derive cost‐effective cyber maintenance actions that minimize total costs. We consider several types o...
There is growing interest in using AI-based algorithms to support clinician decision-making. An important consideration is how transparent complex algorithms can be for predictions, particularly with respect to imminent mortality in a hospital environment. Understanding the basis of predictions, the process used to generate models and recommendatio...
Not every project needs to be planned and not every missed due date requires a notification. Also, the costs of the project managers themselves are not always negligible. Here, we propose models to clarify the appropriate level of planning and information flows among project participants. Ideally, a given task becomes available to start at exactly...
This article attempts to clarify how a technique called Fast (Finite Scenario-Based) Bayesian Reinforcement Learning ( works. Unlike Q-learning and other methods which require large amounts of data possibly through fast simulators, the FBRL treasures the real world data points that it obtains and will obtain. It does this by thinking about model be...
Linear regression models are not the only curve-fitting methods in wide use. Also, these methods are not useful for analyzing data for categorical responses. In this chapter, so-called “kriging” models, “artificial neural nets” (ANNs), and logistic regression methods are briefly described. ANNs and logistic regression methods are relevant for categ...
This chapter contains two descriptions of real projects in which a student played a major role in saving millions of dollars: the printed circuit board study and the wire harness voids study. The objectives of this chapter include: (1) providing direct evidence that the methods are widely used and associated with monetary savings and (2) challengin...
The phrase “statistical quality control” (SQC) refers to the application of statistical methods to monitor and evaluate systems and to determine whether changing key input variable (KIV) settings is appropriate. Specifically, SQC is associated with Shewhart’s statistical process charting (SPC) methods. These SPC methods include several charting pro...
Response surface methods (RSM) are primarily relevant when the decision-maker desires (1) to create a relatively accurate prediction of engineered system input-output relationships and (2) to “tune” or optimize thoroughly of the system being designed. Since these methods require more runs for a given number of factors than screening using fractiona...
Design for six sigma (DFSS) methods can be viewed as part of six sigma or an alternative method as described in Chap. 1. These methods generally involve teams that have control over the design nominals and specifications. Having this “design control” often means that the teams have relatively great power to improve the system quality. It has been s...
Design of experiments (DOE) methods are among the most complicated and useful of statistical quality control techniques. DOE methods can be an important part of a thorough system optimization, yielding definitive system design or redesign recommendations. These methods all involve the activities of experimental planning, conducting experiments, and...
The methods presented in this chapter are primarily relevant when it is desired to determine simultaneously which of many possible changes in system inputs cause average outputs to change. “Factor screening ” is the process of starting with a long list of possibly influential factors and ending with a usually smaller list of factors believed to aff...
As is the case for other six sigma-related methods, practitioners of six sigma have demonstrated that it is possible to derive value from design of experiments (DOE) and regression with little or no knowledge of statistical theory. However, understanding the implications of probability theory can be intellectually satisfying and enhance the chances...
The selection of confirmed key system input (KIV) settings is the main outcome of a six sigma project. The term “optimization problem” refers to the selection of settings to derive to formally maximize or minimize a quantitative objective. Chapter 6 described how formal optimization methods are sometimes applied in the assumption phase of projects...
Some people view statistical material as a way to push students to sharpen their minds, but as having little vocational or practical value. Furthermore, practitioners of six sigma have demonstrated that it is possible to derive value from statistical methods while having little or no knowledge of statistical theory. However, understanding the impli...
If the project involves an improvement to existing systems, the term “control” is used to refer to the final six sigma project phase in which tentative recommendations are confirmed and institutionalized. This follows because inspection controls are being put in place to confirm that the changes do initially increase quality and that they continue...
This chapter focuses on the definition of a project, including the designation of who is responsible for what progress by when. By definition, those applying six sigma methods must answer some or all of these questions in the first phase of their system improvement or new system design projects. Also, according to what may be regarded as a defining...
“Tolerance design” refers to the selection of specifications for individual components using formal optimization. Specifications might relate to the acceptable length of a shaft, for example, or the acceptable resistance of a specific resistor in a printed circuit board. Choices about the specifications are important in part because conforming comp...
In Chap 2, it was suggested that projects are useful for developing recommendations to change system key input variable (KIV) settings. The measure phase in six sigma for improvement projects quantitatively evaluates the current or default system KIVs, using thorough measurements of key output variables (KOVs) before changes are made. This informat...
In Chap. 3, the development and documentation of project goals was discussed. Chapter 4 described the process of evaluating relevant systems, including measurement systems, before any system changes are recommended by the project team. The analyze phase involves establishing cause-and-effect relationships between system inputs and outputs.
The purposes of this chapter are to review many of the most powerful statistical techniques from the previous chapters and to illustrate their application with Minitab® software. To focus the review, ten questions about simple experimental systems are asked and answered using Minitab and previously mentioned techniques.
In Chapt. 4, it is claimed that perhaps the majority of quality problems are caused by variation in quality characteristics. The evidence is that typically only a small fraction of units fail to conform to specifications. If characteristic values were consistent, then either 100% of units would conform or 0%. Robust design methods seek to reduce th...
In Chap. 5, methods were described with goals that included clarifying the input-output relationships of systems. The purpose of this chapter is to describe methods for using the information from previous phases to tune the inputs and develop tentative recommendations. The phrase “improvement phase” refers to the situation in which an existing syst...
Regression is a family of curve-fitting methods for (1) predicting average response performance for new combinations of factors and (2) understanding which factor changes cause changes in average outputs. In this chapter, the uses of regression for prediction and performing hypothesis tests are described. Regression methods are perhaps the most wid...
In this chapter, two additional case studies illustrate design of experiments (DOE) and regression being applied in real-world manufacturing. The first study involved the application of screening methods for identifying the cause of a major quality problem and resolving that problem. The second derives from Allen et al. (2000) and relates to the ap...
In the previous chapters several methods are described for achieving various objectives. Each of these methods can be viewed as representative of many other similar methods developed by researchers. Many of these methods are published in such respected journals as the Journal of Quality Technology, Technometrics, and The Bell System Technical Journ...
The purposes of this chapter are (1) to describe some of the main contributions of Edwards Deming, (2) to clarify the motivating potential score and the implications for job design, (3) to explain the relevance of the field of human factors engineering, and (4) to offer brief highlights from the change management literature. Each of these subjects...
The purposes of this chapter are: (1) to describe six sigma strategy and (2) to propose opportunities for additional research and evolution of six sigma. Part I of this book describes several methods that can structure activities within a project. Part II focuses on design of experiment (DOE) methods that can be used inside six sigma projects. DOE...
This book provides an accessible one-volume introduction to Lean Six Sigma and statistics in engineering for students and industry practitioners. Lean production has long been regarded as critical to business success in many industries. Over the last ten years, instruction in Six Sigma has been linked more and more with learning about the elements...
Freestyle text data such as surveys, complaint transcripts, customer ratings, or maintenance squawks can provide critical information for quality engineering. Exploratory text data analysis (ETDA) is proposed here as a special case of exploratory data analysis (EDA) for quality improvement problems with freestyle text data. The EDTA method seeks to...
Many decision problems are set in changing environments. For example, determining the optimal investment in cyber maintenance depends on whether there is evidence of an unusual vulnerability, such as "Heartbleed," that is causing an especially high rate of incidents. This gives rise to the need for timely information to update decision models so th...
This paper proposes methods for forward and inverse system modeling using Bayesian and least squares regression. These methods are based on both space-filling design criteria for multiple response problems and linear optimality criteria focusing on D-optimality. Modeling with and without the constant term is considered motivated by the case study a...
We developed two in-classroom seminar wargames to analyze and teach the prospective effects of proposed courses of action in response to cyberattacks. These were part of a research effort for the Army Cyber Command and Second Army (ARCYBER / 2A), under an NSF grant. These quick and simple wargames illustrate well the advantages such games can provi...
Handover communication improvement initiatives typically employ a "one size fits all" approach. A human factors perspective has the potential to guide how to tailor interventions to roles, levels of experience, settings, and types of patients. We conducted ethnographic observations of sign-outs by attending and resident physicians in 2 medical inte...
Control charting cyber vulnerabilities is challenging because the same vulnerabilities can remain from period to period. Also, hosts (personal computers, servers, printers, etc.) are often scanned infrequently and can be unavailable during scanning. To address these challenges, control charting of the period-to-period demerits per host using a hybr...
Job shop scheduling (JSS) problems have been studied for over six decades. Many of them are proved to be non-deterministic polynomial-time (NP) hard, which means that they are intractable and the computation time increases exponentially with the problem size goes up. Some assumptions have been made in the previous studies about JSS problems in orde...
Two-colour microarrays are used to study differential gene expression on a large scale. Experimental planning can help reduce the chances of wrong inferences about whether genes are differentially expressed. Previous research on this problem has focused on minimising estimation errors (according to variance-based criteria such as A-optimality) on t...
In practical applications, information about the accuracy or ‘fidelity’ of alternative surrogate systems may be ambiguous and difficult to determine. To address this problem, we propose to treat surrogate system fidelity level as a categorical factor in optimal response surface design. To design the associated experiments, we apply the Expected Int...
The frequencies of cyber attacks and known cyber vulnerabilities continue to increase and there is a need for models to focus limited administrator attention and build cases for additional resources. A related challenge is the scarcity of available data partly because of security concerns. In this paper, we propose a method based on Markov decision...
An efficient, radial basis function-based extension of multifidelity sequential Kriging optimization was developed. This method, referred to as multifidelity sequential radial basis optimization (MFSRBO) addresses multicriteria optimization involving more than a single type of model representing more than a single discipline, and takes into account...
The purpose of this article is to explore the interaction of two opposing forces. The forces of wealth accumulation in sturring job creation and the force of satisfaction and the “magic number” in causing job destruction are explored." An agent based model is proposed to explore the potentially competing effects of two hypothesized economic forces....
In this article, we attempt to simulate the election lines in four central Florida counties in the 2012 presidential election. To do this, we estimate the numbers of booths at all locations and the service times using data about poll closing times and numbers of ballot items at all 479 locations. Then, we investigate the relevance of an optimizatio...
Many inputs for simulation optimization models are assumed to come from known distributions. When such distributions are obtained from small sample sizes, the parameters of these distributions may be associated with an “uncertainty set” or ranges. The presence of this uncertainty means that one or more solutions may be optimal depending on which pa...
We apply service-operations-management concepts to improve the efficiency and equity of voting systems. Recent elections in the United States and elsewhere have been plagued by long lines, excessive waiting times, and perceptions of unfairness. We build models for the waiting lines at voting precincts using both traditional steady-state queueing me...
Providing equal access to public service resources is a fundamental goal of democratic societies. Growing research interest in public services (e.g., health care, humanitarian relief, elections) has increased the importance of considering objective functions related to equity. This article studies discrete resource allocation problems where the dec...
This case study documents the transition of an undergraduate software laboratory from faceto-
face only instruction to a blended-learning model motivated, in part, by instructor cost
savings. To assure quality in learning outcomes was preserved, we implemented the transition
using a randomized experiment. Participating students were randomly assign...
This article proposes a method for Pareto charting that is based on unsupervised, freestyle text such as customer complaint, rework, scrap, or maintenance event descriptions. The proposed procedure is based on a slight extension of the latent Dirichlet allocation method to form multifield latent Dirichlet allocation. The extension is the usage of f...
This paper explores the issue of model misspecification, or bias, in the context of response surface design problems involving quantitative and qualitative factors. New designs are proposed specifically to address bias and compared with five types of alternatives ranging from types of composite to D-optimal designs using four criteria including D-e...
This paper proposes a plot-based method for fractional factorial data analysis. The proposed plot is called a "response-probability model analysis plot" (RPMAP) because it displays the predicted responses associated with alternative models and decisions versus the model posterior probabilities. Benefits of the proposed method include unique informa...
In particular, the “Decide” module provides perhaps the most critically important way to build structure into simulations.
Decide can route entities based on a probabilistic condition such as whether or not parts conform to specifications. More
commonly, perhaps, it can route based on system conditions. For example, an entity might enter the shorte...
This chapter describes practical information relevant to simulation projects. In general, the inspiration for system changes
can come from many sources including from competitors. Also, the creativity involved with identifying alternative systems
is generally critical to project success. Simulation is usually only useful for evaluating hypothetical...
Computer speeds continue to increase. At the same time, the complexity and realism of simulations also continues to increase.
For example, 20 replicates of the voting systems simulation in Chap. 7 involve approximately 10 million simulated voters. Currently, a standard PC requires several minutes to yield the expected
worst precinct closing time es...
In this chapter, computer simulation approaches in addition to discrete event simulation are described. The focus is primarily
on agent-based modeling which is defined as the activity of simulating system-wide properties as they derive from the actions
and interactions of autonomous entities. This contrasts with system dynamics and other differenti...
Queuing theory models provide these benefits with at least two types of associated costs. First, users need to make a limiting
set of assumptions about arrivals and service distributions. These assumptions might not apply to any reasonable approximation
in some cases of interest. Making them could lead to inadvisable recommendations. Second, queuin...
This chapter describes the standard process for performing discrete event simulations to estimate expected waiting times.
In doing so, it describes event-based “controllers” that generate chronologies differentiating discrete event simulation from
other types of statistical simulations.
This chapter introduces a software package used widely for instruction and real-world decision-support often referred to as
ARENA software. ARENA is effectively a suite of software which includes the ARENA simulator, the Input Analyzer (for input
analysis), and the Process Analyzer (PAN) (for output analysis). All of these are produced by Rockwell...
This chapter describes methods for gathering and analyzing real world data to support discrete event simulation modeling. In this phase, the distribution approximations for each process including arrivals are estimated using a combination of field observations (e.g., based on stopwatch timings) and assumption-making. In many cases, the time and cos...
After input analysis, model building, and model validation, decision support is not immediately available. The simulation team simply has a model to predict outputs or responses for given combinations of input or factor settings. Showing related animations and the results from a single system is rarely sufficient. While the process of building the...
There is a growing body of knowledge describing the economic and social challenge faced by the United States because of the small (14%) and decreasing number of students pursuing Science, Technology, Engineering, and Mathematics (STEM) majors. We propose a simple two-period, agent-based simulation based on socia