Jason L. Loeppky

Jason L. Loeppky
University of British Columbia - Okanagan | UBC Okanagan · Department of Statistics

Ph.D.

About

35
Publications
7,202
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,104
Citations

Publications

Publications (35)
Article
Full-text available
We consider the challenge of numerically comparing optimization algorithms that employ random-restarts under the assumption that only limited test data is available. We develop a bootstrapping technique to estimate the incumbent solution of the optimization problem over time as a stochastic process. The asymptotic properties of the estimator are ex...
Article
High intensity interval training (HIIT) elicits health benefits but it is unclear how HIIT impacts sedentary behaviour. In this preliminary study, we compared the effects of supervised HIIT or moderate intensity continuous training (MICT) on sedentary time in overweight/obese adults. In both groups, percentage of time spent in sedentary activities...
Article
Gaussian processes are widely used in the analysis of data from a computer model. Ideally, the analysis will yield accurate predictions with correct coverage probabilities of credible intervals. In this paper, we first review several existing Bayesian implementations in the literature. We show that Bayesian approaches with squared-exponential corre...
Article
Structural health monitoring is widely applied in industrial sectors as it reduces costs associated with maintenance intervals and manual inspections of damage in sensitive structures, while enhancing their operation safety. A major concern and current challenge in developing “robust” structural health monitoring systems, however, is the impact of...
Article
Water utilities often rely on water main failure prediction models to develop an effective maintenance, rehabilitation and replacement (M/R/R) action plan. However, the understanding of water main failure becomes difficult due to various uncertainties. In this study, a Bayesian updating based water main failure prediction framework is developed to...
Article
The current performance-based building design considers maximum interstorey drift (MISD) ratio as the main structural performance indicator. Observations from past earthquake and reported studies, however, have highlighted that residual interstory drift (RISD) ratio has become an important factor in assessing post-earthquake safety of buildings, an...
Article
Full-text available
Background & Aims: Long chain omega-3 polyunsaturated fatty acids (n-3 PUFA), such as docosahexaenoic acid (DHA) are widely considered beneficial for infant health and development. The aim of this meta-analysis was to summarize the evidence related to the clinical outcomes of long chain n-3 PUFA supplementation maternally and in fortified formula/t...
Article
Full-text available
Statistical methods based on a regression model plus a zero-mean Gaussian process (GP) have been widely used for predicting the output of a deterministic computer code. There are many suggestions in the literature for how to choose the regression component and how to model the correlation structure of the GP. This article argues that comprehensive,...
Article
Constrained blackbox optimization is a difficult problem, with most approaches coming from the mathematical programming literature. The statistical literature is sparse, especially in addressing problems with nontrivial constraints. This situation is unfortunate because statistical methods have many attractive properties: global scope, handling noi...
Article
Full-text available
Space filling designs are central to studying complex systems, they allow one to understand the overall behaviour of the response over the input space and construct models reduced uncertainty. In many applications a set of constraints are imposed over the inputs that result in a non-rectangular and sometimes non-convex input space. In these cases t...
Article
In this paper, we investigate the problem of assessing statistical methods and effectively summarizing results from simulations. Specifically, we consider problems of the type where multiple methods are compared on a reasonably large test set of problems. These simulation studies are typically used to provide advice on an effective method for analy...
Article
Full-text available
Adaptive and sequential experiment design is a well-studied area in numerous domains. We survey and synthesize the work of the online statistical learning paradigm referred to as multi-armed bandits integrating the existing research as a resource for a certain class of online experiments. We first explore the traditional stochastic model of a multi...
Article
Full-text available
This is a discussion of the paper "Modeling an Augmented Lagrangian for Improved Blackbox Constrained Optimization," (Gramacy, R.~B., Gray, G.~A., Digabel, S.~L., Lee, H.~K.~H., Ranjan, P., Wells, G., and Wild, S.~M., Technometrics, 61, 1--38, 2015).
Article
Full-text available
Determining optimal surveillance networks for an emerging pathogen is difficult since it is not known beforehand what the characteristics of a pathogen will be or where it will emerge. The resources for surveillance of infectious diseases in animals and wildlife are often limited and mathematical modeling can play a supporting role in examining a w...
Conference Paper
Full-text available
Restless bandits model the exploration vs. exploitation trade-off in a changing (non-stationary) world. Restless bandits have been studied in both the context of continuously-changing (drifting) and change-point (sudden) restlessness. In this work, we study specific classes of drifting restless bandits selected for their relevance to modelling an o...
Article
Water distribution networks (WDNs) are among the most important and expensive municipal infrastructure assets that are vital to public health. Municipal authorities strive for implementing preventive (or proactive) programs rather than corrective (or reactive) programs. The ability to predict the failure of pipes in WDNs is vital in the proactive i...
Article
Full-text available
We propose a new approach to comparing simulated observations that enables us to determine the significance of the underlying physical effects. We utilize the methodology of experimental design, a subfield of statistical analysis, to establish a framework for comparing simulated position-position-velocity data cubes to each other. We propose three...
Article
A mixture experiment is characterized by having two or more inputs that are specified as a percentage contribution to a total amount of material. In such situations, the input variables are correlated because they must sum to one. Consequently, additional care must be taken when fitting statistical models or visualizing the effect of one or more in...
Article
Consider the weighted maxmin dispersion problem of locating point(s) in a given region Χ ⊆ℝn that is/are furthest from a given set of m points. The region is assumed to be convex under componentwise squaring. We show that this problem is NP-hard even when Χ is a box and the weights are equal. We then propose a convex relaxation of this problem for...
Conference Paper
We describe position-position-velocity data cubes derived from the star-forming interstellar medium (SFISM). The physical characteristics and evolution of the SFISM must be deduced from the incomplete information present in the observed data. Astrophysicists can simulate the evolution of the SFISM from first principles but the standard comparisons...
Article
In this paper we define a new class of designs for computer experiments. A projection array based design defines sets of simulation runs with properties that extend the conceptual properties of orthogonal array based Latin hypercube sampling, particularly to underlying design structures other than orthogonal arrays. Additionally, we illustrate how...
Article
Sequential experiment design strategies have been proposed for efficiently augmenting initial designs to solve many problems of interest to computer experimenters, including optimization, contour and threshold estimation, and global prediction. We focus on batch sequential design strategies for achieving maturity in global prediction of discrepancy...
Article
Full-text available
We produce reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension. Our arguments quantify two key characteristics of computer codes that affect the sample size required for a desired level of accuracy when approximating the code via a Gaussi...
Article
Computer models simulating a physical process are used in many areas of science. Due to the complex nature of these codes it is often necessary to approximate the code, which is typically done using a Gaussian process. In many situations the number of code runs available to build the Gaussian process approximation is limited. When the initial desig...
Article
The investigation of complex physical systems utilizing sophisticated computer models has become commonplace with the advent of modern computational facilities. In many applications, experimental data on the physical systems of interest is extremely expensive to obtain and hence is available in limited quantities. The mathematical systems implement...
Article
In many industrial applications, the experimenter is interested in estimating some of the main effects and two-factor interactions. In this article we rank two-level orthogonal designs based on the number of estimable models containing a subset of main effects and their associated two-factor interactions. By ranking designs in this way, the experim...
Article
In recent years there has been considerable attention paid to robust parameter design as a strategy for variance reduction. Of particular concern is the selection of a good experimental plan in light of the two different types of factors in the experiment (control and noise) and the asymmetric manner in which effects of the same order are treated....
Article
Full-text available
Computer models to simulate physical phenomena are now widely available in engineering and science. Before relying on a computer model, a natural first step is often to compare its output with physical or field data, to assess whether the computer model reliably represents the real world. Field data, when available, can also be used to calibrate or...
Article
In many industrial settings, experimental replication is sacrificed for run size. This can present serious difficulties in the analysis. Further restrictions are often placed on the experiment due to the inability or cost of completely randomizing the run order. Two situations where this arises are in blocked fractional factorial experiments and fr...
Article
The terms curvature and interaction traditionally are not defined or used in the context of mixture experiments because curvature and interaction effects are partially confounded due to the mixture constraint that the component proportions sum to 1. Instead, the term nonlinear blending traditionally is defined and used. However, just as the concept...
Article
A glass composition variation study (CVS) for high-level waste (HLW) stored at the Idaho National Engineering and Environmental Laboratory (INEEL) is being statistically designed and performed in phases over several years. The purpose of the CVS is to investigate and model how HLW-glass properties depend on glass composition within a glass composit...
Article
Full-text available
Transformations can help small sample likelihood/Bayesian inference by improving the ap-proximate normality of the likelihood/posterior. In this article we investigate when one can expect an improvement for a one-dimensional random function (Gaussian process) model. The log transformation of the range parameter is compared with an alternative (the...
Article
Full-text available
Thesis (Ph.D.)--Simon Fraser University, 2004. Includes bibliographical references.

Network

Cited By