Bradley JonesSAS Institute | SAS · JMP Division
Bradley Jones
Ph.D. Applied Economics
About
107
Publications
71,776
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,724
Citations
Introduction
Skills and Expertise
Additional affiliations
Publications
Publications (107)
Nonregular fractional factorial designs are a preferable alternative to regular resolution IV designs because they avoid confounding 2‐factor interactions. As a result, nonregular designs can estimate and identify a few active 2‐factor interactions. However, due to the sometimes complex alias structure of nonregular designs, standard factor screeni...
There is limited literature on screening when some factors are at three levels and others are at two levels. This topic has seen renewed interest of late following the introduction of the definitive screening design structure by Jones and Nachtsheim 2011 Jones, B., and C. J. Nachtsheim. 2011. A class of three-level designs for definitive screening...
In practice, optimal screening designs for arbitrary run sizes are traditionally generated using the D-criterion with factor settings fixed at +/- 1, even when considering continuous factors with levels in [-1, 1]. This paper identifies cases of undesirable estimation variance properties for such D-optimal designs and argues that generally A-optima...
This case study describes a novel use of a definitive screening design (DSD) run in a three-step process to create a special resin. In the first step, polymerization, three factors were varied. The two most important properties of the resin result from the polymerization step. Due to a useful feature of a DSD, the measured values of these propertie...
The prediction profiler as it is now called in the statistical software, JMP, was introduced at a talk with discussion entitled, “An Interactive Graph for Exploring Multidimensional Response Surfaces.” The talk was part of a session at the Joint Statistics Meetings in Atlanta, Georgia in 1991. The original intent of the plot was to provide a graph...
Nonregular designs are a preferable alternative to regular resolution IV designs because they avoid confounding two‐factor interactions. As a result nonregular designs can estimate and identify a few active two‐factor interactions. However, due to the sometimes complex alias structure of nonregular designs, standard screening strategies can fail to...
The purpose of this article is to persuade experimenters to choose A-optimal designs rather than D-optimal designs for screening experiments. The primary reason for this advice is that the A-optimality criterion is more consistent with the screening objective than the D-optimality criterion. The goal of screening experiments is to identify an activ...
In developing screening experiments for two‐level factors, practitioners typically are familiar with regular fractional factorial designs, which are orthogonal, globally D‐optimal (ie, 100% D‐efficient), and exist if N is a power of two. In addition, nonregular D‐optimal orthogonal designs can be generated for almost any N a multiple of four, the m...
This article considers how to provide pure-error degrees of freedom for definitive screening designs (DSDs) through partial replication. We provide two methods for obtaining pure-error degrees of freedom while minimizing the additional cost. We compare the properties of the two methods and make recommendations for the practitioner.
This paper describes the construction and analysis of definitive screening designs.
In this paper, we propose a new method for constructing supersaturated designs that is based on the Kronecker product of two carefully-chosen matrices. The construction method leads to a partitioning of the columns of the design such that the columns within a group are correlated to the others within the same group, but are orthogonal to any factor...
A common occurrence in practical design of experiments is that one factor, called a nested factor, can only be varied for some but not all the levels of a categorical factor, called a branching factor. In this case, it is possible, but inefficient, to proceed by performing two experiments. One experiment would be run at the level(s) of the branchin...
This article presents a case study of developing a space-filling design (SFD) for a constrained mixture experiment when the experimental region is specified by single-component constraints (SCCs), linear multiple-component constraints (LMCCs), and nonlinear multiple-component constraints (NMCCs). Traditional methods and software for designing const...
Using maximum likelihood (ML) estimation for discrete choice modeling of small datasets causes two problems. The first problem is that the data may exhibit separation, in which case the ML estimates do not exist. Also, provided they exist, the ML estimates are biased. In this paper, we show how to adapt Firth's penalized likelihood estimation for u...
Space‐filling designs allow for exploration of responses when each continuous input factor is set to many different values over its range. While typical space‐filling designs treat all the input factors as continuous, some problems necessitate the use of nominal input factors. In such cases, it is desirable that the design for the continuous inputs...
In this article, we consider a situation in which an investigator has run an initial screening experiment, has eliminated some inactive factors, and has undertaken a second stage of experimentation in an effort to identify the optimal operating conditions. If sequential experimentation is possible, the standard approach to this problem is to identi...
Definitive screening designs (DSDs) were recently introduced by Jones and Nachtsheim (2011b). The use of three-level factors and the desirable aliasing structure of the DSDs make them potentially suitable for identifying main effects and second-order terms in one stage of experimentation. However, as the number of active effects approaches the numb...
When experimental resources are significantly constrained, resolution V fractional factorial designs are often prohibitively large for experiments with 6 or more factors. Resolution IV designs may also be cost prohibitive, as additional experimentation may be required to de-alias active 2-factor interactions (2FI). This paper introduces 20-run no-c...
Replicating runs in designed experiments is good practice. The most important reason to replicate runs is to allow for a model-independent estimate of the error variance. Without the pure error degrees of freedom provided by replicated runs, the error variance will be biased if the fitted model is missing an active effect. This work provides a repl...
Since their introduction by Jones and Nachtsheim (2011 Jones, B. and Nachtsheim, C. J. (2011). “A Class of Three-Level Designs for Definitive Screening in the Presence of Second-Order Effects.” Journal of Quality Technology 43, 1–15.[Web of Science ®]), Definitive Screening Designs (DSDs) have seen application in fields as diverse as bio-manufactur...
We compare cost-efficient alternatives for the full factorial 24 design, the regular 25-1 fractional factorial design, and the regular 26-1 fractional factorial design that can fit the model consisting of all the main effects as well as all the two-factor interactions. For 4 and 5 factors we examine orthogonal arrays with 12 and 20 runs, respective...
In mixture experiments, the factors under study are proportions of the ingredients of a mixture. The special nature of the factors necessitates specific types of regression models, and specific types of experimental designs. Although mixture experiments usually are intended to predict the response(s) for all possible formulations of the mixture and...
The primary aim of screening experiments is to identify the active factors; that is, those having the largest effects on the response of interest. Large factor effects can be either main effects, two-factor interactions (2FIs), or even strong curvature effects. Because the number of runs in a screening experiment is generally on the order of the nu...
Recent work in two-level screening experiments has demonstrated the advantages of using small foldover designs, even when such designs are not orthogonal for the estimation of main effects. In this paper, we provide further support for this argument and develop a fast algorithm for constructing efficient two-level foldover (EFD) designs. We show th...
In some designed experiments, measurements of characteristics of the experimental units may be available prior to performing the runs. If the investigators believe that these measured characteristics may have some effect on the response of interest, then it seems natural to include these characteristics as factors in the experiment even though they...
Jones and Nachtsheim (2011) proposed a new class of screening designs called definitive screening designs. As originally presented, these designs are three-level designs for quantitative factors that provide estimates of main effects that are unbiased by any second-order effect and require only one more than twice as many runs as there are factors....
We consider screening experiments where an investigator wishes to study many factors using fewer observations. Our focus is on experiments with two-level factors and a main effects model with intercept. Since the number of parameters is larger than the number of observations, traditional methods of inference and design are unavailable. In 1959, Box...
For deterministic computer simulations Gaussian Process models are a standard procedure for fitting data. These models can be used only when the study design avoids having replicated points. This characteristic is also desirable for one-dimensional projections of the design, since it may happen that one of the design factors has a strongly nonlinea...
Resolution III regular fractional factorial designs for 9–14 factors in 16 runs are standard designs for factor screening in industrial experimentation because of their economical run size. However, for all these designs, the main effects are completely confounded with some two-factor interaction(s), so experimenters must frequently either augment...
In many discrete choice experiments set up for product innovation, the number of attributes is large, which results in a substantial cognitive burden for the respondents. To reduce the cognitive burden in such cases, Green suggested in the early '70s the use of partial profiles that vary only the levels of a subset of the attributes. In this paper,...
We consider screening experiments where an investigator wishes to study many factors using fewer observations. Our focus is on experiments with two-level factors and a main effects model with intercept. Since the number of parameters is larger than the number of observations, traditional methods of inference and design are unavailable. In 19592.
Bo...
In general, modeling data from blocked and split-plot response surface experiments requires the use of generalized least squares and the estimation of two variance components. The literature on the optimal design of blocked and split-plot response surface experiments, however, focuses entirely on the precise estimation of the fixed factor effects a...
Space-filling designs allow for exploration of responses with many different settings for each input factor. While much research has been done using rectangular design spaces, it is not uncommon to have constraints on the design region where some combinations are impossible or undesirable to run. In this article, we present an intuitive method for...
The no-confounding (NC) designs introduced by Jones and Montgomery (2010) are 16-run fractional factorials for six to eight factors having partial aliasing of the main effects by a few two-factor interactions but avoiding any complete confounding of any main effects or two-factor interactions with each other. These designs potentially allow for una...
The singular value decomposition of a real matrix always exists and is essentially unique. Based on the singular value decomposition of the design matrices of two general 2-level fractional factorial designs, new necessary and sufficient conditions for the determination of combinatorial equivalence or non-equivalence of the corresponding designs ar...
Bradley Jones states that this article addresses problems that engineers and scientists experience in collaborating with statisticians. He aims to contribute to the authors' methodology by suggesting an alternative approach for creating the w-trace with the same computational effort. His approach is inspired by the method proposed by Sambo, Borrott...
The singular value decomposition of a real matrix always exists and is essentially unique. Based on the singular value decomposition of the design matrices of two general 2-level fractional factorial designs, new necessary and sufficient conditions for the determination of combinatorial equivalence or non-equivalence of the corresponding designs ar...
Market segmentation is a key concept in marketing research. Identification of consumer segments helps in setting up and improving a marketing strategy. Hence, the need is to improve existing methods and to develop new segmentation methods. We introduce two new consumer indicators that can be used as segmentation basis in two-stage methods, the forc...
Recently, Jones and Nachtsheim (2011) proposed a new class of designs called definitive screening
designs (DSDs). These designs have three levels, provide estimates of main effects that are unbiased by
any second-order effect, require only one more than twice as many runs as there are factors, and avoid
confounding of any pair of second-order effec...
Space-filling designs are a common choice of experimental design strategy for computer experiments. This article compares space-filling design types based on their theoretical prediction variance properties with respect to the Gaussian process model. An analytical solution for calculating the integrated prediction variance (IV) of the Gaussian proc...
Recently, Jones and Nachtsheim (2011) proposed a new class of designs called definitive screening designs (DSDs). These designs have three levels, provide estimates of main effects that are unbiased by any second-order effect, require only one more than twice as many runs as there are factors, and avoid confounding of any pair of second-order effec...
Strip-plot designs are commonly used in situations where the production process consists of two process stages involving hard-to-change factors and where it is possible to apply the second stage to semifinished products from the first stage. In this paper, we focus on three-stage processes. As opposed to the threestage strip-plot designs in the lit...
Many industrial experiments involve restricted rather than complete randomization. This often leads to the use of split-plot designs, which limit the number of independent settings of some of the experimental factors. These factors, named whole-plot factors, are often, in some way, hard to change. The remaining factors, called subplot factors, are...
Computational methods to explore posterior distributions, in particular Markov chain Monte Carlo (MCMC), have played a dominant role in Bayesian statistics over the last 30 years. These methods have enabled statisticians and researchers to tackle problems that defy closed-form solution, greatly expanding the scope of Bayesian analysis. Joseph's ing...
A panel of prominent experts, who represent many different areas of academia, research, and industry, answered a series of questions about the present and future of Statistical engineering (SE). The experts talked on a new formal definition of SE that encompasses the integration of statistical thinking with the application of statistical methods an...
A panel of prominent experts, who represent different areas of academia, research, and industry, answered questions from diverse areas of industry, government, and academia about the changing roles for statisticians in the SE workplace and discuss some of the opportunities and challenges for the future. The members talked of the opportunities for s...
Response surface experiments often involve only quantitative factors, and the response is fit using a full quadratic model in these factors. The term response surface implies that interest in these studies is more on prediction than parameter estimation because the points on the fitted surface are predicted responses. When computing optimal designs...
In a discrete choice experiment, each respondent chooses the best product or service sequentially from many groups or choice sets of alternative goods. The alternatives, called pro¯les, are described by level combinations from a set of prede¯ned attributes. Respondents sometimes make their choices on the basis of only one dominant attribute rather...
One attractive feature of optimum design criteria, such as D-and A-optimality, is that they are directly related to statistically interpretable properties of the designs that are obtained, such as minimizing the volume of a joint confidence region for the parameters. However, the assumed relationships with inferential procedures are valid only if t...
In a discrete choice experiment, each respondent chooses the best product or service sequentially from many groups or choice sets of alternative goods. The alternatives are described by levels of a set of predefined attributes and are also referred to as profiles. Respondents often find it difficult to trade off prospective goods when every attribu...
Many industrial experiments involve restricted rather than complete randomization. This often leads to the use of split-plot designs, which limit the number of independent settings of some of the experimental factors. These factors, named whole-plot factors, are often, in some way, hard to change. The remaining factors, called sub-plot factors, are...
There are many situations in which the requirements of a standard experimental design do not fit the research requirements of the problem. Three such situations occur when the problem requires unusual resource restrictions, when there are constraints on the design region, and when a nonstandard model is expected to be required to adequately explain...
"This is an engaging and informative book on the modern practice of experimental design. The authors' writing style is entertaining, the consulting dialogs are extremely enjoyable, and the technical material is presented brilliantly but not overwhelmingly. The book is a joy to read. Everyone who practices or teaches DOE should read this book." -Dou...
Key conceptsCase: a robust and optimal process experimentPeek into the black boxBackground readingSummary
Key conceptsThe setup of a comparative experimentSummary
Recently, the use of Bayesian optimal designs for discrete choice experiments, also called stated choice experiments or conjoint choice experiments, has gained much attention, stimulating the development of Bayesian choice design algorithms. Characteristic for the Bayesian design strategy is that it incorporates the available information about peop...
JMP is a statistical software environment that enables scientists, engineers, and business analysts to make discoveries through data exploration. One powerful method for beginning the process of discovery employs statistically designed experiments. A well-designed experiment ensures that the resulting data have large information content. We support...
Recently, the use of Bayesian optimal designs for discrete choice experiments, also called stated choice experiments or conjoint choice experiments, has gained much attention, stimulating the development of Bayesian choice design algorithms. Characteristic for the Bayesian design strategy is that it incorporates the available information about peop...
Most two-level fractional factorial designs used in practice involve independent or fully confounded effects (so-called regular designs). For example, for 16 runs and 6 factors, the classical resolution IV design with defining relation I = ABCE = BCDF = ADEF has become the de facto gold standard. Recent work has indicated that non-regular orthogona...
For some experimenters, a disadvantage of the standard optimal design approach is that it does not consider explicitly the aliasing of specified model terms with terms that are potentially important but are not included in the model. For example, when constructing an optimal design for a first-order model, aliasing of main effects and interactions...
Due to the increasing interest in market segmentation in modern marketing research, several methods for dealing with consumer heterogeneity and for revealing market segments have been described in the literature.In this study, the authors compare eight two-stage segmentation methods that aim to uncover consumer segments by classifying subject-speci...
Screening designs are attractive for assessing the relative impact of a large number of factors on a
response of interest. Experimenters often prefer quantitative factors with three levels over two-level factors
because having three levels allows for some assessment of curvature in the factor–response relationship.
Yet, the most familiar screening...
The Gaussian process (GASP) model has found widespread use as a surrogate model for results from deterministic computer model output. In this paper, we compare the fits of GASP models to specific space-filling designs based on their accuracy in predicting responses at previously unsampled locations. This is done empirically using several test funct...
Screening designs are attractive for assessing the relative impact of a large number of factors on a response of interest. Experimenters often prefer quantitative factors with three levels over two-level factors because having three levels allows for some assessment of curvature in the factor—response relationship. Yet, the most familiar screening...
Screening designs are attractive for assessing the relative impact of a large number of factors on a response of interest.
Engineers prefer factors with three levels over two-level factors because having three levels allows for some assessment of
curvature in the factor-response relationship. Yet, the most familiar screening designs limit each fact...
For some experimenters, a disadvantage of the standard optimal design approach is that it does not consider explicitly the aliasing of specified model terms with terms that are potentially important but are not included in the model. For example, when constructing an optimal design for a first-order model, aliasing of main effects and interactions...
The resolution IV regular fractional factorial designs in 16 runs for six, seven, and eight factors are in standard use. They are economical and provide clear estimates of main effects when three‐factor and higher‐order interactions are negligible. However, because the two‐factor interactions are completely confounded, experimenters are frequently...
Problem: The dairy company FrieslandCampina had an opportunity to redesign the production process for its coffee cream. This product has a very specific viscosity, and the redesigned process had to result in the same viscosity as the old one.
Approach: For an effective redesign, the investigators wanted to obtain a simple model linking the settings...
The Gaussian process (GASP) model has found widespread use as a surrogate model for results from deterministic computer model output. In this paper, we compare the fits of GASP models to specific space-filling designs based on their accuracy in predicting responses at previously unsampled locations. This is done empirically using several test funct...
We introduce a new algorithm for generating near G-optimal designs for second-order models over cuboidal regions. The algorithm involves the use of Brent's minimization algorithm with coordinate exchange to create designs for 2 to 5 factors. Designs created using this new method either match or exceed the G-efficiency of previously reported designs...
The use of simulation as a modeling and analysis tool is wide spread. Simulation is an enabling tool for experimenting virtually on a validated computer environment. Often the underlying function for a computer experiment result has too much curvature to be adequately modeled by a low-order polynomial. In such cases, finding an appropriate experime...
We introduce a new algorithm for generating near G-optimal designs for second-order models over cuboidal regions. The algorithm involves the use of Brent's minimization algorithm with coordinate exchange to create designs for 2 to 5 factors. Designs created using this new method either match or exceed the G-efficiency of previously reported designs...
The past decade has seen rapid advances in the development of new methods for the design and analysis of split-plot experiments. Unfortunately, the value of these designs for industrial experimentation has not been fully appreciated. In this paper, we review recent developments and provide guidelines for the use of split-plot designs in industrial...
The cost of experimentation can often be reduced by forgoing complete randomization. A well-known design with restricted randomization is a split-plot design, which is commonly used in industry when some experimental factors are harder to change than others or when a two-stage production process is studied. Split-plot designs are also often used in...
In an effort to speed the development of new products and processes, many companies are turning to computer simulations to avoid the time and expense of building prototypes. These computer simulations are often complex, taking hours to complete one run. If there are many variables affecting the results of the simulation, then it makes sense to desi...
Recently, Kessels et al. (2006) developed a way to produce Bayesian G- and V-optimal designs for the multinomial logitmodel. These designs allow for precise response predictions which is the goal of conjoint choice experiments. The authors showed that the G- and V- optimality criteria outperform the D- and A-optimality criteria for prediction. Howe...
Experimental design in nonlinear settings is complicated by the fact that the efficiency of a design depends on the unknown parameter values. Thus good designs need to be efficient over a range of likely parameter values. Bayesian design criteria provide a natural framework for achieving such robustness, by averaging local design criteria over a pr...
In industrial experimentation, there is growing interest in studies that span more than one processing step. Convenience often dictates restrictions in randomization in passing from one processing step to another. When the study encompasses three processing steps, this leads to split-split-plot designs. We provide an algorithm for computing D-optim...
When comparing different designs for an experiment, optimality criteria and other measures often depend on the correctness of the assumed model. In this article we develop and illustrate an approach for comparing designs given the potential effect of bias due to an underspecified model. We illustrate this approach using graphical summaries of the e...
Supersaturated designs are an increasingly popular tool for screening factors in the presence of effect sparsity. The advantage of this class of designs over resolution III factorial designs or Plackett–Burman designs is that n, the number of runs, can be substantially smaller than the number of factors, m. A limitation associated with most supersa...
The use of simulation as a modeling and analysis tool is wide spread. Simulation is an enabling tool for experi- menting virtually on a validated computer environment. Often the underlying function for the results of a comput- er simulation experiment has too much curvature to be adequately modeled by a low order polynomial. In such cases finding a...
In this paper, we argue that some of the prior parameter distributions used in the literature for the construction of Bayesian optimal designs are internally inconsistent. We rectify this error and provide practical advice on how to properly specify the prior parameter distribution. Also, we present two pertinent examples to illustrate that Bayesia...
We introduce a new class of supersaturated designs using Bayesian D-optimality. The designs generated using this approach can have arbitrary sample sizes, can have any number of blocks of any size, and can incorporate categorical factors with more than two levels. In side by side diagnostic comparisons based on the E(s2) criterion for two-level exp...
Recent progress in model-robust designs has focused on maximizing estimation capacities. However, for a given design, two competing models may be both estimable and yet difficult or impossible to discriminate in the model selection procedure. In this paper, we propose several criteria for gauging the capability of a design for model discrimination....
We introduce a new method for generating optimal split-plot designs. These designs are optimal in the sense that they are efficient for estimating the fixed effects of the statistical model that is appropriate given the split-plot design structure. One advantage of the method is that it does not require the prior specification of a candidate set. T...