Bradley ReardenOak Ridge National Laboratory | ORNL · Reactor and Nuclear Systems Division
Bradley Rearden
PhD, Nuclear Engineering - Texas A&M University
About
131
Publications
21,599
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,450
Citations
Introduction
Skills and Expertise
Publications
Publications (131)
New multigroup cross section libraries and associated covariance files that are optimized for the analysis of sodium-cooled fast reactors were generated with the AMPX tools distributed with the SCALE code system. These new libraries account for the fast neutron flux spectrum with sufficient resolution of resonances in the higher energy range to rep...
The subject of UACSA Phase IV was the quantification of covariances between neutron multiplication factors keff of criticality safety benchmark experiments due to uncertainties of system parameters shared by different experiments and the investigation of the impact of these covariances on criticality safety validation. Generally, these covariances...
A well-established knowledge of nuclear phenomena including fission, reaction cross sections, and structure/decay properties is critical for applications ranging from the design of new reactors to nonproliferation to the production of radioisotopes for the diagnosis and treatment of illness. However, the lack of a well-quantified, predictive theore...
Criticality safety analyses rely on the availability of relevant benchmark experiments to determine justifiable margins of subcriticality. When a target application lacks neutronically similar benchmark experiments, validation studies must provide justification to the regulator that the impact of modeling and simulation limitations is well understo...
Best Estimate Plus Uncertainty (BEPU), as an approach, involves predictive simulations in a safety assessment processes. Since the very beginning (80s) when the best estimate simulations have been firstly accepted in a safety assessment it has been associated the only with a domain of a system thermal hydraulics (SYS TH). Nowadays, due to a progres...
This paper summarizes the development of a capability for estimating the sensitivity of integral experiment results to evaluated nuclear data resonance parameters in the SCALE TSUNAMI module as the first step in using integral experiment results to improve nuclear data evaluations.
Criticality safety analyses rely on the availability of relevant benchmark experiments to determine justifiable margins of subcriticality. When a target application lacks neutronically similar benchmark experiments, validation studies must provide justification that the impact of modeling and simulation (M&S) limitations is well understood for the...
Criticality safety analyses rely on the availability of relevant benchmark experiments to determine justifiable margins of subcriticality. When a target application lacks neutronically similar benchmark experiments, validation studies must provide justification that the impact of modeling and simulation limitations is well understood for the applic...
Best Estimate Plus Uncertainty (BEPU) in its basis pretends on an exact use of high-fidelity simulations in a process of safety assessment. Despite on shared definition of BEPU has not been yet presented it seems to be a Decision Making (DM) support tool. BEPU needs, of course, in a consistent experiment-based UQ otherwise it could make a little se...
This paper summarizes efforts to improve the efficiency of Cf-252 production at Oak Ridge National Laboratory's High Flux Isotope Reactor by using sensitivity analysis to identify potential Cf-252 isotope production target design optimizations. The Generalized Perturbation Theory sensitivity coefficient capabilities of the TSUNAMI-3D code within th...
As basis for systematic sensitivity and uncertainty analyses of fast spectrum systems with modules of the SCALE code package, new multi-group cross section libraries are generated considering typical hard neutron flux spectra and a sufficient resolution of the resonances in the fast energy. The performance of the new libraries is investigated in te...
This paper investigates the strength of statistical metrics for predicting the onset and magnitude of bias in Monte Carlo tally estimates due to fission source undersampling in eigenvalue simulations. Previous studies found that metrics which had showed potential for predicting undersampling biases in flux and eigenvalue estimates in multigroup sim...
This study focuses on understanding the phenomena in Monte Carlo simulations known as undersampling, in which Monte Carlo tally estimates may not encounter a sufficient number of particles during each generation to obtain unbiased tally estimates. Steady-state Monte Carlo simulations were performed using the KENO Monte Carlo tools within the SCALE...
The sensitivity and uncertainty analysis tools of the Oak Ridge National Laboratory SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertain...
The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) analysis to advanced applications have motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) M...
The sensitivity and uncertainty analysis sequences of SCALE compute the sensitivity of k eff to each constituent multigroup cross section using perturbation theory based on forward and adjoint transport computations with several available codes. Versions 6.0 and 6.1 of SCALE, released in 2009 and 2010, respectively, include important additions to t...
This study explored the potential of using Markov chain convergence diagnostics to predict the prevalence and magnitude of biases due to undersampling in Monte Carlo eigenvalue and flux tally estimates. Five metrics were applied to two models of pressurized water reactor fuel assemblies and their potential for identifying undersampling biases was e...
The paper presents uncertainty and sensitivity analyses in criticality calculations with respect to nuclear data performed with the SCALE module TSUNAMI based on general perturbation theory, and the tools SAMPLER, to be available with the next SCALE release, and XSUSA; the latter are based on a random sampling approach. The investigation mainly use...
The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity...
This work introduces a new approach for calculating the sensitivity of generalized neu-tronic responses to nuclear data uncertainties using continuous-energy Monte Carlo methods. The GEneralized Adjoint Responses in Monte Carlo (GEAR-MC) method has enabled the calculation of high resolution sensitivity coefficients for multiple, generalized neutron...
Three methods for calculating continuous-energy eigenvalue sensitivity coefficients were developed and implemented into the SHIFT Monte Carlo code within the Scale code package. The methods were used for several simple test problems and were evaluated in terms of speed, accuracy, efficiency, and memory requirements. A promising new method for calcu...
The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) analysis to advanced applications have motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) M...
SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the...
The Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD) established a Subgroup (called “Subgroup 33”) in 2009 on “Methods and issues for the combined use of integral experiments and covariance data.” The first stage was devoted to producing the descrip...
Covariance data computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. Applications in criticality safety calculations and used nuclear fuel analysis are discussed.
A new covariance data library based on ENDF/B-VII.1 was recently processed for the SCALE nuclear analysis code system. The multigroup covariance data are discussed here, along with testing and application results for critical benchmark experiments. The cross section covariance library, along with covariances for fission product yields and decay dat...
The separation of concerns in the development of numerical models not only leads to a separation into components but, based on their purpose, these components may also be written in different programming languages. The sensitivity analysis of a numerical model provides quantitative information about the dependencies of the model outputs with respec...
This report has been issued by the WPEC Subgroup 33, whose mission was to review methods and issues of the combined use of integral experiments and covariance data, with the objective of recommending a set of best and consistent practices in order to improve evaluated nuclear data files. In particular, it is shown that the statistical adjustment me...
A new statistical sampling sequence called Sampler has been developed for the SCALE code system. Random
values for the input multigroup cross sections are determined by using the XSUSA program to sample uncertainty data provided in the SCALE covariance library. Using these samples, Sampler computes perturbed selfshielded cross sections and propagat...
This paper presents a method for determining partial biases and bias uncertainties for application in fission product burnup credit criticality safety analysis. The contribution of each nuclide to the overall system k eff bias and the bias uncertainty are determined via the generalized linear least squares method. Where experimental benchmarks are...
The Verified, Archived Library of Inputs and Data (VALID) at Oak Ridge National Laboratory (ORNL) contains high-quality, independently reviewed models and results that improve confidence in analysis and validation. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of...
The sensitivities of the keff eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncert...
This study introduced two new approaches for calculating the importance weighting function for Contributon and CLUTCH eigenvalue sensitivity coefficient calculations, and compared them in terms of accuracy and applicability. The necessary levels of mesh refinement and mesh convergence for obtaining accurate eigenvalue sensitivity coefficients were...
This study introduced three approaches for calculating the importance weighting function for Contributon and CLUTCH eigenvalue sensitivity coefficient calculations, and compared them in terms of accuracy and applicability. The necessary levels of mesh refinement and mesh convergence for obtaining accurate eigenvalue sensitivity coefficients were de...
A new statistical sampling sequence called Sampler has been developed for the SCALE code system. Random values for the input multigroup cross sections are determined by using the XSUSA program to sample uncertainty data provided in the SCALE covariance library. Using these samples, Sampler computes perturbed self-shielded cross sections and propaga...
This paper describes the Monte Carlo codes KENO V.a and KENO-VI in SCALE that are primarily used to calculate multiplication factors and flux distributions of fissile systems. Both codes allow explicit geometric representation of the target systems and are used internationally for safety analyses involving fissile materials. KENO V.a has limiting g...
In SCALE 6, the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) modules calculate the sensitivity of k eff or reactivity differences to the neutron cross-section data on an energy-dependent, nuclide-reaction-specific basis. These sensitivity data are useful for uncertainty quantification, using the comprehensive...
This report has been issued by WPEC Subgroup 33, whose mission was to study methods and issues of the combined use of integral experiments and covariance data, with the objective of recommending a set of best and consistent practices in order to improve evaluated nuclear data files. In a first step, the subgroup reviewed and assessed the existing n...
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or r...
The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existi...
Self-shielding effects are often ignored in S/U analysis because methods do not currently exist to obtain self-shielded covariance data. An alternative to using selfshielded covariance data is to compute "complete sensitivity coefficient"that include the effect of self-shielding. the effects of selfshielding. The complete sensitivities are combined...
The computational bias of criticality safety computer codes must be established through the validation of the codes to critical experiments. A large collection of suitable experiments has been vetted by the International Criticality Safety Benchmark Experiment Program (ICSBEP) and made available in the International Handbook of Evaluated Criticalit...
An assessment was previously performed to evaluate modeling capabilities and quantify preliminary biases and uncertainties associated with the modeling methods and data utilized in designing a nuclear reactor such as a beryllium-reflected, highly-enriched-uranium (HEU)-O2 fission surface power (FSP) system for space nuclear power. The conclusion of...
The SCALE code system developed at Oak Ridge National Laboratory provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE fo...
The SCALE TSUNAMI-3D sensitivity and uncertainty analysis sequence computes the sensitivity of k-eff to each constituent multigroup cross section using adjoint techniques with the KENO Monte Carlo codes. A new technique to simultaneously obtain the product of the forward and adjoint angular flux moments within a single Monte Carlo calculation has b...
This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data a...
The National Aeronautics and Space Administration (NASA) is providing funding to the Department of Energy to assess, develop, and test nuclear technologies that could provide surface power to a lunar outpost. Sufficient testing of this fission surface power (FSP) system will need to be completed to enable a decision by NASA for flight development....
VIBE is a new graphical user interface to identify and interpret sensitivity data from SCALE/TSUNAMI that was distributed with SCALE 6.0. VIBE enables users to group-collapse sensitivity data and then sort and filter the collapsed data to identify important processes in applications or experiments, providing an improved means of preselecting experi...
One of the challenges associated with implementation of burnup credit is the validation of criticality calculations used in the safety evaluation; in particular the availability and use of applicable critical experiment data. The purpose of the validation is to quantify the relationship between reality and calculated results. Validation and determi...
Oak Ridge National Laboratory (ORNL) staff used the SCALE TSUNAMI tools to provide a demonstration evaluation of critical experiments considered for use in validation of current and anticipated operations involving ²³³U at the Radiochemical Development Facility (RDF). This work was reported in ORNL/TM-2008/196 issued in January 2009. This paper pre...
Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on END...
The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of...
Ideally, computational method validation is performed by modeling critical experiments that are very similar, neutronically, to the model used in the safety analysis. Similar, in this context, means that the neutron multiplication factors (k{sub eff}) of the safety analysis model and critical experiment model are affected in the same way to the sam...
An advanced automatic differentiation tool for Fortran 90 software has been developed at Oak Ridge National Laboratory. This tool, called GRESS 90, has a code-coupling feature to propagate derivatives relative to the input of one code through a series of codes that utilize the results of one calculation as the input in the next to determine a final...
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a tech...
In the criticality code validation of common systems, many paths may exist to a correct bias, bias uncertainty, and upper subcritical limit. The challenge for the criticality analyst is to select an efficient, defensible, and safe methodology to consistently obtain the correct values. One method of testing criticality code validation techniques is...
Framatome ANP, Sandia National Laboratories (SNL), Oak Ridge National Laboratory (ORNL), and the University of Florida are cooperating on the U.S. Department of Energy Nuclear Energy Research Initiative (NERI) project 2001-0124 to design, assemble, execute, analyze, and document a series of critical experiments to validate reactor physics and criti...
The KENO V.a and KENO-VI three-dimensional Monte Carlo criticality computer codes in the SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory are widely used and accepted around the world for criticality safety analyses. As part of current development efforts to improve...