Kenneth M. HansonLos Alamos National Laboratory | LANL · Retired
Kenneth M. Hanson
PhD
About
221
Publications
21,030
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
4,791
Citations
Introduction
Kenneth M. Hanson is retired from the Los Alamos National Laboratory. Kenneth has done research in Applied Mathematics, Probability Theory, Statistics, Bayesian Reconstruction, Image Analysis and Medical Imaging. His latest project is 'Tomographic ultrasound mammography.'
Additional affiliations
September 2011 - September 2016
Education
September 1963 - October 1970
September 1958 - June 1963
Publications
Publications (221)
To celebrate the 50th anniversary of the SPIE Medical Imaging meetings, I present photographs taken over the last three decades, selected with the goal of highlighting the enthusiasm and energy of the participants, which has led to the ongoing success of these meetings. Links are given to access complete albums for individual meetings.
Breast ultrasound tomography is an emerging imaging modality to reconstruct the sound speed, density, and ultrasound attenuation of the breast in addition to ultrasound reflection/beamforming images for breast cancer detection and characterization. We recently designed and manufactured a new synthetic-aperture breast ultrasound tomography prototype...
Breast ultrasound tomography is an emerging imaging modality to reconstruct the sound speed, density, and ultrasound attenuation of the breast in addition to ultrasound reflection/beamforming images for breast cancer detection and characterization. We recently designed and manufactured a new synthetic-aperture breast ultrasound tomography prototype...
Ultrasound tomography has great potential to provide quantitative estimations of physical properties of breast tumors for accurate characterization of breast cancer. We design and manufacture a new synthetic-aperture breast ultrasound tomography system with two parallel transducer arrays. The distance of these two transducer arrays is adjustable fo...
Ultrasound transmission tomography usually generates low-resolution breast images. We improve sound-speed reconstructions using ultrasound waveform tomography with both transmission and reflection data. We validate the improvement using computer-generated synthetic-aperture ultrasound transmission and reflection data for numerical breast phantoms....
Imaging breast microcalcifications is crucial for early detection and diagnosis of breast cancer. It is challenging for current clinical ultrasound to image breast microcalcifications. However, new imaging techniques using data acquired with a synthetic-aperture ultrasound system have the potential to significantly improve ultrasound imaging. We re...
Meta-analysis aims to combine results from multiple experiments. For example, a neutron reaction rate or cross section is typically measured in multiple experiments, and a single estimate and its uncertainty are provided for users of the estimated reaction rate. It is often difficult to combine estimates from multiple laboratories because there can...
During the early 90s, I engaged in a productive and enjoyable collaboration with Robert Wagner and his colleague, Kyle Myers. We explored the ramifications of the principle that the quality of an image should be assessed on the basis of how well it facilitates the performance of appropriate visual tasks. We applied this principle to algorithms used...
Ultrasound reflection imaging is a promising imaging modality for detecting small, early-stage breast cancers. Properly accounting for ultrasound scattering from heterogeneities within the breast is essential for high-resolution and high-quality ultrasound breast imaging. We develop a globally optimized Fourier finite-difference method for ultrasou...
Least‐squares data analysis is based on the assumption that the normal (Gaussian) distribution appropriately characterizes the likelihood, that is, the conditional probability of each measurement d, given a measured quantity y, p(d|y). On the other hand, there is ample evidence in nuclear physics of significant disagreements among measurements, whi...
Least-squares data analysis is based on the assumption that the normal (Gaussian) distribution appropriately characterizes the likelihood, that is, the conditional probability of each measurement d, given a measured quantity y, p(d | y). On the other hand, there is ample evidence in nuclear physics of significant disagreements among measurements, w...
Purpose: To improve resolution and reduce speckle in ultrasound breast images by accounting for ultrasound scattering from breast heterogeneities during reflectivityimage reconstruction.Method and Materials: X‐ray mammography often fails to detect cancers in dense breasts, while breast ultrasound has the potential to detect them. Breast heterogenei...
By drawing an analogy between the logarithm of a probability distribution and a physical potential, it is natural to ask the question, "what is the effect of applying an external force on model parameters?" In Bayesian inference, parameters are frequently estimated as those that maximize the posterior, yielding the maximum a posteriori (MAP) soluti...
An experimental program at Los Alamos Neutron Scattering Center (LANSCE) has been developed to precisely measure differential fission cross sections over 10 decades in incident neutron energy for a range of actinides relevant to advanced nuclear reactor designs and transmutation concepts. As the need for uncertainty quantification (UQ) and covarian...
Relationships between statistics and physics often provide a deeper understanding that can lead new or improved algorithmic approaches to solving statistics problems. It is well known that the negative logarithm of a probability distribution is analogous to a physical potential. I describe a novel approach, borrowed from physics, to estimating spec...
We present an approach to uncertainty quantification for nuclear applications that combines the covariance evaluation of differential cross-section data and the error propagation from matching a criticality experiment using a neutron-transport calculation. We have studied the reduction in uncertainty of 239Pu fission cross sections by using a one-d...
We present an approach to uncertainty quantification for nuclear applications, which combines the covariance evaluation of differential cross-sections data and the error propagation from matching a criticality experiment using a neutron transport calculation. We have studied the effect on Pu-239 fission cross sections of using a one-dimensional neu...
This article is the third in a series of papers concerning the importance of simulation code validation to the US Department of Energy Accelerated Strategic Computing Initiative (ASCI) program [1]. The series started with a review by John Garcia of the critical need for advanced validation techniques in the ASCI program, which was created to make u...
The evaluation of neutron cross sections as a function of energy is fraught with inconsistent measurements. I describe a Bayesian approach to deal with the inconsistencies by probabilistically modeling the possibility of discrepant data and data sets with long-tailed likelihood functions. Systematic normalization uncertainties in each data set are...
The evaluation of neutron cross sections as a function of energy is fraught with inconsistent measurements. I describe a Bayesian approach to deal with the inconsistencies by probabilistically modeling the possibility of discrepant data and data sets with long-tailed likelihood functions. Systematic normalization uncertainties in each data set are...
This set of tutorials provides an overview of Bayesian and probabilistic modeling, with special emphasis on cross-section evaluation. The fundamentals of the Bayesian approach are presented and illustrated with many examples. I will show how one can, with appropriate probabilistic modeling, cope with the usual goblins of data analysis, outliers, in...
Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement,...
Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement,...
The standard Monte Carlo (MC) technique is often used to estimate the integral of a function of many variables. When the cost of function evaluation is very high, however, as in the area of simulation science, it becomes important to look for more efficient techniques for numerical integration. The technique of quasi-Monte Carlo (QMC) offers improv...
We utilize data from Hopkinson-bar experiments and quasi-static compression experiments to characterize uncertainties for parameters governing the Preston-Tonks-Wallace (PTW) (1) plastic deformation model for a variety of materials. This particular plastic deformation model is designed to be valid over a range of input conditions, which include str...
Peelle’s Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement,...
Key to understanding the uncertainties in a physics simulation code is the full characterization of the uncertainties in the physics submodels incorporated in it. I demonstrate an approach to the determining the parameters of material models in a sim-ulation code that combines the principles of physics and probabilistic Bayesian analysis. The focus...
In performing statistical studies in simulation science, it is often necessary to estimate the value of an integral of a function that is based on a simulation calculation. Because simulations tend to be costly in terms of computer time and resources, the ability to accurately estimate integrals with a minimum number of function evaluations is a cr...
Key to understanding the uncertainties in a physics simulation code is the full characterization of the uncertainties in the physics submodels incorporated in it. I demonstrate an approach to the determining the parameters of material models in a simulation code that combines the principles of physics and probabilistic Bayesian analysis. The focus...
In performing statistical studies in simulation science, it is often necessary to estimate the value of an integral of a function that is based on a simulation calculation. Because simulations tend to be costly in terms of computer time and resources, the ability to accurately estimate integrals with a minimum number of function evaluations is a cr...
In performing statistical studies in simulation science, it is often necessary to estimate the value of an integral of a function that is based on a simulation calculation. Because simulations tend to be costly in terms of computer time and resources, the ability to accurately estimate integrals with a minimum number of function evaluations is a cr...
This document summarizes the Preston-Tonks-Wallace (PTW) model, and presents some examples of typical stress-strain relations.
The goal in Quasi-Monte Carlo (QMC) is to improve the accuracy of integrals estimated by the Monte Carlo technique through a suitable specification of the sample point set. Indeed, the errors from N samples typically drop as N -1 with QMC, which is much better than the N^-1/2 dependence obtained with Monte Carlo estimates based on random point sets...
We present an approach for assessing the uncertainties in simulation code outputs in which one focuses on the physics submodels incorporated into the code. Through a Bayesian analysis of a hierarchy of experiments that explore various aspects of the physics submodels, one can infer the sources of uncertainty, and quantify them. As an example of thi...
The goal in Quasi-Monte Carlo (QMC) is to improve the accuracy of integrals estimated by the Monte Carlo technique through a suitable speci.cation of the sample point set. Indeed, the errors from N samples typically drop as 1/N with QMC, which is much better than the 1/sqrt(N) dependence obtained with Monte Carlo estimates based on random point set...
The hybrid Markov Chain Monte Carlo technique affords a robust means for sampling multidimensional probability density functions with high efficiency, provided one can calculate the gradient of ϕ = minus-log-probability in a time comparable to calculating the probability itself. The latter condition is met using the technique of adjoint differentia...
The hybrid Markov Chain Monte Carlo technique affords a robust means for sampling multidimensional probability density functions with high efficiency, provided one can calculate the gradient of phi = minus-log-probability in a time comparable to calculating the probability itself. The latter condition is met using the technique of adjoint different...
A technique to estimate the radial dependence of the noise power spectrum of images is proposed in which the calculations are conducted solely in the spatial domain of the noise image. The noise power spectrum averaged over a radial spatial-frequency interval is obtained from the variance of a noise image that has been convolved with a small kernel...
The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is intr...
When using simulation codes, one often has the task of minimising a scalar objective function with respect to numerous parameters. This situation occurs when trying to fit (assimilate) data or trying to optimise an engineering design. For simulations in which the objective function to be minimised is reasonably well behaved, that is, is differentia...
We present an uncertainty analysis of data taken using the Rossi technique in which the horizontal oscilloscope sweep is driven sinusoidally in time while the vertical axis follows the signal amplitude. The analysis is aimed at determining the logarithmic derivative of the amplitude as a function of time. Within the Bayesian framework used, inferen...
The ubiquitous problem of estimating the background of a measured spectrum is solved with Bayesian probability theory. A mixture model is used to capture the defining characteristics of the problem, namely that the background is smoother than the signal. The smoothness property is quantified in terms of a cubic spline basis where a variable degree...
The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is intr...
The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is intr...
We present an uncertainty analysis of data taken using the Rossi technique
in which the horizontal oscilloscope sweep is driven sinusoidally in time while the
vertical axis follows the signal amplitude. The analysis is aimed at determining
the logarithmic derivative of the amplitude as a function of time. Within the
Bayesian framework used, inferen...
SPIE Conferences have long been an incubator for new ideas in imaging and optical technologies. A huge
number of new developments in medical imaging have been first publicized at SPIE conferences. In these
notes I will provide an overview of the development of image processing in medical imaging, as represented
in the SPIE press.
Deformable geometric models fit very naturally into the context of Bayesian analysis. The prior probability of boundary shapes is taken to proportional to the negative exponential of the deformation energy used to control the boundary. This probabilistic interpretation is demonstrated using a Markov-Chain Monte-Carlo (MCMC) technique, which permits...
This impromptu talk was presented to introduce the basics of the Markov Chain Monte Carlo technique, which is being increasing used in Bayesian analysis. The aim of MCMC is to produce a sequence of parameter vectors that represent random draws from a probability density function (pdf). The pdf of interest in Bayesian analysis is typically the poste...
In the Rossi technique for displaying a time-dependent signal, the horizontal sweep of an oscilloscope is driven sinusoidally in time and the vertical sweep is driven by the signal to be recorded. The major advantage of this unusual recording technique is that the trace is confined to the central region of the oscilloscope face, the region in which...
A general probabilistic technique for estimating background contributions to measured spectra is presented. A Bayesian model is used to capture the defining characteristics of the problem, namely, that the background is smoother than the signal. The signal is allowed to have positive and/or negative components. The background is represented in term...
We address the issue of reconstructing the shape of an object with constant interior activity from a set of projections. We directly estimate the position of a triangulated surface describing the boundary of the object by incorporating prior knowledge about the unknown shape. The inverse problem is addressed in a Bayesian framework. We compute the...
A probabilistic framework is presented for assessing the uncertainties in simulation predictions that arise from model parameters derived from uncertain measurements. A probabilistic network facilitates both conceptualizing and computationally implementing an analysis of a large number of experiments in terms of many intrinsic models in a logically...
In this article we build on our past attempts to reconstruct a 3D,
time-varying bolus of radiotracer from first-pass data obtained by the
dynamic SPECT imager, FASTSPECT, built by the University of Arizona. The
object imaged is a CardioWest total artificial heart. The bolus is
entirely contained in one ventricle and its associated inlet and outlet...
Because of its unique characteristics for imaging tissue, there is growing medical interest in the diagnostic procedure of optical tomography. In this procedure, one attempts to reconstruct the properties of a tissue sample from measurements of how infrared light pulses are propagated through the sample. The reconstructions show the spatial distrib...
Rheumatoid arthritis (RA) is one of the most common diseases of human joints. This progressive disease is characterized by an inflammation process that originates in the inner membrane (synovalis) of the capsule and spreads to other parts of the joint. In early stages the synovalis thickness and the permeability of this membrane changes. This leads...
Deformable geometric models can be used in the context of Bayesian analysis to solve ill-posed tomographic reconstruction problems. The uncertainties associated with a Bayesian analysis may be assessed by generating a set of random samples from the posterior, which may be accomplished using a Markov-Chain Monte-Carlo (MCMC) technique. We demonstrat...
Currently available tomographic image reconstruction schemes for optical tomography (OT) are mostly based on the limiting assumptions of small perturbations and a priori knowledge of the optical properties of a reference medium. Furthermore, these algorithms usually require the inversion of large, full, ill-conditioned Jacobian matrixes. In this wo...
Currently available image reconstruction schemes for optical tomography (OT) are mostly based on the limiting assumptions of small perturbations and a priori knowledge of the optical properties of a reference medium. Furthermore, these algorithms usually require the inversion of large, ill-conditioned Jacobian matrices. In this work a gradient-base...
Deformable geometric models fit very naturally into the context of Bayesian analysis. The prior probability of boundary shapes is taken to proportional to the negative exponential of the deformation energy used to control the boundary. This probabilistic interpretation is demonstrated using a Markov-Chain Monte-Carlo (MCMC) technique, which permits...
A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. We will discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenom...
The authors have developed a computer application, called the Bayes Inference Engine, to enable one to make inferences about models of a physical object from radiographs taken of it. In the BIE calculational models are represented by a data-flow diagram that can be manipulated by the analyst in a graphical-programming environment. The authors demon...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of model realizations that sample the posterior probability distribution of a Bayesian analysis. That sequence may be used to make inferences about the model uncertainties that derive from measurement uncertainties. This paper presents an approach to improv...
There is a growing need to assess the uncertainties in the predictions made by simulation codes. This assessment process, which is called validation, requires a complete methodology for characterizing the uncertainties in simulation codes. The analysis of any single experiment yields the uncertainties in model parameters that derive from uncertain...
Currently available tomographic image reconstruction schemes for photon migration tomography (PMT) are mostly based on the limiting assumptions of small perturbations and a priori knowledge of the optical properties of a reference medium. In this work a model-based iterative image reconstruction (MOBIIR) method is presented, which does not require...
We demonstrate the reconstruction of a 3D, time-varying bolus of radiotracer from first-pass data obtained by the dynamic SPECT imager, FASTSPECT, built by the University of Arizona. The object imaged is a CardioWest Total Artificial Heart. The bolus is entirely contained in one ventricle and its associated inlet and outlet tubes. The model for the...
We address the issue of reconstructing the shape of an object with uniform interior activity from a set of projections. We estimate directly from projection data the position of a triangulated surface describing the boundary of the object while incorporating prior knowledge about the unknown shape. This inverse problem is addressed in a Bayesian fr...
Quantitative high-resolution electron microscopy (QHREM) involves the detailed comparison of experimental high-resolution images with image simulation based on a model and weighted by the estimated uncertainty in the experimental results. For simple metals, such as Al, models have been systematically improved using nonlinear least-squares methods t...
Many problems in physics and modern computing are inverse problems -- problems where the desired output is known, and the task is to find the set of input parameters that will best reproduce that output in a hydrodynamics code (hydrocode). Optimization methods tackle this type of problem, and a central task in applying optimization methods is to be...
Experimental data often can only be interpreted by means of a computational simulation that approximately models the physical situation. We will discuss techniques that facilitate application to complex, large-scale simulations of the standard approach to inversion in which gradient-based optimization is used to find the parameters that best match...
We demonstrate the reconstruction of a 3D, time-varying
bolus of radiotracer from first-pass data obtained at the
dynamic SPECT imager, FASTSPECT, built by the
University of Arizona. The object imaged is a CardioWest
Total Artificial Heart. The bolus is entirely contained in one
ventricle and its associated inlet and outlet tracts. The model
for th...
When calculating the sensitivities of hydrocode results, one generally chooses either equation-based or code-based methods. The automatic differentiation (AD) programs ADIFOR, GRESS and TAMC utilize a code-based approach, determining adjoint code on a line-by-line basis, producing an enhanced source that is compiled and run to give the derivatives...
Deformable geometric models fit very naturally into the context of Bayesian analysis. The prior probability of boundary shapes is taken to proportional to the negative exponential of the deformation energy used to control the boundary. This probabilistic interpretation is demonstrated using a Markov-Chain Monte-Carlo (MCMC) technique, which permits...
A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. We will discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenom...
We demonstrate the reconstruction of a 3D, time-varying bolus of radiotracer from first-pass data obtained by the dynamic SPECT imager, FASTSPECT, built by the University of Arizona. The object imaged is a CardioWest Total Artificial Heart. The bolus is entirely contained in one ventricle and its associated inlet and outlet tubes. The model for the...
This paper addresses the issue of reconstructing the unknown field of absorption and scattering coefficients from time-resolved measurements of di#used light in a computationally efficient manner. The intended application is optical tomography, which has generated considerable interest in recent times. The inverse problem is posed in the Bayesian f...
We address the issue of reconstructing an object of constant interior density in the context of 3D tomography where there is prior knowledge about the unknown shape. We explore the direct estimation of the parameters of a chosen geometrical model from a set of radiographic measurements, rather than performing operations (segmentation for example) o...
Deformable geometric models can be used in the context of Bayesian analysis to solve ill-posed tomographic reconstruction problems. The uncertainties associated with a Bayesian analysis may be assessed by generating a set of random samples from the posterior, which may be accomplished using a Markov-Chain Monte-Carlo (MCMC) technique. We demonstrat...
Bayesian analysis is especially useful to apply to low-count
medical imaging data, such as gated cardiac SPECT, because it allows one
to solve the nonlinear, ill-posed, inverse problems associated with such
data. One advantage of the Bayesian approach is that it quantifies the
uncertainty in estimated parameters through the posterior probability.
T...
We address the issue of how to make decisions about the degree of smoothness demanded of a flexible contour used to model the boundary of a 2D object. We demonstrate the use of a Bayesian approach to set the strength of the smoothness prior for a tomographic reconstruction problem. The Akaike Information Criterion is used to determine whether to al...