
Bernd HamannUniversity of California, Davis | UCD · Department of Computer Science
Bernd Hamann
Ph.D., Arizona State University, 1991
About
510
Publications
93,037
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
10,645
Citations
Introduction
I am a Professor of Computer Science at the University of California, Davis. U.S.A. My main areas of interest are data analysis and visualization, geometric design and modeling, and computer graphics.
Additional affiliations
Education
June 1988 - June 1988
October 1985 - May 1988
October 1983 - July 1986
Publications
Publications (510)
p>The diagnostic approach of interstitial lung diseases (ILDs) may be a challenge and requires a multidisciplinary discussion (MDD). While the definition of histological patterns of surgical lung biopsy by pathologists is helpful to the final diagnosis of MDD, its recognition could be challenging due to various degrees of lung remodeling and distor...
p>The diagnostic approach of interstitial lung diseases (ILDs) may be a challenge and requires a multidisciplinary discussion (MDD). While the definition of histological patterns of surgical lung biopsy by pathologists is helpful to the final diagnosis of MDD, its recognition could be challenging due to various degrees of lung remodeling and distor...
Understanding flow traffic patterns in networks, such as the Internet or service provider networks, is crucial to improving their design and building them robustly. However, as networks grow and become more complex, it is increasingly cumbersome and challenging to study how the many flow patterns, sizes and the continually changing source-destinati...
We present a finite element (FE) approach that deforms a given meshed CAD-based simulation model to a shape represented by a triangulation. We use an FE solver to calculate a smooth deformation field that we apply to the simulation mesh. The FE load case derives displacement boundaries from computed distance estimates between source and target mesh...
Today’s supercomputing capabilities allow ocean scientists to generate simulation data at increasingly higher spatial and temporal resolutions. However, I/O bandwidth and data storage costs limit the amount of data saved to disk. In situ methods are one solution to generate reduced data extracts, with the potential to reduce disk storage requiremen...
We present a finite-element-based approach to support quality assurance for assembly processes of non-rigid sheet metal parts in the automotive and aerospace industries. These parts have adjustable mechanical boundaries, making it possible to compensate geometrical deviations caused by the production process of components. Every part produced must...
We present a method for approximating surface data of arbitrary topology by a model of smoothly connected B-spline surfaces. Most of the existing solutions for this problem use constructions with limited degrees of freedom or they address smoothness between surfaces in a post-processing step, often leading to undesirable surface behavior in proximi...
Detecting anomalies in time series data of hydrocarbon reservoir production is crucially important. Anomalies can result for different reasons: gross errors, system availability, human intervention, or abrupt changes in the series. They must be identified due to their potential to alter the series correlation, influence data-driven forecast, and af...
This paper presents an iterative finite element (FE)–based method to calculate the gravity-free shape of nonrigid parts from an optical measurement performed on a non-over-constrained fixture. Measuring these kinds of parts in a stress-free state is almost impossible because deflections caused by their weight occur. To solve this problem, a simulat...
During large-scale simulations, intermediate data products such as image databases have become popular due to their low relative storage cost and fast in-situ analysis. Serving as a form of data reduction, these image databases have become more acceptable to perform data analysis on. We present an image-space detection and classification system for...
The goal of visual surface inspection is to analyze an object’s surface and detect defects by looking at it from different angles. Developments over the past years have made it possible to partially automate this process. Inspection systems use robots to move cameras and obtain pictures that are evaluated by image processing algorithms. Setting up...
Data assimilation is an important and time-consuming process in petroleum reservoir numerical simulation. It produces a set of calibrated models used to forecast and optimize oil and gas production. The process focuses on reducing uncertainties related to reservoir properties, yielding numerical reservoir models that plausibly reproduce measured da...
The authors introduce an integrative approach for the analysis of the high-dimensional parameter space relevant for decision-making in the context of quality control. Typically, a large number of parameters influence the quality of a manufactured part in an assembly process, and our approach supports the visual exploration and comprehension of the...
The authors introduce an integrative approach for the analysis of the high-dimensional parameter space relevant for decision-making in the context of quality control. Typically, a large number of parameters influence the quality of a manufactured part in an assembly process, and our approach supports the visual exploration and comprehension of the...
Numerical simulations use past reservoir behavior to calibrate models used to predict future performance. Traditionally, this process is carried out deterministically through history matching and most current approaches focus on developing probabilistic procedures, called data assimilation, whereby reservoir simulation models are calibrated to repr...
As more advanced and complex survey telescopes are developed, the size and scale of data being captured grows at increasing rates. Across various domains, data compression through wavelets has enabled the reduction of data size and increase in computation efficiency. In this paper, we provide qualitative and quantitative tests of a new wavelet-base...
We introduce an innovative ensemble analysis framework for organizing, searching, and comparing results produced by hundreds of physical simulations. Our web-based approach is built on standard technologies, utilizes a scalable and modular design, and is suitable for displaying the results of in situ analysis and extreme-scale simulations.
This paper introduces a method for detecting endpoints of partially overlapping straight fibers in three-dimensional voxel image data. The novel approach directly determines fiber endpoints without the need for more expansive single-fiber segmentation. In the context of fiber-reinforced polymers, endpoint information is of practical significance as...
We introduce an approach for the interactive visual analysis of weighted, dynamic networks. These networks arise in areas such as computational neuroscience, sociology, and biology. Network analysis remains challenging due to complex time-varying network behavior. For example, edges disappear/reappear, communities grow/vanish, or overall network to...
This paper concerns the use of compression methods applied to large scientific data. Specifically the paper addresses the effect of lossy compression on approximation error. Computer simulations, experiments and imaging technologies generate terabyte-scale datasets making necessary new approaches for compression coupled with data analysis. Lossless...
We present an integrated interactive framework for the visual analysis of time-varying multivariate data sets. As part of our research, we performed in-depth studies concerning the applicability of visualization techniques to obtain valuable insights. We consolidated the considered analysis and visualization methods in one framework, called TV-MV A...
This paper concerns virtual reality (VR) environments and innovative, natural interaction techniques for them. The presented research was driven by the goal to enable users to invoke actions with their body physically, causing the correct action of the VR environment. The paper introduces a system that tracks a user’s movements that are recognized...
Finding bottlenecks and eliminating them to increase the overall flow of a network often appears in real world applications, such as production planning, factory layout, flow related physical approaches, and even cyber security. In many cases, several edges can form a bottleneck (cascaded bottlenecks). This work presents a visual analytics methodol...
This paper tackles the non-trivial image-processing task to segment hook-ended fibers in three-dimensional images. For this purpose, a novel segmentation method is presented that relies on the following observation: For a single fiber the configurations of principal curvatures that can occur on its surface are limited. Deviations from these configu...
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scient...
This paper studies the influence of the definition of neighborhoods and methods used for creating point connectivity on topological analysis of scalar functions. It is assumed that a scalar function is known only at a finite set of points with associated function values. In order to utilize topological approaches to analyze the scalar-valued point...
We introduce a framework for distributed or co-located teams to collaborate highly efficiently using diverse mobile devices for design and assessment of complex systems. Our framework enhances the efficiency of collaborations arising in design, simulation or data analysis, including visualization. First, we investigate which requirements on such a...
Modern HPC centers comprise clusters, storage, networks, power and cooling infrastructure, and more. Analyzing the efficiency of these complex facilities is a daunting task. Increasingly, facilities deploy sensors and monitoring tools, but with millions of instrumented components, analyzing collected data manually is intractable. Data from an HPC c...
We present a clustering-based interactive approach to multivariate data analysis, motivated by the specific needs of scintillation data. Ionospheric scintillation is a rapid variation in the amplitude and/or phase of radio signals traveling through the ionosphere. This spatial and time-varying phenomenon is of great interest since it affects the re...
The problem of finding bottlenecks in flow networks often appears in real world applicationslike production planning, factory layout, flow related physical approaches and even cyber security. This work introduces intuitive visual mechanisms to enable domain experts and users to visually analyzestable regions of a network and identify critical trans...
We present a methodology to analyze and visualize an ensemble of finite pointset method (FPM) simulations that model the viscous fingering process of salt solutions inside water. In course of the simulations the solutions form structures with increased salt concentration value, called viscous fingers. These structures are of primary interest to dom...
The wear behavior of cutting tools directly affects the quality of the machined part. The measurement and evaluation of wear is a time consuming and process and is subjective. Therefore, an image-based wear measure that can be computed automatically based on given image series of cutting tools and an objective way to review the resulting wear is pr...
Utilizing the vector units of current processors for ray tracing single rays through Bounding Volume Hierarchies has been accomplished by increasing the branching factor of the acceleration structure to match the vector width. A high branching factor allows vectorized bounding box tests but requires a complex control flow for the calculation of a f...
Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Exis...
The representation of data quality within established high-dimensional data visualization techniques such as scatterplots and parallel coordinates is still an open problem. This work offers a scale-invariant measure based on Pareto optimality that is able to indicate the quality of data points with respect to the Pareto front. In cases where datase...
The representation of data quality within established high-dimensional data visualization techniques such as scatterplots and parallel coordinates is still an open problem. This work offers a scale-invariant measure based on Pareto optimality that is able to indicate the quality of data points with respect to the Pareto front. In cases where datase...
Background: There exists a need for effective and easy-to-use software tools supporting the analysis of complex
Electrocorticography (ECoG) data. Understanding how epileptic seizures develop or identifying diagnostic indicators
for neurological diseases require the in-depth analysis of neural activity data from ECoG. Such data is multi-scale and is...
Despite the fact of increasing popularity virtual environments still lack useful and natural interaction techniques. We present a multi-modal interaction interface, designed for smartwatches and smartphones for fully immersive environments. Our approach enhances the efficiency of interaction in virtual worlds in a natural and intuitive way. We have...
We propose a new method using the distribution of extrema of Laplacian eigenfunctions for two-dimensional (2D) shape description and matching. We construct a weighted directed graph, which we call signed natural neighbor graph, to represent a Laplacian eigenfunction of a shape. The nodes of this sparse graph are the extrema of the corresponding eig...
This paper introduces an improved approach for the volume data registration of human retina. Volume data registration refers to calculating out a near-optimal transformation between two volumes with overlapping region and stitching them together. Iterative closest point (ICP) algorithm is a registration method that deals with registration between p...
We describe a specialized methodology for segmenting 2D microscopy digital images of freshwater green microalgae. The goal is to obtain representative algae shapes to extract morphological features to be employed in a posterior step of taxonomical classification of the species. The proposed methodology relies on the seeded region growing principle...
Modern cyber-physical production systems (CPPS) connect different elements like machine tools and workpieces. The constituent elements are often equipped with highperformance sensors as well as information and communication technology, enabling them to interact with each other. This leads to an increasing amount and complexity of data that requires...
We present ECoG ClusterFlow, a novel interactive visual
analysis tool for the exploration of high-resolution Electrocorticography
(ECoG) data. Our system detects and visualizes
dynamic high-level structures, such as communities,
using the time-varying spatial connectivity network derived
from the high-resolution ECoG data. ECoG ClusterFlow
provides...
We present an analysis and visualization prototype using the concept of a flow topology graph (FTG) for characterization of flow in constrained networks, with a focus on discrete fracture networks (DFN), developed collaboratively by geoscientists and visualization scientists. Our method allows users to understand and evaluate flow and transport in...
We present Brain Modulyzer, an interactive visual exploration tool for functional magnetic resonance imaging (fMRI) brain scans, aimed at analyzing the correlation between different brain regions when resting or when performing mental tasks. Brain Modulyzer combines multiple coordinated views—such as heat maps, node link diagrams and anatomical vie...
Scattered data interpolation and approximation techniques allow for the reconstruction of a scalar field based upon a finite number of scattered samples of the field. In general, the fidelity of the reconstruction with respect to the original scalar field tends to deteriorate as the number of samples decreases. For the situation of very sparse samp...
Event traces are valuable for understanding the behavior of parallel programs. However, automatically analyzing a large parallel trace is difficult, especially without a specific objective. We aid this endeavor by extracting a trace’s logical structure, an ordering of trace events derived from happened-before relationships, while taking into accoun...
We describe a method for combining and visualizing a set of overlapping volumetric data sets with high resolution but limited spatial extent. Our system combines the calculation of a registration metric with ray casting for direct volume rendering on the graphics processing unit (GPU). We use the simulated annealing algorithm to find a registration...
The book discusses novel visualization techniques driven by the needs in medicine and life sciences as well as new application areas and challenges for visualization within these fields. It presents ideas and concepts for visual analysis of data from scientific studies of living organs or to the delivery of healthcare. Target scientific domains inc...
This paper compares the effectiveness of various multi-resolution geometric representation methods, such as B-spline, Daubechies, Coiflet and Dual-tree wavelets, curvelets and surfacelets, to capture the structure of fully developed turbulence using a truncated set of coefficients. The turbulence dataset is obtained from a Direct Numerical Simulati...
Asynchrony and non-determinism in Charm++ programs present a significant challenge in analyzing their event traces. We present a new framework to organize event traces of parallel programs written in Charm++. Our reorganization allows one to more easily explore and analyze such traces by providing context through logical structure. We describe seve...
We present a method for segmenting 2D microscopy images of freshwater green microalgae. Our approach is based on a specialized level set method, leading to efficient and highly accurate algae segmentation. The level set formulation of our problem allows us to generate an algae's boundary curve as the result of an evolving level curve, based on comp...
This paper introduces an improved method for detecting objects of interest (galaxies and stars) in astronomical images. After
applying a global detection scheme, further refinement is applied by dividing the entire image into several irregularly sized
sub-regions using the watershed segmentation method. A more refined detection procedure is perform...
Reconstruction of hand-held laser scanner data is used in industry primarily for reverse engineering. Traditionally, scanning and reconstruction are separate steps. The operator of the laser scanner h as no feedback from the reconstruction results. On-line reconstruction of the CAD geometry allows for such an immediate feedback.
We propose a method...
A central problem in image processing and computer vision is the computation of corresponding interest points in a given set of images. Usually, interest points are considered as independent elements described by some local information. Due to the limitations of such an approach, many incorrect correspondences can be obtained. A specific contributi...
With the continuous rise in complexity of modern supercomputers, optimizing the performance of large-scale parallel programs is becoming increasingly challenging. Simultaneously, the growth in scale magnifies the impact of even minor inefficiencies – potentially millions of compute hours and megawatts in power consumption can be wasted on avoidable...
Processing images of underwater environments of Antarctic lakes is challenging due to poor lighting conditions, low saturation and noise. This paper presents a novel pipeline for dense point cloud scene reconstruction from underwater stereo images and video obtained with low-cost consumer recording hardware. Features in stereo frames are selected a...
Optimizing memory access is critical for performance and power efficiency. CPU manufacturers have developed sampling-based performance measurement units (PMUs) that report precise costs of memory accesses at specific addresses. However, this data is too low-level to be meaningfully interpreted and contains an excessive amount of irrelevant or unint...
As the design, development and execution of manufacturing processes continue to spread out across the world, globally distributed enterprises demand new paradigms. Distance collaboration tools are becoming increasingly important in order to maintain synergies between spatially distributed entities enabling effective cooperation over large distances...
Performance visualization comprises techniques that aid developers and analysts in improving the time and energy efficiency of their software. In this work, we discuss performance as it relates to visualization and survey existing approaches in performance visualization. We present an overview of what types of performance data can be collected and...
Merge trees represent the topology of scalar functions. To assess the topological similarity of functions, one can compare their merge trees. To do so, one needs a notion of a distance between merge trees, which we define. We provide examples of using our merge tree distance and compare this new measure to other ways used to characterize topologica...
This chapter introduces a novel method for vortex detection in flow fields based on the corotation of line segments and glyph rendering. The corotation measure is defined as a point-symmetric scalar function on a sphere, suitable for direct representation in the form of a three-dimensional glyph. Appropriate placement of these glyphs in the domain...
Recent progress in retinal image acquisition techniques, including optical coherence tomography (OCT) and scanning laser ophthalmoscopy (SLO), combined with improved performance of adaptive optics (AO) instrumentation, has resulted in improvement in the quality of in vivo images of cellular structures in the human retina. Here, we present a short r...