Harald Obermaier

University of California, Davis, Davis, California, United States

Are you Harald Obermaier?

Claim your profile

Publications (31)23.59 Total impact

  • Kevin Bensema · Luke Gosink · Harald Obermaier · Kenneth Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space. While this approach helps address conceptual and parametric uncertainties, the ensemble datasets produced by this technique present a special challenge to visualization researchers as the ensemble dataset records a distribution of possible values for each location in the domain. Contemporary visualization approaches that rely solely on summary statistics (e.g., mean and variance) cannot convey the detailed information encoded in ensemble distributions that are paramount to ensemble analysis; summary statistics provide no information about modality classification and modality persistence. To address this problem, we propose a novel technique that classifies high-variance locations based on the modality of the distribution of ensemble predictions. Additionally, we develop a set of confidence metrics to inform the end-user of the quality of fit between the distribution at a given location and its assigned class. Finally, for the special application of evaulating the stability of bimodal regions, we develop local and regional metrics.
    No preview · Article · Dec 2015 · IEEE Transactions on Visualization and Computer Graphics
  • Harald Obermaier · Kevin Bensema · Kenneth Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Visualization and analysis techniques play a key role in the discovery of relevant features in ensemble data. Trends, in the form of persisting commonalities or differences in time-varying ensemble datasets, constitute one of the most expressive feature types in ensemble analysis. We develop a flow-graph representation as the core of a system designed for the visual analysis of trends in time-varying ensembles. In our interactive analysis framework, this graph is linked to a representation of ensemble parameter-space and the ensemble itself. This facilitates a detailed examination of trends and their correlations to properties of input-space. We demonstrate the utility of the proposed trends analysis framework in several benchmark data sets, highlighting its capability to support goal-driven design of time-varying simulations.
    No preview · Article · Dec 2015 · IEEE Transactions on Visualization and Computer Graphics
  • Harald Obermaier · Kenneth I Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Effective display and visual analysis of complex 3D data is a challenging task. Occlusions, overlaps, and projective distortions-as frequently caused by typical 3D rendering techniques-can be major obstacles to unambiguous and robust data analysis. Slicing planes are a ubiquitous tool to resolve several of these issues. They act as simple clipping geometry to provide clear cut-away views of the data. We propose to enhance the visualization and analysis process by providing methods for automatic placement of such slicing planes based on local optimization of gradient vector flow. The final obtained slicing planes maximize the total amount of information displayed with respect to a pre-specified importance function. We demonstrate how such automated slicing plane placement is able to support and enrich 3D data visualization and analysis in multiple scenarios, such as volume or surface rendering, and evaluate its performance in several benchmark data sets.
    No preview · Article · Nov 2015 · IEEE Transactions on Visualization and Computer Graphics
  • Jennifer Chandler · Harald Obermaier · Kenneth I. Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Particle tracing in time-varying flow fields is traditionally performed by numerical integration of the underlying vector field. This procedure can become computationally expensive, especially in scattered, particle-based flow fields, which complicate interpolation due to the lack of an explicit neighborhood structure. If such a particle-based flow field allows for the identification of consecutive particle positions, an alternative approach to particle tracing can be employed: we substitute repeated numerical integration of vector data by geometric interpolation in the highly dynamic particle system as defined by the particle-based simulation. To allow for efficient and accurate location and interpolation of changing particle neighborhoods, we develop a modified k-d tree representation that is capable of creating a dynamic partitioning of even highly compressible data sets with strongly varying particle densities. With this representation we are able to efficiently perform pathline computation by identifying, tracking, and updating an enclosing, dynamic particle neighborhood as particles move overtime. We investigate and evaluate the complexity, accuracy, and robustness of this interpolation-based alternative approach to trajectory generation in compressible and incompressible particle systems generated by simulation techniques such as Smoothed Particle Hydrodynamics (SPH).
    No preview · Article · Sep 2015 · IEEE Transactions on Visualization and Computer Graphics
  • A. Berres · H. Obermaier · K. Joy · H. Hagen
    [Show abstract] [Hide abstract]
    ABSTRACT: Time surfaces are a versatile tool to visualise advection and deformation in flow fields. Due to complex flow behaviours involving stretching, shearing, and folding, straightforward mesh-based representations of these surfaces can develop artefacts and degenerate quickly. Common counter-measures rely on refinement and adaptive insertion of new particles which lead to an unpredictable increase in memory requirements. We propose a novel time surface extraction technique that keeps the number of required flow particles constant, while providing a high level of fidelity and enabling straightforward load balancing. Our solution implements a 2D particle relaxation procedure that makes use of local surface metric tensors to model surface deformations. We combine this with an accurate bicubic surface representation to provide an artefact-free surface visualisation. We demonstrate and evaluate benefits of the proposed method with respect to surface accuracy and computational efficiency.
    No preview · Article · Jul 2015
  • A. Agranovsky · H. Obermaier · C. Garth · K.I. Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Where the computation of particle trajectories in classic vector field representations includes computationally involved numerical integration, a Lagrangian representation in the form of a flow map opens up new alternative ways of trajectory extraction through interpolation. In our paper, we present a novel re-organization of the Lagrangian representation by sub-sampling a pre-computed set of trajectories into multiple levels of resolution, maintaining a bound over the amount of memory mapped by the file system. We exemplify the advantages of replacing integration with interpolation for particle trajectory calculation through a real-time, low memory cost, interactive exploration environment for the study of flow fields. Beginning with a base resolution, once an area of interest is located, additional trajectories from other levels of resolution are dynamically loaded, densely covering those regions of the flow field that are relevant for the extraction of the desired feature. We show that as more trajectories are loaded, the accuracy of the extracted features converges to the accuracy of the flow features extracted from numerical integration with the added benefit of real-time, non-iterative, multi-resolution path and time surface extraction.
    No preview · Article · Jan 2015 · Proceedings of SPIE - The International Society for Optical Engineering
  • Harald Obermaier · Kenneth I. Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Simulating complex events is a challenge and often requires carefully selecting simulation parameters. As vast computation resources become available, researchers can run alternative parameter settings or simulation models in parallel, creating an ensemble of possible outcomes for a given event of interest. The visual analysis of ensembles is one of visualization's most important new areas and should greatly affect the field in the next few years. The goal is to develop expressive visualizations of an ensemble's properties to support scientists in this demanding parameter-space exploration.
    No preview · Article · May 2014 · IEEE Computer Graphics and Applications
  • [Show abstract] [Hide abstract]
    ABSTRACT: Evaluation, solved and unsolved problems, and future directions are popular themes pervading the visualization community over the last decade. The top unsolved problem in both scientific and information visualization was the subject of an IEEE Visualization Conference panel in 2004. The future of graphics hardware was another important topic of discussion the same year. The subject of how to evaluate visualization returned a few years later. Chris Johnson published a list of 10 top problems in scientific visualization research. This was followed up by report of both past achievements and future challenges in visualization research as well as financial support recommendations to the National Science Foundation (NSF) and National Institute of Health (NIH). Chen recently published the first list of top unsolved information visualization problems. Future research directions of topology-based visualization was also a major theme of a workshop on topology-based methods. Laramee and Kosara published a list of top future challenges in human-centered visualization.
    No preview · Article · Jan 2014 · Mathematics and Visualization
  • Harald Obermaier · Ronald Peikert
    [Show abstract] [Hide abstract]
    ABSTRACT: Feature-based techniques are one of the main categories of methods used in scientific visualization. Features are structures in a dataset that are meaningful within the scientific or engineering context of the dataset. Extracted features can be visualized directly, or they can be used indirectly for modifying another type of visualization. In multifield data, each of the component fields can be searched for features, but in addition, there can be features of the multifield which rely on information form several of its components and which cannot be found by searching in a single field. In this chapter we give a survey of feature-based visualization of multifields, taking both of these feature types into account.
    No preview · Article · Jan 2014 · Mathematics and Visualization
  • [Show abstract] [Hide abstract]
    ABSTRACT: Numerical ensemble forecasting is a powerful tool that drives many risk analysis efforts and decision making tasks. These ensembles are composed of individual simulations that each uniquely model a possible outcome for a common event of interest: e.g., the direction and force of a hurricane, or the path of travel and mortality rate of a pandemic. This paper presents a new visual strategy to help quantify and characterize a numerical ensemble's predictive uncertainty: i.e., the ability for ensemble constituents to accurately and consistently predict an event of interest based on ground truth observations. Our strategy employs a Bayesian framework to first construct a statistical aggregate from the ensemble. We extend the information obtained from the aggregate with a visualization strategy that characterizes predictive uncertainty at two levels: at a global level, which assesses the ensemble as a whole, as well as a local level, which examines each of the ensemble's constituents. Through this approach, modelers are able to better assess the predictive strengths and weaknesses of the ensemble as a whole, as well as individual models. We apply our method to two datasets to demonstrate its broad applicability.
    No preview · Article · Dec 2013
  • Mathias Hummel · Harald Obermaier · Christoph Garth · Kenneth I Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Sets of simulation runs based on parameter and model variation, so-called ensembles, are increasingly used to model physical behaviors whose parameter space is too large or complex to be explored automatically. Visualization plays a key role in conveying important properties in ensembles, such as the degree to which members of the ensemble agree or disagree in their behavior. For ensembles of time-varying vector fields, there are numerous challenges for providing an expressive comparative visualization, among which is the requirement to relate the effect of individual flow divergence to joint transport characteristics of the ensemble. Yet, techniques developed for scalar ensembles are of little use in this context, as the notion of transport induced by a vector field cannot be modeled using such tools. We develop a Lagrangian framework for the comparison of flow fields in an ensemble. Our techniques evaluate individual and joint transport variance and introduce a classification space that facilitates incorporation of these properties into a common ensemble visualization. Variances of Lagrangian neighborhoods are computed using pathline integration and Principal Components Analysis. This allows for an inclusion of uncertainty measurements into the visualization and analysis approach. Our results demonstrate the usefulness and expressiveness of the presented method on several practical examples.
    No preview · Article · Dec 2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: Multifluid simulations often create volume fraction data, representing fluid volumes per region or cell of a fluid data set. Accurate and visually realistic extraction of fluid boundaries is a challenging and essential task for efficient analysis of multifluid data. In this work, we present a new material interface reconstruction method for such volume fraction data. Within each cell of the data set, our method utilizes a gradient field approximation based on trilinearly blended Coons-patches to generate a volume fraction function, representing the change in volume fractions over the cells. A continuously varying isovalue field is applied to this function to produce a smooth interface that preserves the given volume fractions well. Further, the method allows user-controlled balance between volume accuracy and physical plausibility of the interface. The method works on two- and three-dimensional Cartesian grids, and handles multiple materials. Calculations are performed locally and utilize only the one-ring of cells surrounding a given cell, allowing visualizations of the material interfaces to be easily generated on a GPU or in a large-scale distributed parallel environment. Our results demonstrate the robustness, accuracy, and flexibility of the developed algorithms.
    No preview · Article · Oct 2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: Liquid–liquid extraction is a typical multi-fluid problem in chemical engineering where two types of immiscible fluids are mixed together. Mixing of two-phase fluids results in a time-varying fluid density distribution, quantitatively indicating the presence of liquid phases. For engineers who design extraction devices, it is crucial to understand the density distribution of each fluid, particularly flow regions that have a high concentration of the dispersed phase. The propagation of regions of high density can be studied by examining the topology of isosurfaces of the density data. We present a topology-based approach to track the splitting and merging events of these regions using the Reeb graphs. Time is used as the third dimension in addition to two-dimensional (2D) point-based simulation data. Due to low time resolution of the input data set, a physics-based interpolation scheme is required in order to improve the accuracy of the proposed topology tracking method. The model used for interpolation produces a smooth time-dependent density field by applying Lagrangian-based advection to the given simulated point cloud data, conforming to the physical laws of flow evolution. Using the Reeb graph, the spatial and temporal locations of bifurcation and merging events can be readily identified supporting in-depth analysis of the extraction process.
    No preview · Article · Jul 2013 · Computer Aided Geometric Design
  • Harald Obermaier · Martin Hering-Bertram · Hans Hagen
    [Show abstract] [Hide abstract]
    ABSTRACT: The input–output behavior or flow transfer function of typical mixing processes is highly relevant to the analysis of the dynamic system and its mixing quality. We aim to visualize this behavior by extracting topologically relevant flow volumes from statistics accumulated during particle traversal of the flow field. To guarantee a sufficiently dense sampling of the flow field, we use adaptive time-surfaces for the computation of these trajectory statistics. The proposed volume extraction technique operates in parameter space of the computed time-surfaces and facilitates fast extraction of boundary geometry at different levels of detail. Our results visualize flow transfer functions in the form of volumes for extrema of different time-surface statistics and demonstrate their benefit for flow analysis.
    No preview · Article · Jul 2013 · Computer Aided Geometric Design

  • No preview · Conference Paper · Jan 2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A fundamental characteristic of fluid flow is that it causes mixing: introduce a dye into a flow, and it will disperse. Mixing can be used as a method to visualize and characterize flow. Because mixing is a process that occurs over time, it is a 4D problem that presents a challenge for computation, visualization, and analysis. Motivated by a mixing problem in geophysics, we introduce a combination of methods to analyze, transform, and finally visualize mixing in simulations of convection in a self-gravitating 3D spherical shell representing convection in the Earth's mantle. Geophysicists use tools such as the finite element model CitcomS to simulate convection, and introduce massless, passive tracers to model mixing. The output of geophysical flow simulation is hard to analyze for domain experts because of overall data size and complexity. In addition, information overload and occlusion are problems when visualizing a whole-earth model. To address the large size of the data, we rearrange the simulation data using intelligent indexing for fast file access and efficient caching. To address information overload and interpret mixing, we compute tracer concentration statistics, which are used to characterize mixing in mantle convection models. Our visualization uses a specially tailored version of Direct Volume Rendering. The most important adjustment is the use of constant opacity. Because of this special area of application, i. e. the rendering of a spherical shell, many computations for volume rendering can be optimized. These optimizations are essential to a smooth animation of the time-dependent simulation data. Our results show how our system can be used to quickly assess the simulation output and test hypotheses regarding Earth's mantle convection. The integrated processing pipeline helps geoscientists to focus on their main task of analyzing mantle homogenization.
    Full-text · Article · Dec 2012 · IEEE Transactions on Visualization and Computer Graphics
  • Harald Obermaier · Kenneth I. Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Integral flow surfaces constitute a widely used flow visualization tool due to their capability to convey important flow information such as fluid transport, mixing, and domain segmentation. Current flow surface rendering techniques limit their expressiveness, however, by focusing virtually exclusively on displacement visualization, visually neglecting the more complex notion of deformation such as shearing and stretching that is central to the field of continuum mechanics. To incorporate this information into the flow surface visualization and analysis process, we derive a metric tensor field that encodes local surface deformations as induced by the velocity gradient of the underlying flow field. We demonstrate how properties of the resulting metric tensor field are capable of enhancing present surface visualization and generation methods and develop novel surface querying, sampling, and visualization techniques. The provided results show how this step towards unifying classic flow visualization and more advanced concepts from continuum mechanics enables more detailed and improved flow analysis.
    No preview · Article · Dec 2012 · IEEE Transactions on Visualization and Computer Graphics
  • [Show abstract] [Hide abstract]
    ABSTRACT: A fundamental characteristic of fluid flow is that it causes mixing: introduce a dye into a flow, and it will disperse. Mixing can be used as a method to visualize and characterize flow. Because mixing is a process that occurs over time, it is a 4D problem that presents a challenge for computation, visualization, and analysis. Motivated by a mixing problem in geophysics, we introduce a combination of methods to analyze, transform, and finally visualize mixing in simulations of convection in a self-gravitating 3D spherical shell representing convection in the Earth's mantle. Geophysicists use tools such as the finite element model CitcomS to simulate convection, and introduce massless, passive tracers to model mixing. The output of geophysical flow simulation is hard to analyze for domain experts because of overall data size and complexity. In addition, information overload and occlusion are problems when visualizing a whole-earth model. To address the large size of the data, we rearrange the simulation data using intelligent indexing for fast file access and efficient caching. To address information overload and interpret mixing, we compute tracer concentration statistics, which are used to characterize mixing in mantle convection models. Our visualization uses a specially tailored version of Direct Volume Rendering. The most important adjustment is the use of constant opacity. Because of this special area of application, i.e. the rendering of a spherical shell, many computations for volume rendering can be optimized. These optimizations are essential to a smooth animation of the time-dependent simulation data. Our results show how our system can be used to quickly assess the simulation output and test hypotheses regarding Earth's mantle convection. The integrated processing pipeline helps geoscientists to focus on their main task of analyzing mantle homogenization.
    No preview · Article · Oct 2012 · IEEE Transactions on Visualization and Computer Graphics
  • Harald Obermaier · Kenneth I. Joy
    [Show abstract] [Hide abstract]
    ABSTRACT: Modern time-varying flow visualization techniques that rely on advection are able to convey fluid transport, but cannot provide an accurate insight into local flow behavior over time or locally corresponding patterns in unsteady vector fields. We overcome these limitations of purely Lagrangian approaches by generalizing the concept of function fields to time-varying flows. This representation of unsteady vector-fields as stationary function fields, where every position in space is a vector-valued function supports the application of novel analysis techniques based on function correlation, and allows to answer data analysis questions that remain unanswered with classic time-varying vector field analysis techniques. Our results demonstrate how analysis of time-varying flow fields can benefit from a conversion into function field representations and show the robustness of our presented clustering techniques.
    No preview · Chapter · Jul 2012
  • [Show abstract] [Hide abstract]
    ABSTRACT: Crease surfaces describe extremal structures of 3D scalar fields. We present a new region-growing-based approach to the meshless extraction of adaptive nonmanifold valley and ridge surfaces that overcomes limitations of previous approaches by decoupling point seeding and triangulation of the surface. Our method is capable of extracting valley surface skeletons as connected minimum structures. As our algorithm is inherently mesh-free and curvature adaptive, it is suitable for surface construction in fields with an arbitrary neighborhood structure. As an application for insightful visualization with valley surfaces, we choose a low frequency acoustics simulation. We use our valley surface construction approach to visualize the resulting complex-valued scalar pressure field for arbitrary frequencies to identify regions of sound cancellation. This provides an expressive visualization of the topology of wave node and antinode structures in simulated acoustics.
    No preview · Article · Feb 2012

Publication Stats

38 Citations
23.59 Total Impact Points

Institutions

  • 2011-2015
    • University of California, Davis
      • • Institute for Data Analysis and Visualization (IDAV)
      • • Department of Computer Science
      Davis, California, United States
  • 2010-2011
    • Technische Universität Kaiserslautern
      • Department of Computer Science
      Kaiserlautern, Rheinland-Pfalz, Germany