James Ahrens’s research while affiliated with Los Alamos National Laboratory and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (166)


An Exploration of How Volume Rendering is Impacted by Lossy Data Reduction
  • Conference Paper

November 2024

·

3 Reads

Yanni Etchi

·

·

Pascal Grosset

·

[...]

·

David Rogers



The ECP ALPINE project: In situ and post hoc visualization infrastructure and analysis capabilities for exascale
  • Article
  • Full-text available

October 2024

·

55 Reads

·

1 Citation

The International Journal of High Performance Computing Applications

A significant challenge on an exascale computer is the speed at which we compute results exceeds by many orders of magnitude the speed at which we save these results. Therefore the Exascale Computing Project (ECP) ALPINE project focuses on providing exascale-ready visualization solutions including in situ processing. In situ visualization and analysis runs as the simulation is run, on simulations results are they are generated avoiding the need to save entire simulations to storage for later analysis. The ALPINE project made post hoc visualization tools, ParaView and VisIt, exascale ready and developed in situ algorithms and infrastructures. The suite of ALPINE algorithms developed under ECP includes novel approaches to enable automated data analysis and visualization to focus on the most important aspects of the simulation. Many of the algorithms also provide data reduction benefits to meet the I/O challenges at exascale. ALPINE developed a new lightweight in situ infrastructure, Ascent.

Download

Fig. 3: Overview of our proposed workflow for multi-resolution scientific data compression.
Fig. 14: Vis of original data, decompressed data (generated by our workflow using ZFP, CR = 240), and decompressed data with uncertainty, cyan/green box highlights the missing/cracking isosurface.
A High-Quality Workflow for Multi-Resolution Scientific Data Reduction and Visualization

July 2024

·

68 Reads

Multi-resolution methods such as Adaptive Mesh Refinement (AMR) can enhance storage efficiency for HPC applications generating vast volumes of data. However, their applicability is limited and cannot be universally deployed across all applications. Furthermore, integrating lossy compression with multi-resolution techniques to further boost storage efficiency encounters significant barriers. To this end, we introduce an innovative workflow that facilitates high-quality multi-resolution data compression for both uniform and AMR simulations. Initially, to extend the usability of multi-resolution techniques, our workflow employs a compression-oriented Region of Interest (ROI) extraction method, transforming uniform data into a multi-resolution format. Subsequently, to bridge the gap between multi-resolution techniques and lossy compressors, we optimize three distinct compressors, ensuring their optimal performance on multi-resolution data. Lastly, we incorporate an advanced uncertainty visualization method into our workflow to understand the potential impacts of lossy compression. Experimental evaluation demonstrates that our workflow achieves significant compression quality improvements.





AMRIC: A Novel In Situ Lossy Compression Framework for Efficient I/O in Adaptive Mesh Refinement Applications

July 2023

·

74 Reads

As supercomputers advance towards exascale capabilities, computational intensity increases significantly, and the volume of data requiring storage and transmission experiences exponential growth. Adaptive Mesh Refinement (AMR) has emerged as an effective solution to address these two challenges. Concurrently, error-bounded lossy compression is recognized as one of the most efficient approaches to tackle the latter issue. Despite their respective advantages, few attempts have been made to investigate how AMR and error-bounded lossy compression can function together. To this end, this study presents a novel in-situ lossy compression framework that employs the HDF5 filter to improve both I/O costs and boost compression quality for AMR applications. We implement our solution into the AMReX framework and evaluate on two real-world AMR applications, Nyx and WarpX, on the Summit supercomputer. Experiments with 4096 CPU cores demonstrate that AMRIC improves the compression ratio by up to 81X and the I/O performance by up to 39X over AMReX's original compression solution.


In Situ Analysis and Visualization of Extreme-Scale Particle Simulations

January 2023

·

30 Reads

·

1 Citation

Lecture Notes in Computer Science

In situ analysis has emerged as a dominant paradigm for performing scalable visual analysis of extreme-scale computational simulation data. Compared to the traditional post hoc analysis pipeline where data is first stored into disks and then analyzed offline, in situ analysis processes data at the time its generation in the supercomputers so that the slow and expensive disk I/O is minimized. In this work, we present a new in situ visual analysis pipeline for the extreme-scale multiphase flow simulation MFiX-Exa and demonstrate how the pipeline can be used to process large particle fields in situ and produce informative visualizations of the data features. We deploy our analysis pipeline on Oak Ridge’s Summit supercomputer to study its in situ applicability and usefulness.KeywordsIn situ analysisVisualizationFeature detectionHigh performance computingComputational scienceParticle data


Citations (71)


... For example, El-Rushaidat et al. [17] aimed to convert unstructured data into rectilinear grids, while Berger and Rigoutsos [28] further explored cluster and adaptive mesh refinement (AMR) algorithms to reduce grids into non-uniform structure consist of fewer rectangular patches. Various studies have also addressed compressing data in AMR forms [29]- [31]. Our work is motivated by some mesh-to-grid data approximation techniques discussed in the aforementioned work but fundamentally differs in two key aspects: first, our primary objective is to reduce the storage cost rather than accelerate analytic tasks such as visualization; second, we aim to maintain the general structure of reduced data and mathematically preserve errors incurred during data compression. ...

Reference:

A General Framework for Error-controlled Unstructured Scientific Data Compression
Analyzing Impact of Data Reduction Techniques on Visualization for AMR Applications Using AMReX Framework

... Despite the existence of various AMR data compression solutions, none of the studies have comprehensively examined the impact of lossy compression on the visualization of AMR data. While there are studies that analyze the effects of data compression on non-AMR data visualization [31], the visualization of AMR data is more complex due to its hierarchical structure. ...

Analyzing the Impact of Lossy Data Reduction on Volume Rendering of Cosmology Data
  • Citing Conference Paper
  • November 2022

... Because all children in this cohort entered school at the same point in time, we cannot distinguish these two exposures and determine the separate effects of school vs. non-school exposure. In the United States, COVID-19 incidence has been shown to vary with the start and return to school following breaks, and thus these secular trends may also have been driven by school attendance [42][43][44]. In particular, the initial wave was generally coincident with the start of the school year in February, with a second smaller peak occurring around the typical winter break in July. ...

Assessing K-12 School Reopenings Under Different COVID-19 Spread Scenarios – United States, School Year 2020/21: A Retrospective Modeling Study

Epidemics

... Wang et al. [50] used a similar approach proposing an error-bounded lossy compression approach for 3D AMR data which was later extended in [51]. The authors compressed each refinement level of the AMR data separately. ...

TAC: Optimizing Error-Bounded Lossy Compression for Three-Dimensional Adaptive Mesh Refinement Simulations

... Therefore, the histogram+gradient-based version is generally better at retaining important features of the data than just the histogram-based version of sampling algorithm. The interested reader is directed to Biswas et al., Biswas et al. (2022Biswas et al. ( , 2021 for further information. ...

Sampling for Scientific Data Analysis and Reduction

... To illustrate how pyDNMF-GPU can be used as a building block for more comprehensive workflows, we integrate pyDNMF-GPU with our existing model selection algorithm pyDNMFk 1 that enables automatic determination of the (usually unknown) number of latent features on a large scale datasets [4][5][6][7][8]. We utilized the integrated model selection algorithm previously to decompose the worlds' largest collection of human cancer genomes [9], defining cancer mutational signatures [10], as well as successfully applied to solve real-world problems in various fields [8,[11][12][13][14][15][16][17][18][19]. ...

Selection of Optimal Salient Time Steps by Non-negative Tucker Tensor Decomposition

... Because most machine learning methods are currently black-box models, they have no physical mechanism to constrain their estimates (Banesh et al., 2021). In our fused framework, the input variable selection is based on experience and previous studies. ...

An Image-Based Framework for Ocean Feature Detection and Analysis

Journal of Geovisualization and Spatial Analysis