Jesus Pulido

Jesus Pulido
  • University of California, Davis

About

24
Publications
2,622
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
250
Citations
Current institution
University of California, Davis

Publications

Publications (24)
Preprint
Full-text available
Multi-resolution methods such as Adaptive Mesh Refinement (AMR) can enhance storage efficiency for HPC applications generating vast volumes of data. However, their applicability is limited and cannot be universally deployed across all applications. Furthermore, integrating lossy compression with multi-resolution techniques to further boost storage...
Preprint
Full-text available
As supercomputers advance towards exascale capabilities, computational intensity increases significantly, and the volume of data requiring storage and transmission experiences exponential growth. Adaptive Mesh Refinement (AMR) has emerged as an effective solution to address these two challenges. Concurrently, error-bounded lossy compression is reco...
Preprint
Full-text available
Today's scientific simulations require a significant reduction of data volume because of extremely large amounts of data they produce and the limited I/O bandwidth and storage space. Error-bounded lossy compression has been considered one of the most effective solutions to the above problem. However, little work has been done to improve error-bound...
Article
Today's scientific simulations require significant data volume reduction because of the enormous amounts of data produced and the limited I/O bandwidth and storage space. Error-bounded lossy compression has been considered one of the most effective solutions to the above problem. However, little work has been done to improve error-bounded lossy com...
Conference Paper
Full-text available
Today's scientific simulations require a significant reduction of data volume because of extremely large amounts of data they produce and the limited I/O bandwidth and storage space. Error-bounded lossy compression has been considered one of the most effective solutions to the above problem. However, little work has been done to improve error-bound...
Preprint
Full-text available
Today's scientific simulations require a significant reduction of data volume because of extremely large amounts of data they produce and the limited I/O bandwidth and storage space. Error-bounded lossy compression has been considered one of the most effective solutions to the above problem. However, little work has been done to improve error-bound...
Conference Paper
Full-text available
Extreme-scale cosmological simulations have been widely used by today’s researchers and scientists on leadership supercomputers. A new generation of error-bounded lossy compressors has been used in workflows to reduce storage requirements and minimize the impact of throughput limitations while saving large snapshots of high-fidelity data for post-h...
Preprint
Full-text available
Extreme-scale cosmological simulations have been widely used by today's researchers and scientists on leadership supercomputers. A new generation of error-bounded lossy compressors has been used in workflows to reduce storage requirements and minimize the impact of throughput limitations while saving large snapshots of high-fidelity data for post-h...
Article
During large-scale simulations, intermediate data products such as image databases have become popular due to their low relative storage cost and fast in-situ analysis. Serving as a form of data reduction, these image databases have become more acceptable to perform data analysis on. We present an image-space detection and classification system for...
Conference Paper
Full-text available
To help understand our universe better, researchers and scientists currently run extreme-scale cosmology simulations on leadership supercomputers. However, such simulations can generate large amounts of scientific data, which often result in expensive costs in data associated with data movement and storage. Lossy compression techniques have become...
Preprint
Full-text available
To help understand our universe better, researchers and scientists currently run extreme-scale cosmology simulations on leadership supercomputers. However, such simulations can generate large amounts of scientific data, which often result in expensive costs in data associated with data movement and storage. Lossy compression techniques have become...
Article
As more advanced and complex survey telescopes are developed, the size and scale of data being captured grows at increasing rates. Across various domains, data compression through wavelets has enabled the reduction of data size and increase in computation efficiency. In this paper, we provide qualitative and quantitative tests of a new wavelet-base...
Article
Full-text available
This paper concerns the use of compression methods applied to large scientific data. Specifically the paper addresses the effect of lossy compression on approximation error. Computer simulations, experiments and imaging technologies generate terabyte-scale datasets making necessary new approaches for compression coupled with data analysis. Lossless...
Conference Paper
Recent advancements in high-performance computing have enabled scientists to model various scientific phenomena in great detail. However, the analysis and visualization of the output data from such large-scale simulations are posing significant challenges due to their excessive size and disk I/O bottlenecks. One viable solution to this problem is t...
Article
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scient...
Article
This paper compares the effectiveness of various multi-resolution geometric representation methods, such as B-spline, Daubechies, Coiflet and Dual-tree wavelets, curvelets and surfacelets, to capture the structure of fully developed turbulence using a truncated set of coefficients. The turbulence dataset is obtained from a Direct Numerical Simulati...
Article
Full-text available
This paper introduces an improved method for detecting objects of interest (galaxies and stars) in astronomical images. After applying a global detection scheme, further refinement is applied by dividing the entire image into several irregularly sized sub-regions using the watershed segmentation method. A more refined detection procedure is perform...
Article
A central problem in image processing and computer vision is the computation of corresponding interest points in a given set of images. Usually, interest points are considered as independent elements described by some local information. Due to the limitations of such an approach, many incorrect correspondences can be obtained. A specific contributi...
Conference Paper
Processing images of underwater environments of Antarctic lakes is challenging due to poor lighting conditions, low saturation and noise. This paper presents a novel pipeline for dense point cloud scene reconstruction from underwater stereo images and video obtained with low-cost consumer recording hardware. Features in stereo frames are selected a...

Network

Cited By