Figure 1 - uploaded by Alen Alexanderian
Content may be subject to copyright.
Image tile pyramid with 5 levels. In this example, a 16 MP image is tiled into 341 sub-images of 256 × 256 pixel size.
Source publication
Across all disciplines that work with image data - from astrophysics to medical research and historic preservation - there is a growing need for efficient ways to browse and inspect large sets of high-resolution images. We present the development of a visualization software for solar physics data based on the JPEG 2000 image compression standard. O...
Contexts in source publication
Context 1
... images. Its goal is to help scientists discover new phenomena and link related data sets from various instruments that are currently often analyzed in isolation. In addition, it will make a huge amount of information available to the general public by visualizing it in intuitive and appealing ways. The current number of data browsing tools is limited and each offers very specific functionality. For example, SOHO’s web-accessible Near Real-time Image Browser [1] is serving over twelve years of heliophysics data as JPEG images in two sizes: 1024 × 1024 and 512 × 512 pixels. The SOHO Movie Theater [2] uses the same data to create on-screen animations with basic movie control functionality. The Solar Weather Browser [3] allows quick image browsing of highly compressed data and can handle up to two overlays. All of these widely used tools work well with current data volumes and met many of the current browsing needs but will be severely challenged in the immediate future. To address the new challenges, Jhelioviewer has been built to provide an integrated solution for image encoding, storage and dissemination. Its advanced browsing capabilities (e.g. movies with arbitrary time cadence, unlimited overlays, image processing, event data integration etc.) underscore the superiority and novelty of this approach. The focus of this paper is the application of a relatively new image compression technique that is both established and well known in the digital image processing community to the challenging task of the solar physics community to cope with the ever-growing amount of data available. While the impetus for this article is handling solar physics data, similar requirements of accessing, browsing and searching image data exist for applications in other areas such as the earth sciences and medical research [4, 5, 6]. In the following sections we will: compare the hierarchical structure of JPEG 2000 files to the popular image tiling approach for rendering and visualization of large image data sets over networks and give a brief overview of the JPEG 2000 compression standard. We will then outline the specific requirements that led to the development of Jhelioviewer and describe the architecture and implementation of the system. We will conclude by giving an outlook on possible future uses and extensions of the project. A popular technique to handle image rendering and visualization of large data sets over networks is image tiling. For this method, each original image is divided into sub-images, or tiles, at various resolution (zoom) levels, thus creating a pyramid of image tiles for each image (Figure 1). Used by many providers of geospatial data, e.g. Google TM and Mapquest TM , this approach has the advantage of transferring only data for the chosen region of interest (ROI) and zoom level as image tiles from the server to the client. While this method works well for data repositories with only a few files, it has ...
Context 2
... disadvantages when the number of files increases. • The number of tiles increases as a power of the number of resolution levels: It is given by the finite geometric series k n =0 − 1 z k = (1 − z n ) / (1 − z ), where z is the number of tiles each sub-image is divided into, to create the next resolution level, and n is the number of resolution levels. Even for a modest image size of 16 MP divided into tiles with a size of 256 × 256 pixel at five resolution levels, the tiling approach increases the number of files to be stored by a factor of 341 and the total data volume by at least about a factor of two (the total overhead in file size depends on the compression rate and the image content). SDO’s Atmospheric Imaging Assembly (AIA) 3 instrument will take 16 MP images of the Sun in 8 spectral channels at least every 10 seconds, i.e. on the order of 70,000 images per day or 30 million files per year. For data sets of this magnitude the number of tiles is staggering. Even for a modest fraction of the data, generating tiles becomes prohibitive. • Typical use cases for image browsing involve repeated zooming in and out of different ROIs. For each zoom level, a new set of image tiles has to be transferred from the server to the client. This method uses significantly more network bandwidth than necessary as it fails to exploit the fact that part of the information contained in the image has already been transferred at a different zoom level. Clearly, image tiling is not a sufficent solution. A method that eliminates the need for tiling is the discrete wavelet transform (DWT), an image transform that is the basis of the JPEG 2000 image compression scheme. The JPEG 2000 standard [7] was created by the Joint Photographic Experts Group (JPEG) with the intention of improving upon, and superseding, the very successful JPEG standard [8] that has been in use for almost 20 years. It is a novel image compression ISO standard that offers both a lossless and a lossy compression mode and provides many new features, which make it a promising format for handling massive amounts of image data and associ- ated meta data. Images have to be encoded only once in the highest desired quality and can subsequently be decoded in many ways to extract subfield images with a chosen spatial resolution, level of detail and region of interest. This offers significant advantages compared to storing multiple versions of images or tiles for different resolution levels and drastically reduces the size and complexity of storage and network transmission requirements. Figure 2 shows the JPEG 2000 equivalent of the image tile pyramid of Figure 1: Level 1 (L1) represents the image encoded at the highest quality and ...
Similar publications
Data and Information Access Link (DIAL) is web based package software tools for Remote sesning and geo-spatial data applications. Using DIAL, scientists can set up a web based data server, organize data, build metadata catalog and distribute data. Users with WWW browser can access DIAL. DIAL has implemented EOS Data Gateway EDG and CIP protocols th...
The next-generation astronomy digital archives will cover most of the sky at fine resolution in many wavelengths, from X-rays, through ultraviolet, optical, and infrared. The archives will be stored at diverse geographical locations. One of the first of these projects, the Sloan Digital Sky Survey (SDSS) is creating a 5-wavelength catalog over 10,0...
The data sharing system for resource and environment science databases of the Chinese Academy of Science (CAS) is of an open three-tiered architecture, which integrates the geographical databases of about 9 institutes of CAS by the mechanism of distributive unstructured data management, metadata integration, catalogue services, and security control...
Since 1986 we have investigated the problems and possibilities of applying modern information retrieval methods to large online public access library catalogs (OPACs). In the Retrieval Experiment—Virginia Tech OnLine Catalog (REVTOLC) study we carried out a large pilot test in 1987 and a larger, controlled investigation in 1990, with 216 users and...
We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake-Catcher Network (QCN) that connects low-cost microelectromechanical systems accelerometers to a network of volunteer-owned, Internet-connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors...
Citations
... In the framework of the Feature Finding Team project (FFT; Martens et al. 2012), the SDO Event Detection System (SDO EDS; Hurlburt et al. 2012) runs the SPoCA-suite to extract coronal hole information from AIA images in the 19.3 nm passband, and upload the entries every four hours to the Heliophysics Events Knowledgebase (Hurlburt et al. 2012). These entries are searchable through the graphical interface iSolSearch, the ontology software package of IDL Solarsoft, and the JHelioviewer visualization tool (Müller et al. 2009). ...
We demonstrate the use of machine learning algorithms in combination with
segmentation techniques in order to distinguish coronal holes and filaments in
SDO/AIA EUV images of the Sun. Based on two coronal hole detection techniques
(intensity-based thresholding, SPoCA), we prepared data sets of manually
labeled coronal hole and filament channel regions present on the Sun during the
time range 2011 - 2013. By mapping the extracted regions from EUV observations
onto HMI line-of-sight magnetograms we also include their magnetic
characteristics. We computed shape measures from the segmented binary maps as
well as first order and second order texture statistics from the segmented
regions in the EUV images and magnetograms. These attributes were used for data
mining investigations to identify the most performant rule to differentiate
between coronal holes and filament channels. We applied several classifiers,
namely Support Vector Machine, Linear Support Vector Machine, Decision Tree,
and Random Forest and found that all classification rules achieve good results
in general, with linear SVM providing the best performances (with a true skill
statistic of ~0.90). Additional information from magnetic field data
systematically improves the performance across all four classifiers for the
SPoCA detection. Since the calculation is inexpensive in computing time, this
approach is well suited for applications on real-time data. This study
demonstrates how a machine learning approach may help improve upon an
unsupervised feature extraction method.
... Downloading, browsing and analyzing significant areas of interest for these data volumes on a remote server are not easy, simply because these processes overload the existing Internet and network infrastructure. From a scientist's viewpoint, the process of retrieving large data volume from even a few repositories, and dealing with immobile data sets poses the problems of searching, browsing and extracting interesting images while avoiding the search for a needle in a haystack problem explained in [1]. ...
... A JPEG 2000-based visualization and discovery software for SOHO image data developed by Müller et al [1] is called JHelioviewer. A remote access as a client-server application for compressed images was provided. ...
Nothing is more important to us on Earth than the Sun. Without the heat and light of the sun, life as we know it could not exist on the earth. Sun exhibits phenomena on differnet scales, timescales and wavelength ranges. Recent solar missions have increased the rate of solar data available for study which presents both opportunities and challenges. Several satellites have been launched to observe the Sun such as STEREO (Solar TErrestrial RElations Observatory) and SDO (Solar Dynamics Observatory). STEREO and SDO provide full disk images of the Sun at different cadence rates in different wavelengths with maximum resolutions of 2048×2048 and 4096×4096 pixels, respectively. STEREO mission combines two spacecrafts circulating arround the Sun to provide simultaneous views from widely spaced locations. SDO aims to study the solar atmosphere on small scales and times and in many wavelengths. STEREO and SDO missions provide huge volumes of data per day, hence it is not an easy process to download, browse and analyze significant areas of interest for these data volumes on a remote server, simply because these processes overload the existing internet and network infrastructures. In this paper, a tool for visualizing and analysing STEREO and SDO data is introduced. The aim of this tool is to help scientists to discover new phenomena and link related data sets from various instruments that are often analyzed in isolation. The proposed tool offers a number of useful image processing tools associated with activities highly focused on solar images such as: segment of active region(s), creating anaglyphs, extracting solar limb, tracking solar events etc.
... SDO produces images with 6 megapixels every 10 seconds. This corresponds to 1.4 terabytes of information per day [4]. ...
There is a huge amount of information generated by telescopes as astronomical images. Therefore, software and equipment could process these images to find new phenomena and obtain new knowledge about the space. Considering the need of rapid processing of those images, we present a software for astronomical image processing in high-performance computing clusters, which use shared and distributed memory systems, called Gaspra. We designed Gaspra to batch process large sets of astronomical images, allowing researchers to create scientific workflows to obtain new knowledge from these astronomy imagery data sets. Experiments with Gaspra show a 3.5-fold speedup to process a single image in 5 processing nodes, each node supporting 64 different threads.
This article compiles and examines a comprehensive coronal magnetic-null-point survey created by potential-field-source-surface (PFSS) modeling and Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) observations. The locations of 582 potential magnetic null points in the corona were predicted from the PFSS model between Carrington Rotations (CR) 2098 (June 2010) and 2139 (July 2013). These locations were manually inspected, using contrast-enhanced SDO/AIA images in 171 Å at the East and West solar limb, for structures associated with nulls. A Kolmogorov–Smirnov (K–S) test showed a statistically significant difference between observed and predicted latitudinal distributions of null points. This finding is explored further to show that the observability of null points could be affected by the Sun’s asymmetric hemisphere activity. Additional K–S tests show no effect on observability related to eigenvalues associated with the fan and spine structure surrounding null points or to the orientation of the spine. We find that approximately 31 % of nulls obtained from the PFSS model were observed in SDO/AIA images at one of the solar limbs. An observed null on the East solar limb had a 51.6 % chance of being observed on the West solar limb. Predicted null points going back to CR 1893 (March 1995) were also used for comparing radial and latitudinal distributions of nulls to previous work and to test for correlation of solar activity to the number of predicted nulls.
key goal for space weather studies is to define severe and extreme
conditions that might plausibly afflict human technology. On 23 July
2012, solar active region 1520 (~141°W heliographic longitude) gave
rise to a powerful coronal mass ejection (CME) with an initial speed
that was determined to be 2500 ± 500 km/s. The eruption was
directed away from Earth toward 125°W longitude. STEREO-A sensors
detected the CME arrival only about 19 h later and made in situ
measurements of the solar wind and interplanetary magnetic field. In
this paper, we address the question of what would have happened if this
powerful interplanetary event had been Earthward directed. Using a
well-proven geomagnetic storm forecast model, we find that the 23-24
July event would certainly have produced a geomagnetic storm that was
comparable to the largest events of the twentieth century (Dst ~ -500
nT). Using plausible assumptions about seasonal and time-of-day
orientation of the Earth's magnetic dipole, the most extreme modeled
value of storm-time disturbance would have been Dst = -1182 nT. This is
considerably larger than estimates for the famous Carrington storm of
1859. This finding has far reaching implications because it demonstrates
that extreme space weather conditions such as those during March of 1989
or September of 1859 can happen even during a modest solar activity
cycle such as the one presently underway. We argue that this extreme
event should immediately be employed by the space weather community to
model severe space weather effects on technological systems such as the
electric power grid.