Wiley

Earth and Space Science

Published by Wiley and American Geophysical Union

Online ISSN: 2333-5084

Disciplines: Earth and space science

Journal websiteAuthor guidelines

Top-read articles

129 reads in the past 30 days

(a) Seismicity in a regional box of size 10° latitude by 10° longitude centered on Los Angeles, CA (a). Large red circles represent earthquakes having magnitudes M > 6.9. Smaller blue circles are earthquakes with M > 5.9. (b) The timeseries of earthquakes in that region since 1970, having magnitudes M > 3.29. Blue curve is the exponential moving average (EMA) with number of weights N = 36 (https://en.wikipedia.org/wiki/Moving_average#Exponential_moving_average (accessed 7/20/2022)). (c) Time series for the mean number μ(t) $\mu (t)$ of small earthquakes as a function of time. The mean is taken beginning in 1960, and is also shown since 1970. (d) Optimized state variable timeseries Θ(t) ${\Theta }(t)$. State variable is the EMA average of the small earthquakes, then adjusted using the current mean number μ(2022) $\mu (2022)$ of small earthquakes, using a constant of proportionality λ $\lambda $. (e) The N‐value and λ $\lambda $‐value are obtained by optimizing the Receiver Operating Characteristic skill, which is shown as the total area under the red curve. Skill for the random time series is shown as the area under the diagonal line, thus random skill = 0.5.
(a) Shows the same Receiver Operating Characteristic (ROC) diagram as in Figure 1e for a future time window of TW= ${T}_{W}=$ 1 year. ROC is the red curve, representing a plot of the true positive rate (TPR) (hit rate) as a function of the false positive rate (false alarm rate). The diagonal line is the TPR for an ensemble of 50 random time series, each of which were obtained from the state variable time series Θ(t) ${\Theta }(t)$ using a bootstrap procedure of random sampling with replacement. The ensemble of random time series is shown as the cyan curves grouped near the diagonal line. (b) Shows the skill, as a function of the future time window TW ${T}_{W}$ , for fixed EMA N‐value and λ $\lambda $‐value. (c). Shows the skill index SKI defined in Equation 1, also as a function of TW ${T}_{W}$. (d). Shows the Shannon information entropy, Equation 3, as a function of future time window TW ${T}_{W}$. Here the information is computed from the probability mass function associated with the ROC curve. Horizontal dashed line is the information entropy for the random ROC curve (diagonal line), assuming N = 200 threshold values.
(a) Shows the optimized state variable as a function of time, an enlarged version of Figure 1d (b) Shows the Positive Predictive Value (PPV) or Precision. Red cuve is the PPV for the state variable shown in (a), where the vertical axis is the threshold TH. The cyan lines represent the PPV for 50 random time series. Mean of the time series is the solid black line, and 1σ confidence is shown as the dashed lines. (c). Red curve is the corresponding self information Iself ${I}_{\mathrm{s}\mathrm{e}\mathrm{l}\mathrm{f}}$, Equation 2, on the horizontal axis as a function of the threshold value TH on the vertical axis. Again, the cyan curves are the self‐information for the ensemble of 50 random time series, with mean (solid black line) and 1σ confidence as the dashed lines.
Results of a long simulation of 183 large “earthquakes. We have constructed a time series Θsim(t) ${{\Theta }}_{\mathrm{s}\mathrm{i}\mathrm{m}}(t)$ using Equation 4, which yields results generally similar to those in Figures 1 and 3 (a) Time series Θsim(t) ${{\Theta }}_{\mathrm{s}\mathrm{i}\mathrm{m}}(t)$ as a function of “time” in “months” on the left, positive predictive value on the right. Compare to Figure 3. The vertical red line at bottom of the time series is the large “earthquake”, the dashed blue line is the derivative of the time series representing the activity. On the left is the time series, on the right is the associated Precision (PPV). (b) Receiver Operating Characteristic (ROC) curve for the time series as discussed in the text. The area of 0.97 under the ROC curve (skill) is larger than the no skill value of 0.5 and has a skill index of 99.7%, indicating very significant predictive skill. Cyan curves are the skill from 50 random time series. (c) Histogram of intervals between 183 large “earthquakes.” (d) Cumulative interval statistics, obtained from integrating histogram in (c). Also shown is the dashed curve for Poisson (exponential) statistics having the same mean as the time series.
Does the Catalog of California Earthquakes, With Aftershocks Included, Contain Information About Future Large Earthquakes?

February 2023

·

2,320 Reads

·

9 Citations

·

Andrea Donnellan

·

·

[...]

·

James Crutchfield
Download

97 reads in the past 30 days

Ten Years of Earth and Space Science: Introduction to the Special Collection

January 2025

·

98 Reads

The journal Earth and Space Science (ESS) was founded in 2014 to offer the scientific community a new platform for the dissemination of key new data, observations, methods, instruments, and models, presented within the context of their application. Thus, the aim of the journal was (and is) to highlight the complexity and importance of experimental design, methodology, data acquisition and processing, intertwined with data interpretation. Such approach is consistent with the mission of most AGU journals, but the distinctive element for ESS is its focus on the concept of the useful impact of publication, progressively replacing that on conventional publication metrics. In this context, the journal has been, since its inception, the preferred home for studies stemming from both global and local geoscience research. This special collection contains 16 papers published in ESS, selected by the Editorial Board to highlight the aims, scope and path of evolution and growth of the journal since it inaugural issue, in 2014.

Aims and scope


Earth and Space Science is an open access journal publishing original articles spanning all of the Earth, planetary and space sciences. ESS particularly welcomes papers presenting key data sets, observations, methods, instruments, sensors, and algorithms and showing their applications.

Recent articles


Radiation Belt Losses: The Long‐Standing Debate Part II
  • Article
  • Full-text available

February 2025

·

8 Reads

L. W. Blum

·

F. Staples

·

A. Y. Drozdov

·

[...]

·

X. Fu

On 21 June 2022, during the annual Geospace Environment Modeling (GEM) workshop, a panel discussion titled “Radiation Belt Loss: The Long‐Standing Debate Part II” was organized by the focus group “System Understanding of Radiation Belt Particle Dynamics.” The panel focused on unresolved questions regarding the mechanisms driving electron loss in Earth's radiation belts, discussing topics including magnetopause shadowing, outward radial transport, and wave‐particle interactions driving particle precipitation. In this commentary, we provide an overview of the outcomes of this discussion and highlight future needs to better resolve outstanding questions.


Improving Typhoon Predictions by Integrating Data‐Driven Machine Learning Model With Physics Model Based on the Spectral Nudging and Data Assimilation

February 2025

·

15 Reads

The rapid advancement of data‐driven machine learning (ML) models has improved typhoon track forecasts, but challenges remain, such as underestimating typhoon intensity and lacking interpretability. This study introduces an ML‐driven hybrid typhoon model, where Pangu forecasts constrain the Weather Research and Forecasting (WRF) model using spectral nudging. The results indicate that track forecasts from the WRF simulation nudged by Pangu forecasts significantly outperform those from the WRF simulation using the NCEP GFS initial field and those from the ECMWF IFS for Typhoon Doksuri (2023). Besides, the typhoon intensity forecasts from Pangu‐nudging are notably stronger than those from the ECMWF IFS, demonstrating that the hybrid model effectively leverages the strengths of both ML and physical models. Furthermore, this study is the first to explore the significance of data assimilation in ML‐driven hybrid typhoon model. The findings reveal that after assimilating water vapor channels from the FY‐4B AGRI, the errors in typhoon intensity forecasts are significantly reduced.


Long‐Term Trend and Seasonal Cycles of Gap‐Free Downscaled Diurnal/Nocturnal LST and the Interaction to Functional Plant Trait Under Tropical Monsoon Climate

February 2025

·

17 Reads

Pham Viet Hoa

·

·

Giang Thi Phuong Thao

·

[...]

·

Nguyen Cao Hanh

Land surface temperature (LST) monitoring via Earth observation constellation will become optimized and consistent with spatiotemporal‐explicit characteristics. Besides, scientific evidence for the interaction between LST and vegetation biophysical variables remains limited through spatial large‐scale assessment and seamless long‐term tracking. This study addresses this gap by utilizing gap‐filled fine spatial resolution LST products in understanding the dynamic over the period 2000–2023 and the spatiotemporal relationship with leaf area index (LAI). Firstly, Moderate Resolution Imaging Spectroradiometer (MODIS) LST 1,000 m of both daytime and nighttime were downscaled to a finer resolution of 250 m using the Random Forest algorithm. The Whittaker algorithm was then applied to obtain gap‐free LST products due to the typical cloud cover under tropical monsoon climate. Time series decomposition of gap‐filled fine resolution LST revealed slight warming trends in daytime (0.005°C year⁻¹), nighttime (0.036°C year⁻¹), and mean of all‐day time (0.02°C year⁻¹) over recent 24 years, while seasonal amplitude in daytime (−3.7°C–4.8°C) is more fluctuated than in nighttime (−2.5°C–1.9°C). Spatial correlations of monthly LSTs and LAI indicated a consistent negative correlation (R ranging from −0.717 to −0.45). These findings shed light on the quantitative relationship between vegetation LAI and LST, contributing to a more unified theoretical framework for understanding functional vegetation responses under diverse climatic conditions.


Operation timelines of Vigil‐related missions from 1996 to 2036. It is indicated if the mission has instruments for imaging and/or in‐situ measurements. The years marked in yellow are the minimal known planned operational timelines.
Occurrence of great and severe geomagnetic storms, according to thresholds specified in Table 4 in the timeframe 1996–2024. The visualization is consistent with dates listed in Table 5. The counts represent the number of events per particular month. It is noted that ∼ ${\sim} $80% of considered extreme events occurred until the year 2006 and were observed only by Solar and Heliospheric Observatory, WIND, and Advanced Composition Explorer (refer to Figure 1).
Vigil‐like view of extreme space weather event: Example dashboard of images taken by Solar and Heliospheric Observatory (SOHO)/Michelson Doppler Imager, EIT, and LASCO imagers and in‐situ data measured at L1 by SOHO/CELIAS and WIND/MFI for the event indexed as 21 in Table 5. The displayed time is in UTC. The full movie of data from 20 days around the date 20.11.2003 event is available online at our YouTube playlist of 4 great space weather events (https://www.youtube.com/playlist?list=PLNAJsgS6RlzgnhvJzieUyCZP9tV9‐bYPr).
Left: Overlap of 35 images of moments when the most extreme SWE events were produced, as captured by Solar and Heliospheric Observatory (SOHO)/EIT 195 Å and Solar Dynamics Observatory (SDO)/Atmospheric Imaging Assembly (AIA) 193 Å. Visible flares are numbered, corresponding to events listed in Table 5. It is noted that science‐ready FITS files were used to generate source images. However, the cross‐calibration was not considered between the instruments as we are here interested only in the locations of flares and not in their absolute intensity. Right: The distribution of time scales for events listed in Table 5. Here, the duration is calculated from the time when the event was visible on SOHO/EIT 195 Å, SDO/AIA 193 Å, and/or SOHO/LASCO C2 images until the time when the Dst index reached its extreme value. The four missing time scales (bars) are caused by unavailable images that are needed for the estimation of the event initialization time.
Extreme Space Weather Events of the Past 30 Years: Preparation for Data From Mission Vigil

February 2025

·

32 Reads

Extreme Space Weather events can negatively affect ground‐based infrastructure and satellite communications. European Space Agency plans to launch a new operational mission, Vigil, to monitor space weather activity and provide timely warnings about immediate danger. In this work, we have identified 24 instruments that have already acquired data on 8 space missions and are similar to instruments planned for mission Vigil. We then selected the 39 most extreme space weather events that affected the Earth in the past 30 years and gathered Vigil‐like data for them. The objective of this work and our main motivation was to address the following question: “How would Vigil have observed extreme space weather events if it had been operational during those events?” For this reason, we prepared a pipeline for the community to obtain images and in‐situ measurements for these specific periods, allowing straightforward applications for the follow‐up data‐driven studies. This effort could maximize Vigil's potential. Additionally, we studied the sources of extreme space weather events and the time it took for solar plasma to reach Earth's magnetosphere. This analysis demonstrates the utilization of the gathered data set and provides interesting insights into the most hazardous space events that influenced society in recent decades.


Integrated Remote Sensing for Enhanced Drought Assessment: A Multi‐Index Approach in Rajasthan, India

February 2025

·

13 Reads

This study investigates land use, land cover (LULC) changes, vegetation health, and drought severity in Rajasthan, India, from 1985 to 2020 using remote sensing techniques. By analyzing satellite imagery with the normalized difference vegetation index (NDVI), temperature condition index (TCI), vegetation condition index (VCI), and NDVI deviation (Dev_NDVI), we assess the spatial and temporal dynamics of the region's landscape and drought conditions. Our findings indicate significant LULC changes, including a decrease in water bodies from 6412.87 to 2248.51 km² and dense forests by 61.37%, while built‐up areas expanded by 890.50%, reflecting substantial human impact and environmental change. Drought analysis revealed that nearly 49% of the study area experienced moderate to severe drought conditions, with VCI levels below 40%, indicating widespread drought impact across different regions and time periods. The study employs weighted sum analysis of Dev_NDVI, VCI, and TCI to create a detailed drought severity map, revealing areas of severe and extreme drought that necessitate immediate action for sustainable management. The novelty of this approach lies in its integrated multi‐index method for assessing drought over a 35 year period, providing a robust framework for analyzing environmental dynamics and the resilience of ecosystems to climatic stresses. This research emphasizes the value of remote sensing for continuous environmental monitoring and highlights future implications for integrating advanced satellite technologies to enhance drought management strategies, ultimately informing policy decisions for sustainable land and water resource management in Rajasthan and similar semi‐arid regions globally.


Greenland Ice Sheet Wide Supraglacial Lake Evolution and Dynamics: Insights From the 2018 and 2019 Melt Seasons

February 2025

·

13 Reads

Supraglacial lakes on the Greenland Ice Sheet (GrIS) can impact both the ice sheet surface mass balance and ice dynamics. Thus, understanding the evolution and dynamics of supraglacial lakes is important to provide improved parameterizations for ice sheet models to enable better projections of future GrIS changes. In this study, we utilize the growing inventory of optical and microwave satellite imagery to automatically determine the fate of Greenland‐wide supraglacial lakes during 2018 and 2019; low and high melt seasons respectively. We develop a novel time series classification method to categorize lakes into four classes: (a) Refreezing, (b) rapidly draining, (c) slowly draining, and (d) buried. Our findings reveal significant interannual variability between the two melt seasons, with a notable increase in the proportion of draining lakes, and a particular dominance of slowly draining lakes, in 2019. We also find that as mean lake depth increases, so does the percentage of lakes that drain, indicating that lake depth may influence hydrofracture potential. We further observe rapidly draining lakes at higher elevations than the previously hypothesized upper‐elevation hydrofracture limit (1,600 m), and that non‐draining lakes are generally deeper during the lower melt 2018 season. Our automatic classification approach and the resulting 2‐year ice‐sheet‐wide data set provide new insights into GrIS supraglacial lake dynamics and evolution, offering a valuable resource for future research.


Algorithm Theoretical Basis for Version 3 TEMPO O2‐O2 Cloud Product

January 2025

·

34 Reads

This Algorithm Theoretical Basis Document (ATBD) describes the retrieval algorithm and sensitivities of the Version 3 cloud product derived from the spectra collected by the Tropospheric Emissions: Monitoring of POllution (TEMPO) instrument. The cloud product is primarily produced for supporting the retrievals of TEMPO trace gases that are important for understanding atmospheric chemistry and monitoring air pollution. The TEMPO cloud algorithm is adapted from NASA's Ozone Monitoring Instrument (OMI) oxygen collision complex (O2‐O2) cloud algorithm. The retrieval generates effective cloud fraction (ECF) from the normalized radiance at 466 nm and generates cloud optical centroid pressure (OCP) using the O2‐O2 column amount derived from the spectral absorption feature near 477 nm. The slant column of O2‐O2 is retrieved using Smithsonian Astrophysical Observatory's spectral fitting code with optimized retrieval parameters. ECF and OCP are used by TEMPO trace gas retrievals to calculate Air Mass Factors which convert slant columns to vertical columns. The sensitivities of the cloud retrieval to various input parameters are investigated.


The Critical Role of Sea Ice Products for Accurate Wind‐Wave Simulations in the Arctic

January 2025

·

13 Reads

The Arctic region is experiencing significant changes due to climate change, and the resulting decline in sea ice concentration and extent is already impacting ocean dynamics and exacerbating coastal hazards in the region. In this context, numerical models play a crucial role in simulating the interactions between the ocean, land, sea ice, and atmosphere, thus supporting scientific studies in the region. This research aims to evaluate how different sea ice products with spatial resolutions varying from 2 to 25 km influence a phase averaged spectral wave model results in the Alaskan Arctic under storm conditions. Four events throughout the Fall to Winter seasons in 2019 were utilized to assess the accuracy of wave simulations generated under the dynamic sea ice conditions found in the Arctic. The selected sea ice products used to parameterize the numerical wave model include the National Snow and Ice Data Center (NSIDC) sea ice concentration, the European Centre for Medium‐Range Weather Forecasts (ECMWF) Re‐Analysis (ERA5), the HYbrid Coordinate Ocean Model‐Community Ice CodE (HYCOM‐CICE) system assimilated with Navy Coupled Ocean Data Assimilation (NCODA), and the High‐resolution Ice‐Ocean Modeling and Assimilation System (HIOMAS). The Simulating WAves Nearshore (SWAN) model's accuracy in simulating waves using these sea ice products was evaluated against Sea State Daily Multisensor L3 satellite observations. Results show wave simulations using ERA5 consistently exhibited high correlation with observations, maintaining an accuracy above 0.83 to the observations across all events. Conversely, HIOMAS demonstrated the weakest performance, particularly during the Winter, with the lowest correlation of 0.40 to the observations. Remarkably, ERA5 surpassed all other products by up to 30% in accuracy during the selected storm events, and even when an ensemble was assessed by combining the selected sea ice products, ERA5's individual performance remained unmatched. Our study provides insights for selecting sea ice products under different sea ice conditions for accurately simulating waves and coastal hazards in high latitudes.


Evolution and Structure of a Heavy‐Precipitation‐Producing Quasi‐Linear Convective System Along a Mesoscale Outflow Boundary

January 2025

·

2 Reads

This study explored the complex evolution mechanism and fine‐scale structures of a quasi‐linear convective system (QLCS) in the eastern Taihang Mountain from 1300 BST 12 to 0300 BST 13 August 2018 by using Doppler radar data, high‐resolution surface observations and sounding data. The QLCS which produced heavy precipitation was maintained as the southeasterly being lifted when flowed over a mesoscale outflow boundary (MOB) associated with a cold pool. Topographic blocking effect of Taihang Mountain and the cold environmental northeasterly enhanced the uplift of southeasterly at southwest and northeast of the MOB. Northeastward extension of the QLCS was promoted by the prevailing southeasterly airflow and high convective available potential energy. Meanwhile, the dry cold layer between 850 and 500 hPa obviously prevented southeastward movement of the QLCS. A clear increase of the disturbance pressure took place due to water loading increase other than the temperature dropping. Northwestward oriented “echo training” of convective cells facilitated the perfect‐structured QLCS to split into several meso‐β‐scale rain bands with irregular convergence along the MOB. Mesoscale convective vortices associated with slow‐moving strong convective echoes played an important role in middle part of the QLCS development which accounts for the heavy precipitation.


Role of Barystatic Sea Level Change in Global Mass Conservation and Its Excitation to Length‐Of‐Day Variations

January 2025

·

17 Reads

Barystatic sea level stores excess water mass from the atmosphere and land to maintain global mass conservations within the Earth system. Besides the secular contribution to global sea‐level rise, changes in barystatic sea level also play an important role in mass‐induced length‐of‐day (LOD) variations over a few years or shorter periods. Compared to barystatic sea level changes deduced from the geophysical models, Gravity Recovery and Climate Experiment and GRACE follow‐on (GRACE/GFO) measurements provide actual observed ocean mass changes. Here, we investigate short‐term both seasonal (annual and semiannual) and non‐seasonal LOD variations caused by mass redistribution using GRACE/GFO mass estimates and effective angular momentum (EAM) products, particularly quantitatively assessing the excitation from the barystatic sea level. Note that correcting the problem of global mass non‐conservation is necessary for GRACE/GFO mass estimates in both spherical harmonic and mascon solutions to calculate the LOD excitation accurately. LOD mass term contributions derived from GRACE/GFO mass estimates considering global mass conservation show high consistency with satellite laser ranging results and are much closer to geodetic LOD observations than EAM products at seasonal and non‐seasonal time scales. The barystatic sea level exhibits the most significant amplitude in mass‐induced LOD variations, compensating for most land hydrological excitation, but shows no clear correlation with the atmosphere. Due to slight fluctuations in cryospheric effects and the substantial compensatory action of the barystatic sea level, differences in the land hydrological excitation do not lead to significant deviations in the total LOD mass term between EAM products and GRACE/GFO mass estimates.


Seismic Imaging of Halokinetic Sequences and Structures With High‐Resolution, Dual‐Element Acquisition, and Processing: Applications to the Gassum Structure in Eastern Jutland, Denmark

January 2025

·

35 Reads

Understanding the structural intricacies of subsurface halokinetic formations is crucial for various geological applications, including geological capture and storage (geological carbon storage (GCS)). This study focuses on the seismic imaging of the Gassum structure in eastern Jutland, Denmark, employing high‐resolution, dual‐element acquisition, and processing techniques. The investigation aims to unravel details in the evolution of the salt dome and its implications for GCS potential. High‐resolution seismic data processing and interpretation reveals a skewed dome structure with steeper flanks on the western and northern sides, characterized by faults and stratigraphic thinning. The asymmetric growth of the dome suggests uneven salt loading during its genesis, influencing local stress fields and structural development, with evidence of syn‐tectonic subsidence that produced salt welds. This is supported by the presence of stratigraphic wedges and an increased depth of imaged horizons within steeper flanks of the dome. A mild piercement of the salt into overlying sediments, onlapping features, and the presence of normal faults that originate from the dome apex and extend radially, all indicate a reactive piercement process in the salt pillow's development stage. This produced an extensional regime in overlying strata, inducing sequence thinning and graben structures. Analysis of reservoir and seal properties unveils adequate conditions for GCS, with a continuous reservoir and thick primary and secondary seals. However, the presence of faults intersecting these formations raises concerns regarding long‐term storage stability. Further investigations into reservoir porosity, migration paths, and volumetric analysis are warranted for conclusive GCS assessments.


Using Radiogenic Noble Gas Nuclides to Identify and Characterize Rock Fracturing

January 2025

·

6 Reads

Fracture‐released radiogenic noble gas nuclides are used to identify locations and constrain the volume of new fracture creation during subsurface detonations. Real‐time, in situ noble gases and reactive gases were monitored using a field‐deployed mass spectrometer and automated sampling system in a multilevel borehole array. Released gases were measured after two different detonations having distinct energy, pressure, and gas volume characteristics. Explosive‐derived gases (N2O, CO2) and excess radiogenic ⁴He and ⁴⁰Ar above atmospheric background are used to identify locations of gas transport and new fracture creation after each detonation. Fracture‐released radiogenic ⁴He is used to constrain the volume of newly created fractures with a model of helium release from fracturing. Explosive by‐product gas was observed in multiple locations both near and distal to the shot locations for both detonations. Radiogenic ⁴He and ⁴⁰Ar release from rock damage was observed in locations near the detonation after the second, more powerful detonation. Observed ⁴He response is consistent with a model of diffusive release from newly created fractures. Volume of new fractures estimated from the ⁴He release ranges from 1 to 5 m² with apertures ranging from 0.1 to 1 μ {\upmu }m. Our results provide evidence that radiogenic noble gases released during fracture creation can be identified at the field scale in real time and used to identify timing and location of fracture creation during deformation events. This technique could be useful in subsurface science and engineering problems where the location and amount of newly created rock fracturing is of interest including fault rupture, mine safety, subsurface detonation monitoring and reservoir stimulation.


Identifying Ocean Submesoscale Activity From Vertical Density Profiles Using Machine Learning

January 2025

·

7 Reads

Submesoscale eddies are important features in the upper ocean where they mediate air‐sea exchanges, convey heat and tracer fluxes into ocean interior, and enhance biological production. However, due to their small size (0.1–10 km) and short lifetime (hours to days), directly observing submesoscales in the field generally requires targeted high resolution surveys. Submesoscales increase the vertical density stratification of the upper ocean and qualitatively modify the vertical density profile. In this paper, we propose an unsupervised machine learning algorithm to identify submesoscale activity using vertical density profiles. The algorithm, based on the profile classification model (PCM) approach, is trained and tested on two model‐based data sets with vastly different resolutions. One data set is extracted from a large‐eddy simulation (LES) in a 4 km by 4 km domain and the other from a regional model for a sector in the Southern Ocean. We show that the adapted PCM can identify regions with high submesoscale activity, as characterized by the vorticity field (i.e., where surface vertical vorticity ζ ζ\zeta is similar to Coriolis frequency f f and Rossby number Ro=ζ/f∼O(1) Ro=ζ/fO(1)Ro=\zeta /f\sim \mathcal{O}(1)), using solely the vertical density profiles, without any additional information on the velocity, the profile location, or horizontal density gradients. The results of this paper show that the adapted PCM can be applied to data sets from different sources and provides a method to study submesoscale eddies using global data sets (e.g., CTD profiles collected from ships, gliders, and Argo floats).


A New Age of SAR: How Can Commercial Smallsat Constellations Contribute to NASA's Surface Deformation and Change Mission?

January 2025

·

60 Reads

In response to the 2017 Decadal Survey, NASA conducted a five‐year study on the Surface Deformation and Change (SDC) designated observable to study potential mission concepts. As part of the SDC mission study, the Commercial Synthetic Aperture Radar (ComSAR) subgroup was tasked with evaluating the current landscape of the SAR and interferometric SAR (InSAR) industry to assess whether NASA could leverage commercial smallsat products to meet the needs of the SDC science mission. The assessment found that although the commercial SAR industry is growing rapidly, off‐the‐shelf products can currently only make a small—albeit distinct—contribution to SDC mission goals. This gap is due to different design goals between current commercial systems (which prioritize targeted high‐resolution, non‐interferometric observations at short wavelengths with a daily or faster revisit) and a future SDC architecture (which focuses on broad, moderate‐resolution, and interferometric observations at long wavelengths). Even by 2030, planned commercial constellations are expected to only cover ∼ {\sim} 65% of the area needed to match NISAR coverage. Still, high‐resolution and rapid‐repeat capabilities can augment scientific findings from a future SDC mission, as demonstrated by recent contributions from commercial data to applied sciences, cryosphere, and volcanology. Future innovations on smallsat constellation concepts could further contribute to SDC science and applications. Although current constellation designs are not fully able to satisfy desired SDC science capabilities, initial positive feedback to a request for information indicates a potential future path for a customized SDC commercial architecture; more studies will be needed to determine the feasibility of these approaches.


Enhanced Forecasting and Assessment of Urban Air Quality by an Automated Machine Learning System: The AI‐Air

January 2025

·

33 Reads

An automated air quality forecasting system (AI‐Air) was developed to optimize and improve air quality forecasting for different typical cities, combined with the China Meteorological Administration Unified Atmospheric Chemistry Environmental Model (CUACE), and used in a typical inland city of Zhengzhou and a coastal city of Haikou in China. The performance evaluation results show that for the PM2.5 forecasts, the correlation coefficient (R) is increased by 0.07–0.13, and the mean error (ME) and root mean square error (RMSE) is decreased by 3.2–3.5 and 3.8–4.7 μg/m³. Similarly, for the O3 forecasts, the R value is improved by 0.09–0.44, and ME and RMSE values are reduced by 7.1–22.8 and 9.0–25.9 μg/m³, respectively. Case analyses of operational forecasting also indicate that the AI‐Air system can significantly improve the forecasting performance of pollutant concentrations and effectively correct underestimation, or overestimation phenomena compared to the CUACE model. Additionally, explanatory analyses were performed to assess the key meteorological factors affecting air quality in cities with different topographic and climatic conditions. The AI‐Air system highlights the potential of AI techniques to improve forecast accuracy and efficiency, and with promising applications in the field of air quality forecasting.


Permafrost Dynamics Observatory: 3. Remote Sensing Big Data for the Active Layer, Soil Moisture, and Greening and Browning

January 2025

·

42 Reads

Because of the remote nature of permafrost, it is difficult to collect data over large geographic regions using ground surveys. Remote sensing enables us to study permafrost at high resolution and over large areas. The Arctic‐Boreal Vulnerability Experiment's Permafrost Dynamics Observatory (PDO) contains data about permafrost subsidence, active layer thickness (ALT), soil water content, and water table depth, derived from airborne radar measurements at 66 image swaths in 2017. With nearly 58,000,000 pixels available for analysis, this data set enables new discoveries and can corroborate findings from previous studies across the Arctic‐Boreal region. We analyze the distributions of these variables and use a space‐for‐time substitution to enable interpretation of the effects of climate trends. Higher soil volumetric water content (VWC) is associated with lower ALT and subsidence, suggesting that Arctic soil may become drier as the climate warms. Soil VWC is bimodal, with saturated soil occurring more commonly in burned areas, while unburned areas are more commonly unsaturated. All permafrost variables show statistically significant differences from one land cover type to another; in particular, cropland has thicker active layers and developed land has lower seasonal subsidence than most other land cover types, potentially related to disturbance and permafrost thaw. While vegetation browning is not strongly associated with any of the measured permafrost variables, more greening is associated with less subsidence and ALT and with higher bulk soil VWC.


Bi‐Directional Spectro‐Polarimetry of Olivine Sand

January 2025

·

40 Reads

We characterized the bi‐directional spectro‐polarimetry of olivine sands of varying grain size distributions for a comprehensive set of measurement and illumination angles over a wavelength range of 350–2,500 nm. Our laboratory instrumentation included a hyperspectral goniometer, a broadband linear polarizer, and a tungsten‐halogen illumination source. Three distinct grain size distributions of olivine sand samples were used in our experiments. As a function of azimuth, we measured a significant degree of anisotropic scattering, that depends directly on polarization angles, resulting in a distribution that cannot be accurately described solely using phase angle. For media of uniform or similar composition, we observed robust separability of grain size distributions using spectro‐polarimetry. We compared Hapke's polarimetric model for semi‐infinite granular media with a new empirical polarimetric model that we developed. This empirical model more accurately replicates the scattering of unpolarized incident light as a function of all view azimuth, view zenith, and polarization angles for all incident zenith angles. Parameters of our empirical polarimetric model that determine the magnitude of polarization correlate linearly with the inverse diffuse reflectances of the olivine sand samples, exhibiting phenomenology that is most likely due to the Umov effect. Because of the linearity of the correlations, our results show that polarimetry can be used to retrieve medium parameters, such as grain size distributions. We provide our data online and freely available in a Zenodo/GitHub repository.


Nowcasting of a Warm‐Sector Rainfall Event in Southern China With the TRAMS Model: Sensitivity to Different Radar Reflectivity Retrieval Methods and Incremental Updating Strategies

January 2025

·

15 Reads

To improve the radar data assimilation scheme for the high‐resolution Tropical Regional Atmospheric Model System (TRAMS) model, this study investigates the sensitivity of simulating a warm‐sector rainfall event in southern China to different radar reflectivity retrieval methods and incremental updating strategies. The findings indicate that the ice cloud retrieval (ICR) method yields more reasonable cloud hydrometeors. However, the impact of different retrieval methods is minimal without corresponding adjustments to the dynamic field. Further assimilation of the wind field effectively reduced the overestimated south winds and successfully simulated the observed low‐level convergence in northern Guangdong, significantly improving precipitation forecasts. Both incremental analysis update (IAU) and Nudging methods were able to adjust the forecast to better match the observations, with IAU performing slightly better. These findings are beneficial for further improving the forecast accuracy of precipitation intensity. Extending the IAU relaxation time from 4 to 10 min has almost no impact on the actual forecasting. However, prioritizing the adjustment of the wind field through time‐dependent IAU weighting factors, the impact of cloud particle adjustments on the dynamical field can be avoided (e.g., the drag caused by the sinking of cloud particles may offset the upward motion induced by dynamical convergence adjustments). This allows for more realistic low‐level wind convergence and precipitation forecasts to be obtained. Overall, the ICR method for retrieving cloud hydrometeors, combined with the IAU method using time‐dependent distribution weighting factors appears to be a more suitable option for the radar data assimilation scheme in TRAMS model.


Post‐Fire Sediment Yield From a Western Sierra Nevada Watershed Burned by the 2021 Caldor Fire

January 2025

·

35 Reads

Watershed sediment yield commonly increases after wildfire, often causing negative impacts to downstream infrastructure and water resources. Post‐fire erosion is important to understand and quantify because it is increasingly placing water supplies, habitat, communities, and infrastructure at risk as fire regimes intensify in a warming climate. However, measurements of post‐fire sediment mobilization are lacking from many regions. We measured sediment yield from a forested, heavily managed 25.4‐km² watershed in the western Sierra Nevada, California, over 2 years following the 2021 Caldor Fire, by repeat mapping of a reservoir where sediment accumulated from terrain with moderate to high soil burn severity. Sediment yield was less than the geochronology‐derived long‐term average in the first year post‐fire (conservatively estimated at 21.8–28.0 t/km²), low enough to be difficult to measure with uncrewed airborne system (UAS) and bathymetric sonar survey methods that are most effective at detecting larger sedimentary signals. In the second year post‐fire the sediment delivery was 1,560–2,010 t/km², an order of magnitude above long‐term values, attributable to greater precipitation and intensive salvage logging. Hillslope erosion simulated by the Water Erosion Prediction Project (WEPP) model overestimated the measured amount by a factor of 90 in the first year and in the second year by a factor (1.9) that aligned with previously determined model performance in northern California. We encourage additional field studies, and validation of erosion models where feasible, to further expand the range of conditions informing post‐fire hazard assessments and management decisions.


Ten Years of Earth and Space Science: Introduction to the Special Collection

January 2025

·

98 Reads

The journal Earth and Space Science (ESS) was founded in 2014 to offer the scientific community a new platform for the dissemination of key new data, observations, methods, instruments, and models, presented within the context of their application. Thus, the aim of the journal was (and is) to highlight the complexity and importance of experimental design, methodology, data acquisition and processing, intertwined with data interpretation. Such approach is consistent with the mission of most AGU journals, but the distinctive element for ESS is its focus on the concept of the useful impact of publication, progressively replacing that on conventional publication metrics. In this context, the journal has been, since its inception, the preferred home for studies stemming from both global and local geoscience research. This special collection contains 16 papers published in ESS, selected by the Editorial Board to highlight the aims, scope and path of evolution and growth of the journal since it inaugural issue, in 2014.


Four Generations of ECMWF Reanalyses: An Overview of the Successes in Modeling Precipitation and Remaining Challenges for Freshwater Budget of Ocean Models

January 2025

·

22 Reads

This study reviews the progress made in modeling precipitations in four generations of reanalyses from the European Center for Medium‐Range Weather Forecasts, using traditional metrics and a new set of regional metrics. Regional metrics at oceanic basin scales and large land catchment areas over the continents allow for a more comprehensive analysis of the performance of the reanalyses. This leads to the conclusion that significant progress has been made in the past several decades in both the atmospheric model and the assimilation system at the ECMWF, leading to more realistic precipitation. The most recent ERA5 reanalysis outperforms ERA‐Interim and its predecessors by all metrics considered. ERA5 is then used to force a modern ocean general circulation model, and the results show an improvement in terms of the freshwater budget, particularly after the year 2000. However, uncertainties remain about the magnitude and trends of the modeled evaporation.


Analysis of 42 Years of Cosmic Ray Measurements by the Neutron Monitor at Lomnický štít Observatory

January 2025

·

19 Reads

The correlation and physical interconnection between space weather indices and cosmic ray flux has been well‐established with extensive literature on the topic. Our investigation is centered on the relationships among the solar radio flux, geomagnetic field activity, and cosmic ray flux, as observed by the Neutron Monitor at the Lomnický štít Observatory in Slovakia. We processed the raw neutron monitor data, generating the first publicly accessible data set spanning 42 years. The curated continuous data are available in.csv format in hourly resolution from December 1981 to July 2023 and in minute resolution from January 2001 to July 2023 (Institute of Experimental Physics SAS, 2024, https://doi.org/10.5281/zenodo.10790915). Validation of this processed data was accomplished by identifying distinctive events within the data set. As part of the selection of events for case studies, we report the discovery of TGE‐s visible in the data. Applying the Pearson method for statistical analysis, we quantified the linear correlation of the data sets. Additionally, a prediction power score was computed to reveal potential non‐linear relationships. Our findings demonstrate a significant anti‐correlation between cosmic ray and solar radio flux with a correlation coefficient of −0.74, coupled with a positive correlation concerning geomagnetic field strength. We also found that the neutron monitor measurements correlate better with a delay of 7–21 hr applied to the geomagnetic field strength data. The correlation between these data sets is further improved when inspecting periods of extreme solar events only. Lastly, the computed prediction power score of 0.22 for neutron flux in the context of geomagnetic field strength presents exciting possibilities for developing real‐time geomagnetic storm prediction models based on cosmic ray measurements.


Mapping 3D Overthrust Structures by a Hybrid Modeling Method

December 2024

·

55 Reads

A rational three‐dimensional (3D) geological model with complex characteristics generated on a small amount of data is a crucial data infrastructure for scientific research and many applications. However, reconstructing structures with multi‐Z values on a single point caused by folding or overthrusting is still one of the bottlenecks in 3D geological modeling. Combined with the multi‐point statistics (MPS) method and fully connected neural networks (FCNs), this study presented a hybrid framework for 3D geological modeling. The loss functions of FCN and the conventional MPS method jointly form the kernel function of the proposed method, which is constrained by stratigraphic sequence and stratum thickness. The input and output parameters of the FCN are the coordinates and corresponding elevations of geological contacts, respectively. To solve the kernel function, the initial model, in which geological surfaces are generated by the FCNs, is generated using a sequential process. An iterative MPS process with an Expectation Maximization‐like (EM‐like) algorithm is carried out to illuminate the artifacts in the initial model. Ten orthogonal cross‐sections are extracted from the overthrust model created by SEG/EAGE as the modeling data source. The results illustrated that the geometry and spatial relationships of strata and faults are retained well with the geological constraints. The comparison of virtual boreholes from the results and the real model shows that the accuracy of the geological object reaches 75%. The presented method provides a new idea for simulating 3D structures with multi‐Z values, which overcomes the limitations of the conventional MPS‐based 3D modeling method.


Evaluating GEMS HCHO Retrievals With TROPOMI Product, Pandora Observations, and GEOS‐Chem Simulations

December 2024

·

54 Reads

Satellite column formaldehyde (HCHO) is an indicator of regional volatile organic compounds (VOC) emissions as HCHO is a short‐lived intermediate oxidation product. The Geostationary Environment Monitoring Spectrometer (GEMS), launched in 2020, is the first geostationary satellite to monitor hourly HCHO. GEMS offers unprecedented potential to reveal the diurnal variations of VOC emissions in Asia. Here, we present the first study to evaluate year‐round GEMS HCHO retrievals using TROPOMI satellite and ground‐based Pandora spectrometers. Our study shows that GEMS HCHO aligns with TROPOMI (r = 0.59–0.85; differences within 20% for most areas). Moreover, GEMS captures monthly and diurnal HCHO variations observed by Pandora spectrometers across Asia with differences overall within 15% (r ∼ 0.85). Diurnally, we find strong HCHO variations over urban areas but not in forests. During the fire season of mainland Southeast Asia, GEMS HCHO increases in the afternoon, in line with diurnal emission estimates from the Global Fire Emissions Database Version 4 with small fires (GFED4s) and GEOS‐Chem simulations. GEMS also captures the spatial patterns of fire emissions in GFED4s. GEMS HCHO shows negative bias when observing with a high (>60°) viewing zenith angle (VZA) and overly relies on model correction for observations to the north of 30°N.


Machine Learning Classification Strategy to Improve Streamflow Estimates in Diverse River Basins in the Colorado River Basin

December 2024

·

28 Reads

Streamflow in the Colorado River Basin (CRB) is significantly altered by human activities including land use/cover alterations, reservoir operation, irrigation, and water exports. Climate is also highly varied across the CRB which contains snowpack‐dominated watersheds and arid, precipitation‐dominated basins. Recently, machine learning methods have improved the generalizability and accuracy of streamflow models. Previous successes with LSTM modeling have primarily focused on unimpacted basins, and few studies have included human impacted systems in either regional or single‐basin modeling. We demonstrate that the diverse hydrological behavior of river basins in the CRB are too difficult to model with a single, regional model. We propose a method to delineate catchments into categories based on the level of predictability, hydrological characteristics, and the level of human influence. Lastly, we model streamflow in each category with climate and anthropogenic proxy data sets and use feature importance methods to assess whether model performance improves with additional relevant data. Overall, land use cover data at a low temporal resolution was not sufficient to capture the irregular patterns of reservoir releases, demonstrating the importance of having high‐resolution reservoir release data sets at a global scale. On the other hand, the classification approach reduced the complexity of the data and has the potential to improve streamflow forecasts in human‐altered regions.


Journal metrics


2.9 (2023)

Journal Impact Factor™


54%

Acceptance rate


5.5 (2023)

CiteScore™


50 days

Submission to first decision


0.8 (2023)

Immediacy Index


0.00827 (2023)

Eigenfactor®


$2,420 / £1,510 / €1,800

Article processing charge

Editors