250 reads in the past 30 days
Prediction Beyond the Medium Range With an Atmosphere‐Ocean Model That Combines Physics‐Based Modeling and Machine LearningApril 2025
·
306 Reads
Published by Wiley and American Geophysical Union
Online ISSN: 1942-2466
250 reads in the past 30 days
Prediction Beyond the Medium Range With an Atmosphere‐Ocean Model That Combines Physics‐Based Modeling and Machine LearningApril 2025
·
306 Reads
164 reads in the past 30 days
Ultrafine‐Resolution Urban Climate Modeling: Resolving Processes Across ScalesJune 2025
·
164 Reads
105 reads in the past 30 days
An Online Paleoclimate Data Assimilation With a Deep Learning‐Based NetworkJune 2025
·
105 Reads
78 reads in the past 30 days
A Variational LSTM Emulator of Sea Level Contribution From the Antarctic Ice SheetDecember 2023
·
966 Reads
·
21 Citations
65 reads in the past 30 days
Key Gaps in Models' Physical Representation of Climate Intervention and Its ImpactsJune 2025
·
87 Reads
JAMES is an open access journal that publishes original research articles advancing the development and application of models at all scales in understanding the physical Earth system and its coupling to biological, geological and chemical systems.
June 2025
·
4 Reads
Yi Qin
·
Po‐Lun Ma
·
Mark D. Zelinka
·
[...]
·
Meng Huang
Recent studies reveal an anti‐correlation between global cloud feedback (CF) and effective radiative forcing due to aerosol‐cloud interaction (ERFaci) in Earth system models, but the physical mechanisms underlying it remain uncertain. Here we investigate how different turbulence representations contribute to this relationship over the global ocean using an ensemble of Energy Exascale Earth System Model version 2 simulations with perturbed turbulence parameters. The anti‐correlation appears only in the tropical ascent regime. In the Northern Hemisphere midlatitude and high latitude regimes, there is no significant correlation, and in the tropical marine low cloud and Southern Ocean regimes, the correlation is positive. These opposite correlations are primarily driven by opposing CF responses to perturbed parameters. We find that the mean‐state turbulent mixing strength affects both CF and ERFaci, enabling strong correlations in certain regimes. This study highlights the complex linkages between CF and ERFaci through turbulent processes across diverse cloud regimes.
June 2025
·
2 Reads
Yiling Huo
·
Hailong Wang
·
Milena Veneziani
·
[...]
·
Shixuan Zhang
Earth system models are essential tools for climate projections, but coarse resolutions limit regional accuracy, especially in the Arctic. Regionally refined meshes (RRMs) enhance resolution in key areas while maintaining computational efficiency. This paper provides an overview of the United States (U.S.) Department of Energy's (DOE's) Energy Exascale Earth System Model version 2.1 with an Arctic RRM, hereafter referred to as E3SMv2.1‐Arctic, for the atmosphere (25 km), land (25 km), and ocean/ice (10 km) components. We evaluate the atmospheric component and its interactions with land, ocean, and cryosphere by comparing the RRM (E3SM2.1‐Arctic) historical simulations (1950–2014) with the uniform low‐resolution (LR) counterpart, reanalysis products, and observational data sets. The RRM generally reduces biases in the LR model, improving simulations of Arctic large‐scale mean fields, such as precipitation, atmospheric circulation, clouds, atmospheric river frequency, and sea ice thickness. However, it introduces a seasonally dependent surface air temperature bias, reducing the LR cold bias in summer but enhancing the LR warm bias in winter, which contributes to the underestimated winter sea ice area and volume. Radiative feedback analysis shows similar climate feedback strengths in both model configurations, with the RRM exhibiting a more positive surface albedo feedback and contributing to a stronger surface warming than LR. These findings underscore the importance of high‐resolution modeling for advancing our understanding of Arctic climate changes and their broader global impacts, although some persistent biases appear to be independent of model resolution at 10–100 km scales.
June 2025
·
3 Reads
Liran Peng
·
Peter N. Blossey
·
Walter M. Hannah
·
[...]
·
Michael S. Pritchard
This study investigates low cloud feedback in a warmer climate using global simulations from the High‐Resolution Multi‐scale Modeling Framework (HR‐MMF), which explicitly simulates small‐scale eddies globally. Two 5‐year simulations—one with present‐day sea surface temperatures (SSTs) and a second with SSTs warmed uniformly by 4 K—reveal a positive global shortwave cloud radiative effect (SWCRE = 0.3 W/m2 /K), comparable to estimates from CMIP models. As the climate warms, significant reductions in low cloud cover occur over stratocumulus regions. This study is the first attempt to compare HR‐MMF results with predictions from idealized large‐eddy simulations from the CGILS intercomparison. Despite different underlying assumptions, we find qualitative agreement in SWCRE and inversion height changes between HR‐MMF and CGILS predictions. This suggests reasonable credibility for the CGILS framework in predicting cloud responses under the out‐of‐sample conditions found in HR‐MMF. However, the HR‐MMF exhibits stronger SWCRE changes than predicted by CGILS. We explore potential causes for this discrepancy, examining variations in cloud‐controlling factors (CCFs) and cloud conditions. Our results show a fairly homogeneous SWCRE response, with little systematic variation tied to the variations in CCFs. This reveals a dominant role for SST forcing in modulating SWCRE.
June 2025
·
35 Reads
Complex Land Surface Models (LSMs) rely on a plethora of parameters. These parameters and the associated process formulations are often poorly constrained, which hampers reliable predictions of ecosystem dynamics and climate feedbacks. Robust and uncertainty‐aware parameter estimation with observations is complicated by, for example, the high dimensionality of the model parameter space and the computational cost of LSM simulations. Herein, we adapt a novel Bayesian data assimilation (DA) and machine learning framework termed “calibrate, emulate, sample” (CES) to infer parameters in a widely‐used LSM coupled with a demographic vegetation model (CLM‐FATES). First, an iterative ensemble Kalman smoother provides an initial estimate of the posterior distribution (“calibrate”). Subsequently, a machine‐learning‐based emulator is trained on the resulting model‐observation mismatches to predict outcomes for unseen parameter combinations (“emulate”). Finally, this emulator replaces CLM‐FATES simulations in an adaptive Markov Chain Monte Carlo approach enabling computationally feasible posterior sampling with enhanced uncertainty quantification (“sample”). We test our implementation with synthetic and real observations representing a boreal forest site in southern Finland. We estimate a total of six plant‐functional‐type‐specific photosynthetic parameters by assimilating evapotranspiration (ET) and gross primary production (GPP) flux data. CES provided the best estimates of the synthetic truth parameters when compared to data‐blind emulator sampling designs while all approaches reduced model‐observation errors compared to a default parameter simulation (GPP: −10 % to −30 %, ET: −4 % to −6 %). Although errors were also consistently reduced with real data, comparing the emulator designs was less conclusive, which we mainly attribute to equifinality, structural uncertainty within CLM‐FATES, and/or unknown errors in the data that are not accounted for.
June 2025
·
13 Reads
Modeling human‐environment feedbacks is critical for assessing the effectiveness of climate change mitigation and adaptation strategies under a changing climate. The Energy Exascale Earth System Model (E3SM) now includes a human component, with the Global Change Analysis Model (GCAM) at its core, that is synchronously coupled with the land and atmosphere components through the E3SM coupling software. Terrestrial productivity is passed from E3SM to GCAM to make climate‐responsive land use and CO2 emission projections for the next 5‐year period, which are interpolated and passed to E3SM annually. Key variables affected by the incorporation of these feedbacks include land use/cover change, crop prices, terrestrial carbon, local surface temperature, and climate extremes. Regional differences are more pronounced than global differences because the effects are driven primarily by differences in land use. This novel system enables a new type of scenario development and provides a powerful modeling framework that facilitates the addition of other feedbacks between these models. This system has the potential to explore how human responses to climate change impacts in a variety of sectors, including heating/cooling energy demand, water management, and energy production, may alter emissions trajectories and Earth system changes.
June 2025
·
87 Reads
Solar radiation modification (SRM) is increasingly discussed as a potential method to ameliorate some negative effects of climate change. However, unquantified uncertainties in physical and environmental impacts of SRM impede informed debate and decision making. Some uncertainties are due to lack of understanding of processes determining atmospheric effects of SRM and/or a lag in development of their representation in models, meaning even high‐quality model intercomparisons will not necessarily reveal or address them. Although climate models at multiple scales are advancing in complexity, there are specific areas of uncertainty where additional model development (often requiring new observations) could significantly advance understanding of SRM's effects, and improve our ability to assess and weigh potential risks against those of choosing to not use SRM. We convene expert panels in the areas of atmospheric science most critical to understanding the three most widely discussed forms of SRM. Each identifies three key modeling gaps relevant to either stratospheric aerosols, cirrus, or low‐altitude marine clouds. Within each area, key challenges remain in capturing impacts due to complex interactions in aerosol physics, atmospheric chemistry/dynamics, and aerosol‐cloud interactions. Across all three, in addition to arguing for more observations, the panels argue that model development work to either leverage different capabilities of existing models, bridge scales across which relevant processes operate, or address known modeling gaps could advance understanding. By focusing on these knowledge gaps we believe the modeling community could advance understanding of SRM's physical risks and potential benefits, allowing better‐informed decision‐making about whether and how to use SRM.
June 2025
·
3 Reads
Weak Temperature Gradient modeling using a small cloud‐resolving model admits multiple equilibria depending upon the initial model conditions. There were thought to be two equilibrium states, a moist precipitating state and a dry non‐precipitating state. In this paper, we describe a periodic equilibrium which has oscillatory behavior from static boundary conditions. We show that the periodic oscillation has the characteristics of a moisture mode in the vertical dimension, instead of in the horizontal dimension. Further, we show that the oscillation occurs due to a balance between vertical advection and radiation, which can be described using a simple two vertical mode model. The first mode is related to the column relative humidity anomaly and a first baroclinic mode, while the second mode is related to a moisture dipole centered around 600 hPa and a second baroclinic mode. The first mode is associated with the generation of a moisture dipole, while the second mode is associated with the generation of a column moisture anomaly.
June 2025
·
1 Read
Clouds exert a significant impact on global temperatures and climate change. Cloud‐radiative feedback (CRF) is one of the major sources of climate change uncertainty. Understanding CRF is therefore crucial for accurate climate projections. Biases like the double‐ITCZ problem in Global Climate Models (GCMs) hamper precise climate projections. Here, we explore a bias‐corrected downscaling method to constrain the cloud feedback uncertainties in the tropical and sub‐tropical Atlantic region. We use regional climate model (RCM) simulations with convection permitting resolution, driven by debiased driving fields from three different global climate models (GCMs). Bias‐corrected downscaling significantly reduces biases in ITCZ intensity and position, eliminating the double‐ITCZ bias across all six experiments (three GCMs for historical and future periods). We explore the new methodology's potential to investigate the CRF in comparison to that of the driving GCMs. Results indicate that additional GCMs and RCMs are necessary for a more comprehensive uncertainty estimation and more conclusive results, while our simulations suggest a potentially narrower range of CRF over the tropical and subtropical Atlantic, primarily due to an improved representation of stratocumulus clouds. Our study highlights the potential of bias‐corrected downscaling in constraining the uncertainty of simulations and estimates of cloud feedback and equilibrium climate sensitivity. The results advocate for further simulations with additional RCMs and domains for a more comprehensive analysis.
June 2025
·
64 Reads
Peatlands are significant carbon reservoirs vulnerable to climate change and land use change such as drainage for cultivation or forestry. We modified the ORCHIDEE‐PEAT global land surface model, which has a detailed description of peat processes, by incorporating three new peatland‐specific plant functional types (PFTs), namely deciduous broadleaf shrub, moss and lichen, as well as evergreen needleleaf tree in addition to previously peatland graminoid PFT to simulate peatland vegetation dynamic and soil CO2 fluxes. Model parameters controlling photosynthesis, autotrophic respiration, and carbon decomposition have been optimized using eddy‐covariance observations from 14 European peatlands and a Bayesian optimization approach. Optimization was conducted for each individual site (single‐site calibration) or all sites simultaneously (multi‐site calibration). Single‐site calibration performed better, particularly for gross primary production (GPP), with root mean square deviation (RMSD) reduced by 53%. While multi‐site calibration showed limited improvement (e.g., RMSD of GPP reduced by 22%) due to the model's inability to account for spatial parameter variations under different climatic contexts (trait‐climate correlations). Site‐optimized parameters, such as Q10, the temperature sensitivity of heterotrophic respiration, revealed strong empirical relationships with environmental factors, such as air temperature. For instance, Q10 decreased significantly at warmer sites, consistent with independent field data. To improve the model by using the lessons from single‐site optimization, we incorporated two key trait‐climate relationships for Q10 and Vcmax (maximum carboxylation rate) into a new version of the ORCHIDEE‐PEAT models. Using this description of spatial variability of parameters holds significant promise for improving the accuracy of carbon cycle simulations in peatlands.
June 2025
·
2 Reads
The radiative effects of black carbon depend critically on its atmospheric lifetime, which is controlled by the rate at which freshly emitted combustion particles become internally mixed with other aerosol components. Global aerosol models strive to represent this process, but the timescale for aerosol mixing is not easily constrained using observations. In this study, we apply a timescale parameterization derived from particle‐resolved simulations to quantify, in a global aerosol model, the timescale for internal mixing. We show that, while highly variable, the average timescale for internal mixing is approximately 3 hr, which is much shorter than the 24‐hr aging timescale traditionally applied in bulk aerosol models. We then use the mixing timescale to constrain the aging criterion in the Modal Aerosol Module. Our analysis reveals that, to best reflect timescales for internal mixing, modal models should assume that particles transition from the hydrophobic (fresh) to the hydrophilic (aged) class once they accumulate a coating thickness equal to four monolayers of sulfuric acid, as opposed to the model's current aging criterion of eight monolayers. We show that, in remote regions like the Arctic and Antarctic, predictions of black carbon loading and its seasonal variation are particularly sensitive to the model representation of aging. By constraining aging in global models to reflect mixing timescales simulated by the particle‐resolved model, we eliminate one of the free parameters governing black carbon's long‐range transport and spatiotemporal distribution.
June 2025
·
15 Reads
·
1 Citation
Extreme events are the major weather‐related hazard for humanity. It is then of crucial importance to have a good understanding of their statistics and to be able to forecast them. However, lack of sufficient data makes their study particularly challenging. In this work, we provide a simple framework for studying extreme events that tackles the lack of data issue by using the entire available data set, rather than focusing on the extremes of the data set. To do so, we make the assumption that the set of predictors and the observable used to define the extreme event follow a jointly Gaussian distribution. This naturally gives the notion of an optimal projection of the predictors for forecasting the event. We take as a case study extreme heatwaves over France, and we test our method on an 8,000‐year‐long intermediate complexity climate model time series and on the ERA5 reanalysis data set. For a‐posteriori statistics, we observe and motivate the fact that composite maps of very extreme events look similar to less extreme ones. For prediction, we show that our method is competitive with off‐the‐shelf neural networks on the long data set and outperforms them on reanalysis. The optimal projection pattern, which makes our forecast intrinsically interpretable, highlights the importance of soil moisture deficit and quasi‐stationary Rossby waves as precursors to extreme heatwaves.
June 2025
·
34 Reads
Machine learning has the potential to improve the physical realism and/or computational efficiency of parameterizations. A typical approach has been to feed concatenated vertical profiles to a dense neural network. However, feed‐forward networks lack the connections to propagate information sequentially through the vertical column. Here we examine if predictions can be improved by instead traversing the column with recurrent neural networks (RNNs) such as Long Short‐Term Memory (LSTMs). This method encodes physical priors (locality) and uses parameters more efficiently. Firstly, we test RNN‐based radiation emulators in the Integrated Forecasting System. We achieve near‐perfect offline accuracy, and the forecast skill of a suite of global weather simulations using the emulator are for the most part statistically indistinguishable from reference runs. But can radiation emulators provide both high accuracy and a speed‐up? We find optimized, state‐of‐the‐art radiation code on CPU generally faster than RNN‐based emulators on GPU, although the latter can be more energy efficient. To test the method more broadly, and explore recent challenges in parameterization, we also adapt it to data sets from other studies. RNNs outperform reference feed‐forward networks in emulating gravity waves, and when combined with horizontal convolutions, for non‐local unified parameterization. In emulation of moist physics with memory, the RNNs have similar offline accuracy as ResNets, the previous state‐of‐the‐art. However, the RNNs are more efficient, and more stable in autoregressive semi‐prognostic tests. Multi‐step autoregressive training improves performance in these tests and enables a latent representation of convective memory. Recently proposed linearly recurrent models achieve similar performance to LSTMs.
June 2025
·
17 Reads
In recent years, machine learning (ML) models have been used to improve physical parameterizations of general circulation models (GCMs). A significant challenge of integrating ML models into GCMs is the online instability when they are coupled for long‐term simulation. We present a new strategy that demonstrates robust online stability when the physical parameterization package of an atmospheric GCM is replaced by a deep ML model. The method uses experience replay with a multistep training scheme of the ML model in which the model's own output at the previous time step is used in the training. Predicted physics tendencies in the replay buffer with the most recent errors in the training iterations are reused, making the ML model learn from its own errors. The training method reduces the gap between the offline and online environments of the ML model. The method is used to train the ML model as the physical parameterization of the Community Atmosphere Model (CAM5) with training data from the Multi‐scale Modeling Framework high resolution simulations. Three 6‐year online simulations of the CAM5 are carried out by using the ML physics package. The simulated spatial distributions of precipitation, surface temperature and zonally averaged atmospheric fields demonstrate overall better accuracy than that of the standard CAM5 and benchmark model even without the use of additional physical constraints or tuning. This work is the first to demonstrate a solution to address the online instability problem in climate modeling with ML physics by using experience replay.
June 2025
·
50 Reads
Chlorophyll underpins ocean productivity yet simulating chlorophyll across biomes, seasons and depths remains challenging for earth system models. Inconsistencies are often attributed to misrepresentation of the myriad nutrient supply, growth and loss processes that govern phytoplankton biomass. They may also arise, however, from unresolved or misspecified photoacclimation or photoadaptation responses. A series of global ocean ecosystem simulations were conducted to assess these latter sensitivities: alternative photoacclimation schemes implicitly modulated investments in light harvesting versus photodamage avoidance and other cellular functions. Photoadaptation experiments probed the impact of adding low‐ and high‐light adapted phytoplankton ecotypes. Results showed that photoacclimation and photoadaptation alternatives generate chlorophyll differences exceeding a factor of 2 in some regions and seasons. In stratified waters, photoadaptation and acclimation to light levels over mixing depths consistent with the timescale of photoadaptation (days) benefitted model performance. In regions and seasons with deep mixed layers, surface‐skewed photoacclimation yielded improved fidelity across satellite chlorophyll products. Large photoacclimation‐driven differences in chlorophyll concentration had small impacts on primary productivity and carbon export, unlike those arising from changes in the nutrient supply. Improved photoacclimation and photoadaption constraints are thus needed to reduce ambiguities in the drivers of chlorophyll change and their biogeochemical implications.
June 2025
·
164 Reads
Recent advances in urban climate modeling resolution have improved the representation of complex urban environments, with large‐eddy simulation (LES) as a key approach, capturing not only building effects but also urban vegetation and other critical urban processes. Coupling these ultrafine‐resolution (hectometric and finer) approaches with larger‐scale regional and global models provides a promising pathway for cross‐scale urban climate simulations. However, several challenges remain, including the high computational cost that limits most urban LES applications to short‐term, small‐domain simulations, uncertainties in physical parameterizations, and gaps in representing additional urban processes. Addressing these limitations requires advances in computational techniques, numerical schemes, and the integration of diverse observational data. Machine learning presents new opportunities by emulating certain computationally expensive processes, enhancing data assimilation, and improving model accessibility for decision‐making. Future ultrafine‐resolution urban climate modeling should be more end‐user oriented, ensuring that model advancements translate into effective strategies for heat mitigation, disaster risk reduction, and sustainable urban planning.
June 2025
·
23 Reads
Aerosols emitted from biomass burning affect human health and climate, both regionally and globally. The magnitude of these impacts is altered by the biomass burning plume injection height (BB‐PIH). However, these alterations are not well‐understood on a global scale. We present the novel implementation of BB‐PIH in global simulations with an atmospheric chemistry model (GEOS‐Chem) coupled with detailed TwO‐Moment Aerosol Sectional (TOMAS) microphysics. We conduct BB‐PIH simulations under three scenarios: (a) All smoke is well‐mixed into the boundary layer, and (b) and (c) smoke injection height is based on Global Fire Assimilation System (GFAS) plume heights. Elevating BB‐PIH increases the simulated global‐mean aerosol optical depth (10%) despite a global‐mean decrease (1%) in near‐surface PM2.5. Increasing the tropospheric column mass yields enhanced cooling by the global‐mean clear‐sky biomass burning direct radiative effect. However, increasing BB‐PIH places more smoke above clouds in some regions; thus, the all‐sky biomass burning direct radiative effect has weaker cooling in these regions as a result of increasing the BB‐PIH. Elevating the BB‐PIH increases the simulated global‐mean cloud condensation nuclei concentrations at low‐cloud altitudes, strengthening the global‐mean cooling of the biomass burning aerosol indirect effect with a more than doubling over marine areas. Elevating BB‐PIH also generally improves model agreement with the satellite‐retrieved total and smoke extinction coefficient profiles. Our 2‐year global simulations with new BB‐PIH capability enable understanding of the global‐scale impacts of BB‐PIH modeling on simulated air‐quality and radiative effects, going beyond the current understanding limited to specific biomass burning regions and seasons.
June 2025
·
33 Reads
We describe internal, low‐frequency variability in a 21‐year simulation with a cloud‐resolving model. The model domain is the length of the equatorial Pacific and includes a slab ocean, which permits coherent cycles of sea surface temperature (SST), atmospheric convection, and the convectively coupled circulation. The warming phase of the cycle is associated with near‐uniform SST, less organized convection, and sparse low cloud cover, while the cooling phase exhibits strong SST gradients, highly organized convection, and enhanced low cloudiness. Both phases are quasi‐stable but, on long timescales, are ultimately susceptible to instabilities resulting in rapid phase transitions. The internal cycle is leveraged to understand the factors controlling the strength and structure of the tropical overturning circulation and the stratification of the tropical troposphere. The overturning circulation is strongly modulated by convective organization, with SST playing a lesser role. When convection is highly organized, the circulation is weaker and more bottom‐heavy. Alternatively, tropospheric stratification depends on both convective organization and SST, depending on the vertical level. SST‐driven variability dominates aloft while organization‐driven variability dominates at lower levels. A similar pattern is found in ERA5 reanalysis of the equatorial Pacific. The relationship between convective organization and stratification is explicated using a simple entraining plume model. The results highlight the importance of convective organization for tropical variability and lay a foundation for future work using coupled, idealized models that explicitly resolve convection.
June 2025
·
28 Reads
Accurately modeling the large‐scale transport of trace gases and aerosols is critical for interpreting past (and projecting future) changes in atmospheric composition. Simulations of the stratospheric mean age‐of‐air continue to show persistent biases in chemistry climate models, although the drivers of these biases are not well understood. Here we identify one driver of simulated stratospheric transport differences among various NASA Global Earth Observing System (GEOS) candidate model versions under consideration for the upcoming GEOS Retrospective analysis for the 21st Century (GEOS‐R21C). In particular, we show that the simulated age‐of‐air values are sensitive to the so‐called “remapping” algorithm used within the finite‐volume dynamical core, which controls how individual material surfaces are vertically interpolated back to standard pressure levels after each horizontal advection time step. Differences in the age‐of‐air resulting from changes within the remapping algorithm approach ∼ 1 year over the high latitude middle stratosphere—or about 30% climatological mean values—and imprint on several trace gases, including methane (CH4 ) and nitrous oxide (N2 O). These transport sensitivities reflect, to first order, changes in the strength of tropical upwelling in the lower stratosphere (70–100 hPa) which are driven by changes in resolved wave convergence over northern midlatitudes as (critical lines of) wave propagation shift in latitude. Our results strongly support continued examination of the role of numerics in contributing to transport biases in composition modeling.
June 2025
·
105 Reads
An online paleoclimate data assimilation (PDA) that utilizes climate forecasts from a deep learning‐based network (NET) along with assimilation of proxies to reconstruct surface air temperature, is investigated here. The NET is trained on ensemble simulations from the Community Earth System Model‐Last Millennium Ensemble. Due to the nonlinear features with high‐dimensional input, NET gains better predictive skills compared to the linear inverse model (LIM) in a reduced empirical orthogonal function (EOF) space. Thus, an alternative for online PDA is to couple the NET with the integrated hybrid ensemble Kalman filter (IHEnKF). Moreover, an analog blending strategy is proposed to increase ensemble spread and mitigate filter divergence, which blends the analog ensembles selected from climatological samples based on proxies and cycling ensembles advanced by NET. To account for the underestimated uncertainties of real proxy data, an observation error inflation method is applied, which inflates the proxy error variance based on the comparison between the estimated proxy error variance and its climatological innovation. Consistent results are obtained from the pseudoproxy experiments and the real proxy experiments. The more informative ensemble priors from the online PDA using NET enhance the reconstructions than the online PDA using LIM, and both outperform the offline PDA with randomly sampled climatological ensemble priors. The advantages of online PDA with NET over the online PDA with LIM and offline PDA become more pronounced, as the proxy data become sparser.
May 2025
·
15 Reads
Data‐driven weather prediction (DDWP) has made significant advancements in recent years. However, weather prediction using DDWPs still requires an accurate initial field as the input. To fulfill this requirement, the four‐dimensional variational (4DVar) approach can offer initial fields. Recent studies have demonstrated the potential of deep learning (DL)‐based methods in accelerating 4DVar. In this study, we propose a novel model called the 4DVar‐informed generative adversarial network (4DVarGAN), which combines prior knowledge from 4DVar with the conditional generative network (CGAN). We employ a CGAN to non‐iteratively solve the 4DVar cost function and utilize a cycle‐consistent adversarial learning framework for data augmentation. Additionally, we incorporate a 4DVar‐based adaptive adjustment to the output of the proposed model's analysis increment‐generating component, which promotes reasonable stabilization. Experimental results using 500 hPa geopotential fields from the WeatherBench data set demonstrate that our approach achieves a 73‐fold acceleration compared to the 4DVar implemented by the DDWP model. Furthermore, our model exhibits the lowest initial and forecast errors, outperforming state‐of‐the‐art DL‐based data assimilation (DA) methods. Moreover, our method demonstrates effective performance when starting from background fields of varying qualities, consistently achieving stable results. These findings highlight the potential of CGANs in enhancing the reliability of data‐driven DA by incorporating the prior knowledge of the 4DVar method.
May 2025
·
9 Reads
Cloud microphysics has important consequences for climate and weather phenomena, and inaccurate representations can limit forecast accuracy. While atmospheric models increasingly resolve storms and clouds, the accuracy of the underlying microphysics remains limited by computationally expedient bulk moment schemes based on simplifying assumptions. Droplet‐based Lagrangian schemes are more accurate but are underutilized due to their large computational overhead. Machine learning (ML) based schemes can bridge this gap by learning from vast droplet‐based simulation data sets, but have so far struggled to match the accuracy and stability of bulk moment schemes. To address this challenge, we developed SuperdropNet, an ML‐based emulator of the Lagrangian superdroplet simulations. To improve accuracy and stability, we employ multi‐step autoregressive prediction during training, impose physical constraints, and carefully control stochasticity in the training data. Superdropnet predicted hydrometeor states and cloud‐to‐rain transition times more accurately than previous ML emulators, and matched or outperformed bulk moment schemes in many cases. We further carried out detailed analyses to reveal how multistep autoregressive training improves performance, and how the performance of SuperdropNet and other microphysical schemes hydrometeors' mass, number and size distribution. Together our results suggest that ML models can effectively emulate cloud microphysics, in a manner consistent with droplet‐based simulations.
May 2025
·
48 Reads
Snow cover fraction (SCF) accuracy in land surface models (LSMs) impacts the accuracy of surface albedo and land‐atmosphere interactions. However, SCF is a large source of uncertainty, partially because of the scale‐dependent nature of snow depletion curves that is not parameterized by LSMs. Using the spatially and temporally complete observationally‐informed STC‐MODSCAG and Snow Data Assimilation System data sets, we develop a new scale‐aware ground SCF parameterization and implement it into the Noah‐MP LSM. The new scale‐aware parameterization significantly reduces ground SCF errors and the scale‐dependence of errors in the western U.S (WUS) compared with the baseline ground SCF formulation. Specifically, the baseline formulation overestimates ground SCF by 4%, 6%, 9%, and 12% at 1‐km, 3‐km, 13‐km, and 25‐km resolutions in the WUS, respectively, whereas biases from the enhanced scale‐aware scheme are reduced to 0%–2% in box model simulations and do not exhibit a relationship with spatial scales. Noah‐MP simulations using the scale‐aware parameterization have smaller mean (peak) ground SCF biases than the baseline simulation by 1%–2% (3%–5%), with spatiotemporal variability depending on land cover, topography, and snow depth. Noah‐MP simulations using the enhanced scale‐aware parameterization remove the baseline WUS surface albedo overestimates of 0.01–0.03 in the 1‐km to 25‐km resolution simulations, relative to Moderate Resolution Imaging Spectroradiometer retrievals. The Noah‐MP ground SCF and surface albedo improvements due to the scale‐aware parameterization are found across most land cover classifications and elevations, indicating the enhanced ground SCF scheme can improve simulated snowpack and surface energy budget accuracy across a variety of WUS landscapes.
May 2025
·
25 Reads
This study addresses the boundary artifacts in machine‐learned (ML) parameterizations for ocean subgrid mesoscale momentum forcing, as identified in the online ML implementation from a previous study (Zhang et al., 2023, https://doi.org/10.1029/2023ms003697). We focus on the boundary condition (BC) treatment within the existing convolutional neural network (CNN) models and aim to mitigate the “out‐of‐sample” errors observed near complex coastal regions without developing new, complex network architectures. Our approach leverages two established strategies for placing BCs in CNN models, namely zero and replicate padding. Offline evaluations revealed that these padding strategies significantly reduce root mean squared error (RMSE) in coastal regions by limiting the dependence on random initialization of weights and restricting the range of out‐of‐sample predictions. Further online evaluations suggest that replicate padding consistently reduces boundary artifacts across various retrained CNN models. In contrast, zero padding sometimes intensifies artifacts in certain retrained models despite both strategies performing similarly in offline evaluations. This study underscores the need for BC treatments in CNN models trained on open water data when predicting near‐coastal subgrid forces in ML parameterizations. The application of replicate padding, in particular, offers a robust strategy to minimize the propagation of extreme values that can contaminate computational models or cause simulations to fail. Our findings provide insights for enhancing the accuracy and stability of ML parameterizations in the online implementation of ocean circulation models with coastlines.
May 2025
·
11 Reads
Using a three‐dimensional cloud‐resolving model, a systematic exploration is undertaken of the response of a radiative‐convective equilibrium state to imposed vertical wind shear of varying magnitude. Domain‐averaged surface precipitation exhibits a non‐monotonic sensitivity to increasing shear magnitude, characterized by a decrease with increasing shear for weakly sheared conditions (<1.5 × 10⁻³ s⁻¹) and an increase under stronger shear (>1.5 × 10⁻³ s⁻¹), with a similar trend in surface heat fluxes. During the first 30–40 min after wind shear is imposed, convective activity and rainfall are suppressed, which is attributed to increased surface drag and reduced boundary layer eddy kinetic energy. As the shear persists over time, it eventually fosters the development of deep convection. An analysis of the condensed water budget shows that the overall response of the domain‐mean surface precipitation rate to increasing shear magnitude is mainly explained by changes in condensation rate, which in turn is primarily controlled by the cloudy updraft mass flux. In the lower to middle troposphere where most condensation occurs, cloudy updraft fraction steadily increases with increasing shear magnitude, whereas mean updraft vertical velocity exhibits a general decreasing trend as the shear magnitude increases. The compensating responses of updraft fraction and mean vertical velocity explain the non‐monotonic surface precipitation response to vertical wind shear. Vertical shear does not significantly impact the evaporation or precipitation efficiencies.
May 2025
·
12 Reads
Predicting Earth systems weeks or months into the future is an important yet challenging problem due to the high dimensionality, chaotic behavior, and coupled dynamics of the ocean, atmosphere, and other subsystems of the Earth. Numerical models invariably contain model error due to incomplete domain knowledge, limited capabilities of representation, and unresolved processes due to finite spatial resolution. Hybrid modeling, the pairing of a physics‐driven model with a data‐driven component, has shown promise in outperforming both purely physics‐driven and data‐driven approaches in predicting complex systems. Here we demonstrate two new hybrid methods that combine uninitialized temporal or spatiotemporal models with a data‐driven component that may be modally decomposed to give insight into model bias, or used to correct the bias of model projections. These techniques are demonstrated on a simulated chaotic system and two empirical ocean variables: coastal sea surface elevation and sea surface temperature, which highlight that the inclusion of the data‐driven components increases the state accuracy of their short‐term evolution. Our work demonstrates that these hybrid approaches may prove valuable for: improving models during model development, creating novel methods for data assimilation, and enhancing the predictive accuracy of forecasts when available models have significant structural error.
Journal Impact Factor™
Acceptance rate
CiteScore™
Submission to first decision
SNIP
Article processing charge