Laura P. Swiler’s research while affiliated with Sandia National Laboratories and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (123)


Figure 11: Spearman correlations across multiple emission angles. a) Spearman correlations with a directional autoencoder at 26 • b) Spearman correlations with a directional autoencoder at 0 • c) Spearman correlations with a traditional autoencoder at 0 •
AutoSciLab: A Self-Driving Laboratory For Interpretable Scientific Discovery
  • Preprint
  • File available

December 2024

·

15 Reads

Saaketh Desai

·

·

Jeffrey Y. Tsao

·

[...]

·

Prasad P. Iyer

Advances in robotic control and sensing have propelled the rise of automated scientific laboratories capable of high-throughput experiments. However, automated scientific laboratories are currently limited by human intuition in their ability to efficiently design and interpret experiments in high-dimensional spaces, throttling scientific discovery. We present AutoSciLab, a machine learning framework for driving autonomous scientific experiments, forming a surrogate researcher purposed for scientific discovery in high-dimensional spaces. AutoSciLab autonomously follows the scientific method in four steps: (i) generating high-dimensional experiments (x \in R^D) using a variational autoencoder (ii) selecting optimal experiments by forming hypotheses using active learning (iii) distilling the experimental results to discover relevant low-dimensional latent variables (z \in R^d, with d << D) with a 'directional autoencoder' and (iv) learning a human interpretable equation connecting the discovered latent variables with a quantity of interest (y = f(z)), using a neural network equation learner. We validate the generalizability of AutoSciLab by rediscovering a) the principles of projectile motion and b) the phase transitions within the spin-states of the Ising model (NP-hard problem). Applying our framework to an open-ended nanophotonics challenge, AutoSciLab uncovers a fundamentally novel method for directing incoherent light emission that surpasses the current state-of-the-art (Iyer et al. 2023b, 2020).

Download

Spatio-temporal Multivariate Cluster Evolution Analysis for Detecting and Tracking Climate Impacts

October 2024

·

28 Reads

Recent years have seen a growing concern about climate change and its impacts. While Earth System Models (ESMs) can be invaluable tools for studying the impacts of climate change, the complex coupling processes encoded in ESMs and the large amounts of data produced by these models, together with the high internal variability of the Earth system, can obscure important source-to-impact relationships. This paper presents a novel and efficient unsupervised data-driven approach for detecting statistically-significant impacts and tracing spatio-temporal source-impact pathways in the climate through a unique combination of ideas from anomaly detection, clustering and Natural Language Processing (NLP). Using as an exemplar the 1991 eruption of Mount Pinatubo in the Philippines, we demonstrate that the proposed approach is capable of detecting known post-eruption impacts/events. We additionally describe a methodology for extracting meaningful sequences of post-eruption impacts/events by using NLP to efficiently mine frequent multivariate cluster evolutions, which can be used to confirm or discover the chain of physical processes between a climate source and its impact(s).


Figure 5. Soil moisture deficit index (SMDI_2) for the top 2 feet of ground depth evaluated 448 seasonally from 1991 to 1995. Grey color is painted over the grid cell where the SMDI_2 is not 449 statistically significant in contrast to counter-factual ensemble. The parameter AS on each panel 450
Figure 10. Spatially averaged drought indices (SMDI_2 & ETDI) and anomalies for other drivers 607 (Surface Temperature, Precipitation plus irrigation, Actual and Potential Evapotranspiration, and 608 Transpiration) at weekly scale for Middle East (MDE) Region (Latitude =30° N -45° N, 609 Longitude=27°-60° E; eastern Mediterranean / western Asian). 610 611
Figure 11. Spatially averaged drought indices (SMDI_2 & ETDI) and anomalies for other drivers 653 (Surface Temperature, Precipitation plus irrigation, Actual and Potential Evapotranspiration, and 654 Transpiration) at weekly scale Northern Asia Region (Latitude =50° N -75° N, Longitude=55°-655 110° E). Blue stars represent the weeks with average surface temperature below freezing point. 656 657 3.6.3 Northern Asia 658
Table showing the details of regions demarcated to regional characteristics at a weekly 562
Mount Pinatubo’s effect on the moisture-based drivers of plant productivity

September 2024

·

39 Reads

Large volcanic eruptions can significantly affect the state of the climate, including stratospheric sulfate concentrations, surface and top-of-atmosphere radiative fluxes, stratospheric and surface temperature, and regional hydroclimate. The prevalence of higher natural variability in how the regional rainfall responds to the volcanic-induced climate perturbations creates a knowledge gap in understanding of how eruptions affect ecohydrological conditions and plant productivity. Here we will explore the understudied store (soil moisture) and flux (evapotranspiration) of water as the short-term ecohydrological control over plant productivity in response to the 1991 eruption of Mt. Pinatubo. We used the NASA’s Earth system model for modeling of the 1991’s Mt. Pinatubo eruption and detection of hydroclimate response. The model simulates a radiative perturbation of -5 Wm-2 and mean surface cooling of ~ 0.5 °C following the Mt. Pinatubo eruption in 1991. The rainfall response is spatially heterogenous, due to dominating variability, yet still shows suppressed rainfall in the northern hemisphere after the eruption. We find that up to 10–15 % of land regions show a statistically significant agricultural response. Results confirm that these higher-order impacts successfully present a more robust understanding of inferred plant productivity impacts. Our results also explain the geographical dependence of various contributing factors to the compound response and their implications for exploring the climate impacts of such episodic forcings.


Conditional multi-step attribution for climate forcings

September 2024

·

3 Reads

Attribution of climate impacts to a source forcing is critical to understanding, communicating, and addressing the effects of human influence on the climate. While standard attribution methods, such as optimal fingerprinting, have been successfully applied to long-term, widespread effects such as global surface temperature warming, they often struggle in low signal-to-noise regimes, typical of short-term climate forcings or climate variables which are loosely related to the forcing. Single-step approaches, which directly relate a source forcing and final impact, are unable to utilize additional climate information to improve attribution certainty. To address this shortcoming, this paper presents a novel multi-step attribution approach which is capable of analyzing multiple variables conditionally. A connected series of climate effects are treated as dependent, and relationships found in intermediary steps of a causal pathway are leveraged to better characterize the forcing impact. This enables attribution of the forcing level responsible for the observed impacts, while equivalent single-step approaches fail. Utilizing a scalar feature describing the forcing impact, simple forcing response models, and a conditional Bayesian formulation, this method can incorporate several causal pathways to identify the correct forcing magnitude. As an exemplar of a short-term, high-variance forcing, we demonstrate this method for the 1991 eruption of Mt. Pinatubo. Results indicate that including stratospheric and surface temperature and radiative flux measurements increases attribution certainty compared to analyses derived solely from temperature measurements. This framework has potential to improve climate attribution assessments for both geoengineering projects and long-term climate change, for which standard attribution methods may fail.





Beyond PCA: Additional Dimension Reduction Techniques to Consider in the Development of Climate Fingerprints

December 2023

·

16 Reads

·

2 Citations

Journal of Climate

Dimension reduction techniques are an essential part of the climate analyst’s toolkit. Due to the enormous scale of climate data, dimension reduction methods are used to identify major patterns of variability within climate dynamics, to create compelling and informative visualizations, and to quantify major named modes such as the El-Niño Southern Oscillation. Principal Components Analysis (PCA), also known as the method of empirical orthogonal functions (EOFs), is the most commonly used form of dimension reduction, characterized by a remarkable confluence of attractive mathematical, statistical, and computational properties. Despite its ubiquity, PCA suffers from several difficulties relevant to climate science: high computational burden with large data sets, decreased statistical accuracy in high-dimensions, and difficulties comparing across multiple data sets. In this paper, we introduce several variants of PCA that are likely to be of use in climate sciences and address these problems. Specifically, we introduce non-negative, sparse , and tensor PCA and demonstrate how each approach provides superior pattern recognition in climate data. We also discuss approaches to comparing PCA-family results within and across data sets in a domain-relevant manner. We demonstrate these approaches through an analysis of several runs of the E3SM climate model from 1991 to 1995, focusing on the simulated response to the Mt. Pinatubo eruption; our findings are consistent with a recently-identified stratospheric warming fingerprint associated with this type of stratospheric aerosol injection.


Machine Learning Surrogates of a Fuel Matrix Degradation Process Model for Performance Assessment of a Nuclear Waste Repository

May 2023

·

124 Reads

·

4 Citations

Nuclear Technology

Spent nuclear fuel repository simulations are currently not able to incorporate detailed Fuel Matrix Degradation (FMD) process models due to their computational cost, especially when large numbers of waste packages breach. The current paper uses Machine Learning to develop Artificial Neural Network and k-Nearest Neighbor regression surrogate models that approximate the detailed fuel matrix degradation process model while being computationally much faster to evaluate. Using fuel cask temperature, dose rate, and the environmental concentrations of CO32, O2, Fe2+, and H2 as inputs, these surrogates show good agreement with the FMD process model predictions of the UO2 degradation rate for conditions within the range of the training data. A demonstration in a full-scale shale repository reference case simulation shows that the incorporation of the surrogate models captures local and temporal environmental effects on fuel degradation rates while retaining good computational efficiency.



Citations (80)


... Recent years have also seen the emergence of deep learning-based methods for climate attribution, detection and impact analysis [16,17,18,19,20]. These approaches typically train a neural network (NN) on an ensemble of climate data and use the NN to make predictions or as a surrogate in an inverse attribution workflow. ...

Reference:

Random forest regression feature importance for climate impact pathway detection
Stratospheric aerosol source inversion: Noise, variability, and uncertainty quantification
  • Citing Article
  • January 2024

Journal of Machine Learning for Modeling and Computing

... In this approach, spatial and/or temporal patterns are established under various disturbances (i.e., greenhouse gases, aerosol loading, etc.) and matched to observations [2,3]. Although the past few years have seen extensions of fingerprinting to regional analyses [4,5], multiple variables [6,7] and challenging problems with very small signal-to-noise ratios [8,9,10,11], the method is designed to work within a single step. There has been some recent work to develop conditional multi-step fingerprinting methods [12], but this field is still in its infancy. ...

Beyond PCA: Additional Dimension Reduction Techniques to Consider in the Development of Climate Fingerprints
  • Citing Article
  • December 2023

Journal of Climate

... Recent years have also seen the emergence of deep learning-based methods for climate attribution, detection and impact analysis [16,17,18,19,20]. These approaches typically train a neural network (NN) on an ensemble of climate data and use the NN to make predictions or as a surrogate in an inverse attribution workflow. ...

Solving High-Dimensional Inverse Problems with Auxiliary Uncertainty via Operator Learning with Limited Data
  • Citing Article
  • January 2023

Journal of Machine Learning for Modeling and Computing

... [15] used a surrogate model to accelerate uncertainty quantification of computationally expensive multiphase flow simulations involving heterogeneous porous media with high-dimensional input and function-valued output also in the context of radioactive waste repositories. Also, other studies exist that use surrogate modeling as part of the performance assessment of radioactive waste repositories [23,24,25]. While we show that the metamodeling tools in this study are adequate for the given problem, we are aware at the same time that future studies potentially require the inclusion of a number of additional parameters that might show the limits of the classical tools used herein. ...

Machine Learning Surrogates of a Fuel Matrix Degradation Process Model for Performance Assessment of a Nuclear Waste Repository

Nuclear Technology

... While LSA determine the impact of small input perturbations around nominal values on the model output, GSA considers simultaneously the whole combinational variation range of the inputs. While both methods could provide relevant information for T-H-M-C coupling, GSA account for non-linearity and interactions among parameters in system responses in a more robust manner (Chaudhry et al. 2021;Delchini et al. 2021;Nguyen et al. 2009;Wainwright et al. 2013). Recently, GSA enhance collaboration, education, joint code development and demonstration of results. ...

Technical note: Extension of the NEAMS Workbench to parallel sensitivity and uncertainty analysis of thermal hydraulic parameters using DAKOTA and NEK5000
  • Citing Article
  • April 2021

Nuclear Engineering and Technology

... The optimal approach may vary for different ACV estimators, such as MLMC and MFMC. In future work, we plan to leverage advancements in multifidelity model tuning [64][65][66][67][68], which involve selecting the numerical discretization of LF models (with the same parameters) to reduce the error of the MF estimator within a given budget. This will enable us to develop algorithms that optimally determine r L for multiple LF models. ...

Exploration of multifidelity UQ sampling strategies for computer network applications
  • Citing Article
  • January 2021

International Journal for Uncertainty Quantification

... For example, a physics-informed kernel can be derived to introduce prior knowledge, 35 or constraints can be incorporated into the GP since boundary conditions are often known in advance in many engineering fields. 36 A detailed overview of the different approaches is given in Cross et al. 37 In the context of SHM, several authors have previously demonstrated that grey-box modelling can significantly improve predictive accuracy despite this research area's relative novelty. Jones et al. 38 presented a constrained kernel regarding the geometry of the structure of interest for damage localisation. ...

A Survey of Constrained Gaussian Process: Approaches and Implementation Challenges
  • Citing Article
  • January 2020

Journal of Machine Learning for Modeling and Computing

... We furthermore assume that the potential is parameterized by a set of material and geometric design parameters that describe the RVE. The universal approximation capabilities of neural networks (NNs) [43] make them a tempting choice as mapping functions to replace the free energy potential, even though other regression techniques have also been employed [44,45]. Through the derivatives of these neural networks, we can find the internal stresses and the tangent moduli. ...

TENSOR BASIS GAUSSIAN PROCESS MODELS OF HYPERELASTIC MATERIALS
  • Citing Article
  • January 2020

Journal of Machine Learning for Modeling and Computing

... This is extremely useful in cases where a model needs to perform well across discretizations or resolutions, or when the resolution differs between available training data and desired evaluation scenarios. Recently a great number of operator learning methods have been proposed, including Deep Operator Networks (Deep-ONets) [112,113], Fourier Neural Operators (FNOs) [114,115,116,117], Spectral Neural ...

An active learning high-throughput microstructure calibration framework for solving inverse structure-process problems in materials informatics

Acta Materialia

... Data-driven modeling by using machine learning methods in conjunction with crystal plasticity has been a current research focus (Weber et al., 2022;Veasna et al., 2023). Here, polycrystal simulations were coupled with large-scale applications to investigate mechanical behavior and uncertainty quantification (Tallman et al., 2020). Therefore, today's trends in digitization motivate the need for more efficient and robust crystal plasticity solvers. ...

Uncertainty propagation in reduced order models based on crystal plasticity
  • Citing Article
  • June 2020

Computer Methods in Applied Mechanics and Engineering