Elliot Saba's scientific contributions

Publications (6)

Article
Full-text available
Copyright © 2020, for this paper by its authors. Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physics-informed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations te...
Preprint
Full-text available
In this paper we introduce JuliaSim, a high-performance programming environment designed to blend traditional modeling and simulation with machine learning. JuliaSim can build accelerated surrogates from component-based models, such as those conforming to the FMI standard, using continuous-time echo state networks (CTESN). The foundation of this en...
Preprint
Full-text available
Scientific computing is increasingly incorporating the advancements in machine learning and the ability to work with large amounts of data. At the same time, machine learning models are becoming increasingly sophisticated and exhibit many features often seen in scientific computing, stressing the capabilities of machine learning frameworks. Just as...
Preprint
Full-text available
Machine learning as a discipline has seen an incredible surge of interest in recent years due in large part to a perfect storm of new theory, superior tooling, renewed interest in its capabilities. We present in this paper a framework named Flux that shows how further refinement of the core ideas of machine learning, built upon the foundation of th...
Preprint
Full-text available
Google's Cloud TPUs are a promising new hardware architecture for machine learning workloads. They have powered many of Google's milestone machine learning achievements in recent years. Google has now made TPUs available for general use on their cloud platform and as of very recently has opened them up further to allow use by non-TensorFlow fronten...

Citations

... While early approaches were primarily based on training ML methods, such as random forests and deep NNs, on observed or simulated physical data [14,15] and discovering governing physics laws from data [16], recent methods focus on ML approaches where physics is an integral part of the model. This is achieved through physics (PDE)-informed loss functions that combine the existing PDE-based parameterizations of the physical processes with observed/simulated data [17], or through differentiable physics [6,18,19,20,21]. The exceptional success of these methods brought tremendous attention to the emerging field of physics-informed machine learning [13,22]. ...
... Typically, this is a linearprojection estimated by least squares. In our work however, we adopt a non-linear projection, the radial basis function (RBF), from [15]. RBFs are a method of interpolating unstructured data in high-dimensional spaces. ...