
Lu LuUniversity of Pennsylvania | UP · Department of Chemical and Biomolecular Engineering
Lu Lu
Doctor of Philosophy
About
57
Publications
62,990
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2,215
Citations
Introduction
Skills and Expertise
Additional affiliations
September 2020 - May 2021
Education
September 2017 - May 2019
September 2016 - May 2020
September 2016 - May 2017
Publications
Publications (57)
Deep neural operators can learn operators mapping between infinite-dimensional function spaces via deep neural networks and have become an emerging paradigm of scientific machine learning. However, training neural operators usually requires a large amount of high-fidelity data, which is often difficult to obtain in real engineering problems. Here,...
Deep learning has been shown to be an effective tool in solving partial differential equations (PDEs) through physics-informed neural networks (PINNs). PINNs embed the PDE residual into the loss function of the neural network, and have been successfully employed to solve diverse forward and inverse PDE problems. However, one disadvantage of the fir...
Neural operators can learn nonlinear mappings between function spaces and offer a new simulation paradigm for real-time prediction of complex dynamics for realistic diverse applications as well as for system identification in science and engineering. Herein, we investigate the performance of two neural operators, which have shown promising results...
As an emerging paradigm in scientific machine learning, neural operators aim to learn operators, via neural networks, that map between infinite-dimensional function spaces. Several neural operators have been recently developed. However, all the existing neural operators are only designed to learn operators defined on a single Banach space, i.e., th...
The dynamics of systems biological processes are usually modeled by a system of ordinary differential equations (ODEs) with many unknown parameters that need to be inferred from noisy and sparse measurements. Here, we introduce systems-biology informed neural networks for parameter estimation by incorporating the system of ODEs into the neural netw...
Neural operators can learn nonlinear mappings between function spaces and offer a new simulation paradigm for real-time prediction of complex dynamics for realistic diverse applications as well as for system identification in science and engineering. Herein, we investigate the performance of two neural operators, and we develop new practical extens...
The spleen, the largest secondary lymphoid organ in humans, not only fulfils a broad range of immune functions, but also plays an important role in red blood cell’s (RBC) life cycle. Although much progress has been made to elucidate the critical biological processes involved in the maturation of young RBCs (reticulocytes) as well as removal of sene...
Deep learning has been shown to be an effective tool in solving partial differential equations (PDEs) through physics-informed neural networks (PINNs). PINNs embed the PDE residual into the loss function of the neural network, and have been successfully employed to solve diverse forward and inverse PDE problems. However, one disadvantage of the fir...
In high-speed flow past a normal shock, the fluid temperature rises rapidly triggering downstream chemical dissociation reactions. The chemical changes lead to appreciable changes in fluid properties, and these coupled multiphysics and the resulting multiscale dynamics are challenging to resolve numerically. Using conventional computational fluid d...
Accurate prediction of blood glucose variations in type 2 diabetes (T2D) will facilitate better glycemic control and decrease the occurrence of hypoglycemic episodes as well as the morbidity and mortality associated with T2D, hence increasing the quality of life of patients. Owing to the complexity of the blood glucose dynamics, it is difficult to...
Despite great progress in simulating multiphysics problems using the numerical discretization of partial differential equations (PDEs), one still cannot seamlessly incorporate noisy data into existing algorithms, mesh generation remains complex, and high-dimensional problems governed by parameterized PDEs cannot be tackled. Moreover, solving invers...
Deep operator networks (DeepONets) are trained to predict the linear amplification of instability waves in high-speed boundary layers and to perform data assimilation. In contrast to traditional networks that approximate functions, DeepONets are designed to approximate operators. Using this framework, we train a DeepONet to take as inputs an upstre...
Discovering governing equations of a physical system, represented by partial differential equations (PDEs), from data is a central challenge in a variety of areas of science and engineering. Current methods require either some prior knowledge (e.g., candidate PDE terms) to discover the PDE form, or a large dataset to learn a surrogate model of the...
Simulating and predicting multiscale problems that couple multiple physics and dynamics across many orders of spatiotemporal scales is a great challenge that has not been investigated systematically by deep neural networks (DNNs). Herein, we develop a framework based on operator regression, the so-called deep operator network (DeepONet), with the l...
It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN with a single hidden layer can accurately approximate any nonlinear continuous operator. This universal approximation theorem of operators is suggestive of the structure and potential of deep neur...
Electroconvection is a multiphysics problem involving coupling of the flow field with the electric field as well as the cation and anion concentration fields. Here, we use electroconvection as a benchmark problem to put forward a new data assimilation framework, the DeepM&Mnet, for simulating multiphysics and multiscale problems at speeds much fast...
We present convergence rates of operator learning in [Chen and Chen 1995] and [Lu et al. 2020] when the operators are solution operators of differential equations. In particular, we consider solution operators of both linear and nonlinear advection-diffusion equations.
Inverse design arises in a variety of areas in engineering such as acoustic, mechanics, thermal/electronic transport, electromagnetism, and optics. Topology optimization is a major form of inverse design, where we optimize a designed geometry to achieve targeted properties and the geometry is parameterized by a density function. This optimization i...
Simulating and predicting multiscale problems that couple multiple physics and dynamics across many orders of spatiotemporal scales is a great challenge that has not been investigated systematically by deep neural networks (DNNs). Herein, we develop a framework based on operator regression, the so-called deep operator network (DeepONet), with the l...
Mathematical models of biological reactions at the system-level lead to a set of ordinary differential equations with many unknown parameters that need to be inferred using relatively few experimental measurements. Having a reliable and robust algorithm for parameter inference and prediction of the hidden dynamics has been one of the core subjects...
In high-speed flow past a normal shock, the fluid temperature rises rapidly triggering downstream chemical dissociation reactions. The chemical changes lead to appreciable changes in fluid properties, and these coupled multiphysics and the resulting multiscale dynamics are challenging to resolve numerically. Using conventional computational fluid d...
The dying ReLU refers to the problem when ReLU neurons become inactive and only output 0 for any input. There are many empirical and heuristic explanations of why ReLU neurons die. However, little is known about its theoretical analysis. In this paper, we rigorously prove that a deep ReLU network will eventually die in probability as the depth goes...
Electroconvection is a multiphysics problem involving coupling of the flow field with the electric field as well as the cation and anion concentration fields. For small Debye lengths, very steep boundary layers are developed, but standard numerical methods can simulate the different regimes quite accurately. Here, we use electroconvection as a benc...
The accuracy of deep learning, i.e., deep neural networks, can be characterized by dividing the total error into three main types: approximation error, optimization error, and generalization error. Whereas there are some satisfactory answers to the problems of approximation and optimization, much less is known about the theory of generalization. Mo...
In this paper, we employ the emerging paradigm of physics-informed neural networks (PINNs) for the solution of representative inverse scattering problems in photonic metamaterials and nano-optics technologies. In particular, we successfully apply mesh-free PINNs to the difficult task of retrieving the effective permittivity parameters of a number o...
Instrumented indentation has been developed and widely utilized as one of the most versatile and practical means of extracting mechanical properties of materials. This method is particularly desirable for those applications where it is difficult to experimentally determine the mechanical properties using stress–strain data obtained from coupon spec...
In this paper we employ the emerging paradigm of physics-informed neural networks (PINNs) for the solution of representative inverse scattering problems in photonic metamaterials and nano-optics technologies. In particular, we successfully apply mesh-free PINNs to the difficult task of retrieving the effective permittivity parameters of a number of...
While it is widely known that neural networks are universal approximators of continuous functions, a less known and perhaps more powerful result is that a neural network with a single hidden layer can approximate accurately any nonlinear continuous operator \cite{chen1995universal}. This universal approximation theorem is suggestive of the potentia...
While it is widely known that neural networks are universal approximators of continuous functions, a less known and perhaps more powerful result is that a neural network with a single hidden layer can approximate accurately any nonlinear continuous operator [5]. This universal approximation theorem is suggestive of the potential application of neur...
Sickle cell disease is induced by a mutation that converts normal adult hemoglobin to sickle hemoglobin (HbS) and engenders intracellular polymerization of deoxy-HbS and erythrocyte sickling. Development of anti-sickling therapies requires quantitative understanding of HbS polymerization kinetics under organ-specific conditions, which are difficult...
Deep learning has achieved remarkable success in diverse applications; however, its use in solving partial differential equations (PDEs) has emerged only recently. Here, we present an overview of physics-informed neural networks (PINNs), which embed a PDE into the loss of the neural network using automatic differentiation. The PINN algorithm is sim...
Physics-informed neural networks (PINNs) have recently emerged as an alternative way of numerically solving partial differential equations (PDEs) without the need of building elaborate grids, instead, using a straightforward implementation. In particular, in addition to the deep neural network (DNN) for the solution, an auxiliary DNN is considered...
The accuracy of deep learning, i.e., deep neural networks, can be characterized by dividing the total error into three main types: approximation error, optimization error, and generalization error. Whereas there are some satisfactory answers to the problems of approximation and optimization, much less is known about the theory of generalization. Mo...
The dying ReLU refers to the problem when ReLU neurons become inactive and only output 0 for any input. There are many empirical and heuristic explanations on why ReLU neurons die. However, little is known about its theoretical analysis. In this paper, we rigorously prove that a deep ReLU network will eventually die in probability as the depth goes...
Physics-informed neural networks (PINNs), introduced in [M. Raissi, P. Perdikaris, and G. Karniadakis, J. Comput. Phys., 378 (2019), pp. 686-707], are effective in solving integer-order partial differential equations (PDEs) based on scattered and noisy data. PINNs employ standard feedforward neural networks (NNs) with the PDEs explicitly encoded in...
Physics-informed neural networks (PINNs), introduced in [1], are effective in solving integer-order partial differential equations (PDEs) based on scattered and noisy data. PINNs employ standard feedforward neural networks (NNs) with the PDEs explicitly encoded into the NN using automatic differentiation, while the sum of the mean-squared PDE-resid...
Physics-informed neural networks (PINNs) have recently emerged as an alternative way of solving partial differential equations (PDEs) without the need of building elaborate grids, instead, using a straightforward implementation. In particular, in addition to the deep neural network (DNN) for the solution, a second DNN is considered that represents...
Physics-informed neural networks (PINNs) have recently emerged as an alternative way of solving partial differential equations (PDEs) without the need of building elaborate grids, instead, using a straightforward implementation. In particular, in addition to the deep neural network (DNN) for the solution, a second DNN is considered that represents...
He Li Lu Lu Xuejin Li- [...]
Subra Suresh
In red blood cell (RBC) diseases, the spleen contributes to anemia by clearing the damaged RBCs, but its unique ability to mechanically challenge RBCs also poses the risk of inducing other pathogenic effects. We have analyzed RBCs in hereditary spherocytosis (HS) and hereditary elliptocytosis (HE), two typical examples of blood disorders that resul...
Recent theoretical work has demonstrated that deep neural networks have superior performance over shallow networks, but their training is more difficult, e.g., they suffer from the vanishing gradient problem. This problem can be typically resolved by the rectified linear unit (ReLU) activation. However, here we show that even for such activation, d...
Accumulation and aggregation of amyloid are associated with the pathogenesis of many human diseases, such as Alzheimer's disease (AD) and Type 2 Diabetes Mellitus (T2DM). Therefore, a quantitative understanding of the molecular mechanisms causing different aggregated structures and biomechanical properties of amyloid fibrils could shed some light i...
In red blood cell (RBC) disorders, such as sickle cell disease, hereditary spherocytosis, and diabetes, alterations to the size and shape of RBCs due to either mutations of RBC proteins or changes to the extracellular environment, lead to compromised cell deformability, impaired cell stability, and increased propensity to aggregate. Numerous labora...
Reticulocytes, the precursors of erythrocytes, undergo drastic alterations in cell size, shape, and deformability during maturation. Experimental evidence suggests that young reticulocytes are stiffer and less stable than their mature counterparts; however, the underlying mechanism is yet to be fully understood. Here, we develop a coarse-grained mo...
In this work, we review previously developed coarse-grained (CG) particle models for biological membrane and red blood cells (RBCs) and discuss the advantages of the CG particle method over the continuum and atomic simulations on modeling biological phenomena. CG particle models can largely increase the length scale and time scale of atomic simulat...
Understanding of intracellular polymerization of sickle hemoglobin (HbS) and subsequent interaction with the membrane of a red blood cell (RBC) is important to predict the altered morphologies and mechanical properties of sickle RBCs in sickle cell anemia. However, modeling the integrated processes of HbS nucleation, polymerization, HbS fiber inter...
We present OpenRBC, a coarse-grained molecular dynamics code, which is capable of performing an unprecedented in silico experiment --- simulating an entire mammal red blood cell lipid bilayer and cytoskeleton as modeled by 4 million mesoscopic particles --- using a single shared memory commodity workstation. To achieve this, we invented an adaptive...