Davide Murari

Davide Murari
Norwegian University of Science and Technology | NTNU · Department of Mathematical Sciences

About

9
Publications
1,235
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
25
Citations
Citations since 2017
9 Research Items
25 Citations
201720182019202020212022202302468101214
201720182019202020212022202302468101214
201720182019202020212022202302468101214
201720182019202020212022202302468101214

Publications

Publications (9)
Preprint
Motivated by classical work on the numerical integration of ordinary differential equations we present a ResNet-styled neural network architecture that encodes non-expansive (1-Lipschitz) operators, as long as the spectral norms of the weights are appropriately constrained. This is to be contrasted with the ordinary ResNet architecture which, even...
Preprint
Full-text available
Neural networks are the state-of-the-art for many approximation tasks in high-dimensional spaces, as supported by an abundance of experimental evidence. However, we still need a solid theoretical understanding of what they can approximate and, more importantly, at what cost and accuracy. One network architecture of practical use, especially for app...
Chapter
Since their introduction, Lie group integrators have become a method of choice in many application areas. Various formulations of these integrators exist, and in this work we focus on Runge-Kutta-Munthe-Kaas methods. First, we briefly introduce this class of integrators, considering some of the practical aspects of their implementation, such as ada...
Preprint
Full-text available
Neural networks have gained much interest because of their effectiveness in many applications. However, their mathematical properties are generally not well understood. If there is some underlying geometric structure inherent to the data or to the function to approximate, it is often desirable to take this into account in the design of the neural n...
Article
Full-text available
Recently, there has been an increasing interest in modelling and computation of physical systems with neural networks. Hamiltonian systems are an elegant and compact formalism in classical mechanics, where the dynamics is fully determined by one scalar function, the Hamiltonian. The solution trajectories are often constrained to evolve on a submani...
Preprint
Full-text available
Recently, there has been an increasing interest in modelling and computation of physical systems with neural networks. Hamiltonian systems are an elegant and compact formalism in classical mechanics, where the dynamics is fully determined by one scalar function, the Hamiltonian. The solution trajectories are often constrained to evolve on a submani...
Preprint
Full-text available
Since their introduction, Lie group integrators have become a method of choice in many application areas. Various formulations of these integrators exist, and in this work we focus on Runge--Kutta--Munthe--Kaas methods. First, we briefly introduce this class of integrators, considering some of the practical aspects of their implementation, such as...
Article
Full-text available
Since they were introduced in the 1990s, Lie group integrators have become a method of choice in many application areas. These include multibody dynamics, shape analysis, data science, image registration and biophysical simulations. Two important classes of intrinsic Lie group integrators are the Runge–Kutta–Munthe–Kaas methods and the commutator f...
Preprint
Full-text available
Since they were introduced in the 1990s, Lie group integrators have become a method of choice in many application areas. These include multibody dynamics, shape analysis, data science, image registration and biophysical simulations. Two important classes of intrinsic Lie group integrators are the Runge--Kutta--Munthe--Kaas methods and the commutato...

Network

Cited By