
Olivier Nerrand- Thesis
- CISO at HEC Paris
Olivier Nerrand
- Thesis
- CISO at HEC Paris
About
9
Publications
4,180
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
415
Citations
Introduction
Skills and Expertise
Current institution
Publications
Publications (9)
INTRODUCTION The development of engineering applications of neural networks makes it necessary to clarify the similarities and differences between the concepts and methods developed for neural networks and those used in more classical fields such as filtering and control. In previous papers [Nerrand et al. 1993], [Marcos et al. 1993], the relations...
The paper first summarizes a general approach to the training of recurrent neural networks by gradient-based algorithms, which leads to the introduction of four families of training algorithms. Because of the variety of possibilities thus available to the "neural network designer," the choice of the appropriate algorithm to solve a given problem be...
The paper proposes a general framework that encompasses the training of neural networks and the adaptation of filters. We show that neural networks can be considered as general nonlinear filters that can be trained adaptively, that is, that can undergo continual training with a possibly infinite number of time-ordered examples. We introduce the can...
Nous introduisons une famille d'algorithmes adaptatifs permettant l'utilisation de réseaux de neurones comme filtres adaptatifs non linéaires, systèmes susceptibles de subir un apprentissage permanent à partir d'un nombre éventuellement infini d'exemples présentés dans un ordre déterminé. Ces algorithmes, fondés sur des techniques d'évaluation du g...
Training of feedback neural networks for non-linear filtering, identification and control
The authors propose a general framework which encompasses the
training of neural networks and the adaptation of filters. It is shown
that neural networks can be considered as general nonlinear filters
which can be trained adaptively, i.e., which can undergo continual
training. A unified view of gradient-based training algorithms for
feedback networ...
There are a wide variety of cost functions, techniques for
estimating their gradient, and adaptive algorithms for updating the
coefficients of neural networks used as nonlinear adaptive filters. The
authors discuss the various algorithms which result from various choices
of criteria and of gradient estimation techniques. New algorithms are
introduc...