Weight error sensitivity of fixed point attractors in associative memory networks
INFO-COM Dept., Roma Univ., ItalyElectronics Letters (Impact Factor: 0.93). 12/1990; 26(23):1985 - 1986. DOI: 10.1049/el:19901283
Source: IEEE Xplore
The sensitivity of Hopfield neural networks, with two-state neurons is investigated. Simple expressions are derived which give the probability that an equilibrium point of the nominal connection matrix remains a fixed point and an attractor, as a function of the relative error in the weights. Such probability decreases as the number of neurons in the network and the number of stored patterns increase.
- [Show abstract] [Hide abstract]
ABSTRACT: The problem of finding multilayer perceptron inputs belonging to “basins of attraction” of the outputs not from the learning set is concerned. A practical algorithm is proposed based on stochastic optimization. The algorithm performs the search of connected as well as disconnected parts of spurious attractor. The main advantage is that both parts of an algorithm don't depend on the learning processPattern Recognition, 1994. Vol. 2 - Conference B: Computer Vision & Image Processing., Proceedings of the 12th IAPR International. Conference on; 11/1994
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.