Weight error sensitivity of fixed point attractors in associative memory networks

INFO-COM Dept., Roma Univ., Italy
Electronics Letters (Impact Factor: 0.93). 12/1990; 26(23):1985 - 1986. DOI: 10.1049/el:19901283
Source: IEEE Xplore


The sensitivity of Hopfield neural networks, with two-state neurons is investigated. Simple expressions are derived which give the probability that an equilibrium point of the nominal connection matrix remains a fixed point and an attractor, as a function of the relative error in the weights. Such probability decreases as the number of neurons in the network and the number of stored patterns increase.

1 Read
  • [Show abstract] [Hide abstract]
    ABSTRACT: The problem of finding multilayer perceptron inputs belonging to “basins of attraction” of the outputs not from the learning set is concerned. A practical algorithm is proposed based on stochastic optimization. The algorithm performs the search of connected as well as disconnected parts of spurious attractor. The main advantage is that both parts of an algorithm don't depend on the learning process
    Pattern Recognition, 1994. Vol. 2 - Conference B: Computer Vision & Image Processing., Proceedings of the 12th IAPR International. Conference on; 11/1994