A novel Boolean self-organization mapping based on fuzzy geometrical expansion

Conference Paper · January 2004with16 Reads
DOI: 10.1109/ICICS.2003.1292653 · Source: IEEE Xplore
Conference: Information, Communications and Signal Processing, 2003 and the Fourth Pacific Rim Conference on Multimedia. Proceedings of the 2003 Joint Conference of the Fourth International Conference on, Volume: 2
Abstract
A novel self-organization mapping algorithm for Boolean neural networks (BSOM) based on geometrical expansion is proposed in this paper. The proposed BSOM algorithm possesses generalization capability. Compared with traditional self organization mapping (SOM) algorithms, BSOM algorithm is based on geometrical expansion, not gradient descent. BSOM algorithm memorizes more vectors in a hidden neuron, not only an exemplar in the center of SOM cell. Finally BSOM algorithm needs less number of iterations and simple training equations. Test results are given on simple Boolean functions, and a randomly generated Boolean function with 10 variables.
  • [Show abstract] [Hide abstract] ABSTRACT: In this paper, the learning algorithm called expand-and-truncate learning (ETL) is proposed to train multilayer binary neural networks (BNN) with guaranteed convergence for any binary-to-binary mapping. The most significant contribution of this paper is the development of a learning algorithm for three-layer BNN which guarantees the convergence, automatically determining a required number of neurons in the hidden layer. Furthermore, the learning speed of the proposed ETL algorithm is much faster than that of backpropagation learning algorithm in a binary field. Neurons in the proposed BNN employ a hard-limiter activation function, with only integer weights and integer thresholds. Therefore, this will greatly facilitate actual hardware implementation of the proposed BNN using currently available digital VLSI technology
    Article · Feb 1995
  • [Show abstract] [Hide abstract] ABSTRACT: This paper proposes a novel learning algorithm that can realize any binary-to-binary mapping by using three-layer binary neural networks. The algorithm includes an improved expand-and-truncate learning routine that can reduce the number of the hidden neurons by conventional methods. Also, the output layer parameters can be given by simple analytic formulae
    Conference Paper · Jul 1997 · IEEE Transactions on Neural Networks
  • Conference Paper · Jun 2001 · IEEE Transactions on Neural Networks
Show more