Conference Paper

Opposite Transfer Functions and Backpropagation Through Time.

Syst. Design Eng. Dept., Waterloo Univ., Ont.
DOI: 10.1109/FOCI.2007.371529 Conference: Proceedings of the IEEE Symposium on Foundations of Computational Intelligence, FOCI 2007, part of the IEEE Symposium Series on Computational Intelligence 2007, Honolulu, Hawaii, USA, 1-5 April 2007
Source: DBLP

ABSTRACT Backpropagation through time is a very popular discrete-time recurrent neural network training algorithm. However, the computational time associated with the learning process to achieve high accuracy is high. While many approaches have been proposed that alter the learning algorithm, this paper presents a computationally inexpensive method based on the concept of opposite transfer functions to improve learning in the backpropagation through time algorithm. Specifically, we will show an improvement in the accuracy, stability as well as an acceleration in learning time. We will utilize three common benchmarks to provide experimental evidence of the improvements

  • [Show abstract] [Hide abstract]
    ABSTRACT: This chapter discusses the application of opposition-based computing to reducing the amount of function calls required to perform optimization by population-based search. We provide motivation and comparison to similar, but different approaches including antithetic variates and quasi-randomness/low-discrepancy sequences. We employ differential evolution and population-based incremental learning as optimization methods for image thresholding. Our results confirm improvements in required function calls, as well as support the oppositional princples used to attain them.
    06/2010: pages 49-71;
  • [Show abstract] [Hide abstract]
    ABSTRACT: In algorithms design, one of the important aspects is to consider efficiency. Many algorithm design paradigms are existed and used in order to enhance algorithms' efficiency. Opposition-based Learning (OBL) paradigm was recently introduced as a new way of thinking during the design of algorithms. The concepts of opposition have already been used and applied in several applications. These applications are from different fields, such as optimization algorithms, learning algorithms and fuzzy logic. The reported results confirm that OBL paradigm was promising to accelerate or to enhance accuracy of soft computing algorithms. In this paper, a survey of existing applications of opposition-based computing is presented.
    International Joint Conference on Neural Networks, IJCNN 2010, Barcelona, Spain, 18-23 July, 2010; 01/2010
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The concept of opposition-based learning using current optimum is proposed and combined with differential evolution for function optimization. The distance between the opposite points and the global optimum is short enough to keep a high utilization rate of opposition population during the process of evolution, especially in the later stage. Experiments on 33 widely used benchmark problems show that, the proposed algorithm is capable of improving performance significantly because of opposite points.
    Journal of Computational Information Systems 01/2011; 75:1582-1591.

Full-text (3 Sources)

Available from
May 20, 2014