Conference Paper

Opposite Transfer Functions and Backpropagation Through Time.

Syst. Design Eng. Dept., Waterloo Univ., Ont.
DOI: 10.1109/FOCI.2007.371529 Conference: Proceedings of the IEEE Symposium on Foundations of Computational Intelligence, FOCI 2007, part of the IEEE Symposium Series on Computational Intelligence 2007, Honolulu, Hawaii, USA, 1-5 April 2007
Source: DBLP

ABSTRACT Backpropagation through time is a very popular discrete-time recurrent neural network training algorithm. However, the computational time associated with the learning process to achieve high accuracy is high. While many approaches have been proposed that alter the learning algorithm, this paper presents a computationally inexpensive method based on the concept of opposite transfer functions to improve learning in the backpropagation through time algorithm. Specifically, we will show an improvement in the accuracy, stability as well as an acceleration in learning time. We will utilize three common benchmarks to provide experimental evidence of the improvements

0 Bookmarks
 · 
106 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The concept of opposition-based learning using current optimum is proposed and combined with differential evolution for function optimization. The distance between the opposite points and the global optimum is short enough to keep a high utilization rate of opposition population during the process of evolution, especially in the later stage. Experiments on 33 widely used benchmark problems show that, the proposed algorithm is capable of improving performance significantly because of opposite points.
    Journal of Computational Information Systems 01/2011; 75:1582-1591.
  • [Show abstract] [Hide abstract]
    ABSTRACT: An evolutionary circle detection method based on a novel Chaotic Hybrid Algorithm (CHA) is proposed. The method combines the strengths of particle swarm optimization, genetic algorithms and chaotic dynamics, and involves the standard velocity and position update rules of PSOs, with the ideas of selection, crossover and mutation from GA. The opposition-based learning (OBL) is employed in CHA for population initialization. In addition, the notion of species is introduced into the proposed CHA to enhance its performance in solving multimodal problems. The effectiveness of the Species-based Chaotic Hybrid Algorithm (SCHA) is proven through simulations and benchmarking; finally it is successfully applied to solve circle detection problems. To make it more powerful in solving circle detection problems in complicated circumstances, the notion of ‘tolerant radius’ is proposed and incorporated into the SCHA-based method. Simulation tests were undertaken on several hand drawn sketches and natural photos, and the effectiveness of the proposed method was clearly shown in the test results.
    Computers & Mathematics with Applications 09/2012; 64(6):1886–1902. · 2.07 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Backpropagation is the most popular algorithm for training neural networks. However, this gradient-based training method is known to have a tendency towards very long training times and convergence to local optima. Various methods have been proposed to alleviate these issues including, but not limited to, different training algorithms, automatic architecture design and different transfer functions. In this chapter we continue the exploration into improving gradient-based learning algorithms through dynamic transfer function modification. We propose opposite transfer functions as a means to improve the numerical conditioning of neural networks and extrapolate two backpropagation-based learning algorithms. Our experimental results show an improvement in accuracy and generalization ability on common benchmark functions. The experiments involve examining the sensitivity of the approach to learning parameters, type of transfer function and number of neurons in the network.
    09/2008: pages 255-284;

Full-text (3 Sources)

Download
12 Downloads
Available from
May 20, 2014