Article

An approach for constructing parsimonious generalized Gaussian kernel regression models

Intelligent Systems and Diagnostics Group, Department of Electronic and Computer Engineering, University of Portsmouth, Anglesea Building, Anglesea Road, Portsmouth PO1 3DJ, UK
Neurocomputing (Impact Factor: 2.08). 12/2004; 62(1):441-457. DOI: 10.1016/j.neucom.2004.06.003
Source: DBLP

ABSTRACT

The paper proposes a novel construction algorithm for generalized Gaussian kernel regression models. Each kernel regressor in the generalized Gaussian kernel regression model has an individual diagonal covariance matrix, which is determined by maximizing the correlation between the training data and the regressor using a repeated guided random search based on boosting optimization. The standard orthogonal least squares algorithm is then used to select a sparse generalized kernel regression model from the resulting full regression matrix. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modeling approach.

Download full-text

Full-text

Available from: David J Brown, Sep 02, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Conventional model-based data processing methods are computationally expensive and require experts' knowledge for the modelling of a system; neural networks provide a model-free, adaptive, parallel-processing solution. Neural Networks in a Softcomputing Framework presents a thorough review of the most popular neural-network methods and their associated techniques. This concise but comprehensive textbook provides a powerful and universal paradigm for information processing. Each chapter provides state-of-the-art descriptions of the important major research results of the respective neural-network methods. A range of relevant computational intelligence topics, such as fuzzy logic and evolutionary algorithms, are introduced. These are powerful tools for neural-network learning. Array signal processing problems are discussed in order to illustrate the applications of each neural-network model. Neural Networks in a Softcomputing Framework is an ideal textbook for graduate students and researchers in this field because in addition to grasping the fundamentals, they can discover the most recent advances in each of the popular models. The systematic survey of each neural-network model and the exhaustive list of references will enable researchers and students to find suitable topics for future research. The important algorithms outlined also make this textbook a valuable reference for scientists and practitioners working in pattern recognition, signal processing, speech and image processing, data analysis and artificial intelligence.
    Full-text · Book · Oct 2006
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. SVR has been applied in various fields – time series and financial (noisy and risky) prediction, approximation of complex engineering analyses, convex quadratic programming and choices of loss functions, etc. In this paper, an attempt has been made to review the existing theory, methods, recent developments and scopes of SVR.
    Full-text · Article · Nov 2007
  • [Show abstract] [Hide abstract]
    ABSTRACT: A novel modelling framework is proposed for constructing parsimonious and flexible multiscale radial basis function networks (RBF). Unlike a conventional standard single scale RBF network, where all the basis functions have a common kernel width, the new network structure adopts multiscale Gaussian functions as the bases, where each selected centre has multiple kernel widths, to provide more flexible representations with better generalization properties for general nonlinear dynamical systems. As a direct extension of the traditional single scale Gaussian networks, the new multiscale network is easy to implement and is quick to learn using standard learning algorithms. A k-means clustering algorithm and an improved orthogonal least squares (OLS) algorithm are used to determine the unknown parameters in the network model including the centres and widths of the basis functions, and the weights between the basis functions. It is demonstrated that the new network can lead to a parsimonious model with much better generalization property compared with the traditional single width RBF networks.
    No preview · Article · Jan 2008 · Neural Networks
Show more