An approach for constructing parsimonious generalized Gaussian kernel regression models

Intelligent Systems and Diagnostics Group, Department of Electronic and Computer Engineering, University of Portsmouth, Anglesea Building, Anglesea Road, Portsmouth PO1 3DJ, UK
Neurocomputing (Impact Factor: 2.08). 12/2004; 62:441-457. DOI: 10.1016/j.neucom.2004.06.003
Source: DBLP


The paper proposes a novel construction algorithm for generalized Gaussian kernel regression models. Each kernel regressor in the generalized Gaussian kernel regression model has an individual diagonal covariance matrix, which is determined by maximizing the correlation between the training data and the regressor using a repeated guided random search based on boosting optimization. The standard orthogonal least squares algorithm is then used to select a sparse generalized kernel regression model from the resulting full regression matrix. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modeling approach.

Download full-text


Available from: David J Brown, Sep 02, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. SVR has been applied in various fields – time series and financial (noisy and risky) prediction, approximation of complex engineering analyses, convex quadratic programming and choices of loss functions, etc. In this paper, an attempt has been made to review the existing theory, methods, recent developments and scopes of SVR.
  • [Show abstract] [Hide abstract]
    ABSTRACT: A novel modelling framework is proposed for constructing parsimonious and flexible multiscale radial basis function networks (RBF). Unlike a conventional standard single scale RBF network, where all the basis functions have a common kernel width, the new network structure adopts multiscale Gaussian functions as the bases, where each selected centre has multiple kernel widths, to provide more flexible representations with better generalization properties for general nonlinear dynamical systems. As a direct extension of the traditional single scale Gaussian networks, the new multiscale network is easy to implement and is quick to learn using standard learning algorithms. A k-means clustering algorithm and an improved orthogonal least squares (OLS) algorithm are used to determine the unknown parameters in the network model including the centres and widths of the basis functions, and the weights between the basis functions. It is demonstrated that the new network can lead to a parsimonious model with much better generalization property compared with the traditional single width RBF networks.
    Neural Networks 01/2008; 20(10):1081-94. DOI:10.1016/j.neunet.2007.09.017 · 2.71 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a growing algorithm to design the architecture of RBF neural network called growing RBF neural network algorithm (GRBF). The GRBF starts from a single prototype randomly initialized in the feature space; the whole algorithm consists of two major parts: the structure learning phase and parameter adjusting phase. In the structure algorithm, the growing strategy is used to judge when and where the RBF neural network should be grown in the hidden layer based on the sensitivity analysis of the network output. In the parameter adjusting strategy, the whole weights of the RBF should be adjusted for improving the whole capabilities of the GRBF. In the end, the proposed GRBF network is employed to track non-linear functions. The computational complexity analysis and the results of the simulations confirm the efficiency of the proposed algorithm.
    08/2009: pages 73-82;
Show more