Article

An approach for constructing parsimonious generalized Gaussian kernel regression models

Intelligent Systems and Diagnostics Group, Department of Electronic and Computer Engineering, University of Portsmouth, Anglesea Building, Anglesea Road, Portsmouth PO1 3DJ, UK
Neurocomputing (Impact Factor: 2.01). 12/2004; 62:441-457. DOI: 10.1016/j.neucom.2004.06.003
Source: DBLP

ABSTRACT The paper proposes a novel construction algorithm for generalized Gaussian kernel regression models. Each kernel regressor in the generalized Gaussian kernel regression model has an individual diagonal covariance matrix, which is determined by maximizing the correlation between the training data and the regressor using a repeated guided random search based on boosting optimization. The standard orthogonal least squares algorithm is then used to select a sparse generalized kernel regression model from the resulting full regression matrix. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modeling approach.

Full-text

Available from: David J Brown, Sep 02, 2014
1 Follower
 · 
100 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Inertial tolerancing [1]-[3] presents is a statistical expression of a characteristic tolerance without defining the specification interval as is traditionally the case. Conformity is defined by "inertia" (Taguchi loss function [4]) around the target. Although this tolerancing method offers numerous advantages in terms of Quality/Cost ratio, it raises new problems in the customer/supplier relationship. This paper develops an acceptance sampling method design for inertial tolerancing and a sorting strategy when a lot is rejected
    International Journal of Web Engineering and Technology 04/2013; Volume 2,(Issue 10):265-271.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The radial basis function (RBF) network has its foundation in the conventional approximation theory. It has the capability of universal approximation. The RBF network is a popular alternative to the well-known multilayer perceptron (MLP), since it has a simpler structure and a much faster training process. In this paper, we give a comprehensive survey on the RBF network and its learning. Many aspects associated with the RBF network, such as network structure, universal approimation capability, radial basis functions, RBF network learning, structure optimization, normalized RBF networks, application to dynamic system modeling, and nonlinear complex-valued signal processing, are described. We also compare the features and capability of the two models.
    03/2012; 2012. DOI:10.5402/2012/324194
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. SVR has been applied in various fields – time series and financial (noisy and risky) prediction, approximation of complex engineering analyses, convex quadratic programming and choices of loss functions, etc. In this paper, an attempt has been made to review the existing theory, methods, recent developments and scopes of SVR.