Article

An approach for constructing parsimonious generalized Gaussian kernel regression models

Intelligent Systems and Diagnostics Group, Department of Electronic and Computer Engineering, University of Portsmouth, Anglesea Building, Anglesea Road, Portsmouth PO1 3DJ, UK; School of Electronics and Computer Science, University of Southampton, Highfield, Southampton SO17 1BJ, UK
Neurocomputing (Impact Factor: 1.63). 12/2004; DOI: 10.1016/j.neucom.2004.06.003
Source: DBLP

ABSTRACT The paper proposes a novel construction algorithm for generalized Gaussian kernel regression models. Each kernel regressor in the generalized Gaussian kernel regression model has an individual diagonal covariance matrix, which is determined by maximizing the correlation between the training data and the regressor using a repeated guided random search based on boosting optimization. The standard orthogonal least squares algorithm is then used to select a sparse generalized kernel regression model from the resulting full regression matrix. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modeling approach.

0 Bookmarks
 · 
85 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a growing algorithm to design the architecture of RBF neural network called growing RBF neural network algorithm (GRBF). The GRBF starts from a single prototype randomly initialized in the feature space; the whole algorithm consists of two major parts: the structure learning phase and parameter adjusting phase. In the structure algorithm, the growing strategy is used to judge when and where the RBF neural network should be grown in the hidden layer based on the sensitivity analysis of the network output. In the parameter adjusting strategy, the whole weights of the RBF should be adjusted for improving the whole capabilities of the GRBF. In the end, the proposed GRBF network is employed to track non-linear functions. The computational complexity analysis and the results of the simulations confirm the efficiency of the proposed algorithm.
    08/2009: pages 73-82;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The radial basis function (RBF) network has its foundation in the conventional approximation theory. It has the capability of universal approximation. The RBF network is a popular alternative to the well-known multilayer perceptron (MLP), since it has a simpler structure and a much faster training process. In this paper, we give a comprehensive survey on the RBF network and its learning. Many aspects associated with the RBF network, such as network structure, universal approimation capability, radial basis functions, RBF network learning, structure optimization, normalized RBF networks, application to dynamic system modeling, and nonlinear complex-valued signal processing, are described. We also compare the features and capability of the two models.
    ISRN Applied Mathematics. 03/2012; 2012.
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a repair algorithm for the design of a Radial Basis Function (RBF) neural network. The proposed repair RBF (RRBF) algorithm starts from a single prototype randomly initialized in the feature space. The algorithm has two main phases: an architecture learning phase and a parameter adjustment phase. The architecture learning phase uses a repair strategy based on a sensitivity analysis (SA) of the network's output to judge when and where hidden nodes should be added to the network. New nodes are added to repair the architecture when the prototype does not meet the requirements. The parameter adjustment phase uses an adjustment strategy where the capabilities of the network are improved by modifying all the weights. The algorithm is applied to two application areas: approximating a non-linear function, and modeling the key parameter, chemical oxygen demand (COD) used in the waste water treatment process. The results of simulation show that the algorithm provides an efficient solution to both problems.
    International Journal of Neural Systems 02/2010; 20(1):63-74. · 5.05 Impact Factor

Full-text (2 Sources)

View
7 Downloads
Available from
Sep 2, 2014