Article

Non‐parametric regression with wavelet kernels

Applied Stochastic Models in Business and Industry (Impact Factor: 0.53). 03/2005; 21(2):153 - 163. DOI: 10.1002/asmb.533

ABSTRACT This paper introduces a method to construct a reproducing wavelet kernel Hilbert spaces for non-parametric regression estimation when the sampling points are not equally spaced. Another objective is to make high-dimensional wavelet estimation problems tractable. It then provides a theoretical foundation to build reproducing kernel from operators and a practical technique to obtain reproducing kernel Hilbert spaces spanned by a set of wavelets. A multiscale approximation technique that aims at taking advantage of the multiresolution structure of wavelets is also described. Examples on toy regression and a real-world problem illustrate the effectiveness of these wavelet kernels. Copyright © 2005 John Wiley & Sons, Ltd.

0 Followers
 · 
110 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Estimating the non-flat function which comprises both the steep variations and the smooth variations is a hard problem. The results achieved by the common support vector methods like SVR, LPR and LS-SVM are often unsatisfactory, because they cannot avoid underfitting and overfitting simultaneously. This paper takes this problem as a linear regression in a combined feature space which is implicitly defined by a set of translation invariant kernels with different scales, and proposes a multi-scale support vector regression (MS-SVR) method. MS-SVR performs better than SVR, LPR and LS-SVM in the experiments tried.
    Neurocomputing 12/2006; 70(1):420-429. DOI:10.1016/j.neucom.2005.12.128 · 2.01 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Combining Support Vector Machine (SVM) with wavelet analysis, we constructed wavelet SVM (WSVM) classifier based on wavelet kernel functions in Reproducing Kernel Hilbert Space (RKHS). In conventional kernel theory, SVM is faced with the bottleneck of kernel parameter selection which further results in time-consuming and low classification accuracy. The wavelet kernel in RKHS is a kind of multidimensional wavelet function that can approximate arbitrary nonlinear functions. Implications on semiparametric estimation are proposed in this paper. Airborne Operational Modular Imaging Spectrometer II (OMIS II) hyperspectral remote sensing image with 64 bands and Reflective Optics System Imaging Spectrometer (ROSIS) data with 115 bands were used to experiment the performance and accuracy of the proposed WSVM classifier. The experimental results indicate that the WSVM classifier can obtain the highest accuracy when using the Coiflet Kernel function in wavelet transform. In contrast with some traditional classifiers, including Spectral Angle Mapping (SAM) and Minimum Distance Classification (MDC), and SVM classifier using Radial Basis Function kernel, the proposed wavelet SVM classifier using the wavelet kernel function in Reproducing Kernel Hilbert Space is capable of improving classification accuracy obviously.
    Optics Communications 12/2010; 283(24):4978-4984. DOI:10.1016/j.optcom.2010.08.009 · 1.54 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The arithmetic of wind power prediction plays an important part in the development of wind power prediction. In this paper, based on the principles of support vector machine (SVM) and wavelet, the wavelet SVM model for short term wind power prediction is built up along with analyzing the characteristics of power curves of wind turbine generator systems. The operation data from a wind farm in North China are used to test the proposed model, the mean relative error of wavelet SVM model is 6.05% less than that of traditional RBF SVM model. For the time frame of one hour ahead, the average error of optimal wind turbine prediction method is 12.07%.