Joo Yong Shim

Catholic University of Daegu, Kayō, North Gyeongsang, South Korea

Are you Joo Yong Shim?

Claim your profile

Publications (5)

  • Joo Yong Shim · Changha Hwang · Sungkyun Nau
    [Show abstract] [Hide abstract] ABSTRACT: The least squares support vector machine(LS-SVM) is a widely applicable and useful machine learning technique for classification and regression. The solution of LS-SVM is easily obtained from the linear Karush-Kuhn-Tucker conditions instead of a quadratic programming problem of SVM. However, LS-SVM is less robust due to the assumption of the errors and the use of a squared loss function. In this paper we propose a robust LS-SVM regression method which imposes the robustness on the estimation of LS-SVM regression by assigning weight to each data point, which represents the membership degree to cluster. In the numerical studies, the robust LS-SVM regression is compared with the ordinary LS-SVM regression.
    Conference Paper · Sep 2006
  • Changha Hwang · Dug Hun Hong · Eunyoung Na · [...] · Joo Yong Shim
    [Show abstract] [Hide abstract] ABSTRACT: This paper deals with interval regression analysis using support vector machine and quantile regression method. The algorithm consists of two phases – the identification of the main trend of the data and the interval regression based on acquired main trend. Using the principle of support vector machine the linear interval regression can be extended to the nonlinear interval regression. Numerical studies are then presented which indicate the performance of this algorithm.
    Conference Paper · Aug 2005
  • Changha Hwang · Joo Yong Shim
    [Show abstract] [Hide abstract] ABSTRACT: This paper deals with the estimation of the linear and the nonlinear quantile regressions using the idea of support vector machine. Accordingly, the optimization problem is transformed into the Lagrangian dual problem, which is easier to solve. In particular, for the nonlinear quantile regression the idea of kernel function is introduced, which allows us to perform operations in the input space rather than the high dimensional feature space. Experimental results are then presented which illustrate the performance of the proposed method.
    Conference Paper · Aug 2005
  • Joo Yong Shim · Changha Hwang
    Conference Paper · Jan 2005
  • Byung-Joo Kim · Joo Yong Shim · Changha Hwang · [...] · Joon Hyun Song
    [Show abstract] [Hide abstract] ABSTRACT: A new incremental kernel principal component analysis is proposed for the nonlinear feature extraction from the data. The problem of batch kernel principal component analysis is that the computation becomes prohibitive when the data set is large . Another problem is that, in order to update the eigenvectors with another data, the whole decomposition from scratch should be recomputed. The proposed method overcomes these problems by incrementally update eigenspace and using empirical kernel map as kernel function. The proposed method is more efficient in memory requirement than a batch kernel principal component and can be easily improved by re-learning the data. In our experiments we show that proposed method is comparable in performance to a batch kernel principal component for the classification problem on nonlinear data set.
    Conference Paper · Oct 2003