Joo Yong Shim

Catholic University of Daegu, Kayō, North Gyeongsang, South Korea

Are you Joo Yong Shim?

Claim your profile

Publications (5)0 Total impact

  • Joo Yong Shim, Changha Hwang, Sungkyun Nau
    [Show abstract] [Hide abstract]
    ABSTRACT: The least squares support vector machine(LS-SVM) is a widely applicable and useful machine learning technique for classification and regression. The solution of LS-SVM is easily obtained from the linear Karush-Kuhn-Tucker conditions instead of a quadratic programming problem of SVM. However, LS-SVM is less robust due to the assumption of the errors and the use of a squared loss function. In this paper we propose a robust LS-SVM regression method which imposes the robustness on the estimation of LS-SVM regression by assigning weight to each data point, which represents the membership degree to cluster. In the numerical studies, the robust LS-SVM regression is compared with the ordinary LS-SVM regression.
    Advances in Natural Computation, Second International Conference, ICNC 2006, Xi'an, China, September 24-28, 2006. Proceedings, Part I; 01/2006
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper deals with interval regression analysis using support vector machine and quantile regression method. The algorithm consists of two phases – the identification of the main trend of the data and the interval regression based on acquired main trend. Using the principle of support vector machine the linear interval regression can be extended to the nonlinear interval regression. Numerical studies are then presented which indicate the performance of this algorithm.
    Fuzzy Systems and Knowledge Discovery, Second International Conference, FSKD 2005, Changsha, China, August 27-29, 2005, Proceedings, Part I; 01/2005
  • Joo Yong Shim, Changha Hwang
    Advances in Natural Computation, First International Conference, ICNC 2005, Changsha, China, August 27-29, 2005, Proceedings, Part I; 01/2005
  • Changha Hwang, Joo Yong Shim
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper deals with the estimation of the linear and the nonlinear quantile regressions using the idea of support vector machine. Accordingly, the optimization problem is transformed into the Lagrangian dual problem, which is easier to solve. In particular, for the nonlinear quantile regression the idea of kernel function is introduced, which allows us to perform operations in the input space rather than the high dimensional feature space. Experimental results are then presented which illustrate the performance of the proposed method.
    Advances in Natural Computation, First International Conference, ICNC 2005, Changsha, China, August 27-29, 2005, Proceedings, Part I; 01/2005
  • [Show abstract] [Hide abstract]
    ABSTRACT: A new incremental kernel principal component analysis is proposed for the nonlinear feature extraction from the data. The problem of batch kernel principal component analysis is that the computation becomes prohibitive when the data set is large . Another problem is that, in order to update the eigenvectors with another data, the whole decomposition from scratch should be recomputed. The proposed method overcomes these problems by incrementally update eigenspace and using empirical kernel map as kernel function. The proposed method is more efficient in memory requirement than a batch kernel principal component and can be easily improved by re-learning the data. In our experiments we show that proposed method is comparable in performance to a batch kernel principal component for the classification problem on nonlinear data set.
    Foundations of Intelligent Systems, 14th International Symposium, ISMIS 2003, Maebashi City, Japan, October 28-31, 2003, Proceedings; 01/2003