TaeSeok Jin

Dongseo University, Pusan, Busan, South Korea

Are you TaeSeok Jin?

Claim your profile

Publications (16)1.87 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: A simplified Fuzzy-PID controller is designed for 2-axes antenna stabilization and tracking system. Next, the performance of the controller is further verified by computer simulations with Matlab, a 2-Axes antenna, and a small unmanned test helicopter. With the advantages of embedded controller and simplified fuzzy control theory, high performance techniques for antenna tracking control are designed. A comparison between the performance of the conventional PID controller and the Fuzzy-PID controller, which is designed by the same PID control gains, is made as a way to verify the performance of the designed antenna servo control system. The proposed Fuzzy-PID controller has better superior performance than the conventional PID controller with respect to all cases of simulations and experiments.
    02/2008: pages 1083-1088;
  • Taeseok Jin, Jinwoo Park
    [Show abstract] [Hide abstract]
    ABSTRACT: Position estimation is one of the most important functions for the mobile robot navigating in Robotic Space. In order to achieve these goals, we present a method for representing, tracking and human following by fusing distributed multiple vision systems in Robotic Space, with application to pedestrian tracking in a crowd. And the article presents the integration of color distributions into SOM(Self Organizing Map) based particle filtering. Particle filters provide a robust tracking framework under ambiguity conditions. We propose to track the moving objects by generating hypotheses not in the image plan but on the top-view reconstruction of the scene. Comparative results on real video sequences show the advantage of our method for multi-motion tracking. Simulations are carried out to evaluate the proposed performance. Also, the method is applied to the intelligent environment and its performance is verified by the experiments.
    01/2007;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a framework for VR application based on 3 different kinds of hierarchical structures for spatial, temporal, and semantic relationship. To achieve this, we incorporate scene graph, time graph, and ontology into the framework. This approach will enable to integrate synthetic space, real space, and knowledge seamlessly as a new way of developing immersive tangible space.
    Virtual Reality, Second International Conference, ICVR 2007, Held as part of HCI International 2007, Beijing, China, July 22-27, 2007, Proceedings; 01/2007
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present a method for representing tracking and human-following by fusing distributed multiple vision systems in intelligent space, with applications to pedestrian tracking in a crowd. In this context, particle filters provide a robust tracking framework under ambiguous conditions. The particle filter technique is used in this work, but in order to reduce its computational complexity and increase its robustness, we propose to track the moving objects by generating hypotheses not in the image plan but on a top-view reconstruction of the scene. Comparative results on real video sequences show the advantage of our method for multiobject tracking. Simulations are carried out to evaluate the proposed performance. Also, the method is applied to the intelligent environment, and its performance is verified by experiments.
    Artificial Life and Robotics 10/2006; 10(2):96-101.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Latest advances in hardware technology and state-of-the-art of mobile robots and artificial intelligence research can be employed to develop autonomous and distributed monitoring systems. A mobile service robot requires the perception of its present position to co-exist with humans and support humans effectively in populated environments. To realize this, a robot needs to keep track of relevant changes in the environment. This paper proposes localization of a mobile robot using images recognized by distributed intelligent networked devices in intelligent space (ISpace) in order to achieve these goals. This scheme combines data from the observed position, using dead-reckoning sensors, and the estimated position, using images of moving objects, such as a walking human captured by a camera system, to determine the location of a mobile robot. The moving object is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the ISpace. Using the a priori known path of a moving object and a perspective camera model, the geometric constraint equations that represent the relation between image frame coordinates for a moving object and the estimated robot's position are derived. The proposed method utilizes the error between the observed and estimated image coordinates to localize the mobile robot, and the Kalman filtering scheme is used for the estimation of the mobile robot location. The proposed approach is applied for a mobile robot in ISpace to show the reduction of uncertainty in determining the location of a mobile robot, and its performance is verified by computer simulation and experiment.
    Advanced Robotics 01/2006; 20:737-762. · 0.51 Impact Factor
  • Source
    TaeSeok Jin, HongChul Kim, JangMyung Lee
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper introduces Fuzzy Neural Network controller to increase the ability of a mobile robot in reacting to the dynamic environments. States of robot and environment, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.
    AI 2006: Advances in Artificial Intelligence, 19th Australian Joint Conference on Artificial Intelligence, Hobart, Australia, December 4-8, 2006, Proceedings; 01/2006
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present a method for representing, tracking and human following by fusing distributed multiple vision systems in ISpace, with application to pedestrian tracking in a crowd. And the article presents the integration of color distributions into particle filtering. Particle filters provide a robust tracking framework under ambiguity conditions. We propose to track the moving objects by generating hypotheses not in the image plan but on the top-view reconstruction of the scene. Comparative results on real video sequences show the advantage of our method for multi-object tracking. Simulations are carried out to evaluate the proposed performance. Also, the method is applied to the intelligent environment and its performance is verified by the experiments.
    PRICAI 2006: Trends in Artificial Intelligence, 9th Pacific Rim International Conference on Artificial Intelligence, Guilin, China, August 7-11, 2006, Proceedings; 01/2006
  • TaeSeok Jin, JangMyung Lee
    [Show abstract] [Hide abstract]
    ABSTRACT: The Robotic Space is the space where many intelligent sensing and tracking devices, such as computers and multi sensors, are distributed. According to the cooperation of many intelligent devices, the environment, it is very important that the system knows the location information to offer the useful services. In order to achieve these goals, we present a method for representing, tracking and human following by fusing distributed multiple vision systems in Robotic Space, with application to pedestrian tracking in a crowd. And the article presents the integration of color distributions into SOM based particle filtering. Particle filters provide a robust tracking framework under ambiguity conditions. We propose to track the moving objects by generating hypotheses not in the image plan but on the top-view reconstruction of the scene. Comparative results on real video sequences show the advantage of our method for multi-motion tracking. Simulations are carried out to evaluate the proposed performance. Also, the method is applied to the intelligent environment and its performance is verified by the experiments.
    Advances in Artificial Reality and Tele-Existence, 16th International Conference on Artificial Reality and Telexistence, ICAT 2006, Hangzhou, China, November 29 - December 1, 2006, Proceedings; 01/2006
  • [Show abstract] [Hide abstract]
    ABSTRACT: The knowledge of human walking behavior has primary importance for mobile agent in order to operate in the human shared space, with minimal disturb of other humans. This paper introduces such an observation and learning framework, which can acquire the human walking behavior from observation of human walking, using CCD cameras of the Intelligent Space. The proposed behavior learning framework applies Fuzzy-Neural Network(FNN) to approximate observed human behavior, with observation data clustering in order to extract important training data from observation. Preliminary experiment and results are shown to demonstrate the merit of the introduced behavior.
    Neural Information Processing, 13th International Conference, ICONIP 2006, Hong Kong, China, October 3-6, 2006, Proceedings, Part III; 01/2006
  • Jong-kwon Kim, SooHong Park, TaeSeok Jin
    PRICAI 2006: Trends in Artificial Intelligence, 9th Pacific Rim International Conference on Artificial Intelligence, Guilin, China, August 7-11, 2006, Proceedings; 01/2006
  • [Show abstract] [Hide abstract]
    ABSTRACT: The aim of this paper is to investigate a control framework for mobile robots, operating in shared environment with humans. The Intelligent Space (iSpace) can sense the whole space and evaluate the situations in the space by distributing sensors. The mobile agents serve the inhabitants in the space utilizes the evaluated information by iSpace. The iSpace evaluates the situations in the space and learns the walking behavior of the inhabitants. The human intelligence manifests in the space as a behavior, as a response to the situation in the space. The iSpace learns the behavior and applies to mobile agent motion planning and control. This paper introduces the application of fuzzy-neural network to describe the obstacle avoidance behavior learned from humans. Simulation and experiment results are introduced to demonstrate the efficiency of this method.
    Neural Information Processing, 13th International Conference, ICONIP 2006, Hong Kong, China, October 3-6, 2006, Proceedings, Part III; 01/2006
  • TaeSeok Jin, JangMyung Lee, H. Hashimoto
    [Show abstract] [Hide abstract]
    ABSTRACT: The possibility of operating in remote environments by means of teleoperated systems has always been considered of relevant interest in robotics. For this reason, in this paper, the relationship between a slave robot and the uncertain remote environment is proposed as the impedance to generate the virtual force to feed back Io the operator. For the control of a teleoperated mobile robot equipped with camera, the teleoperated mobile robot take pictures of remote environment and sends the visual information back to the operator over the Internet. Because of the limitation of communication bandwidth and narrow view-angles of camera, it is not possible to watch the environment clearly, especially shadow and curved areas. To overcome this problem, the virtual force is generated according to both the distance between the obstacle and robot and the approaching velocity of the obstacle. This virtual force is transferred back to the master over the Internet and the master, which can generate force, enables a human operator to estimate the position of obstacle in the remote environment. By holding this master, in spite of limited visual information, the operator can feel the spatial sense against the remote environment. This force reflection improves the performance of a teleoperated mobile robot significantly.
    Intelligent Robots and Systems, 2004. (IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on; 01/2004
  • Source
    TaeSeok Jin, JangMyung Lee, S. K. Tso
    [Show abstract] [Hide abstract]
    ABSTRACT: To fully utilize the information from the sensors, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile service robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. Note that in the conventional fusion schemes, the measurement is dependent on the current data sets only. Therefore, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the accurate measurement. The theoretical basis is illustrated by examples and the effectiveness is proved through the simulations and experiments. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to the navigation of a mobile robot in an unstructured environment, as well as in structured environment, and the experimental results show the performance of the system.
    Robotica 01/2004; 22:51-59. · 0.88 Impact Factor
  • Source
    TaeSeok Jin, JangMyung Lee, S. K. Tso
    [Show abstract] [Hide abstract]
    ABSTRACT: To fully utilize the information from the sensors of mobile robot, this paper proposes a new sensor-fusion technique where the sample data set obtained at a previous instant is properly transformed and fused with the current data sets to produce a reliable estimate for navigation control. Exploration of an unknown environment is an important task for the new generation of mobile service robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. Notice that in the conventional fusion schemes, the measurement is dependent on the current data sets only. Therefore, more sensors are required to measure a given physical parameter or to improve the reliability of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequences of the data sets are stored and utilized for the purpose. The basic principle is illustrated by examples and the effectiveness is proved through simulations and experiments. The newly proposed STSF (space and time sensor fusion) scheme is applied to the navigation of a mobile robot in an environment using landmarks, and the experimental results demonstrate the effective performance of the system. © 2004 Wiley Periodicals, Inc.
    Journal of Robotic Systems 01/2004; 21:389-400. · 0.48 Impact Factor
  • Taeseok Jin, Soomin Park, JangMyung Lee
    [Show abstract] [Hide abstract]
    ABSTRACT: Position estimation is one of the most important functions for the mobile robot navigating in the unstructured environment. Most of previous localization schemes estimate current position and pose of mobile robot by applying various localization algorithms with the information obtained from sensors which are set on the mobile robot, or by recognizing an artificial landmark attached on the wall, or objects of the environment as natural landmark in the indoor environment. Several drawbacks about them have been brought up. To compensate the drawbacks, a new localization method that estimates the absolute position of the mobile robot by using a fixed camera on the ceiling in the corridor is proposed. And also, it can improve the success rate for position estimation using the proposed method, which calculates the real size of an object. This scheme is not a relative localization, which decreases the position error through algorithms with noisy sensor data, but a kind of absolute localization. The effectiveness of the proposed localization scheme is demonstrated through the experiments.
    Computational Intelligence in Robotics and Automation, 2003. Proceedings. 2003 IEEE International Symposium on; 08/2003
  • Source
    Taeseok Jin, Jinwoo Park, Jandmyung Lee
    [Show abstract] [Hide abstract]
    ABSTRACT: A new method of estimating the pose of a mobile-task robot is developed based upon an active calibration scheme. The utility of a mobile-task robot is widely recognized, which is formed by the serial connection of a mobile robot and a task robot. For the control of the mobile robot, an absolute position sensor is necessary. This paper proposes an active calibration scheme to estimate the pose of a mobile robot that carries a task robot on the top. The active calibration scheme is to estimate a pose of the mobile robot using the relative position/orientation to a known object whose location, size, and shape are known a priori. Through the homogeneous transformation, the absolute position/ orientation of the camera is calculated and that is propagated to getting the pose of a mobile robot. With the experiments in the corridor, the proposed active calibration scheme is verified experimentally.

Publication Stats

21 Citations
1.87 Total Impact Points

Institutions

  • 2006–2008
    • Dongseo University
      • Department of Mechatronics Engineering
      Pusan, Busan, South Korea
  • 2004
    • The University of Tokyo
      Edo, Tōkyō, Japan
  • 2003–2004
    • Pusan National University
      • Division of Electrical and Electronics Engineering
      Tsau-liang-hai, Busan, South Korea