Conference Proceeding

A robotic ball catcher with embedded visual servo processor

Dept. of Electr. & Control Eng., Nat. Chiao-Tung Univ., Hsinchu, Taiwan
Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems 11/2010; DOI:10.1109/IROS.2010.5648912 In proceeding of: Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on
Source: IEEE Xplore

ABSTRACT In this work we present a robotic ball catcher with embedded visual servo processor. The embedded visual servo processor with powerful parallel computing capability is used as the computation platform to track and triangulate a flying ball's position in 3D based on stereo vision. A recursive least squares algorithm for model-based path prediction of the flying ball is used to determine the catch time and position. Experimental results for real time catching of a flying ball are presented by a 6-DOF robot arm. The percentage of success rate of the robotic ball catcher was found to be approximately 60% for the ball thrown to it from five meters away.

0 0
 · 
0 Bookmarks
 · 
34 Views
  • [show abstract] [hide abstract]
    ABSTRACT: This work proposes a technique for calibration of an eye-to-hand system. The target of the hand-eye calibration is to estimate the geometric transformation between the hand and the eye. This calibration method further considers camera intrinsic parameters and geometric relations of a working plane in space at the same time. A laser pointer casually mounted on the hand is utilized. By manipulating the robot and projecting the laser beam on a plane of unknown orientations, a batch of related image positions of light-spots are extracted from images of the camera. Since the laser is rigidly mounted and the plane is fixed at each orientation, the geometric parameters and measurement data must obey a certain nonlinear constraints and the solutions of parameters can be estimated accordingly. A close-form solution is developed by decoupling the nonlinear equations into linear forms to compute all of the initial values. As a result, the calibration method does not need any manual initial guess of the unknown parameters. To achieve a higher accuracy, a nonlinear optimization method is implemented to refine the estimation. The advantage of using laser pointer is that this technique can be used for the case when the eye does not see the hand. Experimental results of simulations and real data are presented to show the validity and the simple requirements of the proposed algorithm.
    IEEE International Conference on Robotics and Automation, ICRA 2011, Shanghai, China, 9-13 May 2011; 01/2011
  • [show abstract] [hide abstract]
    ABSTRACT: Purpose – The purpose of this paper is to propose a calibration method that can calibrate the relationships among the robot manipulator, the camera and the workspace. Design/methodology/approach – The method uses a laser pointer rigidly mounted on the manipulator and projects the laser beam on the work plane. Nonlinear constraints governing the relationships of the geometrical parameters and measurement data are derived. The uniqueness of the solution is guaranteed when the camera is calibrated in advance. As a result, a decoupled multi-stage closed-form solution can be derived based on parallel line constraints, line/plane intersection and projective geometry. The closed-form solution can be further refined by nonlinear optimization which considers all parameters simultaneously in the nonlinear model. Findings – Computer simulations and experimental tests using actual data confirm the effectiveness of the proposed calibration method and illustrate its ability to work even when the eye cannot see the hand. Originality/value – Only a laser pointer is required for this calibration method and this method can work without any manual measurement. In addition, this method can also be applied when the robot is not within the camera field of view.
    Industrial Robot 03/2012; 39(2):197-207. · 0.69 Impact Factor

Jwu-Sheng Hu