Conference Paper

An experimental evaluation of a novel minimum-jerk cartesian controller for humanoid robots.

Dept. of Robot., Brain & Cognitive Sci. (RBCS), Italian Inst. of Technol. (IIT), Genova, Italy
DOI: 10.1109/IROS.2010.5650851 Conference: 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan
Source: IEEE Xplore

ABSTRACT In this paper we describe the design of a Cartesian Controller for a generic robot manipulator. We address some of the challenges that are typically encountered in the field of humanoid robotics. The solution we propose deals with a large number of degrees of freedom, produce smooth, human-like motion and is able to compute the trajectory on-line. In this paper we support the idea that to produce significant advancements in the field of robotics it is important to compare different approaches not only at the theoretical level but also at the implementation level. For this reason we test our software on the iCub platform and compare its performance against other available solutions.

0 Bookmarks
 · 
90 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose a grasping pipeline to deal with unknown objects in the real world. We focus on power grasp, which is characterized by large areas of contact between the object and the surfaces of the palm and fingers. Our method seeks object regions that match the curvature of the robot’s palm. The entire procedure relies on binocular vision, which provides a 3D point cloud of the visible part of the object. The obtained point cloud is segmented in smooth surfaces. A score function measures the quality of the graspable points on the basis of the surface they belong to. A component of the score function is learned from experience and it is used to map the curvature of the object surfaces to the curvature of the robot’s hand.The user can further provide top-down information on the preferred grasping regions. We guarantee the feasibility of a chosen hand configuration by measuring its manipulability. We prove the effectiveness of the proposed approach by tasking a humanoid robot to grasp a number of unknown real objects.
    Proceedings of IEEE International Conference on Advanced Robotics; 01/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Pointing at something refers to orienting the hand, the arm, the head or the body in the direction of an object or an event. This skill constitutes a basic communicative ability for cognitive agents like, e.g. humanoid robots. The goal of this study is to show that approximate and, in particular, precise pointing can be learned as a direct mapping from the object's pixel coordinates in the visual field to hand positions or to joint angles. This highly nonlinear mapping defines the pose and orientation of a robot's arm. The study underlines that this is possible without calculating the object's depth and 3D position explicitly since only the direction is required. To this aim, three state-of-the-art neural network paradigms (multilayer perceptron, extreme learning machine and reservoir computing) are evaluated on real world data gathered from the humanoid robot iCub. Training data are interactively generated and recorded from kinesthetic teaching for the case of precise pointing. Successful generalization is verified on the iCub using a laser pointer attached to its hand.
    Neurocomputing 02/2013; · 1.63 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Identification of inertial parameters is fundamen-tal for the implementation of torque-based control in humanoids. At the same time, good models of friction and actuator dynamics are critical for the low-level control of joint torques. We propose a novel method to identify inertial, friction and motor parameters in a single procedure. The identification exploits the measurements of the PWM of the DC motors and a 6-axis force/torque sensor mounted inside the kinematic chain. The partial least-square (PLS) method is used to perform the regression. We identified the inertial, friction and motor parameters of the right arm of the iCub humanoid robot. We verified that the identified model can accurately predict the force/torque sensor measurements and the motor voltages. Moreover, we compared the identified parameters against the CAD parameters, in the prediction of the force/torque sensor measurements. Finally, we showed that the estimated model can effectively detect external contacts, comparing it against a tactile-based contact detection. The presented approach offers some advantages with respect to other state-of-the-art methods, because of its completeness (i.e. it identifies inertial, friction and motor parameters) and simplicity (only one data collection, with no particular requirements).
    2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2013), Atlanta, Georgia, USA; 10/2013

Full-text (3 Sources)

View
6 Downloads
Available from
Jul 3, 2014