Conference Paper

Learning Kinematic Models for Articulated Objects.

Conference: IJCAI 2009, Proceedings of the 21st International Joint Conference on Artificial Intelligence, Pasadena, California, USA, July 11-17, 2009
Source: DBLP

ABSTRACT Robots operating in home environments must be able to interact with articulated objects such as doors or drawers. Ideally, robots are able to autonomously infer articulation models by observation. In this paper, we present an approach to learn kinematic models by inferring the connectivity of rigid parts and the articulation models for the corresponding links. Our method uses a mixture of parameterized and parameter-free (Gaussian process) representations and finds low-dimensional manifolds that provide the best explanation of the given observations. Our approach has been implemented and evaluated using real data obtained in various realistic home environment settings.

0 Bookmarks
 · 
96 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present an algorithm called Procrustes-Lo-RANSAC (PLR) to recover complete 3D models of articulated objects. Structure-from-motion techniques are used to capture 3D point cloud models of an object in two different configurations. Procrustes analysis, combined with a locally optimized RANSAC sampling strategy, facilitates a straightforward geometric approach to recovering the joint axes, as well as classifying them automatically as either revolute or prismatic. With the resulting articulated model, a robotic system is then able to manipulate the object along its joint axes at a specified grasp point in order to exercise its degrees of freedom. Because the models capture all sides of the object, they are occluded-aware, meaning that the robot has knowledge of parts of the object that are not visible in the current view. Our algorithm does not require prior knowledge of the object, nor does it make any assumptions about the planarity of the object or scene. Experiments with a PUMA 500 robotic arm demonstrate the effectiveness of the approach on a variety of real-world objects containing both revolute and prismatic joints.
    Robotics and Autonomous Systems 01/2013; · 1.16 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. In this paper, we present novel approaches to allow mobile maniplation systems to autonomously adapt to new or changing situations. The approaches developed in this paper cover the following four topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating, (3) using tactile feedback to augment visual perception, and (4) learning novel manipulation tasks from human demonstrations.
    Proceedings of the Twenty-Third international joint conference on Artificial Intelligence; 08/2013
  • [Show abstract] [Hide abstract]
    ABSTRACT: Based on a lifetime of experience, people anticipate the forces associated with performing a manipulation task. In contrast, most robots lack common sense about the forces involved in everyday manipulation tasks. In this paper, we present data-driven methods to inform robots about the forces that they are likely to encounter when performing specific tasks. In the context of door opening, we demonstrate that data-driven object-centric models can be used to haptically recognize specific doors, haptically recognize classes of door (e.g., refrigerator vs. kitchen cabinet), and haptically detect anomalous forces while opening a door, even when opening a specific door for the first time. We also demonstrate that two distinct robots can use forces captured from people opening doors to better detect anomalous forces. These results illustrate the potential for robots to use shared databases of forces to better manipulate the world and attain common sense about everyday forces.
    Autonomous Robots 10/2013; 35(2-3). · 1.91 Impact Factor

Full-text (2 Sources)

Download
46 Downloads
Available from
May 20, 2014

Cyrill Stachniss