Conference Paper

Detecting repeated motion patterns via Dynamic Programming using motion density

Fac. of Eng., Kyushu Univ., Fukuoka, Japan
DOI: 10.1109/ROBOT.2009.5152643 Conference: Robotics and Automation, 2009. ICRA '09. IEEE International Conference on
Source: IEEE Xplore

ABSTRACT In this paper, we propose a method that detects repeated motion patterns in a long motion sequence efficiently. Repeated motion patterns are the structured information that can be obtained without knowledge of the context of motions. They can be used as a seed to find causal relationships between motions or to obtain contextual information of human activity, which is useful for intelligent systems that support human activity in everyday environment. The major contribution of the proposed method is two-fold: (1) motion density is proposed as a repeatability measure and (2) the problem of finding consecutive time frames with large motion density is formulated as a combinatorial optimization problem which is solved via Dynamic Programming (DP) in polynomial time O(N log N) where N is the total amount of data. The proposed method was evaluated by detecting repeated interactions between objects in everyday manipulation tasks and outperformed the previous method in terms of both detectability and computational time.

0 Bookmarks
 · 
78 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this work, an approach for extracting features among multiple observations towards manipulation tasks recognition is proposed . The diversity of information such as hand motion, fingers flexure and object trajectory are important to represent a manipulation task. By using the relevant features we can generate a general form to represent a specific dataset of manipulation tasks. The hand motion generalization process is obtained and later, given a new observation, the task can be identified.
    Technological Innovation for Sustainability - Second IFIP WG 5.5/SOCOLNET Doctoral Conference on Computing, Electrical and Industrial Systems, DoCEIS 2011, Costa de Caparica, Portugal, February 21-23, 2011. Proceedings; 01/2011
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: a b s t r a c t Humans excel in manipulation tasks, a basic skill for our survival and a key feature in our manmade world of artefacts and devices. In this work, we study how humans manipulate simple daily objects, and construct a probabilistic representation model for the tasks and objects useful for autonomous grasping and manipulation by robotic hands. Human demonstrations of predefined object manipulation tasks are recorded from both the human hand and object points of view. The multimodal data acquisition system records human gaze, hand and fingers 6D pose, finger flexure, tactile forces distributed on the inside of the hand, colour images and stereo depth map, and also object 6D pose and object tactile forces using instrumented objects. From the acquired data, relevant features are detected concerning motion patterns, tactile forces and hand-object states. This will enable modelling a class of tasks from sets of repeated demonstrations of the same task, so that a generalised probabilistic representation is derived to be used for task planning in artificial systems. An object centred probabilistic volumetric model is proposed to fuse the multimodal data and map contact regions, gaze, and tactile forces during stable grasps. This model is refined by segmenting the volume into components approximated by superquadrics, and overlaying the contact points used taking into account the task context. Results show that the features extracted are sufficient to distinguish key patterns that characterise each stage of the manipulation tasks, ranging from simple object displacement, where the same grasp is employed during manipulation (homogeneous manipulation) to more complex interactions such as object reorientation, fine positioning, and sequential in-hand rotation (dexterous manipulation). The framework presented retains the relevant data from human demonstrations, concerning both the manipulation and object characteristics, to be used by future grasp planning in artificial systems performing autonomous grasping.
    Robotics and Autonomous Systems 01/2012; 60:396-410. · 1.16 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Repeated patterns are useful clues to learn previously unknown events in an unsupervised way. This paper presents a novel method that detects relatively long variable-length unknown repeated patterns in a motion sequence efficiently. The major contribution of the paper is two-fold: (1) Partly Locality Sensitive Hashing (PLSH) [1] is employed to find repeated patterns efficiently and (2) the problem of finding consecutive time frames that have a large number of repeated patterns is formulated as a combinatorial optimization problem which is solved via Dynamic Programming (DP) in polynomial time O(N<sup>1+1/α</sup>) thanks to PLSH where N is the total amount of data. The proposed method was evaluated by detecting repeated interactions between objects in everyday manipulation tasks and outperformed previous methods in terms of accuracy or computational time.
    Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ International Conference on; 11/2010

Full-text (2 Sources)

View
27 Downloads
Available from
Jun 1, 2014