Conference Paper

A biped robot that keeps steps in time with musical beats while listening to music with its own ears

Kyoto Univ., Kyoto
DOI: 10.1109/IROS.2007.4399244 Conference: Intelligent Robots and Systems, 2007. IROS 2007. IEEE/RSJ International Conference on
Source: IEEE Xplore

ABSTRACT We aim at enabling a biped robot to interact with humans through real-world music in daily-life environments, e.g., to autonomously keep its steps (stamps) in time with musical beats. To achieve this, the robot should be able to robustly predict the beat times in real time while listening to musical performance with its own ears (head-embedded microphones). However, this has not previously been addressed in most studies on music-synchronized robots due to the difficulty in predicting the beat times in real-world music. To solve this problem, we implemented a beat-tracking method developed in the field of music information processing. The predicted beat times are then used by a feedback-control method that adjusts the robot's step intervals to synchronize its steps in time with the beats. The experimental results show that the robot can adjust its steps in time with the beat times as the tempo changes. The resulting robot needed about 25 [s] to recognize the tempo change after it and then synchronize its steps.

0 Bookmarks
 · 
114 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the development toward a dancing robot that can listen to and dance along with musical performances. One of the key components of this robot is the ability to modify its dance motions with varying tempos, without exceeding motor limitations, in the same way that human dancers modify their motions. In this paper, we first observe human performances with varying musical tempos of the same musical piece, and then analyze human modification strategies. The analysis is conducted in terms of three body components: lower, middle, and upper bodies. We assume that these body components have different purposes and different modification strategies, respectively, for the performance of a dance. For all of the motions of these three components, we have found that certain fixed postures, which we call keyposes, tend to be preserved. Thus, this paper presents a method to create motions for robots at a certain music tempo, from human motion at an original music tempo, by using these keyposes. We have implemented these algorithms as an automatic process and validated their effectiveness by using a physical humanoid robot HRP-2. This robot succeeded in performing the Aizu-bandaisan dance, one of the Japanese traditional folk dances, 1.2 and 1.5 times faster than the tempo originally learned, while maintaining its physical constraints. Although we are not achieving a dancing robot which autonomously interacts with varying music tempos, we think that our method has a vital role in the dancing-to-music capability.
    IEEE Transactions on Robotics 06/2014; 30(3):771-778. · 2.57 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we propose the integration of an online audio beat tracking system into the general framework of robot audition, to enable its application in musically-interactive robotic scenarios. To this purpose, we introduced a staterecovery mechanism into our beat tracking algorithm, for handling continuous musical stimuli, and applied different multi-channel preprocessing algorithms (e.g., beamforming, ego noise suppression) to enhance noisy auditory signals lively captured in a real environment. We assessed and compared the robustness of our audio beat tracker through a set of experimental setups, under different live acoustic conditions of incremental complexity. These included the presence of continuous musical stimuli, built of a set of concatenated musical pieces; the presence of noises of different natures (e.g., robot motion, speech); and the simultaneous processing of different audio sources on-the-fly, for music and speech. We successfully tackled all these challenging acoustic conditions and improved the beat tracking accuracy and reaction time to music transitions while simultaneously achieving robust automatic speech recognition.
    Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on; 01/2012
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper suggests to turn to the performative arts for insights that may help the fluent coordination and joint-action timing of human-robot interaction (HRI). We argue that theater acting and musical performance robotics could serve as useful testbeds for the development and evaluation of action coordination in robotics. We also offer two insights from theater acting literature for HRI: the maintenance of continuous sub-surface processes that manifest in motor action, and an emphasis on fast, inaccurate responsiveness using partial information and priming in action selection.

Full-text (2 Sources)

Download
45 Downloads
Available from
Jun 5, 2014